back to article Old, forgotten, lonely? SQL Server 2016 will sling you into Azure

Microsoft has today announced SQL Server 2016 at its sysadmin-focused Ignite conference in Chicago. The software will be available as a preview release this summer. The Windows giant is continuing its commitment to hybrid cloud by adding a new called Stretch Database, which dynamically moves less active data to the Azure cloud …

  1. Anonymous Coward
    Anonymous Coward

    If you're using Stretch technology in SQL Server, always remember to add the /NSA:DENY (*) switch to the server command line to ensure your data is routed to the 'correct' cloud.

    (* documentation errata: this switch is merely a hint to the storage block redirector manager, and may be ignored by some revisions.)

  2. Anonymous Coward
    Anonymous Coward

    Haha!

    Let's move your data to the other side of a WAN link - to improve performance!

    I presume this is foolish reporting and not actually what MS is actually trying to do.

    1. big_D Silver badge

      Re: Haha!

      It sounds like a modern take on the old practice of moving stale data to tape, keeping the local performance higher by not having to index low access data. If the data can't be found locally, you can then load up the relevant tape - or in this case the SQL Server will "stretch" out to the off-site store to pull down that data.

      For those queries, it will be slower, but for the rest, it should keep performance in the acceptable range. If it is working dynamically, then if the data starts to be regularly used again, then I would guess that it moves it back locally.

      I used to work for an oil exploration company, the current surveys were held online, the older data, which wasn't needed very often was held in a huge warehouse with hundreds of thousands of tapes and transferred by van back to the data center as needed.

      1. Anonymous Coward
        Anonymous Coward

        Just a random thought, signifying nothing.

        ... or perhaps not so stale. Were it possible to "stale" the data quicker over to Azure, it might be a mechanism for doing the monthlies, quarterlies, and annuals off the "local cloud" entirely which has to have some utility. BEA and similar gov't analytic types, even if not business types, should surely be a customer.

    2. Tim Anderson

      Re: Haha!

      I think the idea is that your active data is always local so that perf is maintained. Microsoft also says "As core transactional tables grow in size, you may need to archive historical data to lower cost and to maintain fast performance" so I guess shunting stale data to the cloud could help with that.

      Tim

  3. Hans 1 Silver badge
    Coffee/keyboard

    Can I use LIKE on in-memory tables, yet, or is that for 2020?

    1. deadlockvictim Silver badge

      In-Memory OLTP

      The LIKE operator can be used with In-Memory tables with ordinary (i.e Interop) SQL queries.

      It can't be used in the natively-compiled procedures, but then almost nothing can. Microsoft specifies what *can* be used, rather than what can't be used.

      What I would like to see in SQL Server 2016 In-Memory OLTP is:

      • the ability to create indexes on nullable columns;

      • Constraints — Foreign key, check und unique;

      • Outer joins in natively-compiled procedures;

      • Alter Table — at the moment one has to drop a table and recreate it if you want to make a change, add a column, rename it etc.;

      • cross-database queries — although linked servers work;

      • Sub-queries in natively-compiled subqueries;

      This technology is only half-way here. It reminds me of using MySQL back in the 1990s — promising, not yet ready for migrations of currently used systems and maybe suitable for a new application so that the DB and application can be designed around its limitations.

  4. Richie 1

    > Earlier this year, Microsoft acquired Revolution Analytics...

    Oracle has its own R distribution for use with the Oracle Big Data Appliance, so hopefully Microsoft want to rip of this idea and have an R-inside-SQL-server feature. Access to databases from R is pretty good these days (particularly using the dplyr package), but you still have to pass the data out of the database to R, which doesn't make sense for big datasets.

  5. J J Carter Silver badge
    Trollface

    select * from Users where Clue > 0

    ** 0 rows returned **

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019