The announcement of Microsoft's SQL Server Data Services was almost buried near the end of chief software architect Ray Ozzie's Mix 08 keynote last week. It was a curious move for an offering that, as Microsoft's database service for the internet, marks the company's foray into database-based utility computing and promises an …
Microsoft: "SQL Server not suitable for large datacentres"
When I see:
"Microsoft technical fellow Dave Campbell told Reg Dev: "We are not using an off-the-shelf SQL Server to power this, we've taken the technology and shifted it around to make it more suitable for large-scale datacenter deployment.""
...what I read is: "SQL Server is not suitable for large-scale datacentre deployment".
A property bag of name/value pairs
"A container is a collection of entities and is what you search within, analogous to a single database in a traditional model. An entity is a property bag of name/value pairs, where each item may have the same or different properties.
...every item in SSDS has a few fixed properties: an identity that is a globally unique identifier, a "kind" that identifies the category of the entity, and a version datestamp used for concurrency checking."
Sounds remarkably like a Notes database. Very useful for a lot of purposes, not new but there we go... Maybe good quality replication and offline edit access will result.
‘Microsoft reveals its database for the cloud’
you beat me to it...
To clarify a couple of points...
I just want to clarify a couple of points in the comments. Firstly, there are a number of very large Internet properties that are powered by SQL Server including Microsoft's own Live properties. I believe there are > 15,000 SQL Server instances across all of our Internet properties. There are other big names you'd recognize as well. The point I was trying to make is that when you're designing for a data center containing 10^5 or 10^6 servers you do things differently than when you're designing for a typical enterprise deployment. Done right, you can drop both your capital and operational costs significantly.
Secondly, on the data model. We're starting with a pretty simple universal model to get the service up and running. Obviously this will evolve over time as we have the capabilities available in the back end to offer very rich schema and query support. What we're finding is that there are a pretty large number of interesting scenarios you can build out with just what we've announced thus far. We're looking forward to the feedback to figure out what to add next.
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt