This is awesome. OK a simple idea but a good, open, library to do this is long overdue. Anything that removes the excuse for a permanent cache is very welcome.
Here's hoping for a Go port (insert Java memory management snipe here).
Just as the National Security Agency in 2005 came to the conclusion that it would be easier to store everything, Netflix has decided to store all of its content metadata with its customers rather than serving data from a central repository and caching frequently accessed data at the network edge. The streaming media service on …
Netflix is now offering Downloadable content on some platforms.
For those with slower Internet,a possible road to partial salvation. Download a 400MB movie overnight fir viewing next evening. Repeat. Welcome to civilization, with your 1.8 Mbps link.
Plus a huge opportunity for backseat screens to be served from a wifi refilled Netflx gadget in the car.
True, but of course it is only for select titles, select formats and of course select hardware. As someone who has been paying for a netflix account for 11 years, there are *better* ways to do as you describe that just make more sense.
Thanks for this reminder though, as it appears Netflix is going back to how it was in 2005'ish, almost identically (even down to the local cache). Scratch that, literally going back to the same setup...that's strange in today's world.
Xbox360 owners might remember how all those early 720p files were ripped off the xfat partition and uploaded to the net (local file, local cache, local license).
A bunch of hoopla for "It's going on your back" or "We're getting cheap" or "Hello 1998.xml". I'm thinking Netflix wants YOU to meta the tags for them, just how Google had you "contribute" to Maps. That or they are just getting cheaper and charging more (most likely).
Anyways, I'd vote for .xml being there is already a DTD to validate against. As an added bonus, there is _MANY_MANY_ tag systems besides this "Hollow" that have been using .xml for meta for nearly 20 years, so that's a serious kick start.
Lastly, has Netfilx really never seen the <meta type="Title">SCAM!</meta> tag before? This sounds like a gimmick to create a standard by breaking all standards to do so...in the name of profit.
What this sounds like to me is a binary storage mechanism with ease of read-only access into it.
Think MongoDB's BSON or maybe a pre-computed protocol buffers type data structure for the metadata.
XML was the problem - large bulky datasets that were being validated and all that crap. Here we are serving known-good read-only data. Simplify and compact to death and provide an easy shim layer to read the data directly. Maybe just store it ready to put onto the network without transform - literally just arraycopy the required records out of the data structure into the datagram packet :p
Welcome to 1986? Or maybe even earlier, on larger systems? It looks most of the disruptive innovation today is reinventing the wheel. And rediscover good programming practices the Java/Web developer never learnt.
That's pretty much what I was thinking. I saw "compact, fixed-length encoding" and my bullshit-o-meter hit a 9.0 and I thought "You mean a big in-memory array, you twat!".
I'm a grey-beard and reserve the right to be a miserable twat in an office full of 20 something graduates that don't know what machine code is.
User --- network --- caching client --- netflix edge network --- internal services --- Database
Netflix has moved the problem from an overloaded database to a scalable (just deploy more) caching read-only 'client' that is somewhere that the story doesn't make clear. Could be in Netflix's network, could be in the cloud (amazon, etc), could be Akamai, could be colocated with ISPs...
The companies most likely to do this are those that can't afford a proper distribution platform and/or are looking at any way to cut resource usage down on their end. One of many such examples is in the game world, especially on Steam, where companies install windows services or a launcher that torrents updates. Sometimes they continue to run when the game isn't active. So the developer saves money, and the consumer gets left with a slower internet connection.
Is Netflix doing it for this reason? It's very maybe-ish. They're being pretty evasive about the size of the database, and this could just be a trial balloon to see if anyone notices before flipping the switch on some darknet. That's usually how deployments like this go. Or, it could be just what they say -- a way to speed up performance. I'm skeptical because they're keeping a tight lip on how it will be implimented. Will the data stream be compressed? Delta updates daily for new content? How will the cache be synced? There's no technical errata, not even what platforms it is deploying to, which considering that this is a *technical* update, I'd expect more of. The lack of technical data suggests to me that management is in the loop -- and is hiding something.
What, is anyone's guess.
Biting the hand that feeds IT © 1998–2019