Re: Miserable much?
Nice tail. The purple suits you.
154 posts • joined 9 Jul 2009
Nice tail. The purple suits you.
You lot are miserable, we're rather enjoying this series. Some good dialogue, blink and you'll miss it references, Capaldi no longer worrying about whether he's allowed to be Doctor Who and at last stories long enough to catch breath.
Sure, it's not as good as a classic film with a million times the budget (let's just ignore how bad the rest of the Alien franchise got, and the dozens of other films of the time that tried and failed to go there), but it plays with some nice ideas and still brings the occasional surprise. Maybe I'm just not cynical/smug/self referential enough to see through this dumb kiddies programme, but on a Saturday evening, it goes down rather well.
You can... usually made very cheaply in China, which means increasingly unreliable start, questionable safety brake, poor ergonomics (which actually matters when you're holding a heavy, yet efficient cutting tool) and fragile materials.
The question is always - spend three times the price on one that should last three times as long, or go cheap? I went cheap.. and at this point the punchline should be that I'm typing this with my one remaining hand. However, I do need to buy a new chainsaw.
..is that those of us with the technical ability to create a decent X-as-a-service app/website/whaterver, and the self awareness and maturity to recognise the difference between a good idea and a very silly one are then faced with the challenge of getting press attention for our projects.
Yet a train-wreck like this can sail through everyone's sanity screens and land slap in the middle of a few international publications as a funded 'ongoing' (for the moment) business.
I appreciate that the ability to market an idea is a separate skill from the abilities to conceive or implement it, but it still grates.
I believe you can trace the White Heat policies though to the Cambridge Phenomenon, which has begat a number of billion dollar companies. Not that I disagree with the general conclusions of the piece, but that indirect investment in centres of excellence... oh, hold on, those are universities aren't they?
"Does the HL7 standard class as a "rich and immutable API"?"
.. I think it classes as 'just enough rope'.
I suspect you have misinterpreted the problems as being technical rather than political.
Certainly you could envisage a centralised core service that provides data store for patient information in an efficient and timely manner. You could imagine APIs that provide access, and means to extend the data and access methods as the use cases build up. I would imagine even the greenest developer could draw that on a whiteboard somewhere.
In practise though you discover that department A and department B want two different features delivered first, that department C wants to use a legacy system until they have the budget to replace it and department D will not sign off on anything until you can guarantee the system is a complete replacement for their entire patient record system. Not only that, but you must prove the entire project meets various bits of legislation around confidentiality and accuracy, but no-one knows which bits of legislation or can understand the legalese that they're written in. There are mutterings that new legislation is coming along shortly that will change things anyway. Worse still, that suits all the departments who have been relying on the vagueness of paper based records to 'work around' legal requirements. They will, of course expect your system to both meet the legal requirements and bend them to fit the existing processes at the same time.
Whilst all of this has been going on, some clever soul signs a three million pound contract to replace all the blood pressure monitors in the hospital with something that is completely incompatible with your data collection hardware. Despite the fact that another company provides a similar device that would work, you cannot amend the order as the manager in question doesn't want to loose face. In the mean time, the heart rate monitors you have planned to incorporate will not be available for another six months and the current equipment won't support the minimum requirements laid down by someone you've never heard of and cannot contact.
On top of all this, it turns out the provider of the tablets cannot support your APIs as they are, and due to restrictions on hardware procurement, you cannot simply go for a bog standard Android device and write your own client. You therefore have to write an entire translation layer that turns your realtime data collection into a series of web forms accessed through a third party database via SOAP calls. The project only takes a few weeks, but it's long enough for the company to go bust and the procurement process has to be restarted. When it is, the new requirement will be that the tablets must support the old manufacturer's obscure SOAP calls, preventing you from being able to ditch the unnecessary translation layer, and doubling the cost as the new manufacturer charges you to re-engineer their APIs.
Finally, in every meeting with the end users, you discuss workflows and current practices regarding data collection. In every meeting, they think of three new use cases, and outright contradict two of their previous descriptions. Like a good techie, you decide to make it all data driven so things can change in future, and you are told to embark on an expensive side-project to provide access and control over this meta-data so that the various stakeholders can feel in control. After six months of development it emerges that none of the stakeholders actually understands a thing about this project and that you will be solely responsible for collecting and maintaining the metadata anyway. And the users still can't actually agree what their workflow is.
Hell is other people.
..that the reason for the layout of the drives next to the screen was that Amstrad found a large batch of cases manufactured for normal televisions being sold off cheaply when another manufacturer cancelled their order. So Alan Sugar bought them up for pennies and then had the machine designed to fit.
As with any of the many Alan Sugar legends, I've no idea quite how true it is though.
You went to the lengths of hiring a real meerkat, but you're going to CGI the laptop in? I'm RADA trained, darling, of course I can type!
.. my experience suggests that someone has been rather over zealous with the claims here.
The issue here is that most data collection simply doesn't have enough information to draw deep conclusions about end users, beyond simple 'people who like X also like Y' relationships. The problem is that most companies' views of users are restricted to browser sessions and occasional logins, and most interactions are of the form 'looked at X, bought Y'. The restriction here is that you don't know who is actually behind the keyboard at any given time, and inferring the reasons for their choices has to be based on extremely limited information.
Hence I visit Amazon regularly, buy items for niece and nephews' birthdays and occasional needs for my own wife and kids. In a recent list of 'recommended for you' I had a crochet kit and a hand axe besides each other - both utterly irrelevant and not reflecting the actual purpose of my visit that day or even the following year. Worse still, that's for a site that I visit (depressingly) regularly. Most sites suffer from customer loyalty that barely registers on the chart, meaning predictions have to be based on little more than the time of day and the location you logged in from.
Now undoubtedly you can improve the accuracy and timeliness of recommendations (the base level being random guesses from your marketing department), but the vision of precondition and overthrowing governments is far from the truth.
When it's being used for deciding what drinks to have in the office kitchen, your paranoia is perhaps unjustified. No-one is suggesting we ditch our current system of corrupt politicians, lobby groups and uninformed masses for the moment.
Because a lot of people have been trained to think that these things aren't worth anything, so why pay for a game?
Because the traditional games developers have found it very hard to monetise social and mobile platforms, so they don't develop decent games for them.
Because the vast majority of gamers are casual players who don't want to invest in a 'real' game.
Because Kate Upton.
Because sadly we're all a bunch of slightly evolved apes and we don't place much value on quality entertainment over a quick fix.
I heard it took a little longer... maybe a minute and a huff.
Just for a short while, my app has more regular users than the total download of the Reg's offering.
Pity I can't get any decent money from it.
We seem to be importing US-style prurient outrage wholesale.
Still, it's so much easier to see the world in black and white terms than to actually talk about education, understanding, support, diversity and the human condition.
El Reg regularly reports on research, and regularly research goes nowhere, or takes decades to reach the consumer. Much of the excited research announcements turn out to be impossible to turn into a product that can be manufactured reliably, at scale and at a sane cost (see all of the articles on new battery technologies over the last 15 years).
So, the companies that make the money aren't just taking ideas and pressing the magic 'sell one of these' button, but putting in immense product development effort to deliver them to end users.
... so where's the picture in the article?
Whilst the step to fully autonomous everything is clearly a step to far for many to contemplate, it makes sense that all the small improvements - from vehicle-aware cruise control to self-parking are steadily changing our relationship with cars. You can understand that we're not just going to wake up one morning and hand the keys over to our smartphone, but the little conveniences are going to accumulate until we only need to take control of the car for the interesting bits.
Keen drivers might baulk at this, but we have to recognise that most car journeys are dull - commuting, school runs and supermarket trips. Who hasn't set off on an unusual journey and automatically turned left to go to work out of sheer habit? Between Uber, car sharing and cars that can deliver themselves, these boring journeys are ripe for handing over to the machines.
Now, the important question is - how much of this work is being done in the UK? We do some class-leading engineering in the automotive sector, but vehicle IT is a new niche and we're historically more interested in the greasy bits than the wiring. The integrated solutions that are going to be needed will likely have to be developed holistically, so there's a real possibility that we could get locked out of the industry. I'm sure there will be universities looking into this, but are there manufacturers out there ready to move this from theory to practice? It sounds like a great area to be involved in.
Whilst the government bothered to engage with IPSE before the election, and then have delivered a budget that staunchly ignores every point they raised, the biggest disappointment is that the new proposals are unnecessarily complex.
It's taken a long time for many advisors to figure out what they changes are likely to mean, and even then we're left waiting for more clarity. How the various tax bands interact, what allowances will and won't be available and how contractors can efficiently deliver their services has become a mire of paperwork and fag packet maths. This is not a sign of an efficient taxation system, but one that is driven by political positioning.
Where I'm working, there is a serious shortage of skilled workers able to move company IT on to the new platforms and tools. On my team of 12 people, just two are British citizens and the company struggles to find people to expand the team further. This end of the workforce needs highly mobile, specialist workers who take the risks on behalf of the larger companies that bring in their skills and experience. However, the government treats the sector as being indistinguishable from day rate brick layers and offers us similar levels of support - i.e. none whatsoever.
Differences in attitude towards entrepreneurial and small businesses here and across the pond are highlighted by the startup and high tech scenes, but it appears that we're too busy counting the pennies to take lessons from more supportive regimes.
The trap this immediately falls into is making the assumption that being open source is some measure of the saintly measure of a project or company. You may as well check whether the owners have made charitable donations or take in sick animals (no, not developers).
The intent behind open sourcing a project, and the actual end effect doesn't sit on some single continuum between 'evil and cynical' and 'advancing the cause of mankind' any more than the actual software itself is purely saintly or nasty.
When you engage with any project, open source or otherwise, the question has to be whether doing so will meet your goals - and you must recognise that your goals and the owner's goals may be many and varied and wildly different. Not good or bad, just more or less aligned. A single corporate committer may be quite acceptable if you simply need their current release to perform a task in a nice stable environment. Equally, a project may be of no use if its' large and active community wish to introduce breaking changes or pursue new developments that don't sit with your specific use case. Some projects open source a component so small that it's useless in isolation - like a new type of bolt for building an oil rig. Others open source the entire world safe in the knowledge that the chances of you being able to replicate a functional environment is nil - like being given the plans for a steam train with a note saying that building the rail network is a task left up to the developer. None of these are necessarily signs of good or bad intent - just different ways projects may be run.
The only consistent warning sign I've come across are those projects where the owner is enthusiastic to point out that the project is good simply because it's open source. Suddenly I get the strong whiff of snake oil.
As our boy (age 10) was pretty invested with the Wii, we eventually upgraded to the Wii U. The Gamepad is a boon - he plays on the large screen, then carries on playing on the pad when someone else wants to watch TV. The production values remain as high as ever (he's just rattled through Yoshi's Wooly World, which is beautifully realised).
The problem is that the Wii U is not seen as a success - it doesn't get talked about the way the PS4 and XBOne, or even tablets do. That's perhaps the effect of Nintendo's attitude to third party developers finally biting them on the backside. The massive number of throwaway games in browser or on tablets and phones only serve to highlight how few third parties develop for Nintendo. Whilst the quality benefits, it means there are only a handful of cheerleaders for the platform. Rovio and the like benefit from network effects that mean in certain circles you only hear about their preferred channels. Our son knows all about every variation on Angry Birds etc. and that keeps him coming back to the tablet for yet more throwaway games as much as he will spend hours on a single game on the Wii U. In our household, the two platforms have parity, but only because we cared to find out about Wii U titles.
On top of that, the positioning leaves people confused - it's neither cheap nor powerful. What are you actually getting for your money besides access to well known Nintendo IP? The other consoles still sell on the idea of 'new stuff' (whether that's genuinely the case or not), whereas Nintendo has fallen into 'same old stuff, but prettier'. The benefits of the gamepad are not obvious when you see it in the store, and they've failed to capitalise on the extra screen where every other platform has embraced 'two screen media'. Every game and tv show has a back channel these days, yet Nintendo fail to provide that on their own platform that's designed from the outset to have a second screen.
Concrete sets through chemical reaction, not drying. It'll happily set under water.
I presume there are patents at battle here...
One line of C code. That's formatted to look like a dog p***ing on a Christmas Tree and with every other word rhyming.
The number of new build homes is a tiny fraction of the existing housing stock. At the current rate of building, we'd replace all the existing homes in around 160 years. Not only that, but it's the existing (inefficient) homes that most need accurate metering if you accept the argument that accurate metering reduces consumption.
As it is, it's a poorly thought out project, with poor technical specification and even poorer oversight.
John thought hard about what he was going to do.. the most outrageous, dangerous and yet romantic action he had ever dared take in his life. He would..
(cont. on page 2.)
Ha, If you're so clever, YOU tell us what colour it should be.
..the Twittersphere lurches into action, ready to take major offense and vilify anyone unfortunate enough not to have pre-written and proof read every statement they make in public.
Yes, it was not very smart, or sensitive, or appropriate. However I doubt the sentiment behind it quite justifies the strength of the response.
Very funny if deliberate. Very embarassing if not.
I thought that dearth of right wing humour was because they were all out getting jobs and didn't have time to hang around student bars trying to get into the knickers of the cute girl on the Revue committee?
There really is a humour problem for the posters who're trying to prove how sophisticated they are by taking apart comedy on the Register Forums. Take the hint from the original review - you can describe why you find something funny for the benefit of others who find the same things funny. You can't make something 'not funny' because you're simply excluding yourself from the group.
Maybe I'm optimistic, but those posters complaining on the grounds that this won't run their web service any faster (if at all, with it's new OS) have missed the point.
In the data crunching corner of the world, most of the innovation is around describing 'non traditional' processing tasks and then mapping those (very painfully) on to traditional hardware. Everyone will tell you that adding another off the shelf node to a compute cluster is cheap and you can expand to build a cluster capable of handling the large loads that big data, inference and graph compute problems throw up. The problem is that big clusters do not scale linearly when it comes to reliability, and network and disk effects mean that at least half of a cluster's energy goes into overcoming the dead weight of having compute resources that don't match the task. We don't actually want a vaster, more manageable cluster of Linux boxes, we want a compute resource matched to the process description.
Soooo.. as Mars shots go, this could make some sense. We're starting to describe processing in terms of directed graphs of actions, which can be mapped to both batch and real time work loads. An architecture that starts with the premise of many actors consuming a vast store of messages in a robust and scalable way would potentially outperform today's clusters by orders of magnitude. Given the cost of provisioning and maintaining a modern cluster, the exotic nature of the Machine may be a small price to pay.
I'm reminded of the early introduction of NUMA machines, which suddenly introduced capabilities that allowed tasks that used to be done by a building full of mainframes to be done by a box that sat under your desk. This architecture could potentially do the same to clusters - and not by virtualising thousands of machines into one box.
I waited for the comments to come in after this article arrived. On my score card I got points for "language is verbose", "Desktop sucks" and "Java security is terrible". I'm quite surprised not to have got "It runs too slow" and "the VM is too big".
Despite all the revisionist history, Java got in there because it solved a whole bunch of problems that no other platform quite managed. For a start, outside of Delphi, it was just about the only 'platform' that you could get hold of. Sure you could mix up a nightmare brew of C, x-windows, sql libraries and so on, but that was so painfully vendor dependent that each and every project required a fresh start. In contrast, you could knock up a Java app with a gui that talked to the network, database and file system of your choice and not have to re-build it every time you needed to run it on a different machine or for a different client. The browser integration promised (but never delivered) that those apps would eventually be able to be delivered automagically by the internet, but even without it, software houses could deliver tools and demos and apps far more easily than they had before.
Things like the standard runtime libraries, easy dependency injection, automatic documentation generation and very consistent approach to API behaviour meant that picking up Java to do a job was an order of magnitude easier than the equivalent in just about any other language at the time. That's nothing to do with CS students or managers' preference - it was a handy tool that also happened to scratch the itch that most developers have to learn something new (which has since benefited Ruby, Python, Scala et. al.).
The Java guys have never really got UI development, and that pretty much killed off browser integration and desktop apps, despite repeated attempts like JavaFX. It's funny therefore to see Android use Java for a UI-heavy platform. (it's also probably worth mentioning that Minecraft seems to have done fairly well for a little game written in Java). Meanwhile, web development and now big data have absolutely thrived on the ability of Java to evolve into new areas.
Whilst Scala has been an interesting diversion, the small size and relatively limited resources of the development team have been a constraining factor - resulting in painfully slow tools and weirdly inconsistent core libraries. With Java 8 making useful inroads to functional styles, and the solidity of things like the collection classes, I'm not sure that the 'other JVM language' will ever get out of it's niche.
So - thank you Java. I still remember evaluating it for an early project (and having to wait for it to be delivered by post on a CD) and twenty years later, it's still relevant to my clients and still evolving into new areas. It's far from perfect, but it does a lot of jobs pretty well. Long may it continue.
I use Windows, Linux, Android, OSX and a host of other operating systems in the course of my work - I'm fairly platform agnostic, but avoid the Apple consumer products as I'm not keen on the lock in. That said, I've been using an MBP for two years and it's a fantastic developer's machine - powerful, fast and mechanically robust. It gets dropped in a backpack at the end of each day (no sleeve) and bounced around on the commute and (touch wood) remains in perfect condition. Compare this to any other windows laptop and it's head and shoulders ahead in terms of longevity and utility.
My FIL was developing home automation, switches and sensors back in the 80's - you could switch on your kettle from across the Atlantic, answer your door with a remote control and all of the 'not quite useful' concepts that the IoT brigade are currently getting over-excited about.
The point has to be that we build our environment to suit the simple needs of a simple bag of flesh - simple light switches, the front door not so far away from the rest of the house, the heating more or less able to keep the temperature within tolerable bounds and so on. Sure, you can impose multi-zone ultra-precise control and feedback on all that, but (a) we don't naturally think that way and (b) we don't actually care. Do I want complex lighting schemes controllable from a smart-phone? No - I switch the light on when I walk into the room, and off when I walk out. A bit brighter, or a bit dimmer occasionally, but none of that justifies spending several hundred pounds to make a job that is already pretty convenient more complex and more prone to problems.
I don't doubt that someone will come up with something interesting eventually, but unless the device makes a significant difference to the way you live your life, there's no reason to spend so much effort and money when there are simple alternatives in the depressingly analogue world.
Java's actually very good for this sort of stack - portable, robust and flexible enough to embrace the steady shift of development frameworks as we better understand the problems we're trying to resolve. It's far from inefficient and the tooling around it is excellent. If you're still comparing your iPhone sized database with the sort of thing we're doing with Hadoop, you need to re-read the part of the manual that explains how not all databases are the same. The system I'm currently working on consumes about a terabyte of data a day and retains that indefinitely. Not only that, but it delivers value - we're talking eight figure sums annually here for a single use case, but that is a long, long way from the sort of thing you'd achieve with SQLite.
What is the problem is the issue of wrangling a cluster of a few hundred machines and providing production level SLAs on processes running on that cluster. The tooling is catching up, but we're working from the ground up on technology that is still desperately immature. There are a LOT of moving parts in a typical deployment, and whilst Hortonworks, Cloudera and the others are getting better at bringing a working system up, there is much to be done to ensure BAU services are business as usual, not a series of experiments. It doesn't help that there are dozens of different frameworks and approaches to implementing a given solution - nearly everyone I talk to has found a new combination of tools to use and that's preventing companies from focussing their efforts on making one particular toolset feature complete and robust.
So when it turns out we have accidentally declared war on the peace loving Mercurians, we'll know who to blame.
And yet Microsoft, Apple, Google, and the Linux community all survive with automatic patching and no-one has managed to hold the entire world to ransom.
That's not to say it can't happen, and of course updates have caused problems in the past. However, I think most companies understand their exposure.
When it comes to terrorist plots, stopping middle class managers from getting to their meetings on time is hardly going to have the impact they'd want.
The only reason I can conceive for not using a general purpose device like the Pi is if you can make it significantly cheaper. Given the low cost of the Pi, I'd be interested to see the specs of the MicroBit.
Do I want the dominant nation (America) to completely dominate the media I watch? No.
Do I want advertisement and (expensive) subscription rates to be the only way I pay for my media? No. (and whichever way you cut it, the monthly charges for Sky, Netflix etc. are easily as high as the BBC when you consider the range of content they provide).
Do I think the BBC has innovated in the digital space? Absolutely. Whilst the US argued over content rights for streaming media, and fought patent battles, the BBC delivered iPlayer and made swathes of quality current and historical content available.
Do I use the BBC news, weather and other sites on a regular basis. Yes.
Do I listen to Radio Three, or Six? No - but I'm glad they're there to support artists who'd otherwise have to fight angry tigers on "I Have X-factor Get Me Out Of Here" in order to be heard.
Do I think the BBC is biased? Probably, and it varies across channels and programmes, but so is every other media outlet.
Should the BBC be funded to continue to punch way above it's weight on the global stage? Absolutely, though how I have no idea.
I wouldn't suggest YT just puts up a paywall - and of course most visitors cannot or will not ever pay.
At the same time, whilst you say you won't put your credit card into your son's play account, you probably did to let him play Minecraft. If you could 'charge up' his YT account with a one-off payment of, say £2 and that would then let him watch 2000 videos free of advertising, would you find that so onerous?
YT already distinguishes between monetised and non-monetised videos, it wouldn't be a stretch to identify 'premium' video channels that require a subscription, or to specify as a content producer that you will accept a specific combination of free/paid/advert-laden views.
None of this stops YouTube from carrying on exactly as it is, but would open up a more concrete revenue stream for people who don't want to only watch/produce hilarious cat videos.
Zoella and a few others are exceptions rather than the rule. Given the vast number of videos and channels that get posted to YT, a small set of outliers getting large amounts of attention (and therefore money) is to be expected.
However, if you only get a 'normal' amount of attention (say, the sort of viewing figures that many BBC programmes get and are happy with), the revenue is dramatically smaller. Typically smaller than the production costs of even a modest 'proper' video.
The economics seem to mean that unless you're pushing out something new on a near-daily basis, and your production costs are nil (ie. you're a vlogger), you're basically not going to make money.
If Google implemented something like micropayments and paid content authors a tenth of a penny per view, the economics would change dramatically - and the skew away from a tiny handful of mega-stars might allow for better quality content. I suspect a lot of viewers would gladly pay that sort of money just to view videos with no interruptions.
I think the issue is that you may indeed heat your house with a GSHP (nice big garden?), but I bet you still have lights, washing machine, dishwasher, tumble-dryer, wifi, tv and a dozen other electronic gizmos plugged in or on charge. I would bet you travel to work in a car (or rail, bus) and fly abroad at least once a year. I imagine your house is relatively new to be efficiently heated by GSHP - so the materials are likely to have high carbon cost.
The cost of moving to renewables is not just magically switching on the GSHP we all have hiding in our gardens, but building them, delivering them, ensuring the housing stock is sufficiently energy efficient to use them. On top of that, some huge percentage of the population are not in locations where GSHPs can be installed. Even if they were, they would still commute, use modern tech and fly abroad on holiday. You can airily wave away the needs of the third world countries ("let them all use solar!"), but beyond nice, very low powered (and not very bright) LED lights, the moment they start pulling themselves up to a better standard of living, they'll need massively more energy too.
Your personal heating is a gnat's testicle compared to the steaming pile of energy we all use all the time. This is from someone who has a low energy home, heats it through solar and on-site biofuels (logs!) and has looked into the options for off-grid and renewables.
Yes, but have you heard how *loud* it is... you'll hear that coming from a mile away
There's some ideal that, for a culturally healthy society the talented creators should be encouraged (and let's not kid ourselves that these are plentifully available, we've all seen X-Factor trying to find a single talent from a pool of tens of thousands). As such, I'm all for elevating those few to a point of wealth and success. Those that argue that historically musicians used to have to roam the country to earn their crust miss out the fact that historically we lived in mud huts and the majority of the population were illiterate.
Both extremes of the argument (big music corps v.s the music should be free) are exactly that - extremes. I'm all for the current system being realigned, but to go to the other extreme and destroy the incentive to create is just foolish. Arguing that people will create anyway also misses the point - why should we restrict our culture to only those who happen to have the rare combination of talent and dumb pig-headedness needed to create when they are punished for doing so?
Notably the same argument goes for any intellectual work - we've mentally downgraded it's value as the supply suddenly seems so plentiful, but in doing so we're making a strong economic case for lowest common denominator work rather than brilliance. If we restrict the supply (or flood the market from a single source) we lessen the likelihood that outliers can crop up that produce something amazing.
This article has a certain amount of stating the obvious, and could have been written at any time over the last couple of years. How about actually comparing some of the current kit available to buy/build?
Agreed. We have a Hubsan X4 - it's cheap, it records acceptable video, it's easy to fly and it's a good introduction to all the things you have to consider when owning a drone: maintenance, repairs, batteries, planning flights and so on.
I like PC, like the one-liners and the darker approach.
What doesn't work so well for me is the harsh line between story arc and episodic format. Moffat seems to have fallen into the rhythm of one-concept per episode (too often borrowed from a film), which gives little time to explore an idea before they have to leap to a quick conclusion (*cough* shooting a spaceship with a gold arrow). The heavy handed teasers that something else is going on do not constitute a story arc, so much as build to a series end that cannot possibly satisfy once you've got past the "so that's what it meant" discovery.
A little more continuity between episodes would help, as would one or two properly identifiable baddies that the doctor can bash up against rather than simply not understand before producing a rabbit from his hat. At the moment, it's like watching a pin-ball machine: individual events are exciting, but the lack of any flow makes it a bit exhausting.
There are some companies who use big data systems solely so they can tell their clients that their services need big data systems. Equally some use the technology because if their employees thought they were working in yet another low-end service shed, they'd go and find somewhere more CV positive to work.
On the other hand, you can find yourself working on a project where the company behaviour is sufficiently well characterised (MVT et. al.) and the product sufficiently data-led (e.g. large volume online sales) that you can plot a straight line between insight and action, and then close the loop with feedback on sales uplift. At that time, yes you can realise value. If you are unable to identify what you are going to change as a result of the vague insights you hope you might get, or if you are unable to measure the difference between your approaches, then big data (however you define it) is not going to pay its way.
How many companies have tried to build a persistent online world that builds communities and has interest outside of the hardcore gaming community? How many companies successfully support 100m users? Clearly there is service knowledge and technical understanding that is of value to any company wanting to scale online environments. Add in a large user-base and it makes some sense.
From another perspective, this sort of success isn't something you can produce to order. Notch was lucky to hit the right combination of elements and skilled enough to respond to the early community to fine tune the recipe. Having got there, Minecraft has successfully seen off many imitators. At the same time it's been clear for a while that Notch himself has been far less comfortable managing the expectations of a global audience, so his exit is understandable.
My understanding is that a lot of the costs for Mojang are in maintaining a global server infrastructure, so you could imagine that being brought under Microsoft's wing could result in savings and hence more easily reached profits.