So when it turns out we have accidentally declared war on the peace loving Mercurians, we'll know who to blame.
119 posts • joined 9 Jul 2009
So when it turns out we have accidentally declared war on the peace loving Mercurians, we'll know who to blame.
And yet Microsoft, Apple, Google, and the Linux community all survive with automatic patching and no-one has managed to hold the entire world to ransom.
That's not to say it can't happen, and of course updates have caused problems in the past. However, I think most companies understand their exposure.
When it comes to terrorist plots, stopping middle class managers from getting to their meetings on time is hardly going to have the impact they'd want.
The only reason I can conceive for not using a general purpose device like the Pi is if you can make it significantly cheaper. Given the low cost of the Pi, I'd be interested to see the specs of the MicroBit.
Do I want the dominant nation (America) to completely dominate the media I watch? No.
Do I want advertisement and (expensive) subscription rates to be the only way I pay for my media? No. (and whichever way you cut it, the monthly charges for Sky, Netflix etc. are easily as high as the BBC when you consider the range of content they provide).
Do I think the BBC has innovated in the digital space? Absolutely. Whilst the US argued over content rights for streaming media, and fought patent battles, the BBC delivered iPlayer and made swathes of quality current and historical content available.
Do I use the BBC news, weather and other sites on a regular basis. Yes.
Do I listen to Radio Three, or Six? No - but I'm glad they're there to support artists who'd otherwise have to fight angry tigers on "I Have X-factor Get Me Out Of Here" in order to be heard.
Do I think the BBC is biased? Probably, and it varies across channels and programmes, but so is every other media outlet.
Should the BBC be funded to continue to punch way above it's weight on the global stage? Absolutely, though how I have no idea.
I wouldn't suggest YT just puts up a paywall - and of course most visitors cannot or will not ever pay.
At the same time, whilst you say you won't put your credit card into your son's play account, you probably did to let him play Minecraft. If you could 'charge up' his YT account with a one-off payment of, say £2 and that would then let him watch 2000 videos free of advertising, would you find that so onerous?
YT already distinguishes between monetised and non-monetised videos, it wouldn't be a stretch to identify 'premium' video channels that require a subscription, or to specify as a content producer that you will accept a specific combination of free/paid/advert-laden views.
None of this stops YouTube from carrying on exactly as it is, but would open up a more concrete revenue stream for people who don't want to only watch/produce hilarious cat videos.
Zoella and a few others are exceptions rather than the rule. Given the vast number of videos and channels that get posted to YT, a small set of outliers getting large amounts of attention (and therefore money) is to be expected.
However, if you only get a 'normal' amount of attention (say, the sort of viewing figures that many BBC programmes get and are happy with), the revenue is dramatically smaller. Typically smaller than the production costs of even a modest 'proper' video.
The economics seem to mean that unless you're pushing out something new on a near-daily basis, and your production costs are nil (ie. you're a vlogger), you're basically not going to make money.
If Google implemented something like micropayments and paid content authors a tenth of a penny per view, the economics would change dramatically - and the skew away from a tiny handful of mega-stars might allow for better quality content. I suspect a lot of viewers would gladly pay that sort of money just to view videos with no interruptions.
I think the issue is that you may indeed heat your house with a GSHP (nice big garden?), but I bet you still have lights, washing machine, dishwasher, tumble-dryer, wifi, tv and a dozen other electronic gizmos plugged in or on charge. I would bet you travel to work in a car (or rail, bus) and fly abroad at least once a year. I imagine your house is relatively new to be efficiently heated by GSHP - so the materials are likely to have high carbon cost.
The cost of moving to renewables is not just magically switching on the GSHP we all have hiding in our gardens, but building them, delivering them, ensuring the housing stock is sufficiently energy efficient to use them. On top of that, some huge percentage of the population are not in locations where GSHPs can be installed. Even if they were, they would still commute, use modern tech and fly abroad on holiday. You can airily wave away the needs of the third world countries ("let them all use solar!"), but beyond nice, very low powered (and not very bright) LED lights, the moment they start pulling themselves up to a better standard of living, they'll need massively more energy too.
Your personal heating is a gnat's testicle compared to the steaming pile of energy we all use all the time. This is from someone who has a low energy home, heats it through solar and on-site biofuels (logs!) and has looked into the options for off-grid and renewables.
Yes, but have you heard how *loud* it is... you'll hear that coming from a mile away
There's some ideal that, for a culturally healthy society the talented creators should be encouraged (and let's not kid ourselves that these are plentifully available, we've all seen X-Factor trying to find a single talent from a pool of tens of thousands). As such, I'm all for elevating those few to a point of wealth and success. Those that argue that historically musicians used to have to roam the country to earn their crust miss out the fact that historically we lived in mud huts and the majority of the population were illiterate.
Both extremes of the argument (big music corps v.s the music should be free) are exactly that - extremes. I'm all for the current system being realigned, but to go to the other extreme and destroy the incentive to create is just foolish. Arguing that people will create anyway also misses the point - why should we restrict our culture to only those who happen to have the rare combination of talent and dumb pig-headedness needed to create when they are punished for doing so?
Notably the same argument goes for any intellectual work - we've mentally downgraded it's value as the supply suddenly seems so plentiful, but in doing so we're making a strong economic case for lowest common denominator work rather than brilliance. If we restrict the supply (or flood the market from a single source) we lessen the likelihood that outliers can crop up that produce something amazing.
This article has a certain amount of stating the obvious, and could have been written at any time over the last couple of years. How about actually comparing some of the current kit available to buy/build?
Agreed. We have a Hubsan X4 - it's cheap, it records acceptable video, it's easy to fly and it's a good introduction to all the things you have to consider when owning a drone: maintenance, repairs, batteries, planning flights and so on.
I like PC, like the one-liners and the darker approach.
What doesn't work so well for me is the harsh line between story arc and episodic format. Moffat seems to have fallen into the rhythm of one-concept per episode (too often borrowed from a film), which gives little time to explore an idea before they have to leap to a quick conclusion (*cough* shooting a spaceship with a gold arrow). The heavy handed teasers that something else is going on do not constitute a story arc, so much as build to a series end that cannot possibly satisfy once you've got past the "so that's what it meant" discovery.
A little more continuity between episodes would help, as would one or two properly identifiable baddies that the doctor can bash up against rather than simply not understand before producing a rabbit from his hat. At the moment, it's like watching a pin-ball machine: individual events are exciting, but the lack of any flow makes it a bit exhausting.
There are some companies who use big data systems solely so they can tell their clients that their services need big data systems. Equally some use the technology because if their employees thought they were working in yet another low-end service shed, they'd go and find somewhere more CV positive to work.
On the other hand, you can find yourself working on a project where the company behaviour is sufficiently well characterised (MVT et. al.) and the product sufficiently data-led (e.g. large volume online sales) that you can plot a straight line between insight and action, and then close the loop with feedback on sales uplift. At that time, yes you can realise value. If you are unable to identify what you are going to change as a result of the vague insights you hope you might get, or if you are unable to measure the difference between your approaches, then big data (however you define it) is not going to pay its way.
How many companies have tried to build a persistent online world that builds communities and has interest outside of the hardcore gaming community? How many companies successfully support 100m users? Clearly there is service knowledge and technical understanding that is of value to any company wanting to scale online environments. Add in a large user-base and it makes some sense.
From another perspective, this sort of success isn't something you can produce to order. Notch was lucky to hit the right combination of elements and skilled enough to respond to the early community to fine tune the recipe. Having got there, Minecraft has successfully seen off many imitators. At the same time it's been clear for a while that Notch himself has been far less comfortable managing the expectations of a global audience, so his exit is understandable.
My understanding is that a lot of the costs for Mojang are in maintaining a global server infrastructure, so you could imagine that being brought under Microsoft's wing could result in savings and hence more easily reached profits.
Crunch - why not Cascading or Pig? The point is, there are a lot of options in this space right now, most of which are moving/already run on the new execution engines. I'm glad you feel you can call the 'winner' on this, but from where I'm sitting we're back in the era of fighting over which text editor to use. It all generates plenty of work for the 'serious' end of the industry (yeah, and our production cluster is bigger than yours), but jumping from framework to framework to keep up with the latest trends is not ultimately productive.
It's fairly clear that we can see beyond Map/Reduce to more sophisticated distributed processing. There are quite a few contenders for the next generation platform and it looks like this is yet another.
However, I don't think we're there yet. Most options are about reducing the pain of M/R when you've got iterative jobs and more complex work flows, but a lot of arguably unnecessary pain remains. At some point I'd expect a generic way to describe such work flows to emerge and to become the de-facto standard. For now, none of the proposals are so compelling that developers are stopping coming up with proposal n+1.
To paraphrase: "A positive result will mean we come out of this experiment knowing less than we started!"
Working out we're holograms from within the hologram is cool. When it gets interesting is when we work out a way to do something about it.
I think that's a gross misunderstanding of MS's technique - unlike digital image stabilization techniques where you move successive images around to minimise 'shake', MS are generating 3D geometry along the entire path of the movie and using it to back fill missing segments.
So rather than aligning images, they're creating a completely artificial camera path and using images and computed geometry to render that path. I'm not aware of F/OSS doing that, and maybe you need to take that chip off your shoulder?
We're desperate for good Hadoop engineers, with solid Java, web services and Nosql experience..
Foolishly we were looking at people who'd been in the industry for a few years, when we should be interviewing 6 year olds.
If they did that in the UK, pranksters would have put it on the Eurostar by now and be partying with it in Ibiza within the week.
Our boy has just had his birthday and we got him a Wii U to replace an aging Wii. With Super Mario 3D World and Mario Kart 8, he's delighted. It's a nice system and the fact that there are few games doesn't matter to him because the few that do exist are outstanding and games he plays for years (literally, he restarted Mario Galaxy recently and played it through, missing nothing).
Nintendo have never played well with third parties (Rare were the one exception and they played a clever political game to get in there), so they've always been more dependent on their own titles. It makes the consoles look more limited compared to Playstation and XBox, but doesn't hurt the owner if they're happy with the Nintendo 'style'.
As a techie, of course I'd love a system that gives photorealistic graphics, real online environments and so on, but as a Dad I play casually and cannot see the point of investing in the Xbox or Playstation ecosystems when the vast majority of games are just cannon fodder and very expensive for the limited time I can put into them. We bought the Wii U knowing there were enough games to 'last until Christmas' and the upcoming releases look like they'll go far beyond that. I enjoy dipping into Mario and if I want something more 'sophisticated', it'll go on my PC.
We've had smart meters in our energy-efficient home for five years now. As has been pointed out, usage goes like this:
1. Install meter
2. Gasp at how much power appliance X uses
3. Get used to it and do nothing, as we need appliance X and a replacement costs hundreds
The really hungry appliances are easily matched by 'background' kit (lights, things on charge, fridge, broadband) any one of which costs a lot of money to make a relatively small cost change.
For homes where it would be possible to make a bigger saving (e.g. electric hot water), there is often a reason why the expensive option is there (no gas, family can't afford boiler replacement etc.), so a meter isn't actually going to make a difference either. Of course consumer education is a good thing and some will make the switch, but you could achieve the same thing with a TV campaign explaining how expensive it is to heat your home via different routes.
I heartily applaud an MP who has considered these issues and realises that Smart Meters are an expensive commitment in a rapidly evolving field - but the question then is: What should we be doing? I quite like the Smart home hub (e.g. OpenDCU) idea where instead of putting in a closed bit of kit, we support an adaptable smart home infrastructure and standards that allow many to use it in interesting ways. Much like France introducing Minitel it could have a much bigger effect than just measuring our energy bills.
As an app and web developer I can say that no, existing web standards are not good enough to be able to give up native code. That's not to say that a lot of standard press-and-select apps wouldn't happily exist online, but the hoops you have to jump through to provide a slick, reliable and immersive experience are quite nasty. Sometimes even apparently simple UIs require that many threads run in close synchronization in the background to ensure everything turns up exactly where and when the user expects them to.
Of course the other issue is one of revenue. If it takes me X weeks to code up a half way decent experience, how do I put it on the web and pay my bills? Ad revenue is a miserable compensation and including adverts only serves to interrupt the user's enjoyment. Paid apps at least connect the idea of some value back to the work you've put in.
On the whole though, the current eco-system and platforms are tending towards lowest-common-denominator experiences. It's painful to develop for the web, and just not worth putting months of work into apps that have shelf-lives measured in weeks. Users bemoan paying for something, even if it provides many hours of entertainment or use and there are no discovery mechanisms for genuinely different experiences.
Regardless, the current crop of browsers do not provide a platform for delivering meaningfully better tools and entertainment to the user.
Clearly a character, and for magazines such as Your Sinclair that informed my young life, I couldn't be more grateful. The fact that there was so much more to him makes him all the more wonderful, and all the more of a pity to see him gone.
Strangely, I find myself disagreeing. It's true you can strap a phone to your head for pennies, but I suspect it demonstrates (as per the review here) just how much virtual 3D sucks, even with modern mobile graphics hardware driving it.
The Oculus seems to be showing that there's a bunch of 'other stuff' that has to be solved to get properly immersive VR to work - display refresh rates, latency and lag, accurate position sensing, robust optics and so on. Much of that is just not worth integrating in a phone on the off-chance you'll strap it to your head, and some of it is no doubt patentable. The many failed attempts at virtual 3d over the years seem to show that 'good enough' is not good enough because we're highly sensitive to artificial reality being not quite right.
It's not clear to me whether Oculus are even going to solve this. They're iterating over the hardware and the experience is 'getting better' with each iteration. That suggests that it's not quite there yet. I don't see this becoming commodity hardware any time soon - and that lag has historically been enough for new entrants to move in to dominate a market (eg. Sony with the Playstation, Apple with the iPhone)
Unfortunately in the race to compete with Apple, I doubt Google are going to do a thing about the clone app issue - despite the fact it's clearly easily automatable.
That a third party is identifying 'odd' Google store placements and ratings is depressing. Surely the 'giant of search and metrics' should be capable of curating their own collection? As an app developer that sees a well rated and successful app out-placed by competitors who last updated their offerings two years ago, this is very frustrating. Google make it very, very difficult to compete on quality of service and user experience.
Oh dear, name fail.
It's an old (urban legend?) that BT found it got consistently better microphones out of the charcoal supplied by one specific charcoal burner in Cornwall. They spent a lot of time analysing the type of wood, how he burned it, how it was granulated to try to work out why his charcoal was better than any other. They could find no difference in the materials or process and the guy was particularly unhelpful. In the end they sent someone to spy on him to see what he was doing. It turned out he held a grudge against BT and would routinely pee on the charcoal sacks before they were sent off to the company.
My favourite sign heading north from Cambridge, sadly now lost to an idiot named Dave who is apparently big.
You must know how important it is in space to know exactly where your towel is?
To some extent, the manufacturers have brought this on themselves. Many of them have treated the smartphone market as an exercise in throwing boxes over the wall and forgetting about them - slow updates to already out of date OS installs, very flakey 'own brand' apps that are poor relations to the sort of thing you see on iPhones and modern, stock Android and so on. This doesn't have to be the end of Samsung - they already want to differentiate themselves on user experience - but it will clear out some of the low end players who damage the reputation of the platform. That's going to happen anyway as the smartphone market matures.
It might also help if they apply the same logic to the App market and discard some of the dross that still turns up when you search for something useful. The Silver brand might be a smart way to step away from that legacy shovelware without admitting that the Play Store (just like iTunes) plays host to a vast number of shoddy rip-off apps.
With Google playing to the budget market with the excellent Moto-G, I don't think the consumer is going to loose out really.
..now I've got the Tetris music going round my head.
Given that Amazon are going to be offering games, it's very relevant - OK it's not going to replace a PS4, but might give Nintendo a few sleepless nights.
It seems unfair to lambast a site with such lofty goals, but having watched it develop at a snails' pace it's hard to feel that there is much going on behind the facade. If this were a startup I'd expect to see rapid evolution and a focus on responding to it's own user's requests. Instead, the site has made minor changes and there have been no significant alterations to the current model of interaction which is gaining such poor traction.
In short, I'd expect a lot more to be done with the money, and to see iteration (or indeed pivoting) to take full advantage of the global media coverage. The impression it leaves is that not only has Cole 'conned' the Tax-payer, but the Impossible team have 'conned' her into supporting a badly managed project.
For a site that aims to encourage social interaction, it's notable how opaque the whole process is. We don't know the who, the how or why this money is being spent.
It's very noticeable that installing an app for free, or costing pennies seems to grant the installer the right to slag off the developer, berate them for not adding features and get angry (properly angry) when it doesn't behave exactly as expected. It's even worse if the installer thinks that the developer is (gasp!) making money from it.
If he's been making $50,000 a day, that is a LOT of messages from people who feel they are entitled to a triple-A experience tailored to their exact playing style.
> Kind of makes one realize how little a single user is worth.
Or, to put it another way, how little users value a good experience.
" The quinones are dissolved in water, which prevents them from catching fire..."
Woah there! I was all for this vegetable-based storage plot, right up until they dropped the bombshell that it might be a.. well, bombshell. Catching FIRE?! It all sounded so safe, environmentally friendly and (um) wet. Not firey death. Not from rhubarb.
I'm off to get a fire extinguisher for our garden. Just in case.
I admire your commitment, but sadly you're in the minority. My observation is that
1) Most people prefer free to paying up front for an unknown experience
2) The deluge of low quality apps has lowered people's perceptions of the value of apps to near zero.
In other words, most people expect to be given disposable or poor experiences, so they just don't pay. In essence the market has moved away from paying up front for the developer's effort, to paying (by in game purchases etc.) for each additional hour of entertainment/use. The end users have been shunted over to a model where they only have to pay for stuff that they actually use - but then they get hammered. It's a vicious cycle - developers produce dross, app purchasers don't pay. Perhaps the biggest issue is not which party is most to blame, but that the app stores do very little to help higher quality apps rise to the surface. I know apps that have been broken for the last year and yet still get shown as first results in searches because they had a massive following on some early version of Android.
I'd love to be the developer that breaks the mould, spends a year creating a AAA experience and then charges 10 quid for it, but as I'm not EA or Rockstar, I can't.
The issue is that Nesta and similar schemes like the TSB are presented as means by which the government supports innovation, particularly in technology. Unlike VC money, they do not require the innovator to hand over some part of their ownership. From that description, they sound ideal, and I've been sucked into various projects that got involved with them for exactly those reasons.
The reality is that they are run by people who are quite distant from what we'd recognise as technological innovation. Despite the headline title, they're run by people who have not experienced the cut and thrust of modern engineering - i.e. civil servants and arts graduates. This is not obvious from the glossy brochures and the 'pitch' you get from officers involved in the schemes, and it's that disconnect that the article highlights.
With a daughter (and a son) I'm keen to get engaged with science and engineering, I'm all for products that interest and excite children. I can understand the drive behind the original advert, but the further this saga goes on, the more I feel manipulated. Finding the product is poorly reviewed makes it worse - don't tell me that they've painted some tat pink and sold it to me on the back of 'enpowerment'.
Personal recent experience with government 'support' for innovation has been anything but good - wooly targets, big headline figures and a protracted series of hoops to jump through. The end result is that the opportunity for chance success is eliminated, replaced by the job of fitting to a lumberingly slow process which offers marginal financial support to small businesses.
That said, I'm not sure how you make it better. It should be faster, more flexible and not afraid of failure. In that respect the startup scene shows a lot of possibility - and it's here that the government should be able to outpace commerce by removing (or at least reducing) the cold hard financial requirements in favour of enabling people to 'take a punt' on new technology. The TSB funding is not enabling - it supports businesses that can prove they are capable of standing alone, and in return demands they fit the unique pace of government entities.
Given how the Register produces fascinating 'history of computing' articles, the fuzziness around the development of the first ARM processor, and what exactly Acorn did is a bit disappointing.
As is the lack of technical detail of how exactly Apple surprised ARM with their implementation. If someone tells you that they were surprised by something, wouldn't you ask what and why?
If you're building thousands of devices, the same device is available as a solder on module and I imagine the company can sell you a private bit of cloud that you can operate yourself. They've solved the integration and configuration issues that you would otherwise have to deal with if you buy one of those other SoCs.
The fact that we're still seeing different offerings like this suggests to me that none of the other products have created the universal solution to IoT devices. To claim this is less valid when all of it's competitors fall down in various ways is somewhat mean spirited.
Except (as I understand it) your zigbee module (that offers no real cost savings or simpler integration than this unit) needs a separate base station and further integration. If I want to control a zigbee unit from my android phone, how much extra kit do I need to buy and configure?
Don't get me wrong, ZigBee, RasPi and all the others have specific niches - and it seems to me that this has it's own useful niche too. All the posters attacking it because it isn't one of those other bits of technology make no sense to me.
I think a few of the commentards are missing the point here.
Ignoring the 'I can hack anything' market (which is small, but noisy), this is aimed at the 'deliver something to the consumer' market which is much more boring but far larger in scale. If I, as a manufacturer of light fittings, wanted to make a light fitting that I could control from a mobile, I'd want a module that does all the boring stuff for me, leaving me to do the last step (switch the light on). I'd want it in a tiny form factor and available as a solder-down module that I can just fit into my light fittings.
Arduino, RaspPi and so on don't do that. They're large and designed for hobbyists - general purpose hacking devices. That's great, but they're the most expensive option when it comes to integrating with a bit of consumer kit that just needs to perform simple functions and be controlled over the 'net.
I'm not sure if this is the 'right' answer, but it's a lot better than being told to hack around with a Pi just to perform the most basic of activities.
I'm sorry, but good grief, your idea of computer science sounds terrifyingly like the typical gormless middle manager who's convinced he doesn't need to understand the technical details because he's got a firm hand on the budget.
To be honest, the thing that would most invigorate the entire industry is returning the focus to getting actual value from the work we're doing. Creating something. A real result rather than the hyped up nonsense of improved social media penetration and merry-go-round startups who's only existence seems to be to insert themselves in the middle of a perfectly functional value chain to no good end.
We can create and transform industries with the work we do. We can discover new science and help people live longer, healthier lives. We can deliver outstanding entertainment and we can improve every one of the human senses. We can teach. We can save lives and predict deaths. We can create delightful experiences. We can aid discovery and remember for you. Yet all of these things get lost in the mire of big data projects with no discernible outcome or over-hyped startups with paper-thin business models that boil down to selling more adverts. If you ask people today what computers do for them, they'll tell you Google and Amazon - not for the feats of engineering that uphold those companies, but for the experience of being sold stuff at every point of interaction with a machine. If you want people to be excited about computers, we need to start being excited ourselves, and to throw off the hype around businesses who's only value is coincidental to the actual technology and function being created.
Doctors and lawyers get a good rap because people can see what they do. Make people well, prosecute the guilty, protect the innocent. Computer scientists have lost their identity to telephone sanitisers and snake oil salesmen. Real outcomes excite people, not vague nonsense.
The same applies to kids. They want to see outcomes. In our day, getting an LED flashing was still relatively novel - and a sufficiently big step that moving on to a fully working robot seemed only another small step away. The excitement and imagined possibilities drew us in and we learnt around them. These days, getting a RasPi to light an LED or launch a website is utterly mundane and children are left asking where they can go next. The things that excite them (high quality games, Facebook et. al) seem just as far away as they were when they didn't know how to get a linux partition to boot. The challenge to educators is to get children to a platform where they can achieve things that involve them before having to understand decades worth of technological advancement.
That's the problem with left wing readers - they get confused between economic arguments and moral/social ones. You can have a 'right wing' view on the world and still believe granny should get her hip - but you don't start mixing it up with some made up economic benefit.
Try re-reading the article again without the knee jerk reaction. Just because his politics are different to yours doesn't make the discussion moot. And, again, the article (as I read it) doesn't argue against national health care.
Seriously, why do people think they have some monopoly on caring? This is the biggest fraud perpetrated in modern politics - that somehow any given political group has the unique ability to care about old ladies and children. This comment will be downvoted, of course, but it doesn't stop me wanting to see my dear old mum with a knee replacement, or my kids getting a good education.
Agreed. Teabags only need a quick swill round or the tannins overwhelm the taste.
At the risk of loosing any tea tasting credentials, I've fond memories of tea served by a rather lovely flatmate who would add a generous slug of brandy to a mix of earl grey and assam. Most evenings would end up with us around the kitchen table setting the world to rights.
I have one in my loft, which has only been switched on a couple of times. I have a feeling the keyboard membrane has suffered in the mean time - when it was last tried out, half a row of keys didn't respond.
One day I'll find the time to have a proper play with it.