Re: I applaud this and hope it is clean. However we could have written this in Perl in 1 line.
One line of C code. That's formatted to look like a dog p***ing on a Christmas Tree and with every other word rhyming.
131 posts • joined 9 Jul 2009
One line of C code. That's formatted to look like a dog p***ing on a Christmas Tree and with every other word rhyming.
The number of new build homes is a tiny fraction of the existing housing stock. At the current rate of building, we'd replace all the existing homes in around 160 years. Not only that, but it's the existing (inefficient) homes that most need accurate metering if you accept the argument that accurate metering reduces consumption.
As it is, it's a poorly thought out project, with poor technical specification and even poorer oversight.
John thought hard about what he was going to do.. the most outrageous, dangerous and yet romantic action he had ever dared take in his life. He would..
(cont. on page 2.)
Ha, If you're so clever, YOU tell us what colour it should be.
..the Twittersphere lurches into action, ready to take major offense and vilify anyone unfortunate enough not to have pre-written and proof read every statement they make in public.
Yes, it was not very smart, or sensitive, or appropriate. However I doubt the sentiment behind it quite justifies the strength of the response.
Very funny if deliberate. Very embarassing if not.
I thought that dearth of right wing humour was because they were all out getting jobs and didn't have time to hang around student bars trying to get into the knickers of the cute girl on the Revue committee?
There really is a humour problem for the posters who're trying to prove how sophisticated they are by taking apart comedy on the Register Forums. Take the hint from the original review - you can describe why you find something funny for the benefit of others who find the same things funny. You can't make something 'not funny' because you're simply excluding yourself from the group.
Maybe I'm optimistic, but those posters complaining on the grounds that this won't run their web service any faster (if at all, with it's new OS) have missed the point.
In the data crunching corner of the world, most of the innovation is around describing 'non traditional' processing tasks and then mapping those (very painfully) on to traditional hardware. Everyone will tell you that adding another off the shelf node to a compute cluster is cheap and you can expand to build a cluster capable of handling the large loads that big data, inference and graph compute problems throw up. The problem is that big clusters do not scale linearly when it comes to reliability, and network and disk effects mean that at least half of a cluster's energy goes into overcoming the dead weight of having compute resources that don't match the task. We don't actually want a vaster, more manageable cluster of Linux boxes, we want a compute resource matched to the process description.
Soooo.. as Mars shots go, this could make some sense. We're starting to describe processing in terms of directed graphs of actions, which can be mapped to both batch and real time work loads. An architecture that starts with the premise of many actors consuming a vast store of messages in a robust and scalable way would potentially outperform today's clusters by orders of magnitude. Given the cost of provisioning and maintaining a modern cluster, the exotic nature of the Machine may be a small price to pay.
I'm reminded of the early introduction of NUMA machines, which suddenly introduced capabilities that allowed tasks that used to be done by a building full of mainframes to be done by a box that sat under your desk. This architecture could potentially do the same to clusters - and not by virtualising thousands of machines into one box.
I waited for the comments to come in after this article arrived. On my score card I got points for "language is verbose", "Desktop sucks" and "Java security is terrible". I'm quite surprised not to have got "It runs too slow" and "the VM is too big".
Despite all the revisionist history, Java got in there because it solved a whole bunch of problems that no other platform quite managed. For a start, outside of Delphi, it was just about the only 'platform' that you could get hold of. Sure you could mix up a nightmare brew of C, x-windows, sql libraries and so on, but that was so painfully vendor dependent that each and every project required a fresh start. In contrast, you could knock up a Java app with a gui that talked to the network, database and file system of your choice and not have to re-build it every time you needed to run it on a different machine or for a different client. The browser integration promised (but never delivered) that those apps would eventually be able to be delivered automagically by the internet, but even without it, software houses could deliver tools and demos and apps far more easily than they had before.
Things like the standard runtime libraries, easy dependency injection, automatic documentation generation and very consistent approach to API behaviour meant that picking up Java to do a job was an order of magnitude easier than the equivalent in just about any other language at the time. That's nothing to do with CS students or managers' preference - it was a handy tool that also happened to scratch the itch that most developers have to learn something new (which has since benefited Ruby, Python, Scala et. al.).
The Java guys have never really got UI development, and that pretty much killed off browser integration and desktop apps, despite repeated attempts like JavaFX. It's funny therefore to see Android use Java for a UI-heavy platform. (it's also probably worth mentioning that Minecraft seems to have done fairly well for a little game written in Java). Meanwhile, web development and now big data have absolutely thrived on the ability of Java to evolve into new areas.
Whilst Scala has been an interesting diversion, the small size and relatively limited resources of the development team have been a constraining factor - resulting in painfully slow tools and weirdly inconsistent core libraries. With Java 8 making useful inroads to functional styles, and the solidity of things like the collection classes, I'm not sure that the 'other JVM language' will ever get out of it's niche.
So - thank you Java. I still remember evaluating it for an early project (and having to wait for it to be delivered by post on a CD) and twenty years later, it's still relevant to my clients and still evolving into new areas. It's far from perfect, but it does a lot of jobs pretty well. Long may it continue.
I use Windows, Linux, Android, OSX and a host of other operating systems in the course of my work - I'm fairly platform agnostic, but avoid the Apple consumer products as I'm not keen on the lock in. That said, I've been using an MBP for two years and it's a fantastic developer's machine - powerful, fast and mechanically robust. It gets dropped in a backpack at the end of each day (no sleeve) and bounced around on the commute and (touch wood) remains in perfect condition. Compare this to any other windows laptop and it's head and shoulders ahead in terms of longevity and utility.
My FIL was developing home automation, switches and sensors back in the 80's - you could switch on your kettle from across the Atlantic, answer your door with a remote control and all of the 'not quite useful' concepts that the IoT brigade are currently getting over-excited about.
The point has to be that we build our environment to suit the simple needs of a simple bag of flesh - simple light switches, the front door not so far away from the rest of the house, the heating more or less able to keep the temperature within tolerable bounds and so on. Sure, you can impose multi-zone ultra-precise control and feedback on all that, but (a) we don't naturally think that way and (b) we don't actually care. Do I want complex lighting schemes controllable from a smart-phone? No - I switch the light on when I walk into the room, and off when I walk out. A bit brighter, or a bit dimmer occasionally, but none of that justifies spending several hundred pounds to make a job that is already pretty convenient more complex and more prone to problems.
I don't doubt that someone will come up with something interesting eventually, but unless the device makes a significant difference to the way you live your life, there's no reason to spend so much effort and money when there are simple alternatives in the depressingly analogue world.
Java's actually very good for this sort of stack - portable, robust and flexible enough to embrace the steady shift of development frameworks as we better understand the problems we're trying to resolve. It's far from inefficient and the tooling around it is excellent. If you're still comparing your iPhone sized database with the sort of thing we're doing with Hadoop, you need to re-read the part of the manual that explains how not all databases are the same. The system I'm currently working on consumes about a terabyte of data a day and retains that indefinitely. Not only that, but it delivers value - we're talking eight figure sums annually here for a single use case, but that is a long, long way from the sort of thing you'd achieve with SQLite.
What is the problem is the issue of wrangling a cluster of a few hundred machines and providing production level SLAs on processes running on that cluster. The tooling is catching up, but we're working from the ground up on technology that is still desperately immature. There are a LOT of moving parts in a typical deployment, and whilst Hortonworks, Cloudera and the others are getting better at bringing a working system up, there is much to be done to ensure BAU services are business as usual, not a series of experiments. It doesn't help that there are dozens of different frameworks and approaches to implementing a given solution - nearly everyone I talk to has found a new combination of tools to use and that's preventing companies from focussing their efforts on making one particular toolset feature complete and robust.
So when it turns out we have accidentally declared war on the peace loving Mercurians, we'll know who to blame.
And yet Microsoft, Apple, Google, and the Linux community all survive with automatic patching and no-one has managed to hold the entire world to ransom.
That's not to say it can't happen, and of course updates have caused problems in the past. However, I think most companies understand their exposure.
When it comes to terrorist plots, stopping middle class managers from getting to their meetings on time is hardly going to have the impact they'd want.
The only reason I can conceive for not using a general purpose device like the Pi is if you can make it significantly cheaper. Given the low cost of the Pi, I'd be interested to see the specs of the MicroBit.
Do I want the dominant nation (America) to completely dominate the media I watch? No.
Do I want advertisement and (expensive) subscription rates to be the only way I pay for my media? No. (and whichever way you cut it, the monthly charges for Sky, Netflix etc. are easily as high as the BBC when you consider the range of content they provide).
Do I think the BBC has innovated in the digital space? Absolutely. Whilst the US argued over content rights for streaming media, and fought patent battles, the BBC delivered iPlayer and made swathes of quality current and historical content available.
Do I use the BBC news, weather and other sites on a regular basis. Yes.
Do I listen to Radio Three, or Six? No - but I'm glad they're there to support artists who'd otherwise have to fight angry tigers on "I Have X-factor Get Me Out Of Here" in order to be heard.
Do I think the BBC is biased? Probably, and it varies across channels and programmes, but so is every other media outlet.
Should the BBC be funded to continue to punch way above it's weight on the global stage? Absolutely, though how I have no idea.
I wouldn't suggest YT just puts up a paywall - and of course most visitors cannot or will not ever pay.
At the same time, whilst you say you won't put your credit card into your son's play account, you probably did to let him play Minecraft. If you could 'charge up' his YT account with a one-off payment of, say £2 and that would then let him watch 2000 videos free of advertising, would you find that so onerous?
YT already distinguishes between monetised and non-monetised videos, it wouldn't be a stretch to identify 'premium' video channels that require a subscription, or to specify as a content producer that you will accept a specific combination of free/paid/advert-laden views.
None of this stops YouTube from carrying on exactly as it is, but would open up a more concrete revenue stream for people who don't want to only watch/produce hilarious cat videos.
Zoella and a few others are exceptions rather than the rule. Given the vast number of videos and channels that get posted to YT, a small set of outliers getting large amounts of attention (and therefore money) is to be expected.
However, if you only get a 'normal' amount of attention (say, the sort of viewing figures that many BBC programmes get and are happy with), the revenue is dramatically smaller. Typically smaller than the production costs of even a modest 'proper' video.
The economics seem to mean that unless you're pushing out something new on a near-daily basis, and your production costs are nil (ie. you're a vlogger), you're basically not going to make money.
If Google implemented something like micropayments and paid content authors a tenth of a penny per view, the economics would change dramatically - and the skew away from a tiny handful of mega-stars might allow for better quality content. I suspect a lot of viewers would gladly pay that sort of money just to view videos with no interruptions.
I think the issue is that you may indeed heat your house with a GSHP (nice big garden?), but I bet you still have lights, washing machine, dishwasher, tumble-dryer, wifi, tv and a dozen other electronic gizmos plugged in or on charge. I would bet you travel to work in a car (or rail, bus) and fly abroad at least once a year. I imagine your house is relatively new to be efficiently heated by GSHP - so the materials are likely to have high carbon cost.
The cost of moving to renewables is not just magically switching on the GSHP we all have hiding in our gardens, but building them, delivering them, ensuring the housing stock is sufficiently energy efficient to use them. On top of that, some huge percentage of the population are not in locations where GSHPs can be installed. Even if they were, they would still commute, use modern tech and fly abroad on holiday. You can airily wave away the needs of the third world countries ("let them all use solar!"), but beyond nice, very low powered (and not very bright) LED lights, the moment they start pulling themselves up to a better standard of living, they'll need massively more energy too.
Your personal heating is a gnat's testicle compared to the steaming pile of energy we all use all the time. This is from someone who has a low energy home, heats it through solar and on-site biofuels (logs!) and has looked into the options for off-grid and renewables.
Yes, but have you heard how *loud* it is... you'll hear that coming from a mile away
There's some ideal that, for a culturally healthy society the talented creators should be encouraged (and let's not kid ourselves that these are plentifully available, we've all seen X-Factor trying to find a single talent from a pool of tens of thousands). As such, I'm all for elevating those few to a point of wealth and success. Those that argue that historically musicians used to have to roam the country to earn their crust miss out the fact that historically we lived in mud huts and the majority of the population were illiterate.
Both extremes of the argument (big music corps v.s the music should be free) are exactly that - extremes. I'm all for the current system being realigned, but to go to the other extreme and destroy the incentive to create is just foolish. Arguing that people will create anyway also misses the point - why should we restrict our culture to only those who happen to have the rare combination of talent and dumb pig-headedness needed to create when they are punished for doing so?
Notably the same argument goes for any intellectual work - we've mentally downgraded it's value as the supply suddenly seems so plentiful, but in doing so we're making a strong economic case for lowest common denominator work rather than brilliance. If we restrict the supply (or flood the market from a single source) we lessen the likelihood that outliers can crop up that produce something amazing.
This article has a certain amount of stating the obvious, and could have been written at any time over the last couple of years. How about actually comparing some of the current kit available to buy/build?
Agreed. We have a Hubsan X4 - it's cheap, it records acceptable video, it's easy to fly and it's a good introduction to all the things you have to consider when owning a drone: maintenance, repairs, batteries, planning flights and so on.
I like PC, like the one-liners and the darker approach.
What doesn't work so well for me is the harsh line between story arc and episodic format. Moffat seems to have fallen into the rhythm of one-concept per episode (too often borrowed from a film), which gives little time to explore an idea before they have to leap to a quick conclusion (*cough* shooting a spaceship with a gold arrow). The heavy handed teasers that something else is going on do not constitute a story arc, so much as build to a series end that cannot possibly satisfy once you've got past the "so that's what it meant" discovery.
A little more continuity between episodes would help, as would one or two properly identifiable baddies that the doctor can bash up against rather than simply not understand before producing a rabbit from his hat. At the moment, it's like watching a pin-ball machine: individual events are exciting, but the lack of any flow makes it a bit exhausting.
There are some companies who use big data systems solely so they can tell their clients that their services need big data systems. Equally some use the technology because if their employees thought they were working in yet another low-end service shed, they'd go and find somewhere more CV positive to work.
On the other hand, you can find yourself working on a project where the company behaviour is sufficiently well characterised (MVT et. al.) and the product sufficiently data-led (e.g. large volume online sales) that you can plot a straight line between insight and action, and then close the loop with feedback on sales uplift. At that time, yes you can realise value. If you are unable to identify what you are going to change as a result of the vague insights you hope you might get, or if you are unable to measure the difference between your approaches, then big data (however you define it) is not going to pay its way.
How many companies have tried to build a persistent online world that builds communities and has interest outside of the hardcore gaming community? How many companies successfully support 100m users? Clearly there is service knowledge and technical understanding that is of value to any company wanting to scale online environments. Add in a large user-base and it makes some sense.
From another perspective, this sort of success isn't something you can produce to order. Notch was lucky to hit the right combination of elements and skilled enough to respond to the early community to fine tune the recipe. Having got there, Minecraft has successfully seen off many imitators. At the same time it's been clear for a while that Notch himself has been far less comfortable managing the expectations of a global audience, so his exit is understandable.
My understanding is that a lot of the costs for Mojang are in maintaining a global server infrastructure, so you could imagine that being brought under Microsoft's wing could result in savings and hence more easily reached profits.
Crunch - why not Cascading or Pig? The point is, there are a lot of options in this space right now, most of which are moving/already run on the new execution engines. I'm glad you feel you can call the 'winner' on this, but from where I'm sitting we're back in the era of fighting over which text editor to use. It all generates plenty of work for the 'serious' end of the industry (yeah, and our production cluster is bigger than yours), but jumping from framework to framework to keep up with the latest trends is not ultimately productive.
It's fairly clear that we can see beyond Map/Reduce to more sophisticated distributed processing. There are quite a few contenders for the next generation platform and it looks like this is yet another.
However, I don't think we're there yet. Most options are about reducing the pain of M/R when you've got iterative jobs and more complex work flows, but a lot of arguably unnecessary pain remains. At some point I'd expect a generic way to describe such work flows to emerge and to become the de-facto standard. For now, none of the proposals are so compelling that developers are stopping coming up with proposal n+1.
To paraphrase: "A positive result will mean we come out of this experiment knowing less than we started!"
Working out we're holograms from within the hologram is cool. When it gets interesting is when we work out a way to do something about it.
I think that's a gross misunderstanding of MS's technique - unlike digital image stabilization techniques where you move successive images around to minimise 'shake', MS are generating 3D geometry along the entire path of the movie and using it to back fill missing segments.
So rather than aligning images, they're creating a completely artificial camera path and using images and computed geometry to render that path. I'm not aware of F/OSS doing that, and maybe you need to take that chip off your shoulder?
We're desperate for good Hadoop engineers, with solid Java, web services and Nosql experience..
Foolishly we were looking at people who'd been in the industry for a few years, when we should be interviewing 6 year olds.
If they did that in the UK, pranksters would have put it on the Eurostar by now and be partying with it in Ibiza within the week.
Our boy has just had his birthday and we got him a Wii U to replace an aging Wii. With Super Mario 3D World and Mario Kart 8, he's delighted. It's a nice system and the fact that there are few games doesn't matter to him because the few that do exist are outstanding and games he plays for years (literally, he restarted Mario Galaxy recently and played it through, missing nothing).
Nintendo have never played well with third parties (Rare were the one exception and they played a clever political game to get in there), so they've always been more dependent on their own titles. It makes the consoles look more limited compared to Playstation and XBox, but doesn't hurt the owner if they're happy with the Nintendo 'style'.
As a techie, of course I'd love a system that gives photorealistic graphics, real online environments and so on, but as a Dad I play casually and cannot see the point of investing in the Xbox or Playstation ecosystems when the vast majority of games are just cannon fodder and very expensive for the limited time I can put into them. We bought the Wii U knowing there were enough games to 'last until Christmas' and the upcoming releases look like they'll go far beyond that. I enjoy dipping into Mario and if I want something more 'sophisticated', it'll go on my PC.
We've had smart meters in our energy-efficient home for five years now. As has been pointed out, usage goes like this:
1. Install meter
2. Gasp at how much power appliance X uses
3. Get used to it and do nothing, as we need appliance X and a replacement costs hundreds
The really hungry appliances are easily matched by 'background' kit (lights, things on charge, fridge, broadband) any one of which costs a lot of money to make a relatively small cost change.
For homes where it would be possible to make a bigger saving (e.g. electric hot water), there is often a reason why the expensive option is there (no gas, family can't afford boiler replacement etc.), so a meter isn't actually going to make a difference either. Of course consumer education is a good thing and some will make the switch, but you could achieve the same thing with a TV campaign explaining how expensive it is to heat your home via different routes.
I heartily applaud an MP who has considered these issues and realises that Smart Meters are an expensive commitment in a rapidly evolving field - but the question then is: What should we be doing? I quite like the Smart home hub (e.g. OpenDCU) idea where instead of putting in a closed bit of kit, we support an adaptable smart home infrastructure and standards that allow many to use it in interesting ways. Much like France introducing Minitel it could have a much bigger effect than just measuring our energy bills.
As an app and web developer I can say that no, existing web standards are not good enough to be able to give up native code. That's not to say that a lot of standard press-and-select apps wouldn't happily exist online, but the hoops you have to jump through to provide a slick, reliable and immersive experience are quite nasty. Sometimes even apparently simple UIs require that many threads run in close synchronization in the background to ensure everything turns up exactly where and when the user expects them to.
Of course the other issue is one of revenue. If it takes me X weeks to code up a half way decent experience, how do I put it on the web and pay my bills? Ad revenue is a miserable compensation and including adverts only serves to interrupt the user's enjoyment. Paid apps at least connect the idea of some value back to the work you've put in.
On the whole though, the current eco-system and platforms are tending towards lowest-common-denominator experiences. It's painful to develop for the web, and just not worth putting months of work into apps that have shelf-lives measured in weeks. Users bemoan paying for something, even if it provides many hours of entertainment or use and there are no discovery mechanisms for genuinely different experiences.
Regardless, the current crop of browsers do not provide a platform for delivering meaningfully better tools and entertainment to the user.
Clearly a character, and for magazines such as Your Sinclair that informed my young life, I couldn't be more grateful. The fact that there was so much more to him makes him all the more wonderful, and all the more of a pity to see him gone.
Strangely, I find myself disagreeing. It's true you can strap a phone to your head for pennies, but I suspect it demonstrates (as per the review here) just how much virtual 3D sucks, even with modern mobile graphics hardware driving it.
The Oculus seems to be showing that there's a bunch of 'other stuff' that has to be solved to get properly immersive VR to work - display refresh rates, latency and lag, accurate position sensing, robust optics and so on. Much of that is just not worth integrating in a phone on the off-chance you'll strap it to your head, and some of it is no doubt patentable. The many failed attempts at virtual 3d over the years seem to show that 'good enough' is not good enough because we're highly sensitive to artificial reality being not quite right.
It's not clear to me whether Oculus are even going to solve this. They're iterating over the hardware and the experience is 'getting better' with each iteration. That suggests that it's not quite there yet. I don't see this becoming commodity hardware any time soon - and that lag has historically been enough for new entrants to move in to dominate a market (eg. Sony with the Playstation, Apple with the iPhone)
Unfortunately in the race to compete with Apple, I doubt Google are going to do a thing about the clone app issue - despite the fact it's clearly easily automatable.
That a third party is identifying 'odd' Google store placements and ratings is depressing. Surely the 'giant of search and metrics' should be capable of curating their own collection? As an app developer that sees a well rated and successful app out-placed by competitors who last updated their offerings two years ago, this is very frustrating. Google make it very, very difficult to compete on quality of service and user experience.
Oh dear, name fail.
It's an old (urban legend?) that BT found it got consistently better microphones out of the charcoal supplied by one specific charcoal burner in Cornwall. They spent a lot of time analysing the type of wood, how he burned it, how it was granulated to try to work out why his charcoal was better than any other. They could find no difference in the materials or process and the guy was particularly unhelpful. In the end they sent someone to spy on him to see what he was doing. It turned out he held a grudge against BT and would routinely pee on the charcoal sacks before they were sent off to the company.
My favourite sign heading north from Cambridge, sadly now lost to an idiot named Dave who is apparently big.
You must know how important it is in space to know exactly where your towel is?
To some extent, the manufacturers have brought this on themselves. Many of them have treated the smartphone market as an exercise in throwing boxes over the wall and forgetting about them - slow updates to already out of date OS installs, very flakey 'own brand' apps that are poor relations to the sort of thing you see on iPhones and modern, stock Android and so on. This doesn't have to be the end of Samsung - they already want to differentiate themselves on user experience - but it will clear out some of the low end players who damage the reputation of the platform. That's going to happen anyway as the smartphone market matures.
It might also help if they apply the same logic to the App market and discard some of the dross that still turns up when you search for something useful. The Silver brand might be a smart way to step away from that legacy shovelware without admitting that the Play Store (just like iTunes) plays host to a vast number of shoddy rip-off apps.
With Google playing to the budget market with the excellent Moto-G, I don't think the consumer is going to loose out really.
..now I've got the Tetris music going round my head.
Given that Amazon are going to be offering games, it's very relevant - OK it's not going to replace a PS4, but might give Nintendo a few sleepless nights.
It seems unfair to lambast a site with such lofty goals, but having watched it develop at a snails' pace it's hard to feel that there is much going on behind the facade. If this were a startup I'd expect to see rapid evolution and a focus on responding to it's own user's requests. Instead, the site has made minor changes and there have been no significant alterations to the current model of interaction which is gaining such poor traction.
In short, I'd expect a lot more to be done with the money, and to see iteration (or indeed pivoting) to take full advantage of the global media coverage. The impression it leaves is that not only has Cole 'conned' the Tax-payer, but the Impossible team have 'conned' her into supporting a badly managed project.
For a site that aims to encourage social interaction, it's notable how opaque the whole process is. We don't know the who, the how or why this money is being spent.
It's very noticeable that installing an app for free, or costing pennies seems to grant the installer the right to slag off the developer, berate them for not adding features and get angry (properly angry) when it doesn't behave exactly as expected. It's even worse if the installer thinks that the developer is (gasp!) making money from it.
If he's been making $50,000 a day, that is a LOT of messages from people who feel they are entitled to a triple-A experience tailored to their exact playing style.
> Kind of makes one realize how little a single user is worth.
Or, to put it another way, how little users value a good experience.
" The quinones are dissolved in water, which prevents them from catching fire..."
Woah there! I was all for this vegetable-based storage plot, right up until they dropped the bombshell that it might be a.. well, bombshell. Catching FIRE?! It all sounded so safe, environmentally friendly and (um) wet. Not firey death. Not from rhubarb.
I'm off to get a fire extinguisher for our garden. Just in case.