104 posts • joined 11 Jan 2011
Re: Cost @ km123
Au contraire yourself, mate. I've never read a bigger pile of crap or any other comment that's as poor as yours. I couldn't downvote you hard enough
FWIW, the Pi was designed in Britain and is now also manufactured in Britain.
@Pascal Monett Re: "an experiment in education of pirates"
You're almost there but there's an aspect to Minecraft you forgot to include: the near-constant release of extra content for it (which is free, I might add, and not paid for like most "DLC"). The model for Minecraft is essentially: "Sure, pirate our game all you like - you can even have this free version (which has a limited subset of the features from the main game) - but you won't get all the cool extra stuff we add if you pirate it". There *is* a protection/"DRM"-ish component to it as the game validates against your Mojang account whenever it launches, and multiplayer servers validate your account whenever you connect. No account, no Minecraft for you.
Of course you could pirate each new version that's released, but you still can't play multiplayer - a major component of the game if you ask me - unless you've bought the game or you use a hacked server.
To correct a few further factual errors... Minecraft was indeed originally built by one person (Markus Persson, since you asked), but he didn't sell his game to a company; he founded Mojang with the money he'd made from Minecraft, because he couldn't hope to keep up on his own. He might not work on Minecraft any more - others in Mojang do however - but he's still a developer.
Downvote me all you like for being a pedant, but I can't help myself - when you create your own mods & texture packs it's hard to avoid describing yourself as a massive Minecraft nerd.
Bigger watermarks it is then
I've used watermarks on all my "proper" photos (i.e. not the drunken mobile phone snaps) for a long time now, but they can easily be cropped out. EXIF data can easily be stripped, and steganography seems to be a bit of a dead-end in this case - if $random_person/$random_company on the other side of the globe starts using your images without your permission, how the hell are you going to know about it let alone run their copy of the image through your software...?
I guess the only realistic answer might be a watermark that covers the entire image...
Dell Mini 9
Not specifically the focus of this article, but I owned one of these machines and utterly loved it. The only flaw with it really was the SSD writes were quite laggy now and then, and replacements were quite expensive... Brilliant little machine.
Ended up getting rid of it because I wanted a bigger screen and a less cramped keyboard.
Re: Moral of the story...
Agreed 100%; I'd agree 200%, if that wasn't such bad mathematics... Even routers supplied by ISPs are usually supplied with encryption already applied these days. That might not stop someone trying to defeat that encryption (they won't have to do much work - can you spell "database of default passwords"?) but it certainly would have stopped Google's accidental drive-by slurping. And if you're using a public or unencrypted WiFi hotspot then you should be well aware of the security risks involved - even *Windows* warns you about unencrypted networks, ffs...
The fine is somewhat toothless, but I have utterly no sympathy for anyone who had sensitive data collected.
Your provider gives you a WiFi router that doesn't let you add encryption? Man, I'd hate to sign up with those guys... Even Virgin's SuperSh*teHub - the biggest pile of dog turd masquerading as a router I've ever seen - lets you add WPA2 encryption.
You're comparing accidental data collection to rape? Really? That's the stupidest thing I've ever read, and I read YouTube comments.
I urge you all to downvote this retard as hard as you can.
This is actually remarkably easy. It's a little fiddly to pair the WiiMote with the PC, depending on the manufacturer of your bluetooth chip and your OS, but it does work. I forget the name of the software involved, but it's perfectly possible to make the WiiMote emulate a mouse and remap the buttons to various keys/functions; pairing it to the PC doesn't involve any extra software at all, though it is pretty useless unless you can map it to functions...
Though you do need to use the "sensor" bar in order to achieve this. This is all from several-year-old memory, but the info is quite easy to find with everyone's favourite data-snaffling search engine. IIRC, the "sensor" bar doesn't actually have sensors but has IR sources for receivers in the WiiMote. The cables only have two pins - ground and +5v - so pickup a sacrificial one on the cheap, find the pinout on the web somewhere and hack a suitable power source together (if you're using it with an HTPC, don't forget about those handy-dandy Molex plugs with both +12v and +5v...). I've read reports of people using lit candles to simulate these IR sources, but YMMV - I'm not about to stick lit candles anywhere near my PC or TV to test that...
Alternatively just leave your sensor bar plugged into the Wii and have it on standby all the time - IIRC the sensor bar still gets power when the console is in standby.
@mdava Re: Who are they kidding?
Have a downvote, purely for the use of "Xbox" as a verb. "for the kids to Xbox on"...?! Seriously? Do people speak like that or are such abominations constrained to text-based exchanges? Ye gods...
Prime time viewing
Comfy chair? Check. F5 on standby? Check. This comment thread is sure to be interesting to watch; all I need now is some popcorn...
Just do me a favour and try not to celebrate and rejoice in the fact that another human being has died, regardless of what you think of that person...
SaaS, IaaS and ... PaaS?
Do people really use acronyms like this daily? I look at this stuff - including any mention of "cloud" - and all I see is "BUZZWORD! BUZZWORD! BUZZWORD! BUZZWORD!". Yawn.
And if the people I work with (and have worked with over the years) are anything to go by, I agree with the posters above me: truly self-service IT is a dream that will probably never be realised. If there's one thing I've learned it's that idiot-proofing something is just asking society/Darwin/$deity to produce a better idiot.
Was going to post this as an AC, but screw it - downvote me all you want...
I've long ago given up on modern consoles; most of the games out there are turgid dross and I've played them all - in one incarnation or another - many many times before. Generic Modern WarDeathKill #46 vs. Yet Another GunKillShootDeathGame #12? Thanks, but Doom, Quake, Half Life, Halo, Medal of Honor, Thief, Splinter Cell, etc, beat you to it (and probably did a much better job of entertaining me). Selling my Xbox was the best thing I've done with it.
When it comes to PC gaming, always-on has been creeping in through the back door for a long time now. We as consumers had the choice to vote with our wallets a long time ago and we failed to take it. If people hadn't bought games with this requirement, publishers would not have seen these games as a success and would have dropped the idea. We didn't and they haven't, so here we are. And you can't ignore the fact that, even in the supposedly bullet-proof world of console gaming, piracy is a real problem and publishers/devs *will* want to protect themselves from it.
At the end of the day I'm interested in playing the game. I couldn't really give a crap about whether it needs me to be always-online or not, I just want to play the game. I have a fast and reliable internet connection at home - and the day you manage to prise my router from my hands is the day they bury me - and I have a fast and reliable connection on the move; it simply isn't a problem for me. I caved and bought SimCity - despite knowing about the launch-day problems and online requirement - because I wanted to play the game. StarCraft 2 needs me to be online for battle.net, but I don't really care because I enjoy playing it (I'll admit that having to log in to my battle.net account *every* *damn* *time* is a pain in the balls however).
It might be a problem if I wanted to play the game in 3/4 years' time and I discover that the publisher has dropped support for it, but the chances are that I'll be on Windows 93.6 or whatever by then and I'll have to get third-party hacks and patches just to install it - in the same way that I have to for many older games that I still play today. And all the people crowing the oh-so-popular opinions about EA and SimCity seem to forget that EA are still supporting Spore to this day, despite the fact that it was one of the biggest flops in PC gaming history.
If anything, this opens the door for indie devs with more creative ideas, and that is no bad thing. Since Minecraft rocked my world two and a half years ago, I've been buying more and more indie games than "triple A" releases and I've been having a lot more fun for it. Most of those don't have always-on requirements and are fun & engaging that many "big budget" games couldn't possibly hope to be. It took me six months to beat FTL and I still can't beat it on "Normal"; despite the game kicking me in the balls at every opportunity I still come back to it. I've played Super Hexagon for a total of around 6 hours now and the longest I've lasted in that game is 48 seconds - another game that relishes kicking you in the family jewels (repeatedly, while wearing steel toe-capped boots, and shoving hot pokers in your face while it does so). And if Minecraft tracked hours played the figure would utterly horrify me; I poured at least 12 hours into it over the weekend alone and I consider that a relatively Minecraft-free weekend...
Re: Gartner are full of it
Indeed, I get the point. But a slowing of sales doesn't equal the desctruction of the PC market outright. And I'd be happy to be the poster-boy for the "You don't always need the latest and greatest in order to get by" crowd; my dual-core chip was already two years out of date when I bought it, and I only retired it recently after over 4 years of excellent service. (Well, not really retired, more like re-purposed in HTPC form, but that's irrelevant).
But I would never dream of trying to use my Nexus 7 or any tablet to do something as relatively un-resource-taxing (that was a horrible phrase, I admit) as managing the workflow for my digital SLR camera. Or building 3D models. Or editing/rendering HD video footage.
For the average consumer who sees a computer as a portal to internet or Facebook, sure - in fact I'd argue that tablets have *already* killed the PC in that market sector. But for the person who does more than just consume content - or even indeed for the average Reg reader - a tablet simply does not compare to a full PC.
I would welcome a return to the age where owning a computer means you don't just point it at Facebook all the time. And that's not me trying to be elitist, it's a veiled protest at the dumbing-down that PCs have experienced over the years. These are complex beasts; you *should* have to know at least a little bit about what goes on under the bonnet, you *should* know what all the basic parts are, you *should* know how to fix common problems (or at least know how to research the problem yourself), you *should* know how to upgrade it, you *should* know what an OS is and how to wipe/re-install it, etc, etc...
Re: Gartner are full of it
Uh... None of that proves that tablets will kill PCs though?
I also have a tablet and when I want to just browse stuff that's what I go to; I also plan to get a keyboard dock for it and it can then double-up as a remote SSH client - maybe even write blog posts etc on it...
But none of this means that I'm about to sell the big shiny quad-core beast sat under my desk at home. Nor does it mean that PCs are going to go away altogether. Hence, "Gartner a full of it".
Gartner are full of it
I was about to reply to Eadon's post, but then I realised who posted it; I'll just neatly side-step that for now...
Tablets will never kill PCs. PCs might change and evolve to the point where we no longer recognise them as being the same beasts they are today - see the rise of Mini ITX, NUC, (to a lesser degree) Raspberry Pi, et al - but there will always be room for a higher-powered device with a separate leyboard, mouse & display. The day I relinquish control of my PC is the day you prise the mouse from my cold dead fingers.
Tablets are simply tools for a different job: when I'm editing video footage or creating 3D models I have no desire to reach up and try to touch my screen; when I'm sitting on the sofa idly flicking through some inane website (probably involving cats) then I have no need to boot up the powerhouse PC.
This, in a somewhat roundabout way, reminds me of some comments made by Pixar many years ago regarding PC graphics performance... A GPU manufacturer (think it was Nvidia) said at the time that their latest product brought them close to real-time "Pixar level" rendering. A Pixar bod then responded with the exact technical detail of the hardware involved in rendering Toy Story, closing with the disparaging insinuation that there was no way in hell that a computer AGP port could handle the bandwidth needed (yes, I said AGP - that's how old this story is).
Re: Red Dwarf
That was no accident! That was first-degree toastercide!
The dream of HTCPCP come true!
Finally, a use for the Hypertext Coffee Pot Control Protocol! (http://en.wikipedia.org/wiki/Hyper_Text_Coffee_Pot_Control_Protocol)
Substituting with wife/husband/colleague/office skivvy/gullible "mate"/minion is no comparison to a proper, 100% compliant implementation of HTCPCP.
Please no not again... Aren't we done with this? Why do these idiots persist? More importantly, why do government agencies pander to these idiots instead of going by the established scientific opinion?
If mobile phones Do Bad Things, then we'd have all started dying off 7 or more decades ago when we started spewing non-ionising electromagnetic radiation from massive transmitters on a large/national scale. The frequency makes little difference (other than how far through any given material the signal will penetrate); what matters is transmission power, and you're not likely to have a megawatt, or even kilowatt, transmitter clamped to your face for hours on end any time soon.
And if you think living under/near mobile phone masts is dangerous then see my first point: TV and/or radio transmitters would have killed us all long ago.
Re: Bonkers? Yes... Overpriced? Most definitely
I'm sorry, did the lack of "JOKE ALERT" icon confuse you? No need to get so butthurt.
I'm not disagreeing with having a powerful machine, and I agree: spec up a decent system now and it will last you for years to come - the only thing you'll be looking to upgrade on a machine like this in 3-4 years' time is the graphics card (to support future versions of DX). My point was more that this article didn't go "bonkers" so much as "ridiculously overpriced and sheer overkill". A 1200W PSU, FFS? A rig like that would still struggle to stretch a good quality ~600W PSU. Sacrificing a few points in a benchmark would net you significant cost savings.
If they wanted to go "bonkers" then at least include a complete custom watercooling loop - including graphics cards and chipset blocks - and overclock the s*** out of everything. Hell, if we're *really* going bonkers then why not submerge the whole thing in mineral oil?!
And yes, don't get that awful-looking Inwin "case" they featured; get something from Antec, Coolermaster, Fractal Define, etc...
@Boothy (re: overclocking)
The moment I got the machine built and installed Windows, my old Dual Core E5200 rig was overclocked from 2.5GHz to 3.75GHz - instant 1.25GHz speed boost. It took about half an hour of tinkering and has never missed a beat in the 4 years it's been overclocked.
Re: Bonkers? Yes... Overpriced? Most definitely
If Apple made a "gaming" PC:
-Runs games from 5 years ago and calls them "bleeding edge"
-Use proprietary hardware that you can only get from Apple or Apple-approved dealers
-Woefully underpowered compared to spending the equivalent amount of money on a PC
-Include a stupidly high resolution screen then upscales/interpolates everything to run at effectively half that resolution, rather than displaying in the true native panel resolution
Oh wait. That already exists; it's called an Apple computer...
Re: PC gaming is for sadists
Horses for courses, my good chum. Give me a gamepad and I'll probably be dead before I fully turn around; give me a mouse and keyboard and I'll headshot you from half a map away.
And FWIW, the issue with SimCity isn't DRM. Always-online DRM has been around for a little while and is becoming increasingly more common; I think that's something we're just going to have to learn to swallow (console owners, too). The issue with SimCity is that EA are more or less lying about the online requirement. But that's not a problem with PC gaming, that's a problem with EA.
How exactly do your three screens make you better at, say, UT3 or StarCraft 2 than I am? Being able to "see more" doesn't give you an inherent advantage over anyone else.
No, you do not need this sort of spec, even as a "professional" gamer; rigs such as those described in this article are all about showing off, pure and simple. Yes you need a powerful machine to run modern games well (high framerates, better quality visuals, no visual artifacts, etc, fast monitor response times) - one that would make even "next gen" consoles in 5 years' time weep into their cornflakes - but none of that makes you any good. Good peripherals can only take you so far; there's only so many DPI or adjustable weights you need on your mouse before you get into the realms of just being silly.
Being good at FPS games means good reaction times and knowing the maps, and being good at RTS games means better strategies and quicker actions per minute/second.
Incidentally the reason that many "professional" gamers have such high end kit is that it was given to them for free by their sponsor (or it was a competition prize).
Triple screen? It's all about 6-screen EyeFinity now (which is the only reason you'd need monstrous graphics power that this machine has), do try to keep up; *everyone* knows that having six monitors gives you the biggest e-peen around and really proves just how much of a "hardcore gamer" you are.
Also, make sure you get slim bezels (just to make the cost that little bit more eye-watering), put your lower three displays in portrait orientation and put the top ones in landscape; can't have any nasty bezels in the centre of your FOV now can we? Don't forget a solid concrete desk to support the weight and a custom screen mount, all for an extra couple of grand...
Bonkers? Yes... Overpriced? Most definitely
Bet it must be nice when someone sends you the hardware instead of having to pay for it yourself... In the real world, it's possible to spec up a "bonkers" gaming PC for a fraction of the price of the stuff listed here. You don't even mention overclocking; what's the point of such a powerful machine if you're not even going to attempt to get the best you can out of it? That chip will easily hit 4-4.2GHz.
And your RAM is seriously overpriced; remind me not to go shopping in the same places that you do.
Re: And even after
"a filthy, shameful liason and Unity was the deformed bastard offspring of that unholy union"
And people think that it's the operating system that isn't user friendly; in this thread it's many of the Linux users that aren't user-friendly...
If you have any interest at all in the subject...
...get up to Bletchley Park for the day. Seriously, and make sure you take one of the tours (especially if the older fellow who actually worked there during the war is still there). It was quite possibly the best day trip I've ever made. Plus they've got the National Museum of Computing up there too, so you can coo over all the old hard drive platters which come up to your waist - not to mention the Colossus rebuild, of course...
And as FartingHippo mentioned up there, that was the hardest part for me to get my head around - the fact that we could pretty much decrypt all the enemy's transmissions but still had to make the enemy think that their systems were secure. Put yourself in that situation: you know the enemy is going to attack your forces, or even your civilians, but any action you take might mean that you can never again intercept any enemy communications.
The rest of us outside the Apple world who are interested in streaming content from one device to another will continute to use UPnP and/or DLNA, which already works perfectly well. Doesn't mean you're restricted to streaming from your phone to TV either; for example, I could use my phone to tell my NAS box to stream something to my HTPC (not that I'd want to, since the point of having an HTPC and a NAS on the same network is to move the storage away from the HTPC, but still allow the HTPC to access it... I'm just trying to illustrate my point here...)
Softare developers also don't have to shell out $200 for a copy of the Miracast spec just to be able to do this either...
For what it's worth, Sony also tried that "shared/mutliple screen gaming" thing with the PS3 and the PSP/Vita, and they're going to carry on pushing it with the PS4 + Vita... Hasn't really worked out too well for them so far...
A spare $800?!?!
Always wanted a robot? Well....
Get a cheap remote controlled car - you know, one of those £20 jobbies you see in "gadget" shops
Snip the wires for the steering and drive motors
Hook those motors up to an h-bridge (or multiple h-bridges)
Get a cheap ultrasonic sensor (literally a few quid on ebay)
Hook the whole thing up to a cheap Picaxe microcontroller
Total cost? Around £60-£70. If your motors are sufficiently low power, you might even be able to skip the h-bridges altogether.
See <a href="http://blog.makezine.com/2011/09/22/how-to-drifting-robot-car-video/">here</a> for an example (indeed the video that inspired this post).
Can't really do much worse...
...than ol' Georgey boy did with The Phantom Menace.
Given the crushing disappointment of EP1, I have no expectations one way or the other. Let's just see what the film is like, shall we.
[@AC, 11:24] Re: The end of linux
I don't care what it is as long as he gives me the number for his dealer.
@MyBackDoor (re: Steam Linux)
Please do one, troll.
Having Steam running natively on Linux with at least a significant percentage of its games library being compatible with Linux can only be a good thing. You can complain all you want about privacy or DRM, etc, but the simple response to that is not to use it; no-one is forcing it down your throat at gunpoint. The fact is that there are a large number of people out there (myself included) who would quite happily switch to Linux permanently if the support for games were much better. And don't tell me to go get a console either; I don't need or want outdated hardware, thanksverymuch.
Not to mention all the work that Valve are doing with regards to performance improvement on Linux for OpenGL and drivers... Just sayin'...
My only gripe with it is that Steam Linux is currently x86 only, and I suspect that won't change any time soon.
Tegra 4 "Wayne" gives me a very big nerdboner... Sod putting it in tablets or phones though, I want to see desktop SBC computers built around it.
The Sony phone does look rather purdy, but I'd be a bit worried about this "freezing" of background applications though; what if I *want* background background apps running, so they can provide notifications? Bet the screen looks damn fine though, even if you really don't get the benefit of 1080p on a screen so small...
NBC Commentary on Tim Berners-Lee
That had to be the most eye-opening part of the entire article for me; I hadn't heard about the complete and utter botched job that NBC did with the opening ceremony...
Even with Win8, Facebook's inevitable share nose-dive , iOS Maps, Surface and Apple v Samsung all taking place, telling viewers to google for Tim Berners-Lee has to be the most spectacular episode of idiocy from last year.
I was already pretty dubious about taking the article seriously when it's supposedly about stocking fillers, yet there's nothing in this list at what I would call stocking filler price. I stopped reading altogether when you feature an iPhone case which supposedly reduces mobile "radiation" yet increases signal strength. If you're going to put out such an obvious marketing puff piece based on stuff you've been handed/loaned, next time please put the complete and utter bull***t on the first page so I won't have to waste any of my life on it again.
Putting aside the claims of "radiation" from mobile phones causing harm, you cannot at the same time reduce "radiation" and increase signal strength - THEY ARE THE SAME THING. If you block/attenuate the signal coming out of/into the phone, your signal strength and quality will drop.
"Weird science? Snake oil modelling? Take your pick, it’s going to be hard for a user to determine if the improved signal strength is noticeable or if Pong’s redirection of TRP (total radiated power) is going to save your life or not"
No, it won't be hard for any user with any ounce of common sense at all. Any user with any common sense at all will realise that if you block something you cannot amplify or improve it at the same time. No, they'll stay the hell away from the company foisting quackery on you at an inflated price; they'll get A.N. Other brand silicone case/skin for their phone and pay £5-£15 for it. Which, incidentally, will also stop your phone sliding around and protect it from falls without the exorbitant price tag that this thing commands.
There is no depth of fail low enough to describe the depths that TheReg has plumbed with this "article".
Re: I wish I had a bloody oyster card...
Spelling fail: of course, "on Cardiff" should be "in Cardiff".... Why can't I edit my post like every other comment/forum software, Reg!
I wish I had a bloody oyster card...
I travel on public transport - buses - an awful lot on Cardiff, and the Cardiff Bus "contactless" payment system is laughable. No topping up with a credit or debit card at all, just cash; to make it worse there's a max cash topup value of £20 per transaction - unless you visit the office which always has queues out of the doors and closes at normal office hours, so you can't even go there when you leave work.
I'm not asking for much: I don't need weekly CSVs of my journeys or a 300-500ms transaction time, I just want to be able to add credit to the damn thing with my bank card on a webpage - I don't even care if it's not a phone-friendly site!
I'd take the Oyster card - or NFC/RFID-enabled debit card - over this piece of crap any day of the week. Automatically adding credit? Bleedin' luxury...
Re: No RPi please
No no no no no no!!!
Fundamentals of computer science! How much longer must I bang this drum!
Re: What's the point?
Agreed. I see nothing "vapourware" in the Gertboard or any other commercially-available GPIO expander/interface kits. "Difficult to get hold of" or "in short supply" does not equal "vapourware". Many, including myself, said that the Pi itself was vapourware, yet here we are with over half a million sold and one of the first batch of 10,000 units on my desk.
Besides, using the term "homebrewed" as a perjorative term really completely misses the point of having the GPIO in the first place: being able to wire it up to something you built yourself and control it with code that you wrote yourself - doesn't that neatly cover the term "homebrewed"?
Re: I always thought the point of the Rpi
Have a pint right back, for being one of the few people here who finally gets it.
Like many others, I think you're missing my point. Granted the point about HDMI monitors stands - a point I believe I made - but it doesn't mean that you need a 27" Apple Cinema Display monster on every desk either. Most monitors available in the last 5 years or so all support DVI inputs, and digital DVI is pin-compatible with HDMI; it just needs a £2 converter, and it's exactly how I use my Pi.
The Pi isn't necessarily going to be useful for those schools who can already afford, or already have, "ICT" labs full of reasonably-spec'ed kit. It is however going to be an option for schools that can't afford to spend a fortune on updating their old shoddy equipment in order to be able to actually teach this wonderful new IT curriculum. Let's just do some maths, based on current retail prices, just for the hell of it...
Cheapest DVI monitor I can find on Novatech: £75 ea., or £2250 for 30 of them
Cheap USB keyboard/mouse combo (also Novatech): £9.98 ea., £299.40 for 30x
30x cheap (~£5) SD Cards: £150
30x cases (~£12): £360
30x Pis (~£25): £750
30x HDMI to DVI adaptors: £60
Cheapest desktop I can find on Novatech: £239.99 ea., £7199.70 for 30x
So that's £3659.40 for the Pi-based solution, or £9449.70 for the traditional desktop (which won't need separate cases, SD cards or KB + mouse). Of course these are retail prices and schools can get serious educational discounts, but my point stands.
It isn't for everyone and it was never going to be for everyone. But it could be a seriously big help for those - individuals, schools, businesses, etc - who can't afford to shell out for a full desktop machine. I think people - especially those reading a tech news site - need to remember that not everyone is as privileged as they are/were when it comes to technology. Not necessarily everyone had a good school with great IT education or had parents who could afford to buy gadgets/computers for them.
Taking things a little bit too literally, perhaps? Or did you somehow think that I intended to slight those whose occupation is web technologies or web programming?
No, I am not suggesting that we teach our 10-year olds how to write an operating system. Nor am I say that we should ignore web technologies. What I am saying is that the skills being taught need to be more funadmental: memory management, algorithms, object-oriented approaches, etc. Of course I am also not suggesting that we hand the virtual "shotgun pointed at your feet" that is C++ or assembly language to a class of schoolchildren. Web programming is indeed an easy place to start, and it can be easily translated into what people see every day: Facebook, Google, etc. What I am saying however is that they need to understand the basic fundamentals of hardware and software. Once you learn fundamental programming skills, you can apply that to almost any programming language you choose. Hardware is a bit more tricky, but pretty much any "computer" these days still has the basic building blocks (CPU, GPU, memory controller, buses, I/O, etc).
I am saying that we should be teaching broad fundamental skills, and not specifically focussing on narrow areas - such as web programming or using MS Office. Sure they can be a good starting point start - much the same way that Scratch is a very good starting point - but that shouldn't be the end of the story.
No, no and thrice no!
We do not need them learning "web programming", we need them learning how the systems and languages that support the web and all the associated infrastructure. Teaching them web programming is no better than teaching them Excel; we need them to be able to write their own operating systems, write their own web-based languages, etc...
Plus financial success isn't the only reward for learning to do something; believe it or not, some people, even those who left school over a decade ago (or several decades ago), actually enjoy learning something new just to learn something new.
Some good points, but not that many...
One of the main aims of the foundation was to change the way that "IT" is taught in schools, so that it is more focussed on computer science fundamentals than how to do spreadsheets in Excel or godawful WordArt in PowerPoint. You could argue that the changed stance on IT education by the government has already gone some way to achieving this. That's great, especially if schools can already meet this without having to shell out for new kit. The Pi itself addresses the problem of not having almost disposably-cheap computer hardware with which children can knock around with and experiment - as opposed to an in-excess-of-£300 desktop PC which their parents don't want them screwing up - so it doesn't have to be solely restricted to being used in schools.
And I completely disagree with the article's point that it would involve faffing about with plugging/unplugging cables at the start/end of each lesson, or that using them in schools would be impractical. These things don't have to be removable; just mount them under the desk or to the back of the monitor using VESA mounts. Problems with SD cards can be easily solved with network booting; once you've got the basic stuff booted (GPU blob, etc), the rest of the OS can in theory be booted from a clean network image on that can't be modified. Out of all the crappy old SD/MicroSD cards I've got knocking around, I have so far failed to find one which cannot boot the basic stuff (my day-to-day tinkering distro is on a USB drive, only the boot partition remains on the SD card). Give each child/user an allocated amount of space for their /home/$user share, stored on the network and not locally, and everything else is read-only. Of course this is a little more involved than plugging in a few Windows boxes, but any IT tech worth their salt should be able to do enough research to set this up; they shouldn't be in the job if they can't.
Plus I fail to see how it would be more expensive to kit out a room full of Pis compared to the cost of a full-blown desktop computer, as many commenters (not just those here) like to claim. You cannot tell me that a desktop computer - even a low-powered one - can be bought for less than the cost of a Pi (£25) + case (~£12) + small SD card (~£5-£10), even with educational discounts. Mice and keyboards are dirt cheap, even moreso for those that buy in bulk. Granted that the HDMI-only monitor connection will be a little tricky for the spare old monitors you've got knocking about, but you can't get a $35 computer without compromising somewhere.
And, by the way, using the term "geek children" or "geek dads" (or even "geek mums") doesn't exactly do anything to help anyone's cause here. As if kids needed another label which can be used to single them out as a target for bullying... Is there really anything wrong with wanting to know more about the technology which is so fundamental to our society? Just because someone might be interested in technology or programming doesn't make them a geek; it just makes them a person that's interested in technology or programming.
...and just slap a real Linux distro on it. I'm especially interested in the Samsung ARM Chromebook for exactly that reason. Can't see that touchscreen would do me any favours when I have a perfectly good pointing device; my only complaint is that hardly anyone uses the StinkPad-style nipple pointers - so much better than a trackpad...
Re: Shut up and take my money
Pretty sure that a DB5 is used in the film, and Bond does drive it, although it's not the car used in the chase scene.
Shut up and take my money
If it was good enough to use as a stand-in for the real thing then I want one.
Incidentally, I'm not sure where I heard/read this, but I remember hearing that they wanted to trash a proper DB5 in Casino Royale but Aston Martin wouldn't let them because they're too rare. Hence why they used a more modern DB9 (or DB9 variant) in the scene where Bond+car do a few cartwheels...
Re: Worth it to me. (@Steve)
Might I suggest that if your experience is predominantly Mac machines then you're probably not in the best position to comment on the difference in repair time between Macs and Windows boxes.
If someone asked me to repair their unbootable Mac it might take me a day or two to get it booting again. If I had to repair an unbootable Windows 7 PC, it would probably take me less than an hour to get it booting.
That doesn't mean that Windows PCs are better than Macs because they're easier to repair, that just means that I know more about Windows than I do about OS X.
"a linux steam box is great in principle but how much effort do nvidia and ati put in to optimise their drivers for linux, how many games arent suitable for linux or are 'games for windows'."
The work that Valve have done so far for Linux has been for their own games using the Source game engine and running under OpenGL. And AMD, Nvidia & Intel have actually been putting in quite a bit of effort in optimising their linux drivers, apparently:
"Finally, the Valve Linux team explained that they have been working closely with Nvidia, AMD and Intel to boost graphics performance for their respective hardware under Linux. 'They have all been great to work with,' the team claimed - Torvalds' experiences to the contrary - 'and have been very committed to having engineers on-site working with our engineers, carefully analysing the data we see. We have had very rapid turnaround on any bugs we find and it has been invaluable to have people who understand the game, the renderer, the driver, and the hardware working alongside us when attacking these performance issues.'"
See full article: http://www.bit-tech.net/news/gaming/2012/08/02/valve-linux-performance/1
I've been saying it ever since they announced their intentions to "get into" hardware - they're up to something... FWIW, the "Big Picture" UI for Steam was actually announced a while back when the UI was last overhauled. It wasn't called the same thing then - if it had a name at all - but the principle was the same: a 10-foot UI designed to be used with a joypad or game controller.
Despite what this article says, I still think they're heading towards a Linux-based Valvebox of some kind. Perhaps it'll just be an optimised and Steam-specific Linux distro to start with, but they're definitely up to something on the hardware front.
Back in the real world however, I'll be giving this new UI a shot when I get my HTPC finished properly.
My skin is crawling
I am simultaneously intrigued and freaked out by this prospect. From a technological point of view, it's cool as hell. From a personal point of view, it gives me the ****ing creeps because I'm a big girlyman when it comes to insects. And these buggers hiss; they HISS, ffs...
Can we nuke all the cockroaches please? Contrary to popular belief, a sufficiently large dose of radiation will kill them stone dead...
Re: Actually a genius idea
If it were a worldwide release, that is... But no, this goes back to the old days of bending the rest of the world over a table and forcefully rogering us.
All they're doing, as others have pointed out, is giving pirates yet another medium from which to obtain a DRM-free digital copy (that is, assuming they can break the DRM employed by this system, which is likely a correct assumption).
- +Analysis Microsoft: We're making ONE TRUE WINDOWS to rule us all
- Climate: 'An excuse for tax hikes', scientists 'don't know what they're talking about'
- Apple: We'll unleash OS X Yosemite beta on the MASSES on 24 July
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- White? Male? You work in tech? Let us guess ... Twitter? We KNEW it!