89 posts • joined 7 Sep 2007
Re: DARPA: The Better To Murder You With, My Dear
Hmmm, I can see why you think that, but then stop for a moment and think about why things like this do get developed.
If a single round can be used to pick off a strategic target with little chance of missing, then that one round is all that may be needed to bring about a cessation of conflict, rather than just destroying an entire building, block or town to eliminate a potential threat. A round like this actually has more possibility of saving lives from collateral damage than causing mass murder, which is a sobering thought.
And the other thing to consider if that think-tanks like DARPA usually produce many, many failures before they ever come out with something usable. These failures may then go on to become the basis for some rather tasty peacetime technology, with a purpose far removed from conflict or killing. The battlefield is the mother of innovation, and we have so many technologies developed for such a theatre. Superglue comes to mind, developed as a surgical adhesive to be used in an emergency in the field. There's a reason that the best things it sticks together are usually fingers...
So sure, war and conflict are ugly. But never forget that out of the crap, comes the good. Make sure you see both sides before decrying 'pointless' technological developments!
Re: MIT boffins moot tsunami-proof floating nuke power plants
"let's take the heaviest objects ever built by man, and try and make them float!
You mean supertankers? You'll find they already do float, on account of them being ships and all.
And a floating nuke plant doesn't have to have all the vulnerable infrastructure above the waterline. Makes more sense to have the reactor vessel and support gear underslung and in the water already, in case things DO go pearshaped. Plus you've got more space on top for helipads, accomodation, etc.
Not too difficult when you think about it for more than 30 seconds ;O)
And yet, it's the 'gimmicky crap' that's the stuff that usually showcases the latest from the R&D boffins. While it may not be overly astounding to begin with, if enough interest is shown in these early designs, they'll get redeveloped, refined, and generally improved upon. This is, funnily enough, called progress.
And if no-one likes it? Well, that's fine, because not everything from an R&D department is going to be a solid gold winner. Designs don't perform as well as expected, aesthetics are not quite what people want, software can be buggy. It IS possible to make the mediocre better through PR (And that's my subtle reference to Apple for you, people) but if it doesn't prove popular, it'll vanish soon enough.
And it's surprising really, how much stuff from failed products gets reused in the next big development.
The SD format does make sense from a 'one-size-fits-all' perspective, insofar that manufacturers of appliances and devices that would require such a card to operate only need to build in the cheaper host electronics, with the main system being easily added in later. You COULD solder it all on at creation, but then it may very well be cheaper to buy the finished card units in bulk and simply insert them later on, after loading in your own custom firmware. Economies of scale and the like.
Plus, prototyping would be easier, knowing you can just add the brains to your creation later, as long as you have the relevant mechanical interface.
Mines the one with the industrial sponge with the gluey battery, thanks.
And what about the...
Granted, my phone is a gently aging Nokia N900 (and could probably do with a battery change anytime soon), but given the propensity for data-hungry connections over 3G, and almost anything over WiFi to consume battery power like a starving dog in a butchers shop, isn't this generally a bad thing?
Mobile devices are, by nature, limited in show long they can run due to the big chemical block sitting in the back. Having multiple radio emitters firing up and connecting all at the same time can't be doing wonders for the usage time figures. Pretty serious thing, when your battery is glued and screwed into your block of shiny fruit-plastic.
With the right resources, it's a good plan. But until we have usage time on battery into the 'days' figure, it's not going to do much more than sell a few more phone chargers, so people don't run out of juice halfway through the day.
"SteamOS combines the rock-solid architecture of Linux with a gaming experience, as well as other oxymorons and contradictions. As soon as we find working drivers for proper hardware, it might actually become something people could possibly want, as soon as the Linux zealot community stop their fervant dribbling and putting off the normal people."
Here we go again...
First off, it appears that whatever happens, the fanbois will only ever praise something if Apple does it first, which in this case, is the 64-bit thing. Should Samsung have done it first, they would have derided it as unnecessary. Now that Apple HAVE done it first, everyone else will now 'merely be copying them'.
He's been dead a long enough time, but it appears Old St. Steve is still spinning fast enough in his grave to keep the RDF fully powered up for another 5-6 years.
So... 64-bit CPUs in a phone. Practical? Who knows. Nice to have? Probably. Essential to the modern world? Not really, no. When all the world+dog seems to do on these things is text, call, peruse Farcebook and the like, and listen to music... very few people will ever seen any real-world benefits to this. That said, progress happens in odd ways. Just because it's no good to us right now, doesn't mean we won't be seeing things taking advantage of it later on. I'm just not convinced we're at a point where we need full desktop power in a battery limited device.
About the only real benefit to anyone is the manufacturer of these chips. If they can shift more of their designs to 64-bit, and drop the older architectures, they'll have less variants to keep track of, and can focus more on improvements and the like.
"No, it's for when I drop my Smartphone into a mains powered dock and start running a full desktop OS. With all 8 cores going gangbusters on whatever tasks I'm performing on my full desktop OS"
Well, sure. This does seem to be the way things are headed and all. But one thing to consider here is heat. The more cores you have burning power, the hotter things will get. The phone/tablet casing can only sink away so much before things start to get out of tolerance, so the device itself will never quite reach proper desktop levels of grunt. Not unless you're happy to have a phone/tablet with a mounting point for a large external heat sink, which would require thermal paste and a large smooth surface to conduct the heat through.
I'd say we're due for a halfway house scenario, myself. Your device has your apps and files on it, and the base station has all the extra grunt required to go the full desktop monty, as well as all the peripherals and the like. To which end, we'll still need desktops of a sort, but at least you can hot-desk easily.
I'm looking forward to all the reports of trouser-fires from a full blown 8-core unit forced into 100% CPU utilisation via a nice piece of rogue software... hehehe.
Re: Data Management 101
Copying rather than moving is fine, but in this case the drives themselves went to crap, so either method would have ended up with the same result.
The main difference here is that it's easy to retrieve data with existing equipment or ressurect an old HDD with a board transplant, but with an SSD you're rather more buggered. You MAY somehow be able to read the flash if it's only the controller that's farked, with careful place ent of probes and an external controller. Good luck replacing any BGA chips though. Always back up an SSD to an HDD, otherwise things are going to be very, very expensive, very quickly...
I'll stick to my good old spinning rust for now, I think...
Re: You mean like this web site does....
C'mon Gaz, with such urbane wit like those of, ooh... Eadon, amanfrommars1 and BarryShitpeas... you NEED proper security so that no-one can steal their login info, hijack the accounts, and post complete and utter shi...
I see your point. Carry on!
Well, at least now people get another option to bitch about, instead of just having Imperial and Metric to choose from. I get the feeling this'll go up there with the whole MiB/KiB thing... technically correct, but no-one will really use it in day-to-day life, as we have things that work fine already.
Also, who came up with this... Boltzmann, or one of his many brains?
Re: What's a watch?
I still wear a watch, mostly because it was a mandatory work thing (railway) but nowadays, more because it beats having to yoink my phone out of my pocket each time I see to see what time of day it is. Never knock a device simply because something else incorporates the same functionality unless it replaces it in such a way as to be unobtrusive.
Re: never use post codes for navigation !
Pete, you seem to forget that not all of us mere mortals have need of a GPS system to find somewhere. Occasionally, we like to use a thing called a 'map', and 'plan' how we're going to get there BEFOREHAND, thus committing the route to memory. And even with a postcode, Google Maps will put you in the right area, provided you have a screen capable of displaying more than an inch of the map at a time.
This prevents us from having to keep checking the little annoying talking box of misdirection when it leads us on a merry random trip via roads that may or may not actually exist. Just saying ;O)
Oh, and I almost forgot... eventually his head will be big enough to support colonisation attempts, which would begin with pressurised geodesic domes supporting a variety of plant and animal life.
This will be known as the Eadon Project.
Not quite... you see, as his head gets bigger from all the accumulated ego, it emits the previous mentioned repulsive force. that alone would push small moons out of orbit, but it also has an influence on the Earth itself. Give it another 150-200 ego-stroking posts and he'll actually start to hover gently as he begins to escape gravity itself.
Eventually, in an orbit between the Earth and Moon will be the Eadonsphere, a self-righteous repulsive mass that constantly shrieks about everything that doesn't fit into his world view. Of course, we'll be spared the noise, as there isn't any air up there to transmit the frothing loony-rants back to us.
Re: Eadon's GeeK QuiZ
Because your head gets bigger with every post, and the repulsive force you emit is slowly pushing it out of orbit.
"It was once suggested that Marvin, the paranoid android, actually thought up the El Reg website as an attempt to rid himself of Existential Nihilistic thoughts, he failed and we got left with the result..."
Which explains where we got Eadon from, I guess? Positive proof that bad memes can take on a life of their own, and wreak havoc in the real world!
Re: The Shard
Nah, illuminate the glass part at the top in bright red, project a Brotherhood of Nod symbol on the side... Instant obelisk!
If you look at the waterblock, it's covering three individual components. Unless I'm mistaken, the USB/Network control chip is the one covered by the far right end of the block. So he's covered all the things which require some form of active cooling, I think. Plus the fact that a block that would cover the CPU would be far too small for the inlet/outlet barbs.
Kudos for the effort, looks pretty spiffy. I wonder if he'll end up mass producing them?
Re: Die Hard
Well, to be fair, it CAN'T happen with an iPad. You barely get access to the bloody thing itself, let alone any other devices it could connect to. Plus of course, we'd have to wait for Apple to release drivers for the plane you happen to be in, and that could take months.
Unless you jailbreak it, which is what you'll be needing to do yourself when the nice men with tasers appear after the unscheduled emergency landing...
Just like the rest of the product line...
...it's flashy, over-priced, features rather odd design choices and is really of no use to anyone who needs something practical. Given that the original design was to use the central space to house Jobs' ego, I think they could easily shrink it now, and rejig the whole thing to resemble the excellence of their products.
I'm thinking a large cube on stilts, with a big hole in the top for ventilation...
I'm sure by the time we've burned our way through the current waste stockpile, we'll have developed better alternatives, including thorium burners. I'd like to see this line of research given a massive boost, as this is certainly our best source of power so far.
I'm curious though... does the total amount of waste take into consideration any contaminated materials as well, or just the used fuel? And how do we get at the existing stuff that's buried in various sealed sites? I'm guessing this is more for fresh and un-buried stuff.
Still, hats off to the brains behind it all!
Just how big IS the frog they're trying to levitate, huh?
Have they tried turning it off and on again?
I'd taken my coat off, but I'm putting it back on now...
It's not an original concept, Fluorinert was developed for this purpose already, but this seems to be much better in terms of heat transfer.
Wonder when I'll see it in CustomPC?
Re: In Redmond Microsoft fire the photo copiers up.
But it'll still be 1/5th the price of the Apple version, with 3x the capailities ;O)
What people are failing to see here is that Mac owners are used to paying silly amounts for things and that in all likelihood quite a few Macolytes complained that it wasn't expensive enough...
So now all that's happened is that Microsoft have brough their pricing in line with Apple Standards, and hey presto! All is well again.
It might not be good for the operational aspect of the server, but I bet all the copper traces were as shiny as a newly-cleaned penny...
BANG! And the data is GONE!
For lunch today, I had...
While any security breach is a Bad Thing (tm), this isn't exactly a mission-critical app, is it? The fact that a load of people might not be able to post inane crap about their house/cat/toast/celebrity-du-jour is hardly going to signify the fall of the Internets. And if your company relies on it for anything... well, why?
Quite frankly I'd be happier if they managed to completely annihilate the damn 'service' in its entirety.
Re: "...fosters drinking..."
Where REAL Brits ignore it and drink quality REAL ales.
At least here in Guildford, we do. I belive the common rabble seem to prefer Stella...
If that is the case, then surely you'd expect to see that pattern in the same place on multiple exposures? I'd be surprised if kit like that wasn't thoroughly tested and calibrated before actual use, which would show up any defects in the CCD/CMOS long before actual research is done.
Not every object ever made uses the traditional Apple beta testing method of 'Let the consumers find the bugs, then we release the new, fixed version' :O)
This is... I want to say hilarious, but I'm not so sure. It's funny though, and I've got a better appreciation for that goold old rough'n'ready Imperial measurement system. Sure, we need proper units for accurate measurements and all, but when all's said and done, the bloke on the street is happy with his 99p WH Smith ruler (cm and Inches) and set of kitchen weights (which still generally come in both metric and imperial, happily).
I quite like the irony in the definition of the meter though. What a rather odd number for the time interval, you'd expect it to be in multiples of 10 for SI, no? ;O)
Is it me, or is this just a 27" iPad on a landscape stand, only without the touchscreen?
From my own experience, the only real time you ever see the back of a monitor is in the shop. So why bother with all the gimmicks of ultra thin and whatnot when the bit you'll be looking at is at the front anyway? Thin bezels, sure. This just looks like they ran out of ideas on how to make it better, so they made the edges thinner. Genius!
Apple... Not Much Different
ARM in macs?
Hmm, this'll mean that all those who eagerly awaited for the arrival of Steam on the mac will now be faced with the prospect that the CPU may not be up to scratch anymore when playing the newest games. ARM-style CPUs tend to be geared towards low power drain and efficency, not overall grunt, which puts them in a weak position for any games other than typical app store stuff.
It does look that if they move away from Intel, then buying the ailing AMD would prove to be the best route to obtain the relevant licenses and facilities, as well as the expertise in such areas without having to bumble through the pains of a cold-start. And given the AMD APU design, itm ight just work.
Still scary though, as I'd rather see AMD make a proper comeback, not get absorbed into the horrible pretentious ego-blob of Apple.
Re: Not a fan
> I detest the Android model
Maybe he has an irrational fear of choice? Would certainly explain it.
@ Lee Dowling
Abso-bloody-lutely! You've pretty much found the only reason that prevents me from replacing all my apparently 'inferior' storage (according to the SSD zealots, that is) with solid state.
Capacity sucks. It's THAT which needs improvements, not minor increments on IOPS or overall transfers speeds. We need a proper reason to shift to the newer medium, and it's not going to happen until we have devices of equal or greater capacity of our existing drives for no more than 1.5x the cost. I keep hearing about the huge speed boosts and the like that SSD users apparently gain, but really... I'm a gamer mostly. My Steam drive alone is around 3/4 full, and that's a 1TB unit. Speed is meaningless if you barely have enough room for 1/3rd of your stuff to begin with.
So yes. Come on, you manuifacturers of such arcane electrickery, get your bloody acts together and start making devices with SIZE, we already have the speed.
Sheepsung? Oh, come on. You can do better than that... Troll or not, at least put somethought into it!
Given the already apparently fragile state of US power grids, I'd wager that the Chinese would rather go for a target that isn't already prone to falling over en masse by itself on occasion.
However, it does go to show that industry still seems to be lagging somewhat on security implementation, even if the resources are out there to do so. Maybe the US could divert more of its grotesquely huge war budget to securing it's online presence? Cyberpace could be the new Wild West, give em a chance to feel all pioneering and brave again! ;O)
Hmm, so if I'm reading this correctly... it actually seems to me that using the UEFI kit means that you can easily bypass all the much touted security in a linux install, as most won't run with SecureBoot (?). So what will the FOSS crowd do now?
Oddly, blaming MS for this feels a bit off. Given that it's designed to make a system more secure, why would anyone want to disable it? I'm not sure why the linux crews can't actually bite the bullet on this one and use the functionality present, as surely that would make everything even more secure.
My guess is that as it's a Microsoft requested feature, it MUST be evil. Right?
How is this innovation, exactly?
Ahh, the new Apple 'iDecide' channel-hopping mediabox. Now with NEW patented One-Button remote, for all your lazy-arse needs!
Hmm, not much of an issue... A bic lighter, the end of an old biro, and presto! Custom screwdriver. Or Dremel it, sure. Oh, and for the wag who mentioned untra-sonic welding... well, for THAT, I have a sonic screwdriver...
Re: So what
Gah, re-reading the post makes it sound like I'm in favour of Apple... not the intended result. Wheres the damn edit function, El Reg?
Re: So what
If rounded corners make a copy, then I guess my old N900 is a copy of the iPhone too... black, rounded corners, touchscreen... hell, it even has almost the same CPU (Cortex A8 with a slightly different GPU). The fact that it and the Samsung phones run a completely different OS isn't enough of a barrier, I would posit.
I'd better get the file out, and make the corners flat again...
Time travelling entangled particles, eh?
You know, I've got the strangest feeling I've read this before...
Let's see... 100k a pop, to a German arms manufacturer, several thousand mines... Isn't this just a sneaky way to financially prop up the failing euro?
Nope, still can't find this X28 flare, even If I reverse the day and month like you said... you SURE it's on the 2003th April, year 11?
Re: Coin shrinking
Is blowing all the etched circuitry with the rather effective EM pulse part of the plan? Or maybe losing all the deposited traces and components as the die underneath shrinks, but they don't?
I gotta say, I did wonder at first, but I do belive you've used the wrong icon. The joke one is the on to the left... ;)
Crowdsourcing... one of those slightly obnoxious terms used to describe asking a group of people what their opinions are. A bit like focus groups, without any real focus, or ANY FOSS project team. However, the main thing I've been wondering is this...
If a CS group speaks English, is it considered open-scrowd-sourcing, and if they speak only Cornish or Welsh, is it considered closed-crowd-sourcing?
All I can hear...
... is a load of freetards moaning about some restrictions in some hardware they probably wouldn't even buy in the first place. I mean, come on... would you lot REALLY buy a W8 tablet JUST to put linux on there? Would it even be worth it? Christ, you lot moan about MicroSoft enough as is without comitting the irony of buying something with their software on, even if it's to make some kind of hipster point about changing the OS.
Let's face it, does anyone really buy up-to-date machines to run linux? I was under the impression that it works best on older hardware anyway. You know, the kind without all the shite you lot are moaning about? Do yourselves and the rest of the FOSS community a favour, and cease with the whining. Maybe even go develop some alternative open-sauce hadware that can run everything you want, all for free. Maybe an arduino or something. But please, stop with the dribbly diatribe about things that in all likelyhood will not, and never affect you while you're on the open-everything bandwagon. It's old, it's been said a million times, and has accomplished precisely sod-all.
Yes, I use window. I'm a gamer, and I like my stuff to work without endless faffing. Yes, I've tried linux and found it too faffy. No, I don't care what you might wish to tell me about it. And please see the other comments about how the goddamned bootloader thing isn't mandatory. You lot seem to miss that quite often.
Right, it's 01:02, and I'm tired. Cue the inevitable downvotes from those who realise I've had a go at their evangelistic OS viewpoint, and goodnight!