2484 posts • joined 10 Jun 2009
The people lost their money the instant they turned it into bitcoin
No, they converted one currency into another. Bitcoins are still out there and still have an exchange value. Less than it once was ... but currency fluctuations are nothing new. One pound sterling was once worth exactly the same as one gold sovereign. Until a day came, when the UK went off the gold standard.
Their real mistake was allowing MtGOX to look after their bitcoins, rather than holding them in their own digital wallets. Someone broke into the bank and stole their gold, sorry, bitcoins. Now someone else has all the bitcoins, which are worth a fair bit to the thief even at today's exchange rate. MtGOX is doubtless legally liable, but I don't suppose it's got enough assets left to make it worth suing.
Bring back the write-protect switch!
It won't solve all the problems. But it will solve a lot of them. If there's no way for a bad guy to change the thing's non-volatile settings, it'll mean that power-cycling the thing restores it to whatever state you stored in it by using the write-protect switch. Note: it must be true hardware protection, that's completely impossible to alter by any sort of software exploit.
Advertisers, watch out!
You know, blocking adverts isn't the worst a plug-in could do to you, not by a long way.
How about downloading every advert to the bit-bucket and then generating an auto-click on it, with the resulting page also sent to the bit-bucket? Or even heuristically locating your sign-up page and automatically filling it with garbage and submitting?
Then the site that makes money out of serving ads gets extra revenue, and the advertiser spends money serving bits into buckets with no human eyeballs involved. Those of us with unlimited high-bandwidth broadband probably wouldn't notice any overhead. Eventually the advertisers will notice that the effectiveness of "push" advertising is approaching zero.
Annoy us too much and someone will actually write that plug-in. (Or maybe they have, and I've just not yet been annoyed sufficiently to go and find it?. Adblock-plus will do for now).
My personal attitude is that I'm a buyer, not a sellee. If I want something I'll use Google and suchlike to find out where I can obtain it, what consumers think of it, etc. So instead of pushing adverts that we ignore one way or another, how about spending the money on making your product (a) easier to locate when we look for it, and (b) better?
Sorry to nit-pick, but over-clocked means operated at an un-blessed clock rate above the manufacturer's specification (and crossing one's fingers). In this case Intel has re-engineered or simply revalidated its own chip for a higher clock rate than previously used. They do the same thing with their CPUs from time to time.
Should also last longer than SSD
Depends on how it's being used. In a write-mostly and intensely-accessed environment an SSD will "wear out" in less than the several years that a mechanical drive can be relied on. In one where reads are more frequent or where there is 16 hours/day near-idle, there may be 10, 15, more years of write traffic needed to wear out the SSD, and in that case I'd expect the MTBF of an SSD to be higher than an HD. Expect, because SSDs haven't existed for long enough to be certain about their long-term ageing characteristics.
Would be nice to know what it costs.
(Obvious comparison, 512Gb SSDs. Can't win on performance, so has to be significantly cheaper).
Re: Am i the only one
You shouldn't be able to hear any rotational noise at all - if it's whining, it's dying!
What you hear much more on datacenter disks than on home ones is head seek noise. There's a tradeoff between minimising acoustic noise and minimising head repositioning time. In a datacentre, noise usually doesn't matter (it's dominated by lots and lots of fans moving air around, often making it hard to be heard without shouting).
Also in Microsoft's favour, they don't sell hardware (mouses and suchlike aside). If your hardware prevents you from moving off XP to Windows 7, it's probably not Microsoft's fault, but that of whatever company is refusing to write modern drivers for its older hardware.
Can't think of anything we're being forced to throw away with XP's demise, that was made in 2007 or after.
Re: 2007 hardware obsolete?
If it's not welded or glued shut, assembled by highly trained octopi, or otherwise artificially rendered impossible to upgrade, you can get a whole new lease of life out of an old laptop by removing the hard disk and installing a solid-state disk. The speed of the CPU is frequently irrelelevant, whereas reducing the hard disk seek time to effective zero can make a 5-year-old laptop feel faster than most new ones without SSD.
Same for desktops used for running Office and suchlike, by the way.
Re: Core temperature
It's not necessarily a problem. An actively cooled CPU can dissipate ~100W emanating from a square centimeter of silicon. So if the chips you want to stack generate 1W each, you can stack them 100 deep before the problem's much harder than a CPU. (Somewhat harder because you need thermally conductive glue between the layers, and thermal stresses must not destroy the assembly or the individual chips).
Re: Obviously not
There's at least one problem class where all-local RAM helps. Big sparse matrix calculations, as often encountered in engineering modelling. Wonder what HP *does* charge for maximum RAM? We once got a quote for an HP system thart could support 1Tb RAM, but the price for that configuration was so exorbitant that our scientists went for two commodity HPC servers maxed out with 0.5Tb RAM each ... and quite a lot of change.
Re: Linux is a fractured mess
In what way is the sudden arrival of Gnome 3 on an experienced Gnome 2 user's desktop, different to the sudden arrival of Windows 8.0 on an experienced Windows XP user's desktop?
It'll tell you the difference. There's a way out of Gnome 3 if you don't like it. Indeed, there are several different ways out, including choosing not to upgrade at all (apart from security patches) for the next five years at least.
The reason that the Gnome programmers were vilified when they shipped "3" wasn't that half of us thought it was crap. It was that they'd decided to write it as an upgrade (like Microsoft call Windows 8 + TIFKAM an upgrade). Meaning they had denied us the right to install Gnome 2 alongside Gnome 3 on the same system, just like Microsoft. Luckily they only control Gnome, not Linux.
Why is it that there are so many folks who still think like Soviet State Planners in the 1980s, that there there is One True Way, and that it will inevitably succeed? Oh yes, it's the brainwashing. The CCCP was very good at brainwashing. So is Microsoft. They should note, it didn't do the CCCP much good. The trouble is that you're more likely to believe your own propaganda, than the rest of us.
Linux folks know that "world domination" (Linus) was a joke. But on the other hand "First they ignore you, then they laugh at you, then they fight you, then you win." (Ghandi). We've reached stage 3.... we've won inside the DVD players and the cars ... we've won the tablets in a Googly sort of way ....
Re: Amazing what a bit of competition can do
Define "real work".
What do you think is used to make a movie? To design a new drug (or car, or airliner)? To build a million-user web-site?
Chances are high that the creative stuff, without which none of the other w**kers in the organisation would have jobs, is done partly using Linux and partly using Macs, with Windows in third place and there only because (a) some customers(*) and (b) the abovementioned real w**kers insist on it.
(*) customers are always right even when they are wrong. Unless you work for Microsoft.
Am I reading it right? Having taken away a menu that you accessed with a left-click, they've now given back a menu except you have to access it with a right-click?
I thought this was well-understood?
Recent geological history reveals several very rapid thaws followed by much slower re-cooling.
I thought the mechanism was well-understood: runaway global warming caused by methane released from methane hydrates in permafrost (and/or ocean floors).
The warning to the human race is obvious. Cause a small amount of global warning and it could become a runaway process. There are VAST amounts of methane trapped in permafrost in the Canadian and Russian tundra. Thaw the ice around the edges of that zone and the methane escapes, which causes more global warming, which causes more thawing ... a positive feedback loop.
When you know avalanches have happened before without human intervention, perhaps it's still best to avoid going off-piste?
Re: 50? What? Dell?
What's the electric car slogan: "range anxiety?" What would the laptop equivalent be?
Worrying that its power runs out before it's banished the extra-dimensional horror that's got its tentacles around your neck?
A toothbrush isn't "wireless power", it's just a transformer with the primary in the "charger" and the secondary in the toothbrush.
I can't help thinking, why bother? (in the case of the toothbrush, the principal reason is to make it all but impossible to create an electrical circuit from mains through a fault and thence through mouth, heart and other hand of an id10t(*) to earth.
(*) or victim. or autodarwinator.
RAID1, odd number of disks?
You can do RAID1 on an odd number of disks just as long as your operating system or controller lets you split the disks into partitions (most simply, two equal-size partitions per disk). You then make a RAID0 of RAID1s, should you want to view the whole assemblage as a single volume.
It's actually a slightly enhanced RAID10 that never(?) acquired its own number.
Depends what you mean by a "drive" and how much space you've got inside. Look up "SATA port multiplier", and how Backblaze make their 146Tb storage pods (mostly) out of commodity hardware.
Re: I have my doubts..
Well, to grab a linux command: rsync.
The first rsync will take many hours, maybe days. Just carry on using it while that completes.
Once done, stop modifying it, and repeat the Rsync command. The second pass will copy only the data that has been changed since the first pass, so quite possibly only a few minutes.
The same approach doubtless exists under other names. Back in the days of tape, they were called full and incremental backups.
Which is why Donald Knuth created TeX in the first place
Which has been followed by LyX and kin, offering WYSISYM (what you see is what you MEAN) rather than WYSIWYG.
Actually I think the first WYSIWYM I ever saw was something whose name I have forgotten, running on an Acorn Archimedes. Shame it didn't catch on with the rest of the world.
Standalone scanners needed
A rootkit cannot hide, while it is just data on a disk. In other words, trying to detect a rootkitted O/S using the same rootkitted OS is hopeless. You need to boot a standalone scanner (preferably off CD or DVD because they're not writeable after being checksum-protected and mastered).
Of course, this means some completely-downtime for the infected system.
Why does Windows go out of its way to make this form of security difficult or impossible?
Re: I can see the point
I normally put up with it, unless the perfumed one makes any comments about me sneezing my germs in her vicinity, in which case I've been known to tell her what I *really* think of her perfume and her manners!
Re: "banning cheese next, followed closely by nuts."
"May contain nuts" means that it might contain a stray nut or fragment from another production line in the same factory. The allergic consumer knows whether that could mean rapid death. With a less serious allergy he'll probably chance "may" whereas plain "contains nuts" is a definite no-no.
The only really silly one is seeing "may contain nuts" on packets of nuts! (Though thinking about it, perhaps they mean "may contain other sorts of nuts"?
Glad my allergies lie elsewhere, and I can eat all the nuts I like.
Re: So How do they stop nature recreating the same experiments...
the densest matter we have in the story here is the pesky lawyers
Not dense. Just highly successful parasites. Ever met a poor lawyer?
What you have to do is explain why, if the doomsday preconditions are correct, it hasn't happened already. Because Nature is bombarding the Earth with cosmic rays millions of times more energetic than anything we can make in our experiments. The Earth is a large target. The Sun is a much bigger one. Both are still here despite having been bombarded for the last century at least. If you accept that the last century is not special in cosmological terms, one can assume the same bombardment over the entire multi-billion-year life of the Sun. Either way, if strangelets could destroy us, it would already have happened.
Re: New Wheeze
You may not understand, that under the hood, XP is now really obsolete
the Linux kernel that first supported XFCE was obsolete very many years ago. (Many would say the same of XFCE). Buf if XFCE floats your boat, you can still run it atop the latest kernel. Why couldn't Microsoft let us run the XP user interface atop a windows 8.1 kernel?
Could it be that their programmers are so crap they've got no proper software modularity and the GUI is all tangled up with the kernel and with the applications? Or is it to railroad as many folks as possible into giving as much money as possible to Microsoft? Or do they just enjoy playing god (of the capricious variety from the Greek pantheon)?
Re: What the hell did they expect?
Linux offers CHOICE!
Personally, I use Gnome 2, Mate, or Cinnamon, which offer a conventional start menu. You add frequently-used apps to your task-bar or desktop. I find the menu paradigm perfect for locating one from the many programs that I use infrequently.
But you don't have to be like me. You can install your window manager of choice, and select which window manager you want to use at login time.
I'd have had no complaint with Windows 8 if it had come with an option when I logged in to see the familiar Windows 7 (or even better, XP) manager. But it doesn't, and I hate it.
When it gets to the end of this process, a blood-chilling grinding noise emanates from the EX4
Very odd. I've always recommended WD "Red" drives as the quietest server-grade drives I've encountered, and possibly the quietest outright. I just can't imagine them making such a noise, not even with four of them working on it together.
the bank would probably detect that and inform the police
If you don't admit to your true name, yes. But it is not against the law to go by any name you choose if there is no intent to deceive. So what happens should you go to a bank and say that you'd like to open a bank account in a fairly common name by which you wish to be known to strangers, while showing them your passport in your legal name as required by law? (I'm assuming it would be illegal for a bank to sell on the fact that you have an account "Sigourney Whatever known as Susan Smith")
Companies do this all the time: XYZ ltd. trading as "Whatever we want to trade as today".
I was once told that 90% of Chinese share just eight family names. The speculation was that this was a case of Darwinian selection, in the world's longest-running (almost) continuous civilisation. Those with unusual names were easier for the empire to trace and tax to the full. So they were poorer. So their mortality rate in the next famine was greater.
Perhaps we should all change our names to John Smith or suchlike, so that we may engage in internet commerce without having our easily identified "anonymous" data trafficked and tracked? How long before someone can prove that "John Smith" gets offered different (better? )prices than Sigourney Efidom for one's first purchase from a random etailer?
Is it still possible to open a bank account under a pseudonym, and to require and expect the bank to keep one's legal and legally required name strictly confidential?
Re: Fortran, indeed
Whilst I'd never write anything completely new in Fortran one of it's big advantages is that there are masses of very-well debugged programs/routines that are readily available to use & modify.
What's less-well known is that the Fortran language gives a compiler greater scope for generating optimal number-crunching code, because arrays are fundamental entities in the language, not just pointers to data. This was true even with Fortran-77, but subsequent iterations of the language picked up that ball and really ran with it. (Whole-array arithmentic without any explicit loops, slicing, WHERE statements, .... )
And the advantage grows, as computers can no longer have faster cores, but can have more and more of them, with SIMD instructions adding to the fun.
Implied do loops
Hmmm. Whats wrong with a clear syntax that avoids several extra lines of code, that lets you access data files that aren't in the natural order for the program in one concise line? e.g.
READ( 10,*) (A(I), B(I), I=N, 1, -1)
where you've got quantity N data pairs and you want them in two arrays in the opposite order to how some other program stored them?
A similar concept, properly generalised, was fairly recently added to Python. ("List Comprehensions")
If there's no space for a two-box solution, what about a two- or more-disk solution? Most BIOSes allow one to press F10 or similar to select the boot device. All but the smallest desktop cases can accomodate two 3.5 inch or 4 2.5 inch drives. Most motherboards support at least 4 x SATA, many support six or more.
(I deliberately don't suggest external USB drives even though USB3 is fast enough and most motherboards boot USB these days. My experience suggests that if a drive tests as low quality, it gets sold in a USB box. It's a good way to experiment with different multi-boot configurations, though. )
Industry Standard "vulture" drop test
Since there doesn't appear to be any industry standard, may I suggest that "The Register" creates one. Something like:
Vulture drop test grade 1: survives being dropped six feet onto concrete ten times starting in specified orientations. (Panasonic "Toughbook" territory).
Vulture drop test grade 2: survives the same from six feet onto vinyl flooring.
And invite manufacturers to submit devices for certification, if they dare!
Re: "none of them have had any security problems"
tell that to Iranian Nuclear scientists.
Just the point I was making to someone who thought a PC not connected to a network was secure "by definition".
I offered to make it secure by removing its CD drive and filling its USB slots and Ethernet jacks with epoxy glue (as used to be done at certain MoD sites) but he declined. He needed to get data in and out of it, and wouldn't see that it would soon become a "Typhoid Mary" spreading USB-based malware.
Re: What about the legend that is IEEE-488 (GPIB)?
You're maligning it. It was faster than USB (USB-1, that is).
True, the connector did tend to be the tail wagging the dog. The same problem recurs with a SCART connector and cable on a modern Digibox.
Re: Showing my age ..
For added points, name the other signals on a full D25 modem cable. For Guru-hood, work out why the full 25-pin modem won't talk to the full 25-pin modem connector on the mainframe.
Re: Anderson connectors
Along those lines, does the 1000-amp-plus 12V car battery connector have a name? The modern one that goes around the post and tightens with a wrench? (It works). Or the ancient Lucas twelve-clawed one that was supposed to push on and tighten with a thumbwheel on top, which would corrode itself into an unfortunate combination of immovability and high resistance within a year of fitting a new battery?
The ultimate evil connector....
No-one has yet mentioned the ultimate in evil connectors, which is not only current but hell-bent on conquering the EU housing market.
GU10 lightbulb connector.
For which you need a plastic suction cup to manipulate the bulb into place, or risk having glass splinters embedded in your fingertips.
Re: F---ing SCSI connectors
And for proper wake-in-the-night-sweating nightmares: SCSI
You're obviously too young to remember the connector on a Digital Massbus(TM) disk cable. SCSI was a sweet dream by comparison.
The aforesaid cable was about 40mm in diameter. I think one of them once featured in a Star Trek episode, strangling a crewman by telekinesis.
Re: Missing are
What about a vampire tap kit for an old thicknet Ethernet?
Re: #3 - terminal blocks
I raise you a well-cooked mouse that had inserted its head into the fan on an (old, hot) Opteron server.
I never could convince myself that there was any hole in the chassis large enough for that mouse to squeeze through. I do hope that the poor wee squeaker's neck was broken by the fan, otherwise it was a horrible death.
Re: Mumbai Multiway
Having electricity isn't the amazing thing. Not dying by it is!
Strip and twist
Would you believe 100Mbit server networking down a stripped and twisted cable? (and does it work with newfangled Gbit? I've never tried).
Well, what would you do at 4am, when you discover that you have to connect the server to a switch 13m away from the rack and you have only 10m cables and shorter?
It was working the next morning and the strip-twisted cable assembly was swapped as soon as possible the next day.
Re: Lost the plot
And the Quark chip runs finger-burningly hot.
Presumably it is engineered to do so. As were Atoms before. And any chip well-designed for passive cooling (because you need a fairly large delta-T before convection gets going).
I remember an old Athlon system I once serviced. The heatsink fan had failed and I sizzled my finger on the heatsink (ie over 100C - heaven knows what the chip temperature was). Nevertheless it was "working perfectly". (The perceived problem was a failed CD drive). And it carried on working perfectly until it became obsolete a couple of years later.
Going back even further, I saw a power supply that had become overloaded because of a fault elsewhere, but which only failed when a rectifier diode melted its soldered connections and dropped off the circuit board. It was still a working diode.
CMOS silicon is very tolerant of high temperatures. The CMOS switching speed drops in inverse proportion to the temperature in degrees absolute, so the Tmax for a chip is usually the temperature above which the manufacturer will not guarantee correct function. Running too hot is akin to overclocking (which is why overclockers are into radical heatsink designs). Tmax is certainly not the temperature at which the chip will be destroyed in seconds. It gets much hotter when being soldered in place. Operational life is shortened by high temperature operation, but chips will function for decades and become obsolete long before ... dropping life expectancy from 30 years to 15 years rarely matters.
Re: Lost the plot
NUC isn't intended for embedded at all. If for anything, it's for the Mac Mini market. See my post above re GA-C1037UN-EU for details of a PC board that is easy to run off a vehicle 12V supply (or a 12V power brick) with far more interface options than an NUC. The board itself has a standard ATX power connector, the Pico-PSU range is the other half of the solution, and the C1307 CPU is 17W TDP and fanless.
Re: re: "I work in precisely this sort of market "
Just for completeness, one should perhaps add that it is possible to obtain a full PC for not a huge amount more. I recently purchased a Gigabye GA-C1037UN-EU ITX board. Add a Pico-PSU, DDR3 RAM, a disk device (a USB stick will boot for embedded) and you're away. The board is fanless (17 watt TDP), and to my surprise even comes with two COM ports (one with D9 on the back-panel), two Gbit Ethernets, a PCI slot and an LPT header. Also 3 x SATA (one of which 6Gb for SSD) and e-SATA on the back panel. Cost around £120 (for board, pico-PSU and RAM).
Agreed, it's rather more power-hungry and expensive than boards being discussed here. But it's also a LOT more powerful. I wanted mine for a silent always-on home server cum internet browser, but I immediately thought that if I wanted a computer to run off 12V in an automobile or boat, it would be a perfect starting point.
And I've discovered that I'm no longer using my core-i5 desktop for anything except gaming these days. This little beast is always on so no boot-up wait, and feels plenty fast enough for everything else.
Re: moore's law
you went and fucked it all up there right at the end. Moore's law is demonstrably bollocks and promotes a world view that is.... unhelpful, as acknowledged by the great man himself.
Sorry but you are completely wrong. From a physicist's or engineer's perspective, it's a scaling law, that predicted (back in the 1980s) that there was absolutely nothing fundamental in the way of going from the earliest CMOS computers with a few tens of thousands of transistors running at a few MHz up to today's billion-transistor chips clocked at a few GHz. It also predicted where the law would inevitably fail (ie run out of predictive power). That's where we are today. It's because the transistors have to be made out of discrete atoms and the thickness of a gate is now as thin (as few atoms) as it can be while remaining an insulator.
Back in the days of bipolar transistors, a bit was represented as a flow of current and engineers faced what they called a "smoking hairy golfball" problem. You had to put the components sufficiently far apart so you could keep them cool, which restricted the clock speed because of the speed of light. Shrink it too much and you can't stop it catching fire (aided by the fact that bipolar transistors suffer thermal runaway).
CMOS, on the other hand, scales so that energy is dissipated only when a logic element changes state, and the heat generated per unit area of active electronics is a constant as you shrink the transistors, shrink the operating voltage, and scale up the clock speed, all by the same factor.
Agreed, Moore's law is now historical and no longer predictive. (the entire context of this thread is history!) We've now pretty much reached the physical limits for the smallness of a transistor, and any future improvements in CPU performance will have to come from using the billion or so transistors on a chip more intelligently.
Re: Surely it can be changed ?
but wasn't the invention of the programmable computer an invention that just maybe has a smidjin more of an impact of the modern world?
Definitely, although it was invented by Konrad Zuse in Germany before the war! Colossus came second, or maybe third if you believe the Yanks. Or even fourth, if you accept Babbage's mechanical engine as the first programmable computer.
Also don't under-estimate the input of genius physicists: http://en.wikipedia.org/wiki/History_of_the_transistor. If we still had to use thermionic valves there might be less than a thousand computers in the world. If we still had to use bipolar transistors there might be less than a million. Oddly, the FET was discovered first, though CMOS arrived quite late and was the enabling technology for Moore's law.
- Review Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?
- Pics Whisper tracks its users. So we tracked down its LA office. This is what happened next
- Human spacecraft dodge COMET CHUNKS pelting off Mars
- Ex-US Navy fighter pilot MIT prof: Drones beat humans - I should know
- Downrange Are you a gun owner? Let us in OR ELSE, say Blighty's top cops