2418 posts • joined 10 Jun 2009
Wouldn't it be better to combine SLC and TLC? Write data to SLC first and then have the controller move it to TLC once the data has stayed unchanged for a sensible length of time. Unlike RAM cacheing, no loss if the power suddenly vanishes, no big(gish) backup battery needed to prevent that.
Would it be possible to bake a combined SLC + TLC chip or are the processes different?
How long before the black boxes get pwned, and all our data becomes available to precisely the people we don't want to have it? (Don't want to have it even more than we don't want the government to have it, that is).
I voted against Labour last time to kill ID cards. Who do I vote for next time? What's UKIP's policy on spying on the populace?
Re: my neighbour? you sure about that?
I'd be sorely tempted to send him a big empty box, signed-for, with his own address as the address of the sender.
Common but unofficial
The difference is what happens when the signed-for item disappears after "delivery". Today, the Royal Mail is liable if it's not the recipient's signature. Only up to 100x a first-class stamp but that's often enough. In future the dishonest neighbour (or a bad-egg postie) will sign for it, and the Royal mail will claim that they carried out all their obligations in full and tough luck.
Which is very short-sighted of them. Businesses (Amazon for example) will cease using Royal Mail altogether, as soon as the non-delivery rate soars. As for E-bay, it's probably the end of people who aren't full-time traders selling anything there, for lack of any trustable delivery mechanism.
In my dreams, Ebay would take over the Royal Mail and run it sensibly.
You think that's accidental?
More likely they've decided what they are going to do and the consultation is a complete sham. This is their secondary way of making sure that they don't get a non-ignorable number of responses from those who might disagree with them. The most obvious one is not publicising the so-called consultation in any effective way. Good thing the Register has blown their cunning plan in time to write to newspapers and MPs.
Remember "Beware of the Leopard"? (in the HHGTG, not an Apple blog)
The security side of this is going to be ... interesting?
Re: Apple and the bleeding obvious
> As a general rule, shoppers really aren't interested in the back side of things.
Except for jeans.
Chicken and egg.
> No, all tablets are shaped like they are because that's the size and shape screens are made.
Are you sure? From what I know of the technology there would be no great difficulty in making TFT screens oval or triangular or any other shape without concavities. The electronics to drive such an odd array would be more complex, but hardly impossible. If there were any massive demand. There isn't. Do you think an oval phone would score any cool points over a rectangular one?
Pockets are rectangular. The larger things that go in pockets, like wallets and phones, are rectangular. A round one would be wasting space in the corners, and would have certain other drawbacks, like rotating in the pocket so you couldn't immediately know top from bottom.
Re: Change the battery on a Samsung
the question may be whether you can dismantle, replace battery and reassemble it using a normal techie's toolkit and a bit of Googling, or whether it's designed to be tamper-proof or unrepairable (which in my book ought to be illegal except when there's a strong health and safety reason).
I don't actually know.
Yes. It's another sign of a sick society. The same sickness that has engulfed out banks and financial services. It's managed to corrupt our legislature. If it gets to corrupt our judiciary, we're doomed. I fear that it already has done so in the USA.
It's been established that you cannot patent or copyright the external shape of a car exhaust pipe. That's because only one shape can fit underneath any particular model of car, and therefore there is no freedom to do anything different.
The same, surely, for a pocketable tablet. The overall design is dictated by the pocket.
We may have been spared such lawsuits back when the clam-shell mobile first appeared on the scene, because there was such obvious prior art. The Star Trek communicator. They couldn't make a working one, but the overall design was still obvious.
Patent lawsuit procedure
Rather than summary judgement, I'd say that for every pound, dollar etc that the plaintiff spends on lawyers, he should be obliged to lend exactly the same amount to the defendant for payment of the defendant's legal costs. If he wins, repayment of the loan is added to other damages. If he loses, then he loses the right to repayment of the loan as well as any other costs awarded against him.
The result would be that no company (typically a small company) would feel obliged to settle out of court for lack of sufficient finance to match the plaintiff's legal muscle. Patent trolls would probably disappear.
It wouldn't have made any difference to this case and there's no reason why it should. Samsung and Apple both have deep enough pockets to afford armies of lawyers out of their petty cash.
A quick Google for "Internet Wireless Thermostat" reveals a range of products starting at 120 quid. I would hope that a techie is capable of replacing his existing thermostat without electrocuting himself or creating a fire hazard.
(I'm also assuming it's still legal to do so. It's legal to replace a cracked 13A socket without getting an Electrician in, so I guess a thermostat is the same).
Ever heard of patent royalties?
Re: Only use Epson if...
With HP, keep the printer in standby even if you won't be using it again for weeks. HPs wakup up to squirt a tiny amount of ink through the nozzles every 24 hours, to make sure that the heads don't block up. It really is a tiny amount of ink - you won't notice unless the printer is left un-used for years.
Re: prints black when colour out?
It's actually true, black ink plus a little yellow ink looks blacker than just black ink! Easy to make your own mind up - print a paragraph both ways and compare.
Re: You arent mentioning...
That's if you do a full install. (And I don't have experience of the low-end models to know if there's any alternative). for the high-end Officejets, you can choose which bits you install, and I agree it's best to go minimalist.
Funny, I'd say cheap laser printers, and most especially cheap colour laser printers, are for mugs.
My reason for saying this are the high-end HP Officejet K550, 5400 and 8000 printers. I don't yet have enough data to be able to say whether the latest HP Officejet 8100 continues the fine tradition of printing fast, well, for tens of thousands of pages, and at a price per page lower than cheap mono Laser, let alone colour.
NB the running cost of all printers in this survey. An extra 2p/page on just 5000 pages is a hundred quid. After that many pages you'd automatically have been better off spending more on the printer and less on the ink. (HP OJ Pro 8600 about 120 quid).
Re: "can do 50 full drive writes a day for five years"
Treat warranties from new companies with more than a pinch of salt. They have nothing to lose by lying, sorry, being over-confident.
If the product doesn't wear out prematurely they get to stay in business and their happy customers come back for more. If they get more warranty returns then they can afford three or five years hence, they file for bankrupcy. Either way everyone has had a living for two or three years (and the fat cats at the top might well retire for life, if their living was a seven-figure salary).
BTW isn't SLC good for a million cycles these days?
Just as long as there isn't anywhere that water or air can get trapped until it's exerting nearly an atmosphere of pressure, followed by a tiny pop or crack that's nevertheless quite fatal.
The best places to dry out wet electronics are an airing cupboard, or a machine room with air-con. In both cases, leave it for a good few days.
Rule Zero: get the battery out of the device as soon as possible after it gets wet.
Water won't do much harm to an un-energised device even if it's wet for days. Electrolysis, on the other hand, can corrode it to death within minutes, sometimes less.
The manufacturer doesn't know if the motor will ignite *reliably* under those conditions. One successful test-fire won't say much about reliability. It might be a 10% lucky shot.
Methinks you need to fire at least four. Maybe the manufacturer will supply the motors for free if you return the data?
Re: Good for them
How many nuclear submarines are already dissolving at the bottom of the ocean? Some with a full complement of ICBM warheads? Quite a few that are known to have been lost, and probably rather more that still haven't been disclosed.
Actually there's no nuclear explosion risk, and probably very little radiation risk. There's next to no circulation between the deep ocean and the surface, and a helluva lot of water to dilute the radioactives in. I doubt that this Chinese project adds significantly to that risk, even should the worst happen. Hasn't the worst already happened at least twice, at Chernobyl and in Japan? With less actual harm than the normal operation of coal-fired power stations, even ignoring their CO2 output?
The oceans are salty because they contain most of the sodium that's been released from rock over three billion years of plate tectonics. They're naturally mildly radioactive, for the same reason with respect to Uranium.
Latex + LyX is vastly superior to any word processor. It's a WYSIWYM editor. What You See Is What You Mean. You get to see a rendition of the markup as you type and edit, but it's an approximation to the better rendition you'll get when you print it. You don't have to type markup language control sequences. It is vastly superior to Word, where inserting another paragraph (or even another word) on page 2 can wreak havoc with the layout of any subsequent page. The longer the document, the more mathematics, or the more inserts, the worse Word gets.
Lyx uses LaTeX behind the scenes.
LyX is free and available for Linux and Windows alike. Try it.
Re: If only a quality, user friendly Linux distro was available...
Photoshop - GIMP
Acrobat reader - Evince. PDF creation - print to PDF via CUPS (or on Windows, PrimoPDF) from your content creation program of choice.
Office - OpenOffice or LibreOffice
The alternative products are similarly capable, and the ones I've mentioned above are all available to run on Windows as well, for zero cost, so you don't even have to go to Linux.
Oh, but the interface is different. So it's OK if Microsoft completely changes the interface from Office 2003 to 2007 and again for 2010, but you rule out any other software that's not bug-for-bug compatible with your favoured expensive product? If you were arguing the value of the UI you already know well I'd tend to agree, but when it's ripped out from under you at the next upgrade, why not just say no and go to Open-source alternatives?
I know I'm wasting my bytes, though. For some folks it's better the devil I know, than the angel I don't.
Do you know Acrobat can create pdf files that Acrobat Reader can't print, but which Evince has no difficulty with? (Both on the same Windows system).
Re: Apples with Apples?
I don't believe tablets are replacing PCs. They're being added as well as PCs. PCs for work that requires serious inputting. Tablets for leisure (and some work) that is almost all output. Also because the tablet is a new device class, there's a huge sales surge going on at present. Just like there was once a huge sales surge for the now moribund netbook format. It lasted until everyone who wanted one had got one. In the fairly near future, everyone who wants an iPad will have got one. Microsoft will probably arrive in competition just as the market is saturated.
Re: "Lets face it, it is rather retro kernel design"
Monolithic kernels may be said to have passed the test of time ... at least if they're put together as well and as flexibly as Linux is. Point me at some other kernel architecture that works half as well. Yes, I'm aware of all the academic arguments in favour of microkernels. On paper, they are quite convincing, but I won't be convinced until I see one working well, across a range of workloads and system types, in the real world.
Personally I think Linux has a lot in common with Microkernels. Its software architecture is well modularised. New subsystems are easily integrated and existing ones re-engineered. it's just that the binding is done at kernel build time, not at runtime. It's a bit like the C++ versus script language argument. C++ is less easy to develop, but more efficient. A monolithic kernel is likewise less easy to develop for, but more efficient in production. A kernel is somewhere that efficiency DOES matter.
I have a big problem with neophiles. They think that "old" automatically means bad, without any actual comparison of the relative merits of the old and new products. They don't like "tried and tested and nearly unbreakable". They are also happy to disregard the vast amount of man-hours that are wasted, when a company like Microsoft replaces (say) the XP UI with the Windows 7 UI, and the Office 2003 UI with the Office 2007 UI. Sure, it may be only a couple of hours of lost productivity per user, but multiply that by maybe a billion users. Personally I think it's much higher. There's no accounting for the cost of the mistakes that are made while someone is thinking about the bloody new interface rather than the work he's trying to accomplish within it. Somewhere out there, I'm sure that the change to windows 7 has been the triggering event that destroyed marriages, killed companies, and caused deaths (by heart attack, probably). The right way to go is incremental improvement. Slip in th new features in a completely non-intrusive way, so that if you don't yet need the new stuff you never notice that it's arrived. That's what the Linux kernel has been doing very successfully for at least the last decade. (Unlike Gnome developers ... sorry!)
And almost as soon as we get used to Windows 7, Microsoft decides to Metro-ize us. That's a good neologism, by the way. To Metroize. To pull the rug out from underneath a billion users, in a misguided and doomed attempt to increase corporate revenue. To FUBAR by deliberation rather than by accident.
Re: Some plusses:
You could always buy 8cm mini-CDR and even mini-DVD-RW disks. Most drives can hack them. Smaller capacity, of course.
Slower? Hardly. DVD 16x is 22 Mbytes/second. Most USB thumb-drives are slower, at least when writing. Kingston Data Traveller Hyper-X 16Gb is 16Mb/s write, 25Mb/s read, and that's a premium model. Cheap ones are often around 5Mb/s.
Thinner makes it intrinsically more fragile (wrt getting bent). Something I'm sure the manufacturers have thought long and hard about. Convince the punters to pay more for a less durable product. Yes. Oh YES.
If you want a random client to upgrade you to demigod, also carry around a USB to SATA adapter and a stand-alone Linux distro with ddrescue on it. Then when you hear about someone who has lost data on a disk drive that's making clicking noises, ddrescue it. It doesn't always work, some drives die too quickly, but you sure gain a believer when it does work!
DVD-W still has a niche
One writeable DVD costs less than 20p and holds nearly 5Gb. Cheap enough to give away freely and/or stick in the post. 8Gb USB sticks are at least twenty times more expensive.
Yes, you can send 4.6 Gb by network these days, and without pain on a LAN. However, at DSL upload speed? OK, it might complete overnight and it probably will beat the post, but that's not very convenient . Especially if there's more than one person you need to send a copy to.
Despite this, I don't object to computers lacking a DVDRW drive. I've got a dinky little USB DVDRW drive that powers itself off a single USB socket. If you don't have access to one, there's something wrong with you or your employer.
Re: So precisely do we benefit from discovering higgs?
I'm sure that there were cavemen asking the same question about discovering fire (or rather how it could be moved from one place to another and kept going).
Re: "99% of software is crap"
Should I ask about the 1% of crap that isn't crap?
Re: War of the Worlds deserves a place in history
Cybermen don't qualify. They're not aliens, they're technologically-created zombies.
Daleks really should be on the list, though. Any Dr. Who saves Earth from Daleks sequence beats "Independance Day" on every front, including plot intelligence and plot believability.
Re: What the f*ck does this have to do with the IT industry
99% of software is crap
99% of Hollywood alien invasion movies are crap.
Re: What about Chocky?
And there's the comedy version - Gremlins
Re: What about Chocky?
Triffids weren't aliens. We made them.
It's the everyday world that's complicated! The standard model is really quite simple, but obviously not complete. There may be an even more simple underlying theory that so far we have hardly glimpsed.
Lasers and some forgotten alternatives
We rely on Lasers for optical disk devices and for data-communications. The science of Lasers is definitely simple quantum physics. If someone had experimentally discovered a lasing medium in the 19th century, quantum theory would have had to follow along rapidly. As it was, Einstein got the theory right decades before anyone made a laser.
You can have fun imagining a future where computers still run on purely classical vacuum tube technology. (Yes, micron-scale vacuum tubes are possible, as is integrated circuitry containing millions of them! ) Or, you could try having the Babylonians or Romans discover pneumatic computers (clock speeds of 100kHz, logic element size a few mm - Rolls-Royce did actually once build one to embed in the hot end of a jet engine). If Babbage had known about pneumatics, today's world would have been quite utterly different.
There is absolutely no way to determine if the universe is really real, or is just a perfect simulation of its physical laws and an initial state running on a computer within a universe with completely different physical laws. This is pretty much by definition. The perfect virtuality hypothesis also has zero predictive value, so we apply Occam's razor to it.
Note "Perfect". The most dangerous thing physicists could do is to find the bugs in an *imperfect* virtuality, and then tickle them. (There's a variant which says this has already happened many times over).
There's a scarier possibility, that it's our brains and sensoria that are being simulated by distant descendants of real beings much like ourselves. The simulation is running in their university department of pre-digital history. Sometime soon a grad student is going to realize that the simulation has progressed past the dawn of the information age, and is therefore pointless, so he'll stop the run.
(Ever had the feeling that your life has suffered a subtle continuity error, usually simultaneous with the desire never to drink so much again? Now you know why. Both the continuity error and the getting drunk. One's the bug, the other's the fix).
Re: Be careful what you wish for
Science advances in two ways. One is a prediction from a theory, later confirmed by experiment. the Higgs boson is in this class. The other is an observation of something not predicted by any theory, or which contradicts the generally accepted theory.
So assuming there are no more predictions that everyone wants to confirm, CERN should start looking for the unexpected and currently inexplicable. (I think there are also many more tentative theories making predictions that CERN will in due course test).
Eventually, if physics needs particles at higher energies than CERN can provide, a lot of new technologies will have to be developed. A usefully larger circular accelerator would be impossibly large and impossibly expensive. It'll have to be a linear accelerator with operational parameters way beyond anything we know how to build today.
The truth is that most jobs are boring, unless you can really enjoy the everyday detail. Here are some others. Car salesman. Estate agent. Hairdresser. Accountant. Garbage operative supervisor. Bus driver. Solicitor. Hotel manager. Starting to get the picture? Prefer any of those to IT? (the whole job, not just the over-inflated salary that some of them command).
The thing that's sick with society is celebrity culture, the whole idea that everyone should ape the glamourous, the rich and famous, the fashionable. Mostly what it breeds is dissatisfaction, unhappiness, low self-esteem, and a failure to realise that the reward of helping other people is not solely that it gets you a paid at the end of the month.
I've found a job that lets me spend a good part of my day solving puzzles (something I enjoy). It could be a lot worse. Also it's my job ... not my life.
Re: So, basically a land hurricane?
More likely, a squall line http://en.wikipedia.org/wiki/Squall_line
Hurricanes cannot form over land. They are driven by hot moist air rising from an ocean surface. They also take several days to get going, so you get at least twelve hours notice that a hurricane is headed your way. Usually longer.
Squall lines give very little advance notice. I've heard tell of a transition from a hot summer day to roofs being blown off an hour later.
But how do you know?
How do you know that by overclocking your system, you haven't created conditions that cause, say 34387.00*1.01 to compute as 79231.48, that sum being the decimal representation of something that Intel knows is on the critical timing path of the FPU?
Next thing you know, all the grade 3 techs have been paid over twice their usual salary rather than the scheduled 1% pay rise, and the FD wants to see you NOW!
But even if you're just number-crunching a model that you know will unconditionally iterate to correctness (in the mathematical sense), you still can't be sure. Maybe the FP error was in the calculation of the residual error, and the iteration is terminated before the answer is right? Let's hope your Ph.D. doesn't depend on that result.
Or maybe it's not an FP error, it's in one of those rarely-used instructions that only OS kernels ever use (which may be where MS is coming from, though I have my doubts). Consequences: corrupt database? corrupt filesystem? deadlocked system? security compromise?
I won't overclock a CPU for work, period. (for fun, OK). Intel knows what are the timing-critical logic paths in their billion-transistor chip. I'm sure that if a significant fraction of the dies tested OK at 5% faster on the critical paths, Intel would sell them specified for running 5% faster. Fact: they don't. Becuase Intel knows, this chip doesn't work 100% reliably above that speed. Maybe you'll never crash into the invisible wall, but logically it must be there.
Re: The real problem
have you ever tried running Win 98 SE under VMware on state of the art hardware?
The OS was / is crap, but it sure boots impressively fast!
OT - skyscrapers
You might be surprised to know that most tall buildings have only one support, right in the middle. Everything else is cantilevered off this core. These days, the core has to designed proof against airliners colliding with it and large (I don't know how large) explosions.
Re: Bullpat @JDX
I don't believe that Intel or any other CPU manufacturer would knowingly ship CPUs where getting the right result from any particular operation was be design and testing only probable rather than certain.
Of course, there's a thermodynamically large set of states and they cannot test all of them. They do, however, have access to the CPU simulator, and the ability to probe the actual signals at the surface of the die to validate and calibrate it. They therefore know what are the speed-limiting transitions, and can design their tests to exercise these in particular. If they don't sell a faster version of a particular die, it is fairly likely that they *know* that for this chip and at that speed, at the maximum operating temperature and worst in-spec chip power supply, there is at least one instruction sequence that is very likely to fail.
Overclocking a game is one thing. Overclocking a financial, scientific or engineering model is quite another. Don't. It's more important that the results are right and the system reliable, than getting an extra few percent of speed.
Re: Laptops more reliable than desktops?
A laptop has a built-in UPS (the battery and charger) rather than crashing if the mains supply glitches. A laptop often has a slower CPU than a desktop, and a slower cooler hard drive. These may tip the balance, depending on what exactly is being measured.
The desktops last longer, but that's often because the laptop's RAM can't be upgraded enough to make it worth keeping, or because it's too expensive or too much hard work to replace its keyboard after something gets spilled into it. Laptop displays are also harder to fix (desktop: throw away the monitor and plug in another one). And of course, desktops don't get dropped onto a hard surface nearly so often.
Re: And as for a Unix Server
I've seen that sort of reliability from desktop PCs crunching numbers. No unscheduled downtime other than those caused by the electricity supply, up until the day that it was decided that a newer system would make better use of the electricity. They weren't even required to be quite that reliable, they just were!
Running Linux, of course. And I'm sure that your IBM's disk subsystem was taking a much greater pounding.
Overclocked vs. Flat-out
An overclocked CPU is a CPU running outside its specification. It's been tested by the manufacturer (who knows the weakest spots w.r.t. timing) at a particular speed and may well have been found wanting at a higher speed. It's blindingly obvious that an overclocked CPU may not be working 100% correctly, and can only be recoemmended to someone who cares neither about correct results nor about reliability. A gamer, maybe.
Flat-out, on the other hand, should not reduce reliability. With modern CPUs there is a feedback loop to slow down the CPU when the chip temperature limit is reached. I work in an environment where desktop PCs are crunching numbers 24x7 most days of the year, and our desktop systems don't seem noticeably unreliable. By far the commonest failure is a PSU fail, followed by a hard disk fail. Failed CPUs are as rare as hen's teeth and failed MoBos only slightly commoner. In the old (Athlon) days when a CPU didn't slow down and could actually overheat until the heat crashed it, failed CPUs were also as rare as hen's teeth. I'd vacuum the heatsink, replace the fan, and the system would happily reboot and last as long as any other. Too high a temperature slows down the logic gates in the CPU, until it's the equivalent of a CPU that's overclocked one notch too far, and crashes.
Oh yes, and always run memtest overnight on a machine that's randomly unreliable. Low-incidence errors on RAM will do that. it's why servers (and serious engineering workstations) have ECC RAM. If memtest crashes rather than reporting errors, suspect your power supply first (you may or may not see the problem with a DVM).
- Analysis iPhone 6: The final straw for Android makers eaten alive by the data parasite?
- TOR users become FBI's No.1 hacking target after legal power grab
- Vid Reg bloke zips through an iPHONE 6 queue from ZERO to 60 SECONDS
- Anal-ysis Buying memory in the iPhone 6: Like wiping your bottom with dollar bills
- Bacon-related medical breakthrough wins Ig Nobel prize