Is that chart inflation-adjusted?
If not, it is telling a very different story!
Need a thumb-sideways icon.
3206 posts • joined 10 Jun 2009
As long as Android stays open-source (GPL) then the world still has access to source code for mobile-phone hardware. So if someone decides that they want to port Linux onto hardware on which Android runs, they'll have access to working copylefted code. Some parts of it will be reusable. Other bits won't be usable except as documentation, but at least it's documentation that's been tested and de-bugged on the hardware, rather than a figment of a document-writer or translator's imagination.
I once connected hardware to VMS, given the source code for interfacing it to MS/DOS. None of the code was portable, but the working source was nevertheless a great help compared to the (hopeless) English language hardware documentation (translated from Japanese?)
And I think the jury is still out on whether one O/S (Linux) really can scale across everything from a mobile phone to a datacenter cluster. The phone environment is one where you pay (in battery life or weight) for inefficiency in the software, and maybe the linux kernel represents too much of a handicap. Or maybe not. Time will tell.
I never use Firefox without both Flashblock and Adblock-plus. Adblock makes sure that the buggest(*) source of potential exploits (bogus adverts served into other peoples' legitimate pages) never make it onto my system. Flashblock means that all flash animations get replaced by a logo, which I click if I want to view them. That takes about half a second. Most of the time I don't want to see them, just read the text, and it probably saves those half-seconds ten times over in bandwidth-wait.
(*) a typo, but I like it.
At home I have a hi-fi FM receiver and I can also plug my DAB receiver in to the same amplifier. There's a good emough DAB signal that I don't get any drop-outs or burbling, and since it's a digital medium I believe that what's going to the amp is a good rendition of what's transmitted.
On speech I can't really tell much difference.
On classical music, I most certainly can. DAB introduces highly objectionable, non-harmonic distortions. When the source is a solo instrument or a small ensemble, it is actually quite painful to the musical ear.
The problem is partly that all lossy digital encoding introduces tones that are not harmonically related to the music, but mostly that the DAB codec was obsolete many years ago. Interference on FM, in contrast, adds extraneous noises that one's mental listening process is very capable of filtering out. Over-compression by the broadcasters (or insufficient bandwidth) may be another contributory factor.
The only reason I haven't thrown away my DAB radio is that it's great for listening to BBC World service, compared to Medium Wave. They don't transmit World Service on FM. Sigh.
Dab is a technological mis-step, like the twelve-inch video disk or ECL logic. Time to start again.
It needs a decent Codec. Preferably, one that can be upgraded across the airwaves, should a mark 3(?) Codec become preferable at a future date.
It needs much better error recovery for use (especially) in moving vehicles. My suggestion would be as well as the standard broadcast multiplex, have a highly compressed version of the broadcasts available on a second multiplex with a five-second delay (and maybe a third one delayed even more). In a moving car, play with a delay. If the primary data stream drops out, splce in the low-resolution secondary or tertiary broadcast. You'll hear some transient distortion rather than a total drop-out. Just like FM!
And it needs battery-powered portables that don't eat batteries any faster than an FM radio of the same audio wattage.
If the next version of DAB can't do better than FM for car radios, with respect to both error-tolerance and broadcast coverage, just don't bother. Kill DAB and keep FM. Frankly, if you can't sell it to the auto industry for all new factory-fitted radios, then return to the drawing board.
It would almost certainly be grounds for damages, to perform a destructive shut-down that destroyed the user's data. It might even be illegal in criminal law. However, merely refusing to boot while leaving the user's data accessible to third-party tools (or a paid-for licensed Windows) is probably OK (IANAL)
Maybe compare shredding someone's tyres when they park where they shouldn't, to applying a wheel clamp and demanding a license, sorry parking, fee? Incidentally the key to whether wheel clamping is legal, is whether you entered into a contract with the owner of the parked-on land permitting the enforcement. The presence or absense of clear notices stating the rules determines whether you did. So it's quite a good comparison. You certainly couldn't have missed Microsoft's notices.
I know nothing about the (ancient?) ICL case, but unless Microsoft have done something very nasty to NTFS in the RC, there are plenty of third-party tools (including some freeware) that are capable of mounting a disk containing the RC and user data, and copying the latter onto your choice of another drive or a network server.
I'd be surprised, even dismayed, if the courts took the view that Microsoft was denying the user access to his data, in light of this, and the numerous disclaimers the customer agreed to at install time, and the numerous advance warnings that the customer will have ignored before the system finally won't boot.
Does "removed" in the ICL case mean physically removed? I can see why the courts came down hard on a company that was effectively "kidnapping" the user's data and holding it to ransom! I think this must be my first post when I'm unreservedly on Microsoft's side.
When you spend a few hours at 35,000 feet, you have volunteered yourself for a significant extra dose of cosmic rays by putting yourself above most of the atmosphere. Against this, the dose you receive from scanners on the ground pales into insignificance. One source quoted one hour at altitude equal to forty scans.
In context, several hours at altitude every working day does not give rise to a detectable increase in cancer deaths amongst flight crews. One can calculate theoretical numbers of deaths caused, but there is a much higher number of cancers not caused by radiation.
How does knowing that some flunky is looking at your genitals do you any harm? Especially since he doesn't know whose genitals he is looking at.
And probably, he isn't. looking. This is the sort of thing that computers do better than humans. Unless you are carrying something that you shouldn't be, chances are good that no human actually looks at the picture of your genitals at all.
And in answer to the original question, it reduces embarassment, which might reduce the chance of a passenger suffering a heart attack in the departure lounge by some immeasurably small fraction.
100 hours flying time = 0.4 millisieverts = 400 uSv, so one hour equals 4 uSv, which is forty times the scanner dose. Source: http://news.bbc.co.uk/1/hi/health/557340.stm Therefore, if you are worried about the dose from the scanner, you won't be flying at all because what nature throws at you at 35,000 feet is 40 times worse. Per hour.
Incidentally, there are good reasons to think that at natural-background doses, the body has effective repair mechanisms, and so the cancer rate should not be affected by such doses. There are even a few hints that abnormally low exposure to radiation might even be damaging. The effects are so small that the arguments cannot be resolved by statistics, even of large populations.
Anyway, one can reasonably argue that flying, with or without scanners, causes a fair number of cancers each year. But also, a statistically undetectably small number hidden amongst all the other cancers. The excess can't even be detected in flight crews. And don't forget, you are free not to fly.
To me, the point is that the terrorists on 9/11 got through security weaknesses that were then wide open. Principally, that the pilots did not know of or believe in suicide terrorists, and opened their flight deck door when "forced" to do so by threats to the cabin staff and passengers. Second, that the terrorists were able to take box cutters in their hand baggage. There were others (such as insufficient or failed vetting of trainee pilots).
Those weaknesses are now closed.
Millimetric waves don't penetrate the womb. It is believed that they don't pose any risk to the mother, either.
Anyone getting into an aeroplane is about to volunteer themselves for a significant extra dosage of cosmic rays compared to someone remaining at ground level. If a well-informed pregnant woman is worried about being scanned, she would not be flying in the first place, because the cosmic radiation "risk" is well-known and measurable. (And small compared to the terrorist risk).
I'm normally strongly against government efforts to acquire information amout us, to scare us and control us. But I really can't see any harm in them finding out what I look like naked in silhouette, even if they do decide to store the image for posterity. Look back at my posts about ID cards, for example (which they say will help stop terrorists, but which are of course utterly useless for that purpose, or almost any other).
As for the rights of the many and of the few - well, to a large extent democracy is a system whereby the majority can decide on laws that are imposed on the minority. Certainly, I agree that there are some rights, such as the right not to be tortured, or the right to life, that are fundamental and should not be up for grabs by politicians and lawmakers. I do not agree that the "right" to not have one's naked silhouette viewed by security staff is one of these fundamental rights.
Airports are a special case because a bomber can do so much harm with so little explosive. Compare the death toll for the Madrid rail bombings. Ten bombs on four rush-hour trains claimed under 200 lives, despite being suitcase-sized rather than underwear-sized. That's a large part of why it's necessary to scan at airports, and why it's at present impractical for other transport. We'll have to take our chances on the tube.
Suicide bombers smuggling bombs in body cavities? Maybe we'll soon have to allow X-ray imaging of travellers as well. How does the X-ray dose necessary to detect a bomb in a body cavity compare to the extra cosmic-ray dose which we all get from being in a plane at 35,000 feeet or above? Flight staff suffer that cosmic-ray dose for several hours every working day, and aren't obviously any more prone to getting cancer than the general public. Compared to which, X-raying the passengers might be acceptable to myself on safety grounds (and again I'd not object on any other).
It's surely a pretty fundamental human right not to be murdered.
If there were an airline that required me to strip naked in front of security staff before boarding, I might actually regard that airline as preferable to the others. A trade-off between a little embarassment, and a greatly reduced chance of there being a suicide bomber on board.
The choice should be between the scanner, and stripping off in front of security staff of the same sex. The human rights of a hundred-plus passengers who don't want to die should override the human rights of one passenger who refuses security measures which the hundred-plus accept.
But on the other side of this argument, we have MI5 and MI6 and GCHQ (to name a few agencies that are supposedly on our side), and numerous corresponding entities in other countries that aren't.
They're specialists in turning information into knowledge (or "intelligence"), which might not bother you, if you have nothing to hide. They have truly huge computer resources. They'll have AIs before the rest of us know that such AIs exist. But they're falliable. The first post on this thread, where the Google executive gets labelled as a paedo because someone of the same name uses the same library, is what you have to fear even if you have nothing to hide.
And if instead you get tagged as a potential suicide bomber as a consequence of a data-mining inference error, the first you know about it might be a bullet in your head. Please remember Jean Charles de Menezes before you respond. (Yes I do know that time was not a computer error).
Google wants to "do no evil", but they'd do well to remember what the road to hell is paved with.
The point of x64 is that a single process (or application) can utilize more than 2Gbytes of virtual address space. In certain types of application it is possible that >2Gb of VA space can be mapped onto less than 1Gb of physical RAM without the system paging itself into catatonia. So it can occasionally make sense to run x64 on a system with 1Gb RAM, and it almost always makes sense if the system has 4Gb.
Another reason is if you are developing 64-bit applications on a smaller box. They don't ever allocate >2Gb in your development environment, but let them onto the big iron with a heavier load or model, and they will then.
A third reason is if you run VMware player and *ever* want to boot a 64-bit guest O/S. Incidentally, VMware itself takes advantage of certain VM support available only in x64 mode, and allegedly runs faster on x64. (I've not tested this assertion).
4Gb systems are only one step up from the sensible default these days, and are probably set to become standard pretty soon.
An anecdote. A rich man was once driving his Rolls-Royce through rural france when he hit an enormous pot-hole and horrible noises started coming from the car. At the next village garage, the mechanic diagnosed a failed rear axle and contacted Rolls-Royce. Their response was that they would be flying out a mechanic with the necessary spare parts and the car should be fixed by mid-day next.
When the man returned from his travels, he did not receive an invoice. After some time, being an honest man, he contacted Rolls-Royce about the missing bill. Their reply was short.
"We have no record of the rear axle of a Rolls-Royce Silver Shadow ever having failed.
I suspect that Microsoft likewise prefers not to know of any un-fixed security-critical issues in their systems.
Vista was shite. I won't repeat my reasons for saying that. But to re-deploy your car analogy, a car can be complete shite even if the engine under the bonnet is a good one. That may apply to Vista, if the engine (kernel) of Windows 7 is substantially the same. Microsoft have done a substantial re-design on the bodywork, and they've tuned the engine a lot better. They've now arrived at a package that won't cause every purchaser to tell his friends that it was a big mistake and he should have bought an Apple.
I'd still have preferred it if they'd kept the older model's bodywork, just replaced the engine and incrementally improved the other details. But as with cars, maybe that wasn't possible. As for Vista, I think even Microsoft would now prefer to relegate it to the history books as fast as possible. Even if it's true that Windows 7 is Vista SP2 (in much the same way as XP is NT SP 12 or thereabouts).
Anyone who has ever gone swimming in the sea really shouldn't worry about slight imperfections in the pee recycling system when showering. Think about all those fish ... and that's before we add our own effluents.
And last time there was a drought in London, they told those of us living in West London not to reduce flushing our toilets, because they needed the "water" in East London ....
I'll drink to it, but not sure if I'd want to drink from it.
The number one problem: as with Windows Vista / 7, they've thrown out the user interface which the Office users grew up with, on all versions up to 2007. A menu bar (with incremental "improvements" at various upgrades, but not intrusive enough that one couldn't adjust as one went). It was insane not to offer an Office-2003 User-interface compatibility mode.
It's also significant that very many Word users know nothing about computers and (dare I say it) are not always particularly motivated. They're secretaries or call-centre staff, or all sorts of people who hunt-and-peck at a keyboard who would prefer not to have to.
As others have said, the best "upgrade" may well be OpenOffice, more compatibility with these users at zero cost.
Paris, scrutinizing a ribbon bar. Ooh-err.
> a computer would be magical indeed if its GPU was good enough to exclude the need for a CPU.
The future will probably be CPU/GPU integrated on one chip.
The question will be whether it's the CPU core that does the heavy-duty number-crunching, or the GPU core. Further down that road, whether the two merge into each other, with a bunch of processing pipelines that can be dynamically grabbed by whatever needs them. AMD's recent "modular CPU" architecture is rather begging that question.
AMD owns ATI. I can see why Intel might want to hold this trend back, if for whatever reason they can't buy NVidia. (Could NVidia/VIA be what Intel's paranoia is focussed on? )
More likely, nobody who knew that there were patches that needed to be applied, or perhaps nobody who appreciated that the consequences of cutting this particular corner would be certain catastropic failure. If someone had appreciated the need, I'm sure IBM could have supplied a man to do the work (for a price).
They'll be able to catch up on 12 hours downtime over Xmas. Worst case they pay people to work on Xmas day, though I'd guess that one of the other bank holidays will suffice. That bit of the story makes me think that they let go the man who did the capacity monitoring and planning as well, or deliberately cut everything too close to the bone to postpone the inevitable during the run up to the complete failure of RBS as a bank. Fred "the shred", remember?
Idiots hate people who are more intelligent than themselves. Most bosses are idiots. This goes a long way to explain not just this story, but the entire banking crisis of 2008-9.
Yes, she should be treated in the same way as her proposed laws would have the rest of us treated.
She should also be fingerprinted and DNA-sampled, and that information stored in the police computer for the next N years (even if she's never charged with anything). See how she likes that. Especially after organised crime makes off with the contents of the ID, fingerprint and DNA databases, and DNA synthesizers become mainstream criminal technology ... all within the next decade, I expect.
No-one can claim exclusive rights to Nexus, it's an English word. From the online dictionary:
1. a means of connection between members of a group or things in a series; link; bond
2. a connected group or series
But maybe a bit hubristic for Google, implicitly claiming to be the first connected group or series? I'd have awarded that honour to Racal, later Vodafone, for inventing the mobile.
They don't normally sue, until they've exhausted all other avenues for obtaining license compliance. Most such violations are unintentional, and the offending party is persuaded to release their source code once they have the violation drawn to their attention and have taken legal advice.
Methinks the organisations named are being particularly difficult, either out of pigheadedness, or because their firmware contains something that they want to hide (Spyware? Customer-monitoring-ware? Closed-source DRM? ) that they've linked into a GPL-derived work.
She thinks she's joking?
As soon as her finger is the key to anything of value, there will be people out there performing amputations. It's alreary happened in South Africa, where a businessman had his finger hacked off when the carjackers found out that his Merc was started by fingerprint not by key.
(He was probably lucky it didn't start by retina scan)
DNA is far worse than you realize. Your DNA may well turn up on the knife sticking out of the victim.
1. Because you handled it in the shop months earlier
2. Because the murderer has decided to muddy the waters by wiping the knife's handle around the rim of a used pint glass purloined from a pub where you were drinking
3. Because in 20 years' time the national DNA database will have "leaked" into the hands of organised crime, and DNA synthesizers will be available to the same criminals
4. Like 3, except you are not a random DNA sample, but someone that they have it in for. A witness, maybe, so no alibi, and of course you would be claiming some other dude did it ....
Fingerprints are even easier to plant, all you need is a fingerprint or copy thereof, plus some amateur photographer, electronics, and DIY kit. Photo to etched circuit board to silicone rubber on the fingertips of some rubber gloves in under an hour.
Nothing to hide, nothing to fear ... NOT.
It depends now much of the design is just replication of old fully-debugged stuff. If this is a bunch of P3-Celeron cores connected together with logic that's already well-tried on multi-core CPUs, it may be quite easy - little more than a matter of "joining the dots" with an interconnect layer. The more novel logic is needed on the chip, the more expensive it gets to develop.
The thing I'm wondering is why they don't or can't integrate some RAM alongside each core, because bandwidth between cores and RAM will be a bottleneck. Anyone know? Maybe the appropriate silicon process for RAM is incompatible with the process for CPU cores?
Are we sure, that paedophiles haven't already written a stealth child porn distribution virus? After all, they are the ones who will benefit if half the world's PCs web-caches become polluted, and the "it wasn't me, it was a virus" defense is thereby bolstered?
I seem to recall reading in the early days of computer viruses, that a Bulgarian dissident wrote an evil capitalist propaganda distribution virus, in the hope that the secret police would then implode in self-incrimination and doubt. Which subsequently happened, though for different reasons.
Lets see, a telly is on 4 maybe hours/day and in standby 20 hours/day. They're saying it has to use 50 watts less while it's on, saving 200W/h.
Now consider the standby consumption of a telly. And a DVD recorder. And a satellite tuner. And a games console. And a PC. And a Printer. Alarm clock radio. Audio system. Digital radio. Microwave oven. Duplicate systems used even fewer hours/day in bedrooms, guest rooms, etc.
My home power meter shows that standby power consumption is usually over 1 watt and often over 3, per device. Let's say 10W for the telly stack. 10 x 20 hours = 200W/h, again.
This is complete and utter waste. The TV's remote-control does standby for several years on a couple of AAA cells! They should have mandated that no device is allowed to consume more than a milliwatt when in stand-by, and not bothered about what it uses when it's on. Electronically, that would mean disconnecting the PSU completely from from the mains in stand-by, and running micro-power electronics off a rechargeable battery or large capacitor, which would recharge every time "on" was commanded. Easy to do that with an optically-coupled triac.
Once California had forced the manufacturers' hands into producing hardware that can stand-by on microwatts instead of watts, it would not be long befoe that same technology became universal, saving vast amounts more power all over the world. Chaper then to make just the one "greenest" version, rather than continuing to produce a different version to waste more electricity.
That should be, Rogue ATMS which *are* skimmers. ATMs operated by organised crime.
I worked this risk out very many years ago, and never use an ATM that's not installed in the wall of a bank. Who knows what modifications have been made to the innards of the ones in service stations and clubs?
Irving is an obnoxious nutter, but he has the same rights to privacy and freedom of speech as the rest of us.
It's somtimes justifiable to break a law for the greater good, especially if you are willing to publicise having done so and take the legal consequences as a form of protest. Does this apply here? Doesn't look like it. If they'd found a smoking-gun e-mail which made it clear that he doesn't actually believe what he's preaching, that would be different. Having not found any such, they should have quietly gone away.
"First they came for the communists, and I did not speak out—because I was not a communist;
Then they came for the socialists, and I did not speak out—because I was not a socialist;
Then they came for the trade unionists, and I did not speak out—because I was not a trade unionist;
Then they came for the Jews, and I did not speak out—because I was not a Jew;
Then they came for me—and there was no one left to speak out for me."
Today, "they" - western governments - have designs to record all our e-mails and phone calls. If these "hacktivists" are not roundly condemned for their actions regardless of the nature of their target, it plays right into these governments' hands. It'll be your e-mails next, and then other hard fought-for freedoms. And that is the slippery slope to the sort of government that only an Irving could want.
It really doesn't matter as long as it's ever so slightly bigger than zero.
If a spammer sends 150 million spams and gets just one idiot who responds with details of a bank account from which he steals £1000, that's a win for him. Especially if the bandwidth consumed by the spam was paid for by someone else, which is the case if it's sent out by compromised PCs on someone else's net. Ditto if he gets 30 orders for blue chalk-and-rat-poison pills at £33 profit on each.
I've read a theory, that this is why most spam is so lame. They don't WANT to snare any moderately intelligent people who might then create significant "heat" for them. They don't want to be sufficiently plausible to a man on the Clapham omnibus, that truly capable geeks with a hero complex set out to lure them into the arms of law enforcement. They want to ensnare only drooling idiots, who won't have a clue what to do next after getting themselves robbed or conned. This is why, for example, spammers hardly ever put their spam through a spelling and grammar check program.
It occurs to me that there is a solution for the neighbours trying to sleep, but being disturbed by vocalisations, that would cost a lot less than soundproofing.
Get a white noise source. 50dB of white noise will sound like silence after a few minutes, but will mask other quieter noises. It's sometimes used in open-plan offices to reduce the disturbance caused by conversations at other desks. Another example is the seventy-plus dB of fans in our server room - doesn't sound noisy when you are working in there, until you discover that the only way to have a conversation is to shout.
Those "natural noises" CDs (breaking waves, windswept plains, babbling brooks, etc) work on a similar principle - some people find them more relaxing than white noise. (The brain treats white noise as silence, and complete silence can itself be disturbing, so add the texture of a natural soundscape).
Wish I'd known this back in my Barratt box days!
What else can I say?
I once lived in a Barratt box. The wall between my bedroom and my next-door neighbours' was made of paper, or so it seemed. It was most embarassing to hear every intimate detail of their failing marriage that happened while I was trying to fall asleep.
These days I live in a Victorian flat with a proper double brick wall between me and nextdoor. It works.
Someone buy that couple a few boxes of high-grade acoustic tiles and a double-glazed window for their bedroom!
The problems get less if you can bring in more power fron further afield. For this we need a higher-capacity Europe, North Africa, and Middle-East spanning electricity grid. Thinking about wind alone, it's unlikely to be calm everywhere simultaneously. Then throw in tidal power (available only on coastlines, with its own different periodicities to the weather) and Solar power (flat calm in Southern Europe usually means peak sun, though it often means fog in the North where Winter sun is in any case scarce).
Pumped storage can also mean compressed air in large salt-dome caverns. Other forms of energy storage include hydrogen (from electrolysed water) and flow batteries. If the energy one is storing is fully renewable, then inefficiency matters only in economic terms.
I can't help thinking Wind power is ultimately a distraction. We ought to be building that grid, and investing in solar power generation in Southern Europe and the Sahara. In the UK, we need to build the biggest version of the Severn barrage - 15% of our electricity requirements from the tide.
It's not just NVidia crashing Intel's party, it's VIA as well - the latest VIA Nano 3000 CPUs look very promising on paper.
It it Intel or just imperfect journalism saying that the two-core Atom is for desktops (i.e. is there something preventing its use in notebooks and netbooks? )
They could make installation far quicker and easier AND save some cost on packaging, if they re-engineered this SSD as a PCI card . (The bus connection would be for power supply only. Data via SATA. Alternatively leave the SATA power connector alone and make the bus "connection" mechanical only)
Of course they'd still need to sell the disk-like variant for notebook systems.
I wonder which is the best mid-life kicker for 1Gb RAM XP systems with older (slow-ish) 40Gb and 80Gb HDDs - this drive, or upgrade the RAM to 3Gb? "Reg", it would be nice for all of us if you benchmarked this. The SSD has the advantage of being able to outlive the system you upgrade with it.
Biting the hand that feeds IT © 1998–2019