2369 posts • joined 10 Jun 2009
Oh, we know the nearby stars that will go supernova within a cosmic eyeblink. Betelgeuse (the nearest) will go sometime in the next 100 kyears or so. Other giants further away look even more unstable.
But a human lifetime is a tiny fraction of a cosmic eyeblink, and the exact timing of a supernova is probably not predictable to an accuracy of years (not until it passes the point of no return). The giant bloated star gets less and less stable, contracts, finds there's still something left to fuse, gets hotter, expands, wobbles on the edge like that for hundreds of kyears. Betelgeuse has contracted 15% since we've been able to observe its size (about a century). Maybe, just maybe, it really has run out of fuel this time!
(Betelgeuse is just far enough away that it won't do serious damage to the Earth when it blows. Only just far enough. Halve the distance and there would be Consequences. Halve it again and we'd be staring at a near-future extinction-level event)
(And spare a thought that if there is a $DEITY out there, maybe It is working really hard at making sure that there are no gamma-ray bursters popping in our direction in any "nearby" galaxies. If that is the case I'll thank It should I ever meet it, and I quite understand why It doesn't have any time for our smaller concerns! )
The galaxy is reasonably transparent in infra-red, it's just visible observations that would be blocked. (And of course it's almost completely transparent to neutrinos and gravity waves ... read the article! )
And if you realy want to boggle, google hypernova or pair-instability supernova. These are rarer events. It's hypothesized that a pair-production supernova is such an efficient converter of many suns' worth of mass into energy, that it would't leave any black-hole remnant.
Buy HP stock?
We techies understand this better than most. If all goes according to plan, HP is a must-own investment.
Except, Memristor tech is about the only bright spot for what's otherwise an uninspiring behemoth of a company. What are the chances of HP coming to grief before it can capitalize on its Memristor know-how, or some eejit in HP management selling the immature technology for about a thousandth of what it will be worth long-term? Six years is a long time on the stockmarket.
(In my dreams, HP never bought Compaq, never divested its instruments division, maybe did divest its printers division, and it's the old HP now developing Memristor tech).
Re: Nice for the enterprise but what about for toys?
You mean other than 3Tb USB3 Hard disks?
I doubt they'll be devices in the first instance. I think with Memristor tech, the primary MRAM will be tightly integrated with the CPU and NIC, built onto the motherboard if not into the CPU assembly for bandwidth reasons.
(And we're going to need better than gigabit networking to the desktop! )
predicting the outcome of 5 more years of bleeding edge technological development seems fanciful.
Ever heard of Moore's law? Not quite up with the Higgs Boson, but still one of the most startlingly correct predictions of future progress.
Once CMOS and VLSI fabrication were working, Moore's law became inevitable. It was based purely on the physical scaling laws. Prior to CMOS, engineers talked about the "smoking hairy golfball" problem: a CPU with the sort of performance we take for granted today, would have to be the shape of a golfball because of the speed of light, hairy with wires connecting to it, and smoking because there would be no way to get many kilowatts of heat out of it.
But CMOS scales at constant power per area of chip, so Moore's law worked until the minimum size of a channel became limited by the discrete size of atoms. We're pretty much at that limit now, with Flash and Intel's FIN-FETs.
Going back even further, you might want to search our a lecture given by Richard Feynman "Plenty of room at the bottom".
Re: Biiiiiiig Changes
I suspect it's possible to store enough power in a supercapacitor to allow for the wiping of (say) the 16Gb of MRAM that's playing the role of DRAM, that wants to be wiped on power failure for security reasons. Certainly enough to secure-wipe a much smaller area dedicated to storing crypto-keys. OTOH I also wonder whether such a wipe can ever be proof against theft of the hardware and application of serious data-recovery hardware. (Really secure sites never let used hard disks off site. They smash them to small pieces, then dissolve the pieces in Ferric Chloride! )
Incidentally if you worry about someone with physical access to the hardware, it's possible to recover a surprisingly large amount of information from DRAM minutes after its power is failed. A data thief can usefully remove DIMMS from a system and plug them into a data-copier. (And then test a mere few billion possible bit-strings as crypto keys for stuff in the hard drive, with a significant percentage chance of success).
Re: "showed off a wafer"
Probably it's some sort of failure. During R&D the yields of working product are often low, to say nothing of the experimental designs that don't work at all.
Re: O RLY?
The difference is that this one is working in the labs and there aren't any obvious reasons why it can't go on to development and production. If we didn't already have flash and DRAM it would already be on the market. Because it's got to compete with established technologies, it'll have to stay in the labs until it's a demonstrably BETTER product for at least one value of better. During which time we'll all gain confidence that it doesn't suffer from any unexpected premature ageing processes.
Things could still get delayed. It's even possible that there's an iceberg out there. But my expectation is that this will be the biggest breakthrough since CMOS.
There's less reason to customize a PC as systems become more and more integrated (and Intel's on-board graphics become sufficiently capable for a greater range of tasks). This plays into the hands of Dell and suchlike who have economies of scale, and against companies like RM who build exactly what you want.
Not sure how you define a UK system builder. but rackservers.com is UK-based and also sells custom desktop PCs.
Re: we don't test for smell
10 Scared skunk
11 Thioacetone monomer [ http://pipeline.corante.com/archives/2009/06/11/things_i_wont_work_with_thioacetone.php ] (A must-read. Would you believe that less smells worse? )
Re: I hope this means their quality won't drop even further...
It may be too early to say SSDs are more reliable. That they less often die a week after purchase is true, but that just reflects no moving parts. The other side of the coin is that an SSD may be far more likely than a hard disk to go from AOK to brick "just like that" with no warning of any sort. I'll have a better idea 3 years hence, because ....
We're now specifying SSDs for desktop systems where the user stores nothing locally. However, for storing multiple Terabytes, there's no alternative to using an HD (or an array of HDs).
Re: Quality drop
That has NEVER happened to me with a Maxtor or a WD.
Wouldn't have bothered to say "batch problem" yet again if you hadn't mentioned that defunct manufacturer Maxtor. Theirs were the only drives that I am convinced were truly defective by design, not just one batch. I suffered multiple failures on multiple sizes and models of drive purchased over a 2-3 year spread and decided never to buy another of their drives if I had any choice. That their finances collapsed and they were bought up for a pittance shortly afterwards confirmed my opinion of them.
And now I find someone who thinks Maxtor was wonderful. Well I never.
Re: I hope this means their quality won't drop even further...
Funny: I've seen exactly the same comment about Seagate drives. And HGST and Toshiba drives.
What's really going on is bad luck and (especially) bad batch problems. EVERY manufacturer will occasionally and inevitably ship a batch of product containing one of a batch of faulty components. If you buy a batch of drives, or a batch of PCs containing drives from the same batch, then if you get two or more premature failures in the batch, you should assume that the whole lot are likely going the same way. In particular, complain in writing during the warranty period, requesting that all the implicated drives are swapped. The vendor will almost certainly refuse, but should your fears be realized you can later prove that the merchandise was not of saleable quality, and that you pointed this out BEFORE the warranty expiry but were given the brush-off.
No. It can't sound better. The codecs used in DAB introduce non-harmonic distortion into an audio signal. For music, this is ghastly. Far worse than hiss and crackle, far worse than mere harmonic distortion.
why don't they just plan to retire dab in the long term and replace it with internet streaming?
Coverage and bandwidth. Do you think 4G will ever be available on unclassified roads in deepest Dorset or the Grampians, five miles from the nearest hamlet? (And even if road coverage were to reach 100%, what about farmers and walkers miles off anyroad). Do you think there will be enough bandwidth to support any encoding that doesn't introduce so much non-harmonic distortion as to turn music from a source of joy into a source of pain?
Re: DAB Bashing
Perhaps someone will come up with a converter box that rebroadcasts a single channel on a FM frequency, over an area of a few tens or so square meters.
But it'll sound even more crap than a pure DAB radio. And it'll do nothing to address DAB's other weaknesses. It won't get you a signal that doesn't keep dropping out on the move (es[ecially in rural parts), and it won't give you the battery life of an FM radio if you want to listen somewhere that a mains or automotive electricity supply is not available.
Re: Fine without the possessive
Well, what would normally be meant is "the users who I have primary responsibility for supporting" as opposed to some other group of users. In exactly the same way, a manager is likely to say "my staff" meaning the part of the organisation for which he has primary managerial responsibility.
Why does it annoy you so much?
Re: Great expectations...
But why use the PI to do it?
Because - see "androgynous cupboard" below - it's cheap enough that if you break it, you don't need to cry. A new one won't break the bank. That's particularly important if you are interested in tinkering with hardware.
But even if "all" you have done is accidentally nuked the hard disk of a PC while trying to repartition and install Linux alongside Windows, there is lots of work to get it back to how it was and maybe lots of data you will never get back. Lots of expense also, if you didn't appreciate the importance of making some factory restore DVDs on the day you unpacked it (assuming you did get to unpack it yourself). If it wasn't strictly speaking your very own PC, but a shared resource, people will be blaming you. Not a nice place to be, especially not for a kid.
Re: Mind your language
Personally I'd say that teaching any language that needs a compiler as a first language is misguided and verging on cruel. Any scripting language is better simply because there's so much less of a learning curve to climb. No edit-compile-run infrastructure. No debugging of address / overwrite errors. If you're unsure of something, just try it in immediate mode at the command line and see what happens. Python is particularly good because it is also a very well-structured language (c.f. BBC Basic or, gag, Perl).
AFTER a kid has mastered Python is the time to explain that compilers generate code that runs maybe three times faster if you have a problem for which that actually matters, and what you have to give up in order to gain that speed. As computers get faster and open-source Python libraries of other people's numerical codes expand, such problems get rarer.
If a kid can't "get it" even in Python, a career in something other than programming beckons!
Re: About the cost.
You also need a case, a decent display with HDMI input, a power supply, some SD cards, a USB hub, a keyboard, a mouse, a network cable long enough to get to the ADSL modem, and a fair amount of desk space.
Don't need a case (or can improvise with cardboard or grown-out-of Lego). Keyboard and mouse can probably be scrounged for free or liberated from a council tip. Display maybe likewise, or jack it into any half-modern TV. SD card, you can probably scrounge an old small one from anyone who does photography, or source it for pennies on E-bay. Desk? No, floor or lap will do if nothing else available. Next to the router, if needs must and you really can't get hold of a long enough network cable.
The rest of the argument is a little bit stronger. Low-level programming is always going to be a minority interest. But now it's no longer restricted to a minority of a minority - wannabe hardware hackers no longer need parents who can buy a PC and unusual interfaces, and another one, two or more when your kid's attempts at homebrew electronics let the magic smoke out.
If you have a kid with hacker nature, the kid will be like a duck in water and you won't have to explain anything much, just answer the questions and encourage him. If your kid doesn't have hacker nature (probably 19/20 kids), trying to teach, explain to or encourage her will just put her off programming for life.
Re: How friggin awesome is evolution!
It can't be evolution - if the poison kills you, then you can't breed in a protection against it, therefore this is 100% proof that evolution does not exist.
I should think that how this evolves is something like this.
To start with, the mice attacked scorpions the same way mongooses attack cobras. Very carefully. They're faster and smarter than their prey, so mostly they get to eat it rather than die. The less nimble ones get dead more often, and the species gets nimbler. Note that all hot-blooded species have a fundamental advantage over cold-blooded ones in the morning, before the cold-blooded ones have a chance to re-warm their bodies from the overnight chill.
Sometimes when battling or eating the prey, a small amount of venom gets into their bloodstream. Enough to harm but be survivable. Genetic variability means that for the higher doses, some survive and some don't. Gradually, genetic tolerance of the venom gets bred in. When a sting ceases to be lethal 100% of the time, risk-taking mice gain an evolutionary advantage over the original extremely risk-averse mice.
And so by slow and gradual evolution, you arrive at mice that laugh at scorpion venom.
I'd guess that the final stage is that mice which feel less pain when stung have a slight advantage over ones which do. Pain is distracting, so they more often get caught by larger predators if they suffer pain? That would create an evolutionary bias in favour of "hard" mice.
One other thing: ability to acquire tolerance of venoms by an individual is near-universal in mammals, including humans. It's the standard immune system response. The immune activity degrades a foreign protein before it can fatally degrade the host's proteins, and is then primed to mount the same response faster and better next time. Immune responses are passed (or primed) from mother to child via breast-milk. It's an area we don't yet fully understand. There may be epigenetic factors at work, and Lamarkian evolution may not be completely discredited when it comes to inherited immune response. Just a thought.
One way to think parallel
One way to think in parallel is a spreadsheet!
Behind the scenes, a spreadsheet keeps a list (graph) of which cells depend on which other cells. Provided the dependencies are processed in the right order, you can recalculate each cell in parallel with many others. LibreOffice has the use of GPGPUs for calculation as a future goal.
Of course, most actual spreadsheets are not large enough to keep more than a single core significantly busy, but one might "code" for parallel execution using a program design methodology that looks a lot like a spreadsheet and it's underlying, automatically generated, dependency graph.
More generally, the problem is software. At present we do parallel coding mostly using sequential languages and the spawning of multiple execution threads. Other approaches are needed, with the computer doing the parallelisation, not the human programmer.
The irony is that nature worked it out a very long time ago. A brain is a parallel processing system par excellence, 10^11 processors operating asynchronously at something like 20Hz. But it's possible for an evolved system to be beyond the understanding of that system, and so far even insect brains seem to be beyond our ken.
And Embedded deep in VMS
Somewhere in the system timer interrupt code, back in the days when you got the source of the entire VMS operating system on Microfiche with every new release ...
; Does anybody really know what time it is
; Does anybody really care
; If so I can't imagine why
; We've all got time enough to die
(which is of course a quote of a lyric by , IIRC, Chicago).
Re: Old school assembly language programmer
Hardly. Old-school assember newbie who didn't know what a macro or a pseudo-op was.
and the macro or assembler would generate the sequence of XORs if that was the most efficient way to do it on your CPU. If the hardware vendor hadn't written it for you, any self-respecting assembler programmer would have written it for himself.
I once saw an enormous macro to implement
SETCONST register, value [,scratch_register]
which avoided loading a constant from RAM for about a thousand commonly and less commonly used constants on an early instance of what would later become known as a RISC architecture.
Dead vendor squatting?
How about finding a vendor ID for a vendor that long ago went out of business, and "squatting" that with openly allocated product IDs that the defunct vendor never used (and in light of its corporate death, never will).
Not perfect, but maybe the right way to force the issue?
the US "is not monitoring and will not monitor"
Probably literally true and 100% false at the same time!
Ie, they've sub-contracted it to some outfit that is non-US. GCHQ? Some wholly-owned tax-haven-based subsidiary of a Maryland holding company about which it is very difficult to obtain further ownership information (i.e. 100% indirectly owned by the NSA)?
Re: Reality distortion fields
Well, Microsoft started the trend of taking the superb (XP UI) and crappifying it (Vista/Win7, Win 8 ). Recent Apple OSX releases haven't been quite up to the standard of previous ones, so maybe they *are* following?
The great leader
... has taken note that should he ever decide to invade Australia, he should do so in Winter.
Re: Gravitational waves?
An orbital death spiral is exactly what will happen to every orbiting system in the universe, given enough time. "Enough" is a lot (hint: many powers of ten times the current 15-billion-year age of the universe). Gravitational wave emission becomes humanly measureable only where nature throws stellar or greater masses around in really tight orbits at significant fractions of the speed of light.
You aren't upset about the electromagnetic death spiral that will overtake orbiting electrically charged bodies at a rate many orders of magnitude faster, are you?
Re: Impressive work
It always seemed obvious that a GW detector near a big particle accelerator should be picking up feint but repetitive pulses when the accelerator was running.
The key number to bear in mind is the relative stength of the Gravitational force and the Electromagnetic one. Electromagnetism is about 10^42 times stronger. This is one of the most staggering numbers in physics!
In fact, you can deduce this from thinking about the everyday world. You can dangle a few kilogrammes on a fine thread. Attracting in one direction, the entire Earth. Balancing in the other direction, electrostatic forces between the atoms in the thread. Which are themselves mostly self-cancelled within the atoms (electron charge cancelling proton charge): the attraction between atoms is the result of small asymmetries of charge distribution, which is still sufficient to hold molecules and crystals together against the pull of an entire planet.
The same number is the reason you aren't going to observe gravitational waves for anything less than extreme cosmological situations where there's a star or a galaxy's worth of mass moving close to the speed of light.
Re: Limits of doubt
Not detected, therefore limited. Not enough so at this time to cast doubt on GR. Enough so to cast doubt on some models of galaxy formation (in particular, on the formation of multiple large black holes in their cores and subsequent merger dynamics of these hypothesized large black holes).
A thought I had was that the cosmic era of large black hole mergers may be over (i.e black holes in galaxy centres interact strongly enough that they merge into one within, say, a billion years of the galaxy forming. Since all neaby galaxies formed longer ago than that, mergers won't be happening any more, bar rare events such as colliding galaxies.
One might even be able to apply the Anthropic principle to this (would a universe with frequent mergers of large black holes in our cosmological neighbourhood be compatible with mamalian life on a planet's surface? I'm thinking about cosmologically nearby Gamma-ray busts. )
Re: This is why the UK Governments PV subsidy is stupid
It's quite possible to transmit electricity 1500 miles, with modern UHV DC technology. That's the distance from the UK to the Sahara desert.
It's also less bad than that 1500 mile figure suggests, because in practice you'd tend to displace Spanish electricity to France and French electricity to the UK.
The biggest problem is the (in)stability of North African states. Huge solar farms in the Sahara would be a massive capital investment, and we just don't have confidence that the natives wouldn't hold us to ransom or just scrag it. Morocco is probably the best bet, but also probably not good enough. Especially not South of the Atlas mountains.
However, I am surprised there aren't more Spanish olive farmers grubbing up their trees and planting solar panels. Not enough grid capacity? Hello EU, where are you when we need you?
Could we cede the Olive business to the Moroccans or is it too dry there?
Re: Windows has been essentially based on a VMS core
I think the truth is "inherited a lot of design properties".
But not from the VMS codebase, just from paying the same architect.
Anyway, it's the subsequent history that matters. Microsoft consistently put marketing ahead of securirty. NT 4 blew huge holes in the VMS security model. W2K blew some more. By the time they realised security slightly mattered (around XP SP2 time), it was so completely F*cked that mentioning VMS ancestry just made one feel like crying.
Re: Bewildering is right
Indeed, and if they stuck the Win7 GUI on top of the vastly improved OS that lies underneath TIFKAM and called it Windows 7.1 it would sell like wildfare.
And if they used the XP GUI and called it XP 2.0, it might even go like a wildfire
Re: Ashamed to say
On on the ones I use, control/alt/delete, sign out, hit CR, power button bottom right of screen. That's three-finger salute on keyboard, mouse click, keyboard, mouse again. BLEUGH.
Or you can cheat and just press the soft-power button on the system unit.
Nuke the perps from orbit?
If the USG spent a bit less on exterminating terrorists and a bit more on exterminating slime like this, the NSA might get better publicity. (And I do mean exterminate. How many many-years of human enterprise do these sub-humans waste in order to make a few bucks? I rest my case.)
Pace of a Continental Shelf on a work to rule?
Does this means if we wait a million yeare there will be a sudden phonequake and Windows Phone will overnight become awesome?
Re: reading what the press release said
I thought HGVs were speed-restricted to 56mph. I wish the bloody things could do 70mph, then you wouldn't get a bunch of cars in the outside lane of the motorway while an HVG at 56mph overtakes an HGV doing 55mph! (And then the gradient changes, and the one doing 55 speeds up to 56 ....)
Re: Stupid design
Are you willing to try tipping your 3.5 inch book driver onto its side while it is active transferring data? Repeatedly? Active is worse than just spinning. A friend lost 2Tb this way.
As for weight at the bottom, that won't help if something snags the cables. Indeed, even a flat USB drive on a table-top can be vulnerable to being pulled off the table by its cables if they get tangled with a vacuum cleaner or played with by a pet or a small child. But vertical is far, far more vulnerable.
Another triumph of marketing over simple mechanical safety. Give it a nudge or snarl its cables, and it'll fall over. What happens when an active spinning rust drive falls about six inches into its stable (flat) orientation?
Kiss your data goodbye.
Vertical is for paper books and cereal cartons, which aren't damaged by taking a tumble.
Re: @Pirate Dave RE: Always a PC
Where would a company go which had fully committed to Gnome have been when the developers chucked away the Gnome 2 interface because they were bored with it?
This is a good example for FOSS not against it!
Firstly, in the short term there's absolutely no reason to change what you've got on any particular near-future drop-dead date. Those of us running RHEL5, RHEL6 or the Centos or Scientific Linux free derivatives still have a fully maintained Gnome 2 environment, with guaranteed support for five years after RHEL7 ships.
Secondly, within six months of Gnome 3 hitting the decks, the horde of disgruntled Gnome users had fixed the problem in two ways. They forked Gnome 2 into a new project called MATE - the reactionary route. And they developed Cinnamon, a new UI overlay on Gnome 3 that was far less unfriendly to Gnome 2 fans - the progressive route. I'm happy to move to Cinnamon if / when my platform of choice (Scientific Linux) moves to Gnome 3. I've tried MATE and it works. I've stopped grousing about Gnome devs flouting OSS conventions (ie you do NOT forcibly tear up your user's old way of working, you DO fork a new project and find a maintainer for the old one), because it's gone from a huge annoyance to an irrelevance in under a year.
Thirdly, there were and are are other alternatives. KDE. XFCE. Many other other alternatives. Compare Microsoft's one and only one UI, that they tear up at a whim every few years. (Win 7 was a tear-up, Win 8 a shredder).
Finally, you have the source code. If the above hadn't happened because you were a tiny minority, you could still have maintained your chosen interface for ever, or paid someone to do that, provided your pockets were deep enough. You can't do that with Windows XP. Microsoft has the secret sauce and intends to burn it.
Install a Kill Switch?
One could (maybe) pre-empt them by going to a lawyer and swearing an affidavit to the effect that one had NOT received any surveillance requests, and intends to repeat this process periodically unless it becomes illegal to do so. Then post the affidavit in a public place (if that's not automatic for sworn documents).
If you do receive a surveillance request it becomes illegal to swear that affidavit (i.e. perjury).
If they order you to commit a crime ... ISTR the fifth amemdment guarantees one's right not to be forced to incriminate onself, and making an untrue sworn statement is most definitely criminal.
If they make it illegal to tell the truth on oath ... the entire legal system and rule of law collapses?
I think this is to copyleft, as nuclear weapons are to bullets?
Completely agree. 15" Laptop screens should be 1680 for bog-standard-cheap, 1920 for executive / professional. Think how little a (larger) monitor or TV costs. There's no excuse.
Re: I would love to have a UHD or 4K monitor @RandomHandle
But you couldn't resolve your pixels, because it was a cathode-ray shadow-mask colour tube. Say 0.25mm phosphor-dot spacing, 20 inches across a good one, 20 x 25 x 4 / 2 pixels = 1000 pixels. That divide by two is there because one pixel was a triangle of R,G,B dots. Yes, you got some degree of super-resolution on information encoded as luminance (a good match to your retina), so QXGA wasn't completely wasted. You can argue for /1.5 or even /1, but the display was no way as spatially clear as a 1920x1080 TFT. Analogue, not digital.
OTOH colour quality, for reproducing photographs, peaked with the last of the IIyama/ Sony/ Philips 25 inch vacuum-tube monitors and has declined since. On the plus side it no longer takes three people to manhandle a high-end monitor into place, or a meter-deep desk to support it and a keyboard.
And good big tube monitors cost a fortune back then, so a fair comparison is probably one of the newest 2560 or even 4K ones. These days you get 1920x1080 for well under £200.
Re: I would love to have a UHD or 4K monitor
Actually monitors grew to 1920x1200, and then technological convergence with TVs took away 120 of our pixels :-( :-( :-(
Re: Jobs in Security
Maybe you should have applied to MI5 or GCHQ?
(Or maybe you did, and can't talk about it).
Re: In the words of the great Dr. House: Morons ...
Interesting idea. How would you know that you were hiring a self-proclaimed brilliant hacker who never got caught, as opposed to a con-man with just enough technical ability to sound convincing, or an active black-hat trying to play you? You want references? Slight problem. The only references worth having are people who'll put your new recruit in jail as soon as you lead them to him.
And anyway, if he never got caught, how come he's willing to work for hire at all? If he's so very good, he's also retired on his ill-gotten gains.
BTW why would you want him *in* the corporate environment? It's his job to sit on the outside, being paid to tell you when he's able to exploit your systems, rather than exploiting them. It's *your* job to liaise.
A very old dilemma
So old it's proverbial: "a poacher turned gamekeeper". Or from even further back in time, "Quis custodiet ipsos custodes?"
Persistence is the key issue
I've no idea what the lawyers will do with this case. Probably, make a mess.
However, isn't the key point how long Google retains any sort of memory of what it "learned" by scanning my e-mails? (By which I mean purely statistical analysis thereof. Not forwarding them to the NSA, which is a separate issue, nor storing the actual e-mails after I delete them, which might actually be illegal under EU law).
I've no objection to Google delivering targetted ads based on a statistical profile that is forgotten over a week or two. If they continue to profile me cumulatively over months or (gods forbid) years, it starts to become highly intrusive. Most *people* can't remember in detail what I was doing this time last year, and that's including myself in "most people".
As for the ads, I never see them anyway. Because some advertizers insist on delivering visually intrusive graphics that make me feel nauseous, I block all adverts, and also all flash content.
Re: A split personality release
Glad to see it's not just me. I'm positively allergic to any UI which makes things wobble, warp, or otherwise animate without VERY good reason. I find even the applications bar on an Imac that warps larger when you poke it with the mouse, makes me feel nauseous. (Not because it's ugly, just because my low-level visual processing doesn't get on with it).
This is also the main reason I run Adblock-plus and Flashblock: to keep moving things that I don't want to visually process, off my screen. Their content (advertizing of greater or lesser relevance) is secondary.
And possibly the reason I still haven't bought a smartphone at all.
- Tricked by satire? Get all your news from Facebook? You're in luck, dummy
- Feature TV transport tech, part 1: From server to sofa at the touch of a button
- Google straps on Jetpac: An app to find hipsters, women in foreign cities
- Updated Microsoft Azure goes TITSUP (Total Inability To Support Usual Performance)
- The Return of BSOD: Does ANYONE trust Microsoft patches?