* Posts by Peter Gathercole

2953 posts • joined 15 Jun 2007

Police hit delete on DNA profiles

Peter Gathercole Silver badge

Re: What about

Quote "The Bill orders the destruction of DNA material in most cases where a person is not charged *or convicted* of a crime."

Does that cover it?

Microsoft's IE9: Don't believe the hype

Peter Gathercole Silver badge
Thumb Down

@Doug 11

Microsoft want to make IE a lever to sell Windows. As such, they need to differentiate it from Firefox, Chrome, Safari and Opera.

Unfortunately, with previous versions of IE, it appeared that their model was to make it significantly non-standards compliant but push it to become a de-facto standard so people who wanted a 'full' internet experience would be inconvenienced by the other browsers. Add a dependency between the browser and new versions of Windows (i.e. can't run IE9 on WinXP), and hey presto, you've made it so people with totally usable PC's need to upgrade to the new OS to use the 'features', adding to MS's coffers in the process.

They need to learn that this will no longer happen and indeed may backfire, so unless they embrace the standards, they will rapidly loose that lever, and will have to rely on other tactics in order to push Windows. I believe that they think that HTML5 including the H.264 codec protected by patents will be the enabler to bind people to Windows for another round of PC purchases.

Peter Gathercole Silver badge

Re: WINE ?

Yes, but they will not be running IE9 as AFAIK, there is no Vista or Win7 persona for Wine yet. What I've read appears to suggest that MS have made sufficient changes in the last two versions of Windows to require significant work from the Wine community to make it compatible.

Adobe Flash: 20m phones flip Steve Jobs the bird

Peter Gathercole Silver badge

It means exactly what it says

using the graphic hardware for operations that you would otherwise have to use the CPU for. This includes things like, but not limited to, scaling, texture fill, pixmap copying, shading, and may go up as far as complex rendering such as hidden object removal, surface mapping, and shadow and light-source calculation. Of course, much of this is unnecessary for Flashplayer, and having a hardware H.264 decoder built into the graphics hardware is probably what MPEG-LA and it's members (including Apple) are talking about.

What Adobe may be doing is adding direct support for certain mobile graphic chipsets directly in Flash. They can do this without the graphics hardware vendors direct support if the graphics API is well enough documented. Of course, documenting such things to sufficient detail is entirely foreign to some companies.

In the PC world, much of this is not really necessary because you have abstraction layers like DirectX or OpenGL with hardware vendor support, and can have hardware accelerated graphics while only writing to one or two well-known API's. As far as I am aware, there is no such abstraction for any of the mobile platforms, but I would be happy for someone to correct me if that is not the case.

It's official: Nokia bets on Microsoft for smartphones

Peter Gathercole Silver badge
Thumb Up

@James Hughes 1

I like to think that the moderator team enjoy us playing word games with them sometimes. I really was trying to be humorous myself, but maybe I was too subtle for some people.

If I manage to elicit a direct response from one of them once in a blue moon, then that makes all the comments I write worthwhile.

Peter Gathercole Silver badge

My word. What a reaction!

Sarah. While I don't disagree with your statement, your reaction seems a bit extreme.

I know that you never claim to be without any bias, but you normally leave it to us to be quite so loud. Does this post infringe house rules 7 or 9? I would hate to see the moderator banned from the forum!

Ah. Maybe you left the screen lock off when you went for coffee or something, and it was not Sarah who wrote this!

'Race against time' to find LOST TREES from the MOON

Peter Gathercole Silver badge

How do you tell?

These are tree seeds from earth, taken out and brought back. They may have been subject to vacuum and cold, but that is probably the same as tree seeds in a seedbank.

The only way they could be identified was if they had done an FF4 and become stretchy, stone like, invisible, or could burst into flame, or had been labelled and/or otherwise recorded.

This is such a non-story.

Nokia Digital Radio Headset DAB

Peter Gathercole Silver badge

You beat me to it

I was going to say exactly the same. Sadly, my iPod died, so my Rovi is currently sitting on a shelf unloved while I listen to music and FM radio on my Android phone.

I'll take issue with the review, however. Quote - "Indeed, the more robust signal you tend to get on the move also makes up for any step down in sound quality" - Where the hell did you test this?

Whenever I try to listen on the move, the dropout rate and the bubbling mud problem from varying signal strength makes DAB impossible to listen to, especially for spoken word programmes. Try listening to a comedy show when you keep missing the punch-lines. I know I live out in the sticks, but I also have this problem travelling down the M5 within 10 miles of a major city with a correctly installed DAB radio specifically made for a car.

With FM, you get a hiss or maybe some distorted output, but you can follow a conversation. Try doing that when you completely loose the channel for seconds at a time.

Don't get me wrong. I'm all for Digital Radio, and have five in total in the house and car, but until the transmitter power levels increase, I'll be listening to FM on the move.

BTW swisstoni. The delay is caused by the time it takes to digitise the audio at source and decode in your receiver. It's never going to go away regardless of what anybody tries to do. You could try buying a more expensive receiver, but there is no guaranteeing that it will have a faster processor in it, and would only address the decoding side. I'm afraid you will just have to learn to live with it.

Subdued RM says government cuts challenge its business

Peter Gathercole Silver badge

Never convinced about RM

Many, many years ago I justified and built a computer resource for teaching what was called Computer Appreciation around BBC micros, ECOnet and Acorn file servers in a UK Polytechnic. I managed to get so much for the budget, including colour monitors, robot arms, digitizer tablets, touch screens, printers, plotters, speech recognition and synthesizers, teletext decoders, video cameras, mice, lightpens, and a basic CAD systems (remember, at the time, an Apple ][ with a Bitstick was considered good). We also had representative word processor, spreadsheet and database products and a good Basic (natch) as well as Acornsoft Pascal. This was to teach people from all learning disciplines, not just computing students, what computers could do in the early 1980's before DOS/Windows PC's had achieved dominance, and also before people had Home computers.

The main Computer Unit poured scorn on our efforts, because they wanted to put in RM 480Z's backed up by a 380Z as a file server. Equipping the lab with what they suggested would have blown the whole budget on the computer hardware without being able to afford any of the peripherals, demonstrating that their systems were overpriced. And I don't believe that it could have been used for half of what we actually used it for.

I was very sad to see Acorn loose out in the education sector, especially when RM moved to MS-DOS based systems. The 80186 based Nimbus 1's were neither cheap, standard, or IBM PC compatible.

I just looked at their web site, and saw a device for mobile data capture, priced "from £3.75". I thought "Wow, that looks interesting and cheap. Maybe RM have learned". I then looked further. £3.75 apparently buys you an extension cable (basically a 3m M-F 15 pin serial cable from what I can see).

The actual device itself costs £170, plus £19 for the software (you would have thought they could have included that, bearing in mind that it is essential to use with a computer), and all of the more interesting additional sensors cost £50-80. Suddenly I'm not interested.

I'm glad I'm not working in education any more. The thought of hearding rooms-full of PC's would fill me with dread.

SCO: 'Someone wants to buy our software biz!'

Peter Gathercole Silver badge


SCO keep telling the bankruptcy court that if they are allowed to continue their claim against IBM, and win, that they will have enough money to settle the money they owe.

Unless someone convinces said court that the claims are fatuous, and will never be more than a way of paying lawyers, then the court *HAS* to listen, because they have a duty to recover as much value as possible for the creditors.

And although there have been rulings on the admissibility of the evidence and who owns the copyright, the original case is still outstanding. There may still be a small chance that there is a case against IBM because not all of the documents are in the public domain, but even if there is, it is unlikely that SCO would actually be able to collect any of the money.

I keep feeling that the case against IBM has to reach court and be finally ruled on, so that this whole mess can be consigned to the history books.

What worries me is that Novell are the ones who objected last time this UnXis tried to by the assets, but Novell is not the company it was. Microsoft had a finger in the pie when Attachmate bought Novell. If Microsoft had some way of suggesting to Attachmate that MS would pay more for the IP they bought in the deal if Attachmate/Novell would not object this time round, the whole circus could start anew (shudder).

Microsoft finally says adios to Autorun

Peter Gathercole Silver badge

It was used very often

to kick off a software installer. Even people like PCW used to use it for their cover-mounted CD and DVD's.

I've installed a recent HP printer, and it used autorun (the installation instructions did document how to run it without auto run, but it was phrased like "If the installer program doesn't automatically start, open the CD, and .....").

My significant other (worded to attempt to not to upset the Moderatorix) has some craft software that needs the CD inserted explicitly in the D: drive (and heaven forbid if your CD is not the D: drive), and the instructions for this expect autorun to work, and do not contain an alternative. I keep explaining this, and she keeps telling me that her computer is broken because the software does not start. Grrrrrrrr.

I think too many of the people commenting here are in the Windows support business, where they are in control of any software installation, and do not talk to home and SOHO businesses where simplicity and hand-holding is essential for people who just use computers as tools.

I can't be so old that this has passed out of memory, can I?

Council dives into crematorium-heated pool

Peter Gathercole Silver badge


That's exactly what I think. Contrary to belief in spontaneous human combustion, the human body does not burn well, and you need additional fuel to perform the act. That's why they build funeral pyres.

Seems perfectly sensible to me to recover the heat, and use it.

DEC founder Ken Olsen is dead

Peter Gathercole Silver badge

What I liked

was that the Program Counter was basically register 8, and that many of the jump, load immediate and return instructions were just special cases of other load and store instructions that you would use on other registers.

At one time, I used to be able to dis-assemble PDP-11 machine code without the book. It really was just such a regular instruction set that it was easy.

It used to be interesting to see just how C mapped into PDP-11 machine code. Often, like the case Keris quotes, a simple instruction like i++; would map into a single instruction.

Another innovation in reasonably priced computers that I believe was championed (although not invented) by DEC was the segmented address space that was implemented in such a way as to make it non-intrusive to the program writer, but also would allow different processes to have their own virtual address space independent from the physical memory addressing. It was this feature more than anything else that allowed multi-user computers to be created that allowed a user to screw up their own program without affecting the OS or other user's programs.

Peter Gathercole Silver badge

Although it was known as a PDP-10

they were labelled as DEC System 10 (and the followup, DEC system 20) running TOPS-10 and TOPS-20 as the OS. Real DEC sysprogs called them things like KI and KL systems, after their processor types.

IIRC, they were a bit quirky, having a 36 bit word length, but introduced the concept of a cluster with a fast interconnect. They had a thing called the CI bus, which was like an extended MASBUS that allowed you to connect systems together, as well as to Hierarchical Storage Controllers (HSC's) which provided shared disk between the systems.

This was adapted to become the BI bus for VAXen, which paved the way for VAXCluster.

First fondleslab found in 1970s kids TV sci-fi gem

Peter Gathercole Silver badge


I suspect that my post above will be the final, conclusive proof that I am indeed a nurdy geek, even after all these years. Almost all of the post came from memory, with me referring to Wikipedia only as a sanity check. After that feat of memory, it's a shame I can't remember what I saw on telly yesterday.

My coat is the Parka with the Boxtree Complete Gerry Anderson Episode Guide in the inside pocket.

Peter Gathercole Silver badge

Pedantic mode on

Sky 1 was the flying part of Skydiver 1 (there was more than one Skydiver - there are references to Skydiver 3). The Sky part was the plane, and that only had one crew member, the pilot, who was always the Captain.

The women on the -diver part did also wear string vests, but had substantial under-garments underneath. Not nearly as interesting as the Moonbase uniforms.

BTW. The televised version (at least the more recent ITV 4 screenings) of UFO Episode 1 had several scenes cut out when compared to the DVD, which included the conversion of Grabrelle Drake's duty jumpsuit to the mini-skirt recreational garb (while she was wearing it, mind).

IMHO, UFO was originally targeted as a series for adults, not children, and some of the episodes were originally never broadcast in day/early evening slots, but later in the evening. It was only the association of Gerry Anderson with puppet shows that made the ITV companies play it in the same slots as Stingray/Thunderbirds/Captain Scarlet (normally just before World of Sport on a Saturday morning, or at least that is when LWT and Southern Television played them).

This adult target explains why it was a much darker series than previous Anderson shows, and contained several episodes that I regard as not suitable for children at all. Definitely set me on the edge of my seat at the age of 9.

Peter Gathercole Silver badge


Seriously, a clipboard with buttons! No. I'm sure it was meant to be electronic.

Peter Gathercole Silver badge

That's uncanny, although form-follows-function makes it inevitable

I used to watch the Tomorrow People as one of my favourite shows.

They did not need 3G or any cellular technology, because they had an alien mentor from a more advanced planet (the Trig) who provided access to a limited amount of advanced technology to augment the still developing telepathic abilities of the group. They regularly used remote data gathering and communication devices, so this proto-ipad probably did not need 3G, and may have used a near-field technology, as it was only used in their lab.

In addition, they had an advanced AI called TIM who coordinated all of the technology, although it was not portrayed as techno-magic, and there were definite limits to what they could do.

On a related note, was Captain Kirk in ST-TOS not forever using a device, often given to him by Yeoman Rand (gotta love those uniforms) not an electronic device? I know it used a stylus, so was probably more like a Newton than an iPad, but still.

Super-thin materials could POWER our WORLD

Peter Gathercole Silver badge

Yes, planar

Graphine is a case of a planar crystalline structure I admit (I did mention planar form), but it is a special case, being just carbon with no other elements present. Looking up molybdenum disulphide (MoS2), and bismuth telluride (Bi2Te3) appear to me to show structures that do not have all of the atoms in the molecule arranged in a plane, especially when in crystalline form.

Of course, it may be that the boffins have found another way to form the crystals so that all of the atoms line up in a plane.

α-BN form Boron nitride looks interesting, though.

Peter Gathercole Silver badge


Quote: "Oxford and Dublin boffins have unlocked a doorway leading to more than 150 super-thin exotic nanosheet materials just one atom thick" and "boron nitride (BN), molybdenum disulphide (MoS2), and bismuth telluride (Bi2Te3)"

As these are not pure elements that are being talked about, surely they should be one *molecule* thick, not one *atom* thick, even if they are arranged in a totally planar fashion.

You would think that the "boffins" would know better. Or maybe they are really just "scientists"!

Superphones: A security nightmare waiting to happen

Peter Gathercole Silver badge

He's also ignored Handspring and Palm phones too.

Palm Treo's were able to surf the Internet years before either iPhone or Blackberrys were available. And although there was not an app store as such, it was possible to download free and paid applications directly to the phone.

3rd party add-ons also gave WiFi connectivity, although some of the non-phone Palm devices had it built in. And there were media and productivity apps available, and they were touch-screen devices.

I really wish there had been a Palm TX with a phone built in. That would have been an interesting device, doing much of what the iPhone became famous for.

Microsoft reinfects Chrome with closed video codec

Peter Gathercole Silver badge

@Surprised, I'm not

The decoder license is only part of the picture. Look at the MPEG-LA license with regard to using the coder part for commercial content. There is a fee payment due on each item encoded for commercial use, and IIRC, it's not pennies.

I'm fairly certain that things like TV-on-demand, adverts, commercial presentations, pornography and even free videos on an ad-supported site (think Facebook and YouTube) can be considered as a commercial use, so there is much money to be extracted from content providers. In order for the providers to consider H.264 over a free codec, they have to see almost a clean sheet of browsers supporting it at no cost to the end users. Otherwise they will not see it as an expense worth paying.

It is interesting that WinXP is being avoided. Probably trying to provide more leverage to get end users to fork out for another Windows license.

I would love to see a system for recycling transferable XP Retail licenses from scrapped systems, rather than them disappearing into the smelter with the system case. Anybody any ideas about setting something like this up before they all disappear?

Sky loses pub footy case

Peter Gathercole Silver badge


It all depends on the satellite. There are several different satellite systems in geo-synchronous orbit including Astra, Eurobird, Hotbird, Eutelsat amongst many others. Not all of their transmission footprints are the same, and some are better in southern Europe, and some in north/north west. For example, most of the working Astra 1 and 2 satellites cover most of northern Europe, but not southern Italy or Greece. Chances are the pub was using one of the other satellites which cover southern Europe, some of which are nor encrypted and can be picked up using equipment with the right dish, even though it is not intended for a particular region.

It really is an issue with the content provider and the broadcasting company, as has been pointed out in other comments. Sky cannot prevent someone buying equipment to point at an out-of-region marginal satellite, even if it does overlap with their paid-for service.

Apple clips publishers' wings

Peter Gathercole Silver badge

Unacceptable control

At what point will iFans finally understand that what Apple are doing is unethical. This must be close or past this tipping point, surely.

Virgin Media kills 20Mb broadband service

Peter Gathercole Silver badge

Clarity required in the article

Please remember that Virgin Media offer broadband services via ADSL as well as their cable infrastructure. This report is just for cable customers.

Scotland bans smut. What smut? Won't say

Peter Gathercole Silver badge

Consenting adults

Is it the consent, or the adult part that you think protects you from prosecution?

God forbid that you engage in sex with a consenting 17 year and 11 month old partner, and take some pictures, because although the act is legal, the pictures aren't!

The cost of not deduping

Peter Gathercole Silver badge

There is a trust issue in the examples in the article

"electronic copies of their HR contract and pension scheme guide"

If there were just a single copy of this type of information, then employees would have trouble making sure that what they agreed with when they started a job was the same as the current single copy. You would end up with needing an audit trail for changes to your single copy, with it's own complexity and storage issues. People keep their own copies as a point-in-time reference, to guard against change in the primary copy.

It's funny. Lotus Notes, which was originally sold as a document management system, was selling this "store a pointer, not the document" 15 or more years ago. This aspect of it's use has fallen by the wayside, but I'm sure that it is still in there somewhere. Mind you, using Notes as the glue for a mobile workforce (as IBM does) requires multiple copies to be stored in the hard disk of all of the laptops anyway, so the de-duplication benefits are negated.

Another thing is that you don't want to de-dupe your backup, at least not completely. You must have multiple copies on different media, because media fails. Enterprise backup tools such as TSM work using an incremental-forever model, meaning that by default, only one copy of each version of a file is backed up, but then has specific techniques to force more than one copy of a file to be kept on separate media.

I know I am a born sceptic, but I must admit to being unconvinced by the block-level de-duplication that is being pushed by vendors. Maybe I have just not seen the right studies, or maybe I'm already de-duplicating a lot of what I do using techniques like incremental-forever backups and single-system image deployment.

Maybe I'm just a neo-luddite. Who knows.

India's cheap-as-chips delayed by cash spat

Peter Gathercole Silver badge

£20 does seem a bit optimistic

but if you can build a basic mobile phone, with rechargeable battery screen and keyboard, and sell it (presumably with some profit) in the UK for under £30 (as seen at http://www.reghardware.com/2010/12/03/ten_essential_cheap_voice_phones/), then why should you think that it was impossible. Also look at the 7" Android epad and apad devices that are selling on ebay presumably at a profit to the supply chain for less than £80 at the moment.

I know that the phone companies probably make a bit of a loss on the phones, hoping to recoup it from the phone charges, but I cannot believe they subsidise it by a significant amount.

I doubt that the Indian government is going to insist on a device capable of running Windows 7 with 3D high performance graphics, a multi-megapixel display and a terabyte hard disk, and they will probably drive the profit element down as far as possible. If they are assembled in India, they may also be able to fudge or hide the labour costs, in the same way that they subsidise the railways.

What do you actually need for basic web surfing, email and a bit of text processing? Probably a <500MHz ARM, 256MB memory, 2GB backing store, a basic keyboard and mouse (assuming you are not going to use a touch screen) and a display of 640x480 or so. If the web surfing is intended to be government information (voting, census, tax etc) then they can control the web content and thus the requirements of the display.

So impossible at £30, maybe not.

Bot attacks Linux and Mac but can't lock down its booty

Peter Gathercole Silver badge

...and more

There are many more places than just the .bashrc (assuming you're using bash, of course, I prefer the AT&T software toolbox ksh myself). Both KDE and Gnome (and most other X11 Window mangers as well) have user startup directories and rc files to allow attacks on systems accessed with a GUI, and you would, of course, have the normal PATH and LD_LIBRARY_PATH attack vectors that could be used to subvert commands that people use all the time, and there are many more.

Linux is not immune from attack, it's just that an attack needs to do more things to really pwn it . For instance, if a user has iptables configured to control inbound and outbound traffic on a Linux system (assuming that the user does not run everything as root), you would have to engage in tricking the user to sudo a command, or otherwise obtain escalated privileges to alter the configuration or turn it off, unlike most windows systems.

There is no such thing as a totally secure OS, it's just more difficult to mess with Linux.

The OSX statistics in the article are a surprise, however.

Ubuntu - yes, Ubuntu - poised for mobile melee

Peter Gathercole Silver badge

Flash on Linux

I know that this is not a help forum, but I recently found out something revealing wrt flash in Firefox on Linux.

I was puzzled by the fact that on the same hardware, flash appeared to run much faster on a new Ubuntu install than on a system that had been upgraded. The same was true on a new install using a previous home filesystem.

I found out that there appears to be a firefox quirk left over from a previous way of installing flash, which ended up installing a shared object called libflashplayer.so in the ~/.mozilla/plugins directory, which over-rides the version of flash installed system wide. This meant that even though I had flash 10.something installed, the properties shown in a flash window showed 9.something. I even found that renaming the file to libflashplayer.so.save in the plugins directory still caused it to be picked up.

This screwed up BBC iplayer and many other sites that checked the version of Flash installed. This had puzzled me for a long time.

Deleting the file completely suddenly made flash work sooooo much better. My daily use system is a Thinkpad T30 2GHz Mobile Pentium 4 running Hardy Heron at the moment (I'm still having problems with KMS, suspend and ATI 7500 mobility graphics adapter on Lucid, and I only want to use LTS releases), and even this is able to make a passable attempt at most YouTube videos now.

I have another dual core Pentium E2200 system, which I think is clocked at 2.2GHz running Lucid Lynx, and that manages flash fullscreen without problems after similar treatment.

I think that everybody who experiences slow flash in Firefox may want to check whether they have something similar.

No court order against PlayStation hackers for now

Peter Gathercole Silver badge

Re: Withdrawal of service

Yes, but Sony would almost certainly have put in a "Terms and conditions can change, see the web site at ..." clause in the agreement, making it the customers responsibility to make sure that they were still in compliance.

Peter Gathercole Silver badge

Tom 7

It is quite clear in the Sky agreement. You do indeed own the box once you have passed the 1 year initial agreement. But without a Sky subscription, no matter which box you have, all you can do with it is watch free-to-air channels as they are broadcast in standard definition (with the possible exception of the HD BBC channels).

I recently had my Sky subscription dropped because my bank had cancelled the direct debit (long story), and I could not see Sky 1, Sky 2, Living, Sky movies or even Dave (which is available free-to-air), or any of the kids or documentary channels. Can't remember if the BBC HD channels worked. Just what you would get if you took the Sky card out (although the message on the encrypted channels was a bit more polite). What's even worse, I could not see anything stored on the hard-disk, even if it was from a Free-To-Air channel.

This makes the Sky HD box useless as a recorder without a subscription, even for FTA channels.

What is more interesting, as part of any 'upgrade' the Sky installer will probably want to take YOUR old box away. I'm not sure if this is in the upgrade agreement or not (I've not done one, I got my Sky HD box off ebay). This means that if you later want Sky Multiroom, you end up paying for another Sky box to replace the one you previously had!

I'm wondering whether we could challenge whether Sky have the right to restrict the recording function for FTA channels on Sky+ boxes without a Sky subscription? Anybody any thoughts?

Police reject Labour MP's call for Bristol-wide DNA test

Peter Gathercole Silver badge
Dead Vulture

"...or happened to be female"

Presumably, if they already have a sample to compare with, they must already know that it was a man. It is quite simple to tell whether a suitable DNA sample comes from either a man or a woman by the presence or absence of a Y chromosome.

Flame warning - Of course, if there was already a national DNA register......

Seagate sees big drive capacity jump coming

Peter Gathercole Silver badge

Longevity of data

is really what worries me. We've had a relatively golden age for the last 15 years or so, where any media that you could write to is probably still readable now. I certainly can still read CDs that I burned in the 90's, although it probably depends on how they have been left in direct sunlight.

I recently had a requirement to read some 'spinning rust' from systems I ran in the same time frame (one was a 1.1GB Quantum disk, another was a ~860MB Seagate disk), and the data was still there, still readable.

I have, however, found that older media, particularly floppies (5.25 and 3.5) from the same period or older are very much more a problem, with an almost 50% failure rate of those that I tried (including a whole set of precious and irreplaceable BBC micro disks).

I worry about how long an archived MLC or even SLC flash drive will remain be readable after being put on the shelf. I have already had various flash cards fail on me, which is not a problem for cards used for transient information (holding photos until they are loaded onto a computer, or copies of ripped music or video that is also held elsewhere). Throw them away, and buy another one, reload the data if applicable.

But this would be more of a problem if flash was being used as the primary repository of the information.

I'm not sure that disk is the correct solution either (especially as old style SCSI is pretty much dead, and EIDE interfaces disappear from new systems, replaced by SATA, and SAS and other serial technologies), but I predict that it will be more useful for archive storage than Flash memory. I'm purposefully ignoring tape, as this is now far too expensive for ordinary people, even though the remaining tape and drive manufacturers have a roadmap for data longevity.

Looks like we are fated to continue to re-write our important data forever as we move away from media with any significant lifetime. I think we will look back to days when books were on paper, photographs were on film, and music was on vinyl, all with a lifetime measured in decades, with some nostalgia.

Intel touts 'Sandy Bridge' video chops

Peter Gathercole Silver badge


I know that you wouldn't necessarily put one of these in every system, I was wondering if the encryption keys are in the Sandy Bridge processor, and the keys were mandated by the content provider's encryption, how would you use another processor/graphics card combo.

It's probably not actually going to happen, I was seeing whether anybody would bite to start a discussion.

Peter Gathercole Silver badge

Intel Insider?

So the processor will run a service to allow streaming of video content without the OS being involved? Because this is what the article appears to say!

I think that this is more likely to be media encryption keys locked in the processor, so those nasty Open Source people can't hack them to allow the content to be 'stolen' while it passes through the OS layers. This would enable the media to be encrypted all the way from the server on the Internet to the graphics hardware, and thence on to the display device. Sounds like Intel and the content providers got tired of waiting for the TCG to deliver bare-metal-to-application system trust, so have bypassed the whole OS stack, and large parts of the system hardware.

This does pose the question what if you want to use better graphics hardware than Intel provide? Still, I'm just speculating here.

Who will rid me of these obsolete PCs?

Peter Gathercole Silver badge

Unexpected results

I'll try to dig out a reference, but in the last year, one of the UK magazines (or it might have been the UK PC World online magazine) did some testing and found that putting overspec'd power supplies in systems actually reduced the power consumption. So, if you had a system requiring 450W, putting in an 800W power supply resulted in less power used than a 500W power supply in the same system. They published the measured consumption figures, and these showed a considerable difference.

It was reasoned that a power supply is most efficient towards the middle of it's rated capacity, and efficiency falls off as you reach the limit. In addition, the power supply is more likely to continue to cope even as it ages.

I measure that my 24x7 firewall, which is currently a AMD K6-II (remember those?) clocked at 550MHz only consumes about 85W measured with an in-line consumption meter, so older kit really does consume less and could be less than a 100W filament light bulb (and my 2GHz P4 T30 Thinkpad only uses about 45W even when charging at the same time as it is running). My kids recent gaming rigs draw more like 500W, though.

Don't think I would like to use the K6 system as my workstation, however.

Peter Gathercole Silver badge

I can only comment on the UK

and this inverse exponential is how I was told to run the residual value of my asset register by two different accountants.

Peter Gathercole Silver badge

Only in an LLU area

If I were to switch to SKY broadband, even though I would qualify due to my package, it is not available where I live. I can buy the paid service from SKY, but this is delivered using BT Wholesale just like every other provider in the area.

Peter Gathercole Silver badge

Windows license

You also have very restrictive conditions on the Windows license. Unless you can pass on all the documentation and original media, the Windows EULA does not allow you to transfer the Windows license. And what good to lower-income households are computers without Windows (yes, I am a Linux advocate, but I am realistic enough to know that most people currently don't want Linux, unfortunately).

I'm sure that this is often conveniently overlooked, but any organization involved in re-deploying old systems will not risk crossing Microsoft and their lawyers, and will avoid most old kit unless they are putting Linux on it.

Small biz calls for end date on enhanced 17.5% VAT

Peter Gathercole Silver badge

@JimC, it's not quite on profit, but Value Add

You've pretty much repeated what I said, but at no point is it actually related to any profit (in the tax sense) you might make. VAT is related to sales and purchases which I do admit is INDIRECTLY related to profits. In any case, your VAT registered entity is not paying, their customers are, and you are acting as the tax collector.

If you have ever filled in a VAT form, the calculation goes (this is simplified, because I am not considering cross-border VAT) as follows:

Net VAT=VAT charged on sales minus VAT paid on purchases

They also want to know your gross sales for the period and the value of any purchases, but these do not take any part in the calculation, they're just there to allow some form of sanity and fraud checking.

It's that simple, and if you think it through, you as an entity registered for VAT are not paying any VAT at all on your purchases as you offset it against the VAT you charge (it's offset at the point of paying it to HMRC, not actually paid and claimed back). Your customers (who might also be VAT registered and passing it on) are paying it to you, and this re-curses down until you reach someone who is NOT VAT registered, and they end up actually paying without any way of offsetting it.

When I was involved in running a company, I regarded it as a financially neutral operation, although I did begrudge the paperwork. The only thing that was not neutral is that if you paid your VAT quarterly, you could stuff the money in an interest generating account until you needed it to pay it to HMRC.

The point I was trying to make is that you don't actually have to be a very big business to have to be registered for VAT, and if you are, it is neutral to you, although not to your customers. £50,000 might appear a lot, but if you have four staff being paid full time at national minimum wage, then you must be a good way to the threshold in order to be able to pay them. Any form of shop or pub almost certainly has more than £50,000 annual turnover, as this equates to less than £1,000 per week.

As a result, your competitors are probably also registered for VAT, and in the same boat. It's only if you are competing against someone VERY small (under £50,000 PA turnover) that you will be at a disadvantage.

Peter Gathercole Silver badge

Only affects very small businesses

probably with a turnover of £50,000 or less. Over that amount, businesses have to register for VAT, and charge it on their services, and are allowed to claim back the VAT they pay on any business purchases against what they collect (yes, businesses acting as tax collectors for the government).

The only thing it will do, however, is make their prices higher, as they will have to increase the amount of VAT they charge, but this will also be the same for their competitors. It will also force them to update the processes they use to work out the VAT. This should not be too hard, as they have had practice at changing the VAT rate for the last two years.

The FSB do a good job on the behalf of small businesses, but this time I believe that they are saying something just so that they are not silent on the matter.

Garmin tells iPhone users where to go

Peter Gathercole Silver badge

May be free

but if I could have got updates (which are not available any more, having been dropped as a supported device), I would prefer to use TomTom Navigator 6 that I had running on my Palm Treo 650 using an external bluetooth GPS device than use Google Navigator on my Samsung Galaxy (the Treo really was a smartphone in it's time)

The problem is that the Google app does not provide enough information with regard to speed, time to destination and distance to destination. All I appear to get is time to destination and distance to the next change in navigation (i.e. junction), and even this appears quite arbitrary when in the country.

For instance, on the A396, there is a tight left turn in Exebridge which is counted as a change in navigation, so I can tell how far it is to that even though it is the same road, rather than when I reach the end of the A396. Not clever. And I still find it annoying that it calls the road things like the "A three thousand three hundred and ninety one", rather than a-three-three-nine-one. Reading the street names is clever, though.

I also miss setting the journey up in advance, rather than getting in the car and waiting for it to work out where it is before entering the destination. One time I was in the outskirts of a city, and had to detour due to a closed road, and it did not re-plan the route until I had managed to nearly get to my destination.

Never mind, hopefully Google will update it sometime to make it more usable.

Ford cars get draconian parental controls

Peter Gathercole Silver badge

Ford Popular

The follow on vehicle was a 650cc Ford Angular - sorry, Anglia, immortalised by being one of the first Police PANDA cars.

This could not go much above 60 either, and radio's were really a luxury add-on, as was sound-proofing of the engine compartment.

My first car was a second or third hand top-of-the-range 1976 Vauxhall shove-it - sorry, Chevette GLS which had semi-alloy wheels (steel rims, alloy centres), wide(r) tyres, velour seats, sound-proofing, body styling trim and (shock) a heated rear window all as standard, but no radio. A decent stereo radio-cassette was one of the first things I fitted, though, even though I had to dismantle half of the dashboard to get it in and drill a hole for the aerial in the wing.

It would do 80 downhill with a following wind, though!

Novell's Microsoft patent sale referred to regulators

Peter Gathercole Silver badge

...keep to yourself

The reason why consortia get involved with this type of thing is to prevent exactly what the OSI are trying to do.

By definition, a consortia cannot be a monopoly (which only implies ONE controlling interest), so the monopoly legislation in western countries cannot apply.

It is possible that you might be able to prove a cartel, but not at this stage of the proceedings, as cartels are normally challenged at the point where they fix prices or control access to a resource.

I first noticed this type of thing with the cross-licensing of IP in the TCPA, now the TCG to prevent monopolies organizations to look too closely at the end-to-end DRM in the Trusted Platform, which threatens FOSS on the very computing systems we use.

I suspect that you could probably quote the MPEG-LA and H.264 as another example of consortia controlling a technology to avoid allegations of monopoly.

UK.gov relaxes patent application process

Peter Gathercole Silver badge

I initially thought this,

but I changed my mind when I considered what the article actually said.

It is not eliminating the need to provide searches, which would be disastrous, but to allow the application to the EPO to use the previous searches that were provided for the initial UK patent application automatically, without needing to re-submit them searches to the EPO.

This will reduce the paperwork, and thus the cost to the applicant, without seriously reducing the protection. This would appear to me to be quite sensible.

The only downside I can see is that the UK searches will not have been against the EPO records, but I guess that it would still be necessary to perform those.

Ubuntu Wayland: Shuttleworth's post-Mac makeover

Peter Gathercole Silver badge

@ricegf2 - Posts after my own heart

I could not agree more with what you are saying.

Some people in this comment trail have been saying that the names of the UNIX/Linux filesystems are cryptic. This is not the case, as they all have meaning, although like all things UNIX, the meaning may have been lost a little in the abbreviation. I will attempt to shed some light on this, although this will look more like an essay than a comment. Please bear with me.

Starting with Bell Labs. UNIX distributions up to Version/Edition 7 circa 1976-1982.

/ or root was the top level filesystem, and originally had enough of the system to allow it to boot (so /bin contained all of the binaries (bin - binaries, geddit) necessary to get the system up to the point where it could mount the other filesystems. It included the directories /lib and /etc, which I will mention in more detail later.

/usr was a filesystem, and was originally contained all of the files users would use in addition to what was in /, including /usr/bin which contained binaries for programs used by users. On very early UNIX systems, user home directories were normally present under this directory.

/tmp is exactly what it says it is, a world writeable space for temporary files that will be cleaned up (normally) automatically, often at system boot.

/users was a filesystem used by convention adopted by some Universities as an alternative for holding the home directories of the users.

/lib and /usr/lib were directories used to store library files. The convention was very much like /bin and /usr/bin, with /lib used for libraries required to boot the system, and /usr/lib for other libraries. Remember that at this time, all binaries were compiled statically, as there were no dynamic libraries or run-time linking/binding.

/etc quite literally stands for ETCetera, a location for other files, often configuration and system wide files (like passwd, wtmp, gettydefs etc. (geddit?)) that did not merit their own filesystem. With all configuration files, there was normally a hierarchy, where a program would use environment variables as the first location for options, then files stored in the users home directory, and then the system-wide config files stored in the relevant etc directory (more on this below).

/dev was a directory that contained the device entries (UNIX always treats devices as files, and this is where all devices were referenced). Most files in this directory are what are referred to as "special files", and are used to access devices through their device driver code (indexed with Major and Minor device numbers) using an extended form of the normal UNIX filesystem semantics.

/mnt was a generic mount point used as a convenient point to mount other filesystems. It was normally empty on early UNIXes.

When BSD (the add-on tape to Version 6/7, and also the complete Interdata32 and VAX releases) came along (around 1978-1980), the following filesystems were normally added.

/u01, /u02 ..... Directories to allow the home directories of users to be spread across several filesystems and ultimately disk spindles (this was by convention).

/usr/tmp A directory sometimes overmounted with a filesystem used as an alternative to /tmp for many user related applications (e.g. vi).

I think that /sbin and /usr/sbin (System BINaries, I believe) also appeared around this time, as locations for utilities that were only needed by system administrators, and thus could be excluded by the path and directory permissions from non-privileged users.

Things remained like this until UNIX became more networked with the appearance of network capable UNIXes, particularly SunOS. When diskless workstations arrived around 1983, the filesystems got shaken up a bit.

/ and /usr became read-only (at least on diskless systems)

/var was introduced to hold VARiable data (a meaningful name again), and had much of the configuration data from the normal locations in /etc moved into places like /var/etc, with symlinks (introduced in BSD with the BSD Fast Filesystem) allowing the files to be referenced from their normal location. /usr/tmp became a link to /var/tmp.

/home was introduced and caught on in most UNIX implementation as the place where all home directories would be located.

/export used as a location to hold system specific filesystems to me mounted over the network (read on to find out what this means)

/usr/share was also introduced to hold read-only non-executable files, mainly documentation.

About this time the following were also adopted by convention.

/opt started appearing as a location for OPTional software, often acquired as source and compiled locally.

/usr/local and /local often became the location of locally written software.

In most cases for /var, /opt, /usr/local, it was normal to duplicate the bin, etc and lib convention of locating binaries and system-wide (as opposed to user-local) configuration files and libraries, so for example a tool in /opt/bin normally had it's system-wide configuration files stored in /opt/etc, and any specific library files in /opt/lib. Consistent and simple.

The benefit of re-organising the filesystems into read-only and read-write filesystems was so that a diskless environment could be set up with most of the system related filesystems (/ and /usr in particular) stored on a server, and mounted (normally with NFS) by any diskless client of the right architecture in the environment. Different architecture systems could be served in a heterogeneous environment by having / and /usr for each architecture served from different directories on the server, which could be a different architecture from the clients (like Sun3 and Sparc servers).

/var also became mounted across the network, but each diskless system had their own copy, stored in /export/var on the server, so that things like system names, network settings and the like could be kept distinct for each system.

/usr/share was naturally shared read-only across all of the systems, even of different architectures, as it did not contain binaries.

This meant that you effectively had a single system image for all similar systems in the environment. This enabled system administrators to roll out updates by building new copies of / and /usr on the server, and tweaking the mount points to upgrade the entire environment at the next reboot. Adding a system meant setting up the var directory for the system below /exports, adding the bootp information, connecting it to the network, and powering it up.

And by holding the users home directories in mountable directories, it enabled a user's home directory to be available on all systems in the environment. Sun really meant it when they said "The Network IS the Computer". Every system effectively became the same as far as the users were concerned, so there was no such thing as a Personal Computer or Workstation. They could log on on any system, and as an extension, could remotely log on across the network to special servers that may have had expensive licensed software or particular devices or resources (like faster processors or more memory), using X11 to bring the session back to the workstation they were using, and have their environment present on those systems as well.

As you can see, this was how it was pretty much before Windows even existed.

Linux adopted much of this, but the Linux new-comers, often having grown up with Windows before switching to Linux, have seriously muddied the water. Unfortunately, many of them have not learned the UNIX way of doing things, so have never understood it, and have seriously broken some of the concepts. They don't understand why / and /usr were read-only, so ended up putting configuration files in /etc, rather in /var and using symlinks. They have introduced things like .kde, .kde2, .gnome, and .gnome2 as additional places for config data. And putting the RPM and deb database in /usr/lib was just plain stupid, as it makes it no longer possible to make /usr read-only. They have mostly made default installations have a single huge root filesystem encompassing /usr and /var and /tmp (mostly because of the limited partitioning available on DOS/Windows partitioned disks). They have even stuck some system wide configuration files away from the accepted UNIX locations

So I'm afraid that from a UNIX users perspective, although many of the Linux people attempt to do the 'right-thing', they are working from what was a working model, broken by their Linux peers. Still, it's better than Windows, and is still fixable with the right level of knowledge.

I could go on. I've not mentoned /proc, /devfs, /usbfs or any of the udev or dbus special filesystem, or how /mnt has changed and /media, nor have I considered multiple users, user and group permissions, NIS, and mount permissions on remote filesystems, but it's time to call it a day. I hope it enlightened some of you.

I have written this from memory, based on personal experience of Bell Labs. UNIX V6/7 with BSD 2.3 and 2.6 add-on tapes, BSD 4.1, 4.2 and 4.3, AT&T SVR2, 3 and 4, SunOS 2, 3, 4 and 5 (Solaris). Digital/Tru64 UNIX, IBM AIX and various Linux's (mainly RedHat, and Ubuntu), along with many other UNIX and Linux variants, mostly forgotten. I may have mixed some things up, and different commercial vendors introduced some things in different ways and at different times, but I believe that it is broadly correct, IMHO.

Peter Gathercole Silver badge

Re. "X was a horrible project"

I agree that X was designed for a different environment than personal computers running a GUI on the same system, but to brand it a "horrible project" just goes too far.

Because of it's origins (in academia), it would be fair to say that X10 and X11, particularly the client side, was one of the first "Open Source" projects (along with the original UNIX contributed software products - many of which pre-date GNU). As such, it helped define the model that enabled other open source initiatives to get off the ground. But it suffered teething problems like all new methods, particularly when it got orphaned when the original academic projects fell by the wayside.

What happened with XFree86 and X.org was messy, but ultimately necessary to wrest control back from a number of diverging proprietary implementations by the major UNIX vendors (X11 never did form part of the UNIX standards). I don't fully understand your comment of reducing bloat, unless you mean modularising the graphic display support so you only have to load what you need, rather than building specific binaries for each display type, but that is just a matter of the number of display types that needed to be supported. X11R5 and X11R6 was actually lightweight by the standards of even X.org.

But I have said this before, and I will say it again. If you don't understand what X11 is actually capable of, then you run the risk of throwing the baby out with the bath water. It would be perfectly possible to keep X11 as the underlying display method, and replace GNOME as a window manager (much as Compiz does, and does quite well). This is one of its major strengths, and would allow us die-hard X11 proponents happy. If you use one of the local communication methods (particularly shared memory) you need not necessarily have a huge communication or memory overhead, especially if you expose something like OpenGL at the client/server interface. It's higher than having the display managed as a monolithic single entity, but I don't believe that any of the major platforms do that. There is always an abstraction between the display and the various components.

Having tried Unity and 10.10 netbook on my EeePC 701, surely one of the targeted systems (small display, slow processor) for several weeks, I eventually decided that it was COMPLETELY UNUSABLE at this level of system. The rotating icons on the side of the screen were too slow, and the one you needed was never visible leading to incredible frustration as you scrolled through the available options, trying to decode what the icons actually mean while they fly up and down the screen. It appeared very difficult to customise, and I begrudged the screen space it occupied. My frustration knew virtually no bounds, and it's lucky that the 701 did not fly across the room (note to self - check out anger management courses) on several occasions.

I reverted to GNOME (by re-installing the normal desktop distro), and my 701 is now usable again, and indeed quick enough to be used for most purposes including watching video.

I know I am set in my ways, but I can do almost everything soooo much faster in the old way. I fail to see that adding gloss at the cost of reduced usability and speed helps anybody apart from the people easily dazzled by bling. To put this in context, I also find the interface on my PalmOS Treo much easier to live with than Android on my most recent phone.

I'll crawl back under my rock now, but if Unity becomes the main interface for Ubuntu, I will be switching to the inevitable Gnubuntu distribution, or even away from Ubuntu completely.

Electric forcefield space sailing-ship tech gets EU funding

Peter Gathercole Silver badge

I can't see what prevents

the the wires just folding up in front of the payload. I read that the whole thing spins, so centripetal force will keep the wires extended, but unless the force is very even, surely the psudo-disk will start to precess as soon as the force becomes uneven (such as when tacking), and as the wires will not be rigid (at 25 microns, they could not be), they will just get wrapped up.

I suppose that you could say that the electric field is what is being 'struck', not the wires themselves, but I think that the small push would be transmitted back to the closest wire deforming it away from the disk.

In addition, spinning the construct would be an interesting exercise, as you would have to take into account conservation of angular momentum, and spin relatively fast when starting to deploy it and slow down as it extends outward, again, because the wires are so thin they cannot be rigid. And twisting it to tack...

The mathematics is beyond me (at least, without getting the text books out), so this is just a gut feel

Google revives ‘network computer’ with dual-OS assault on MS

Peter Gathercole Silver badge

@poohbear. Before that, even.

You really need to look at X Terminals from the like of NCD and Tektronix (and Digital, HP and IBM as well) in the late '80's and early '90's. These were really thin clients using X11 as the display model.

AT&T Blit terminals (5620, 630 and 730/740) terminals may also fit the bill from about 1983. You might also argue that Sun Diskless Workstations (circa 1982/3) were actually thin clients, but that may be taking things a bit far.

Biting the hand that feeds IT © 1998–2019