* Posts by Peter Gathercole

2924 posts • joined 15 Jun 2007

'Lenny': Debian for the masses?

Peter Gathercole Silver badge

To anybody who is still reading.

I'm not going to make another post on this thread after this one.

I don't think that Alexander and I are actually commenting on the same thing. I did not say that I don't use MS software. I said I try to avoid it where I can. But... I currently have 7 systems running XP, 2 running Windows 2000, and a couple running older versions of Windows. Hardly MS free.

This is mainly because they came with Windows. The rest of my family use Windows, except my daughter, who uses MacOS. I am their technical support department, so I get to see lots of Windows problems, with networking, printing, Office software and many other obscure problems, including the normal gamut of viruses and trojans. Linux definitely wins here.

I have not used Vista, I admit, but I would say that in this case 'better' is subjective. I read C very well, and I have looked at Genetic UNIX source (AT&T derived), BSD and Linux. I know much of the philosophy behind UNIX development, having worked inside AT&T and IBM. Generally speaking the code is very, very good in in all cases. I have not seen the Windows code, but from what I have heard and seen, some of the Windows code is not actually understood by Microsoft (one rumor I have heard is that IBM still provide support for some of the OS/2 derived code in the UI). Obviously, this is information by proxy, but what I've heard can't all be wrong.

Of course, Vista is supposed to be a significant re-write, and if the reported resource use is as bad as it sounds (even now), then there is something quite wrong in Vista. Do you think that Windows 7 would be getting as much exposure as it is if Microsoft had not finally recognised that at least the perception of Vista was flawed, even if they do not think the OS itself is.

My personal thoughts about Windows is that the design itself is flawed, in it's security, and it's resource use, and also in the way that users use it. Trying to make Vista secure broke applications left, right and centre, because apps expected to be able to write to strange parts of the filetree.

25+ years ago, Sun came up with a model for using networked computers where a system was never personal. If you used a NFS connected diskless workstation, or even a shared server, your environment moved with you. The systems could all be near identical, and you could log on to whichever one you wanted, and use it as if it was your home workstation. The UNIX security model needed almost no tweaking to make it work, even the split between users and administrators worked. All that was needed was a little segregation of system data into read-only, read-mostly, and read-write data. There was even an application deployment method that allowed you to install the software just on the servers, and have it used on the network workstations. There was even a model that allowed heterogeneous systems in the same environment. This was a sys-admin's dream.

Microsoft in the elapsed 25 years have not managed to come up with a model that works nearly as well. A Windows desktop system is still a Personal Computer in 2009, even with roving profiles, sharepoint, Active Director, and all of the other technology they have rolled out. This is because the basic design is flawed, and there is no point in building on cracked foundations. This makes it basically unsuitable for business, even though much effort has been put in to try to make it so. Just go to a desktop Windows system, log in, and see all of the junk left behind in the copies of all of the profiles of users who have previously used the system. And application deployment? After installing MS Office on every desktop, even with scripted installs, one would have thought that someone would have realized that something could be done better.

And what is being rolled out now? Windows 'Mainframes' accessed via Windows Terminal Server, or Citrix XenApp. Hardly progress. It's almost exactly like IBM's VM/CMS environment (in concept, I'm not suggesting that 3270 terminals ran a GUI).

The world has moved on, and I know that the UNIX NFS model is now dated, particularly the network security. But replace UNIX with Linux, NFS with Kerborised NFS 4, or GPFS, or even CIFS, and the model still works. Linux plugs straight in to this environment, and is bringing in new developments.

It is this type of design that I think is superior. It needs to evolve, and be pushed forward, and I think that groups like X.org are making this happen (the UI has needed a re-work for quite some time), but with the render extensions, and integrated GL in the Xserver, this is happening. Look at Compiz Fusion, and there is scope for UI's to be as pretty and as functional as anything Microsoft or Apple can push out. And guess what. Much of this is being done for free, often by people who code in their working life and contribute on their own time, so can produce good code. Debian is a good distro, and is proved to be so by being selected as the bas for so many other distro's.

This is turning into an essay, so I'll shut up now. To the moderator, sorry I had to put you through all of this. If it is too much, I'll not be too annoyed if you choose not to post it.

Peter Gathercole Silver badge

@Alexander again again

You have not read my post properly, nor have you answered my challenge. OK, the NHS refused to accept OO documents. You have not said why. Reading between the lines, and from experience, it is probably because although the documents looked OK on the screen in OO, they ended up being formatted poorly in Word. This is a fixable problem that also happens Word-to-Word if you end up with a different printer driver. Tell me if you can say that you have never changed the destination printer in Word, and suddenly found that the layout of the text has changed. If you havn't, then you have been very lucky. Try adding the Microsoft TruType fonts to OO and see whether it is better.

And with regard to training, I think that you have reinforced my argument. I said, and I quote "If your users will only accept MS Office, then OpenOffice will never do". As I said before, I was commenting on home users, and application agnostic users (agnostic in this sense that they are not tied in to an application). Your users were obviously tied.

And again, on the installation. I was proposing having Linux installed, and then explained why it will not happen all the time Microsoft are dominant.

I will admit to being averse to buying Microsoft products myself. And I admit that I do not like Microsoft's anti-competitive practices, as they are morally wrong, and may be illegal. But I do not deny them a position in the market place. Again, I quote from one of my previous posts: "If you want to run Windows, I'll let you. No big."

I question your comments about Microsoft's products being better than their contemporaries. OS/2 was definitely better than Win3.1, and at least as good as NT3.5. DRDos was better than MSDos4 Amipro/Wordpro and Wordperfect were better than Word. Netscape was better than IE. Linux is better than XP/Vista in almost all respects with the exception of application availability.

In almost every case, Microsoft was able to kill their competitors products by means other than technical merit, normally by threats, but also by loss-making pricing by cross-subsidy from other products.

I was not talking SME. I have said this several times, and I say it again. I'm talking SOHO, to use the acronym. SME's have other requirements which definitely make them non-application agnostic.

And I cannot see how ITIL has any bearing on this discussion, either for home users, or even for SME or Bluechip customers. The required procedures with associated documentation when correctly produced will be ITIL compliant regardless of the underlying OS. You could argue that the product documentation needs to be referred to, but Linux has documentation.

And what has six standard deviations from the norm on a distribution curve ("six sigmas") got to do with anything talked about in this comment thread? Do you really know what you are talking about, or are just using buzz words.

As I said. I use Ubuntu because I don't want to spend all my time fiddling. I state that it is possible, because I do it. It's no more difficult that XP.

It is not technical merit that makes system builders decide not to install Linux on new systems. It is the fear that Microsoft will remove them from the OEM list, meaning that they have to buy Windows at list price (£100+) rather than OEM price. This would make the cost of installing Linux very expensive in collateral fallout.

Microsoft can now give XP away, because they have recovered all development costs. They are not competing with themselves, as small systems can't run the bloat of Vista. What they have done, however, is a U turn, because XP was dead in their eyes, and they were trying very hard to make sure it died before UMPC's came along.

And I would clam down if I were you. If your lack of spelling is an indicator, you are stressed.

Peter Gathercole Silver badge

@Rex Alfie Lee

Don't go on at the community. Rant at the Wireless card manufacturers. If they did their work for Linux as well as they do for Windows (and I don't believe that any version of Windows up to and including XP had ANY wireless drivers at all), then every card would work under Linux.

It is not the Microsoft software that makes the wireless cards work it's that CD that rattles around in the box when you buy it. Most modern Linux distro's will, when installed off the generic install CD, support many of the common wireless chipsets. Not all, I grant, but definitely the Intel Centrino set, the Prism 2 set, and some of the RTL chipsets.

I must admit that I struggled with Ubuntu 8.04 on EeePC 701's. I don't know what they did, but you have to install a modded driver to get it to work. There is a thread on the Asus Community threads that tells you where to get it.

Peter Gathercole Silver badge

@Alexander again

I obviously can't comment on your pharmacy install, but I would expect that the reason why it failed was either the 'it does not look like Word and Excel' argument, or the 'It doesn't run all of the VB code I need' argument.

The first is a matter of training. If your users will only accept MS Office, then OpenOffice will never do. But I would be interested to see how the same users react to MS Office '2007, with it's new look.

The 'it doesn't run all of the VB code' is an example of a user locked into MS products, not the application agnostic users I was talking about. I have this issue in my work, where my work issued laptop has Lotus Symphony (a repackaged OpenOffice), but I need to be able to run a heavily scripted Excel application. So I can understand it, but it is not what I was talking about.

I have other personal experiences of this. My 14 year old son recently was asked to do a PowerPoint presentation for a piece of school work. I suggested he use OpenOffice (installed on pretty much every system in our house), but was told that it had to be done in PowerPoint, because that is what the teacher had demanded. This is an example of the MS monopoly becoming institutionally reinforced, for no good reason. And this is because of the special licensing agreement Microsoft have agreed with most education authorities (== almost free in most cases).

I capitulated, and bought a copy of MS Office Home and Student 2007, which I installed on his gaming rig (see, I run Windows in my home network). It made me feel dirty, as well as a bit bitter about spending money that was not strictly necessary, especially to Microsoft.

But for the average user, buying a system for home use, chances are that they will not buy MS Office anyway. Even at £50 (which is what I paid for Home and Student), this is a significant part of the £300-£500 they pay for the complete system. More likely, they will end up with some variant of MS Works. And even this will provides more function than they will need. I don't see what this can do that cannot be done in OpenOffice, and I use both OpenOffice and MS Office (and also several versions of Works).

Please understand this, I am not, nor have I ever in this comment trail, been talking about corporate or business users. But even if I were, I believe that OpenOffice can hack it as long as there is no other application tie-in. If you disagree that strongly, please give an example of what a user needs that cannot be done in OO. Merely being different does not count, however.

It is interesting you mention Tversity. This appears to be a freeware (for personal use) application, a model that is not that different from that which most Open Source software is published , although a little more restrictive. Yet you have difficulty with Linux and OpenOffice? I detect a degree of hypocrisy here. And it is a stated aim of the Tversity developers that they intend to support MacOS and Linux eventually. See the General FAQ. And for similar functionallity, maybe Elisa may be a better fit on Linux, as it supports DLNA, and is in the Ubuntu repository.

I really don't know what points you are trying to make. If you want to run Windows, I'll let you. No big. But I really feel that Windows is not essential, and a Linux like Ubuntu can currently fit the same space for many or most home users. I know this, because I am doing it. I know exactly how hard it is, and how much of my 1337 Linux haxor skills I have used, and I believe that an ordinary user could do it as well. Can you say the same?

Peter Gathercole Silver badge

@Alexander and William Henderson.

About the only thing I had to fiddle with for my latest Ubuntu re-install was the DVD codec's. And even that was not difficult, and only not installed initially because it is patent-encumbered. It picked up my wireless card, prompted me for my initial key, picked up address, gateway and nameserver from DHCP, already had installed Firefox that then installed the flash and other bits as required as it would have on Windows, and even discovered the printers in a similar manner to Windows. OpenOffice was configured out-of-the-box. Sound worked, as did USB devices, scanners, and Cameras. Without a single additional CD.

This was significantly easier than the last XP install I did from generic media, where I had to license and activate the OS, and install a whole load of drivers and apps from multiple CD's (including the wireless drivers CD). All with loads of re-boots in the process.

But most users don't do this. They unpack their system from its cardboard boxes, stick it on a desk, and turn it on. This could be achieved by vendors pre-installing ANY Linux distro just as well as they do with Windows. But at the moment, this does not happen. For a brief instant, it looked as if this may have happened in the UMPC and OLPC spaces, but Microsoft crashed those parties by changing their self-imposed rules, and extending the life of an OS they had been desperate to kill a few months earlier.

I state again that I believe that for an application agnostic user (i.e. one who does not need Word, Excel, Internet Explorer, PowerDVD et. al.) but can use OpenOffice, Firefox, and Totem, that Ubuntu installed out-of-the-box will satisfy them, and can be easier. I would state that a gamer does not fall into that category, as they want a particular OS locked application, their games.

For streaming multimedia, flash just works, and Silverlight (and WMP based delivery methods such as those deployed by SKY) fall into the category of artificial barriers raised by the vendors. Only Windows will ever satisfy that, even if Moonlight takes off (BTW, try using the ITV streaming service that even though it installs Moonlight, fails to work). Artificial barriers.

For media serving applications, you should try a packaged MythTV or use the Medibuntu repository, just like you probably would use Windows MediaCentre, rather than using a generic user-targeted Linux distro like Ubuntu or Debian. I don't know whether this would surprise you, but Tivo boxes run Linux under the covers.

OpenOffice does have it's own, internationally agreed document formats. It also understands .doc as well as several other proprietary formats. No prop needed there.

Oh, and by the way, William Henderson, I understand rootkits, and have followed many of the ways they are deployed, and most of them require very specific vectors to get into a system, which probably will not be on an end user's Linux workstation. I know it's possible to compromise a Linux system, but it really is not as easy as a Windows. And the information to fix these is not obscure, just rarely needed. I welcome the day when there is no need for copious tools and web-pages to protect and disinfect Windows systems, when they become unnecessary. They are only out there for Windows because the exploits are there. Supply and Demand.

And anyway, there are Virus and Trojan scanners for Linux. Clam, F-Secure, Kaspersky, Trend and Grisoft all have products you can install/buy, just like Windows.

Linux is not out there in the pre-installed space, because Microsoft threaten vendors who ship systems with Linux installed with withdrawing their OEM status for Microsoft products. This is not hearsay, but a matter of historical record (see recorded cases about Netscape, and Lotus and WordPerfect office suites). This is enough to put a system builder out of business in the current climate, at least until Linux is widely accepted. But this sounds like a circular argument to me.

So Linux is not being judged on merit, it is hamstrung and hampered by those who would lose out if it really became successful. This is why there has to be a unified front from those who know, and why Microsoft should be picked up on all of their anti-competitive practices.

OK, I'm off my soap-box for now, I'll wipe the froth from my mouth, and will pick up my coat as I leave.

Peter Gathercole Silver badge

Good god, are we really trying to stop Linux!

This comment trail is painful to read, and infuriating.

I am a self confessed geek, and have been for 40+ years, and also a 30 year+ UNIX user (and Linux since it has been around). I've supported UNIX systems at a source code level, and have worked in a major vendor's UNIX support Centre as one of the senior techies.

I have NOT got the spare time to fiddle with Linux to get it working on my day-to-day system. I do NOT want Windows. So, I use a major distro expecting this to do most of the hard stuff, and this has become Ubuntu. It's quick to install, supports pretty much everything on mainstream PC's and provides the necessary apps. for close on 100% of users who are application agnostic (not everyone, I know).

But I can tweak it if I want. It's close enough to Debian to allow me to put most .deb packages on. And I can compile the stuff up if I need it (e.g. the airprime module to speed up 3G USB dongles). If you want to use the bleeding edge Nvidia drivers, you can, but if you don't, what is in the repository will suffice. But I won't deny other distro's the right to exist, or peoples right to use them. Nor will I deny the rights of people to use Windows when they know no better.

It's important for Lenny to be produced, because it feeds through to other distros, but it is not the OS for the masses, and the Debian core team probably know that. Ubuntu could be, but the jury is still out, waiting for mainstream app. and game support. Normal users want stability, confidence that the system works, the apps. they need, and maybe a good update and patching process.

Also, it would not be the first time that a Microsoft product deliberately offered poor support or artificial barriers to alternative technologies. I used to use the Microsoft tools Virtual Desktop (on a work provided laptop, please note) which used to lose windows when switching between desktops. But only the Firefox ones!

To all of the Linux proponents out there, get with this message. We need a DOMINANT Linux distro for the masses. Stop squabbling in public, it's ugly, and makes people turn away.

Royal Navy warships lose email in virus infection

Peter Gathercole Silver badge
Thumb Up


Never mind. You can probably find everything in Ebenezer's War Surplus Supplies Emporium, including a couple of ruggidised PC's.

Left hand down a bit!

Lenovo shares suspended

Peter Gathercole Silver badge
Thumb Down


I agree. The old IBM branded Thinkpads, built in places like Greenock were rock solid.

I pick up my old T23, and still think it is a good piece of kit. My current personal workhorse is a T30 (on which I am typing this), which shows where things started going wrong, but is still quite nice. I have just been provided with a T60 by my employer, which I am beginning to hate. I mean, why put all the sockets on the side rather than the back?

Once Lenovo started adding their own ideas, they lost sight of what business laptops were all about, and just became another manufacturer.

Oracle tripped up by 'leap second'

Peter Gathercole Silver badge
Paris Hilton

time and reference

I'm sure that the reason why leap seconds are necessary is that the universe is only perfect if treated holisticly.

I don't believe that any scientist or mathematician is arrogant enough to be able to claim that they can take into account all of the significant gravatational objects that could affect both the Earth's orbit or it's rotation. I'm fairly certain that a large asteroid strike, volcanic erruption or earthquake could affect the rotation of the earth, and the solar wind pressure may perturb the orbit enough to introduce a measurable variance in it's orbit. And that's not to mention Andromeda.

So no, the Earth's orbit is not mathematically perfect.

And anyway, who told God that the rotation and orbit of the Earth should follow some nice fixed relationship. It's all a co-incidence.

Is Paris enough of a heavenly body to shake the world?

Windows for Warships™ reaches Royal Navy frigates

Peter Gathercole Silver badge

Open vs. Closed

For goodness sake. It will be something like 10 years before these ships will have another major refit, so will they be stuck with WforW for all of this time? I'm sure the Type 45s are expected to have a 30 year+ lifetime.

With Windows, the MoD are beholden to Microsoft, and have to negotiate extended support at whatever price MS want to charge (how long will Win2K have been out of support when the next refit comes)?

At least with Linux, as a supplier, you can fork the code, and take complete ownership of the product. The length of support becomes how long you are prepared to pay people to be familiar with it, and the security becomes as good as the people you employ. If they are good, you don't even have to rely on the community for fixes. I would have to check, but I believe that provided you keep a product in-house (or should that be onboard-ship), you would not have to release any modified source.

I suppose that one good thing is that if they are using commodity hardware, then it would be possible to drop in the latest Dell server into the rack, provided that you can get the Win2K drivers for the SATA disk and display adapters. Or maybe retrofit another OS.

Viacom to remove Time Warner's Spongebob Squarepants

Peter Gathercole Silver badge
IT Angle

Pay vs. Free TV

At the risk of starting a major flamewar, I think that some of the people commenting about reducing what they pay for TV should look at who actually pays to have new material created.

There are basically four models for paying for content.

1. Public funding

2. Subscription charging

3. Advertising

4. Charity/Sponsorship

Please note that there is a significant overlap between the last two (company sponsored programming). And I am aware that there is a divide between the content creators and the carriers.

In the UK, we are fortunate to have a publicly funded broadcaster who has money to commission good content (catagory 1). This means that some of the rating chasing necessary to attract the diminishing advertising revenue is not necessary, and gives the programme makers some leaway. This is what pays for much of the BBC's output, and a significant part of what appears on the free and pay channels Discovery and the UKTV channels carried on Sky, Virgin and Freeview/Freesat.

But the rest is a mismash, mostly catagory 2 and 3. If costs go up (as they do), then either advertising revenue has to increase, or subscription charges go up. Ditto loss of advertising.

But consider this. Let's say that the ratings on cable/satellite go down. The deals with the advertisers are based on numbers of views of the adverts, so advertising revenues go down. So subscription fees must go up (especially if ratings reductions are a result of lost subscribers). This is also what must happen if the subscriptions become more specific, allowing customers to select their exact package. The alternative is that the amount of newly comissioned material goes down, anfd all we get are repeats, which are cheap.

I'm sure the model is subject to tweaking, but if you want a good selection of new programming, which must include subjectivly good and bad material (everyone has their own preferences, like I don't like SpongeBob), then you must expect to pay for material you don't like.

Like may things, it is a compromise, a bit like democracy. Some of you just want something for nothing.

FSF throws sueball at Cisco

Peter Gathercole Silver badge

@Russel Jackson

I'm no lawyer, but I believe that GPL3 removes the requirement to publish the source for "system libraries", that include all OS libraries and runtime libraries for languages and tools like Python and Ruby. This is how I interpret the information explained at http://www.gnu.org/licenses/quick-guide-gplv3.html. The test appears to be whether the library could reasonably be expected to be freely available.

In the case of a port to new hardware, this test may fail, and may require all source to be published.

Of course , it is likely that the Linksys products were published under GPL2.

Peter Gathercole Silver badge


I'm not sure that I understand what the infringement is either.

As I understand it, CISCO can only have infringed the GPL if they have modified any of the GPL'd code, and put it in a product without making the changes they made to the source available. If this is the case, all they need to do is to provide, on request, the altered code to whoever asks for it. They do not even have to put it on a publicly accessible web site.

There is certainly no block on USING, say the GCC compiler, libraries and other tools to create a commercial product which is NOT covered by the GPL, provided that no modified code licensed under the GPL is included in the product. If this were not the case, then any software developed using GPL'd tools would have to be published under the GPL, a situation that is explained at some length in the information about the GNU Public License.

Linking against the libraries is allowed, but modifying the libraries, compiler or tools to produce the product without making the source changes available is not. I would assume that the process of porting to a new architecture which required library or compiler changes would require those changes to be made available.

I would expect that if the libraries were modified as part of the porting process to produce the product, then these changes would have to be published, although it may be a moot point as to whether using a modified compiler counts as included code. It would certainly be against the spirit of the GPL, but it may not break the license in the strictest sense. This one is probably for the lawyers.

Just my two-pennies worth, which btw. can be found in the right pocket.

Netbook SSD usage to fall under 10% in 2009

Peter Gathercole Silver badge

EeePC 701

I'm not typical. I know what I am doing, and I use my eeePC as an adjunct to my existing systems. I got tired of Xandros after the 2nd re-install (I'm sure there is something wrong with the UnionFS implementation on it), and put Ubuntu 8.04 on it. It works fine, and it is just sooo much easier to get out when I need to look something up on Wikipedia than my full sized laptop. I've also used it as a roving WiFi tool when sorting out problems with larger networks.

I even use it to vnc to my other systems, although the small screen size is a bit of a bind. I also had to install a flash-blocker in firefox, because multiple flash adverts can really sap the life out of the processor.

I generally carry around 8GB SD cards with ripped film and music on them. It's amazing how many will fit in your pocket. For this type of content, it is not really a problem swapping them around. I intend to keep using it as long as it works.

Broadcasters and ISPs cosy up for iPlayer on Freeview

Peter Gathercole Silver badge


I'm sure that the old license contained a phrase about "owning equipment capable of receiving a TV signal", but that predated broadcasts over the Internet.

I got involved in an Email discussion with Castle Communications (who issue and enforce the TV license) about computer based freeview adapters, and was shown the earlier quoted text, but it had not registered that this clause gave you a way out. I found out that you can use mobile (i.e. not mains powered) TV receivers legally anywhere as long as you have a TV license at home, but as soon as you plug it into the wall, it is illegal unless the location is covered by a license.

Remember, it is up to them to prove that you are using TV receiving equipment, and they do not have a statutory-right-of-entry to your house.

Mine has the TV license in the inside pocket, in case I am challenged when using my laptop (please note, ON BATTERY POWER) to watch TV.

HP breaks Japanese excessive packaging record

Peter Gathercole Silver badge
Thumb Up

Maybe not as good but...

I took delivery some years ago of some IBM kit, in a big box on a pallet.

We opened all of the smaller boxes that we recognized, and found that we were left with one moderately sizable one (about CRT monitor size). We opened it, wondering what extra goodies we had been sent, only to find it was completely empty. Apparently, it was to fill the box to keep the other stuff safe. This was apparently normal practice, and IBM had a range of empty boxes, all with IBM part numbers, to serve this function.

DVLA under scrutiny over penalty notice dating game

Peter Gathercole Silver badge

Government department?


The DVLA is actually an "executive agency", not a government department (the clue is in the name, the Driver and Vehicle Licensing Agency). This makes it responsible to the Department of Transport, but it is actually at arms length from direct ministerial control. The DoT and Parliment make policy and statute, and the management structure of the executive determines how this policy will be carried out (at least in theory).

This makes it interesting in the case of some of the security leaks from executive agencies, as in theory, the management of the agency should resign as being responsible, not the ministers.


Porn breath tests for PCs heralds 'stop and scan'

Peter Gathercole Silver badge


HFS+ may be proprietary, but code to look at HFS+, as well as pretty much any other non-encrypted filesystem exists for Linux.

Europe takes aim at BBC licence fee

Peter Gathercole Silver badge


I subscribed to Sky HD last year, and was appalled by the LACK of HD material on the Sky channels. Very little of it was actually HD, much was just up-scaled standard definition (is the Simpsons in HD actually better than SD?) And unlike the BBC HD service (which is FTA, even on Sky HD), I pay a £10 a month premium, as well as paying over-the-odds for being an early adopter (HD box £300, installation etc.) So, Sky are not doing the HD stuff out of the goodness of their hearts.

Compare this to the BBC HD output, and you will see that the BBC are actually commissioning much more real HD content than Sky. And the BBC have developed an HD delivery platform in FreeSat (and also HD FreeView), which other people can build the kit for (Sky and Virgin do not actually *make* their kit, it is subcontracted and then branded)

The real HD benefits on Sky HD appear to be for Sport (which I do not want), Movies (not Sky produced), and the HD documentary channels (a good part of which actually show content part financed by the BBC). I am actually thinking of DROPPING the HD subscription, although the number of Sky channels is improving at the moment.

My Sky subscription currently costs me £576 per year, which actually produces about 8 hours a week (estimate) of new Sky material, and the rest is access to many other channels not produced by Sky (much carrying BBC material - look who owns the UK* channels). My, the BBC license fee actually looks really good value for money, especially now that FreeSat is available as well as FreeView.

The reason Sky want the BBC shutdown is so that they will have a near-monopoly of TV delivery (also the reason why they are so antagonistic towards Virgin buying into ITV) so that any content worth watching has to be paid for, and to BskyB. It's all down to the money and profit.

I agree that the license fee is effectively a tax on owning TV receiving equipment, but why not just think of it as a subscription to whichever part of the BBC you use. It's still good value for money, even if you only use the radio, or news, or the FreeSat or FreeView delivery platform.

Asus to phase out sub-10in Eee PCs, says CEO

Peter Gathercole Silver badge

Xandros not up to scratch

Not sure what happened, but on my 701, I managed to fill the root filesystem. but no way could I free up space. Somehow, whatever I wrote on the root fs got copied onto the unionfs copy. Got to the point that the system no longer booted. Tried booting from a USB flash drive (which worked), but was just unable to free any space at all.

This was the second time I had Xandros screw up, and I am a long-term (10 years) Linux power user (and even longer UNIX), and could not fix it. Xandros appears to have a really strange startup process that I just could not get to grips with. Decided to put Hardy onto the system, and once the drivers were sorted, system works fine.

I like the idea of unionfs, but I just could not fathom what was going on. As far as I was aware, the read-only copy was supposed to never change, so you could restore the system to just-installed condition.

Xandros is probably fine for the simple mode, but I would not choose it as a general purpose Linux distro.

Windows 7 early promise: Passes the Vista test

Peter Gathercole Silver badge

With regard to Vista

OK, we all know that Vista Premium or above works OK when installed from scratch on a contemporary PC. I would hope that if a hardware vendor expecting to sell a system with Vista would spec. the hardware out properly to give a good user experience.

Where I have a problem is them trying very hard to quash support for the perfectly usable 3-4 year old systems that still run XP very well, but do not have the oomph for Vista. Are Microsoft in league with the hardware manufacturers?

I tend to not dump systems that still work, so have three systems running XP on AMD and Intel processors between 1 and 2 GHz and using 256-512MB of memory and modest AGP graphics cards. The kids use them for their homework. So, I buy a new HP printer, and find that I cannot load the drivers on an XP system anymore, even though the printer used to have an XP driver disk (I actually have two of the same printer). The reason? Microsoft have forced other vendors to withdraw drivers that were designed for XP using clauses in their Windows licensing agreement. HP even said as much. They do not carry the drivers on their support website, and provided a helpful sheet of paper with the printer saying that I should keep the XP driver disk safe (the one with my earlier purchase) as HP would not be able to provide the driver after a certain date.

Not sure how the climbdown over XP sales for UMPCs have affected this.

Also, how long are other software vendors going to be able to produce AV, firewall, and other required pieces of software that rely on XP libraries and DLLs. I'm sure that when Microsoft actually decide to kill off XP for good, they will attempt to make all necessary utility software writers to dump XP as well using similar clauses.

I understand that Microsoft may not want to offer full support for older OS releases, as they have a business to run, and providing support forever is just not profitable (but I bet the US military get 10 years support from product withdrawal). But to try to get other businesses to drop support just to make users replace usable computers is just wrong.

Generally, my kids want Windows (one of my sons was very upset the other week because I do not have MS Office to allow him to continue some school work at home), even though I am a committed Linux advocate. Microsoft are just using their dominant position unfairly where ever they can. I know I can get a student edition of Office, but that is still £85 or so. Open Office is free, so why don't schools use it? Because Schools can buy one copy of Office, and then use it on as many systems as they want (and teacher's own PCs as well) without extra payment, effectively free. What chance have other software vendors got?

Anyway, rant off.

Peter Gathercole Silver badge

@AC re. NT and USB

I think that you will find that NT predates Win95 by some years (even NT4 was about the same time as Win95, so predates OSR2).

Do you remember USB on Win95OSR2? Yes USB was in the OS, but you had to load 'drivers' for each device (it did not understand device families, so needed the USB ID for the device to be added), so plugging in a new memory stick or printer required you to put the driver disk in before you could use it.

I added USB support using vendor supplied OS extensions to NT4 on both Compaq and Dell PCs. Was not complex, and worked at least as well as in Win95OSR2.

Shuttleworth on Ubuntu: It ain't about the money

Peter Gathercole Silver badge

Free OS does not prevent paid-for software

Just because the OS is free, this does not mean that a software house cannot charge for its wares that run on that OS.

The various public licenses have clauses that state that you cannot incorporate free code in charged for software, but they normally also allow you to compile against libraries from systems, and also to use the command sets in software.

So, if you have an idea which can result in a software package, it is perfectly possible to charge for that software. You don't have to contribute it back to the Open Community unless you have taken GPL or similar code and incorporated it into your package (and some of the licenses also allow a degree of that). But you can use GCC, Perl, PHP etc. to create the software including the libraries. I have wanted a HMRC certified small business payroll for Linux for ages, and would be prepared to pay the same for a Linux package as I would for a Windows package.

It is only where there is an already available good free package that you would have difficulty in selling your software. If you can sell a good DVD creation or audio editing package on Windows, you can do the same on Linux. Where is the difference? It is only people who do not understand the licensing model who think everything on Linux has to be provided free. Has the availability of Audacity on Windows stopped people selling audio editors on Windows? No,

I would agree that the proliferation of packaging tools is a problem, but you should really treat each distro as a separate OS, at least until a common packaging tool is agreed on. Until that time I will continue making the suggestion that Ubuntu is as good a candidate for the dominant distro as any.

Servers take dive in IBM's third quarter

Peter Gathercole Silver badge

@AC - WTF?

In case you had not noticed, consolidation in the services and hardware business has been going on for the last 30 years. Ever wondered where Tandem, Compaq, Digital Equipment Corp., Pyramid, Data General, ICL, Amdahl, Sequent, Nixdorf, NCR, Honeywell, MIPS (I could go on, the hardware market only really has three or four major players in the non-PC space). All of these have been subsumed by larger companies.

If you think that this has made it easier for startups, then you have a distorted view of the hardware market. Study what has happened to Transmeta or Inmos, who were new companies built around innovative products.

What now happens is that a good startup produces a good product, fails to get capitalized to exploit the product properly, and promptly gets bought by the larger players.

Just wait for IBM, AT&T, and Microsoft to start stifling new startups by leveraging their patent portfolio. It will be nearly impossible for innovators to even get started.

All that the financial mayhem will do is shake the market out even more.

The same will happen in the software market. Just how many companies have been bought by Microsoft, Oracle, IBM, SAGE, Computer Associates (spit), Google, Symantec et. al. because they are competitors or have products that are genuinely new.

I agree that nobody is producing good software for SMEs at the moment, but that will not change, it will just get more difficult to start. There are just too few people prepared to venture money at the moment.

I will state an interest here. I have been an IBMer for seven years (although I missed the blue blood transfusion - I kept to my UNIX roots), and currently get a lot of my work on IBM kit. IBM are not perfect, their software is patchy, rushed to market, and the quality has gone down in recent years, but you are not going to get government agencies, Blue chip companies or major utilities buying software from a startup. Their buying policy will not allow purchases unless companies have a history and a good credit rating. Sad, but true.

All that rationalization will do is to remove choice that will not been replaced. Besides, IBM is reporting GOOD figures (growth is growth), and is extremely unlikely to go under, as they are not exposed to the credit market, being cash rich! Far more likely is that Sun will go. IBM or Microsoft could buy Sun out with the change in their pockets, especially if their stock crashes.

I would hate the hardware market to be IBM, HP and the PC manufacturers. We need competition, and currently only large players can compete.

Renault looks to wee-hued windows to cut car power draw

Peter Gathercole Silver badge
IT Angle

@Horse loving AC

You miss out the rest-stops, maintenance (vet and farrier) bills, expensive garaging, requirement for fuel even when not in use, and basic stubborness whenever its advanced autopilot decides that it does not like fluttering plastic bags, flashing lights or even drain covers.

Also, are you going to invent the waste collection services, because I believe that before the advent of the car, major cities would be inundated with piles of horse sh*t all over the place.

Thunderbird 3 release has wings clipped

Peter Gathercole Silver badge

Thunderbird 3, wings?

Come on. Everybody knows that Thunderbird 3, being a space ship (and orange!), had no wings, just three atomic powered motors.

It's got Fanderson written on the back. Thanks.

Kentucky commandeers world's most popular gambling sites

Peter Gathercole Silver badge

.com madness

The issue here is that the responsible agency for the .com top level domain, as well as the registrars for sub domains are in US legal juristiction. This allows a US judge to issue binding rulings.

There ain't no way this could happen to a .uk or a .ru site.

I move that we make the US adhere to the rules that the rest of the world work by, and give them a .us domain (does it exist already - must check), and make .com a worldwide domain, under the control of the UN or some such organisation.

Oh, and by the way, make it so .co.uk is actually limited to registered UK companies (.co == companies, gettit), and have a .pers.uk, or some other non-business oriented domain for non-corporate entities (are you listening, Nominet).

Still, probably too late now, especially as all of the root DNS servers are under US control as well.

Anybody fancy setting up a new independent set of breakaway domains for a new Internet? I'm sure that it could be done as long as you don't need them to be registered with ICANN. I guess that the main problem would be getting the IP addresses for your new root DNS servers. Ho hum. Maybe when IP6 becomes widespread.

Oh, it's (ironicly) the US flight jacket style coat at the back on the right. Yes, the one with the torn pockets.

Sun's solar wind hits 50-year low

Peter Gathercole Silver badge

@Terrence Bayrock

Completely agree with your last sentence. I've been saying the same for ages.

Electric Mini spied in Munich

Peter Gathercole Silver badge

Manufacturing costs

I admit I have not studied the overall manufacturing costs, but I'm sure that I have read in a reliable source that the supply of high-energy light metal elements, particularly lithium on this planet is severely limited. And because of their properties (they are very reactive), they tend to be difficult to mine and refine, all of which takes energy.

It is OK to use the current prices at todays demand level, but if we are to have a battery powered transport system, I suspect that the demand for these metals, particularly lithium, will easily outstrip supply. Prices will jump, and the whole technology may become too expensive.

Currently, it is wasteful that battery manufacturers are making disposable lithium AA sized batteries (just look at the supermarket battery section). Because of it's chemistry, lithium is likely to become a much more important across the whole range of manufacturing. We must introduce a lithium recovery programme in the waste stream, and educate users not to throw them in the domestic waste.

BTW. Current LiPoly. batteries have a duty cycle of about 1000 discharge-charge cycles. This means that keeping a car used daily on the road will require the batteries to be replaced after 3 years, with progressivly poorer performance at the end of that time, like your Laptop (current charging efficiency quotes 99.8% for LiPoly. batteries, 0.998^1000=0.135=13.5% of original capacity at the end-of-life of the battery). Good news for the car manufacturers, bad news for residual car values. How does that factor in to the overall cost-of-ownership for an electric vehicle?

Of course, we could just wait until someone invents a better battery technology, but a chemical battery is unlikely to provide that much more energy than is possible using lithium without actually being dangerous (it's chemistry, stupid). I wish Shipstones and micro-pile fusion generators were not science fiction.

HP loads PC with nonexistent web browser

Peter Gathercole Silver badge

Sound familier to UNIXers?

Hey, guess what. HP have just re-invented the chrooted environment.

We've been doing this in UNIX land for 30+ years.

Nothing is really new nowadays. Especially my coat!

BOFH: Lock and reload

Peter Gathercole Silver badge


I was only wondering...

And yes, I have managed routers and terminal servers, but not for about 15 years! And even then, it was mostly telnet and latterly SSH. This is why I asked.

I probably have as much RS232 terminal experience as anybody, having been in tech. support for 25+ years, and having been SME on terminals whereever I was for much of that time.

I even wrote a complete (and I mean complete, every function coded as per the DEC documentation) VT52 emulation and a TEK4010/4014 emulation for the Beeb (did not do a VT100, as commercial products like Computer Concepts Communicator and Acorn Termulator came along that did a reasonable, although not perfect job).

Yes, Newbury, DEC VTxxx (40 through 420 including REGIS graphics), Wyse 50 and 60, HP2394, IBM 3151 (spit, bloody compatibillity [or lack thereof] cartridges), 3152, 3153, Beehives, TTY43, IBM Golfball Selectric typewriter conversions, Tektronix 4010-4125, AT&T 5412, 5620 BLIT and 630s, plus a huge number of compatible and near compatible terminals that I can not even remember. I could go through the BSD Termcap source and tick off having used or seen a significant number of the terminals listed therein!

And... I am still intrigued as to whether people still use that RS232 port on the back of the router.

Mine has the 1980 copy of the DEC Terminal hanbook falling apart in the pocket.

Peter Gathercole Silver badge

Does anybody still use...

...rs232 terminals, or even TTY emulators, to manage routers? Surely, SNMP, bespoke applications and web-based access rule nowadays. Can you even buy asynchroneus terminals anymore?

Oh well, must be the recycled stuff from the bin that they managed to re-use and sell back again.

I need my coat, it's raining here!

Open source release takes Linux rootkits mainstream

Peter Gathercole Silver badge

Virtualization vulnerabillity

The extra features that AMD and Intel have added to the processor set include a new privilege mode on the processor, and some extra commands to enter this mode, and some commands that will only run in this mode.

It is like the normal/supervisor mode that most CPU's for multi-user OS's have implemented for over 40 years (think IBM 370, DEC PDP/11). It has just added an extra level ABOVE the operating system, together with a super-supervisor mode, which should normally be occupied by a Hypervisor (which treats OS images in the same way that an OS treats processes). The concept is quite simple to visualise if you think of the virtualised OS images, or LPARs, or whatever you want to call them as processes, and the Hypervisor as the OS.

There is supposed to be guarded access to this space, both at a memory level, and at program level. An OS is supposed to be able to request a service from the hypervisor, which can then vet the request and action it (or not). The OS making the request should NEVER be able to inject code into this space, and also should not be able to write into the memory in this space.

The sort of things that can be performed in this space include memory mapping to the OS memory space, scheduling of OS images, and inter-OS communication (used for virtual network and storage devices). Often, the hypervisor can 'look into' the memory space of a virtualized OS, and can monitor all traffic being sent between OS images. Scary, really.

If it is the case that an code in an OS, or even worse, and unprivileged process within an OS can compromise this divide, then there must be a serious design flaw in the CPU archetecture, or possibly a problem in the default state after IPL. This, in my view, shoud be a reason to avoid using this technology until it is fixed.

BTW, the earliest example of a hardware based virtualization system I came across was probably Amdahl's mainframe Multiple Domain Feature (MDF), which I used first in about 1985, although there were rumours of IBM doing a hardware version of it's normally software based VM earlier than this. The System/370 Advanced Function archetecture had hardware assists to allow VM to work better, include memory keying. IBM's software VM system first appeared in 1972.

Nothing is ever really new nowadays.

Peter Gathercole Silver badge


Actually, running a command using runas does not lead to the same level of access as running as an admin account. I don't know the full details, or how it differs, but I have had two occassions (acting as the defacto Windows sysadmin for my family at home) when a command would not run correctly when run with runas, but did when the same admin user was actually logged in.

I think it has something to do with the inheretance of the privilege by processes forked from the top level process, but it gave me a lot of grief until I noticed it.

In addition, things like auto-installers for devices probably won't work like this until you reverse engineer the autostart process on the install disk for a device. I would hope that the service that notices new hardware does not run with Administrative rights, otherwise you could ownz any Windows box with a rooted device install CD. Not good.

@Tom. OK you MD5sum the kernel, and all the kernel modules, and all the runtime bound libraries, and all the commands that you might expect root to run. Where do you store the expected sums? On the system? Off the system? And how about the MD5sum binary. Is that inviolate? Things like Tripwire and AIX's TCB have been doing this for years, but honestly, once you start thinking about it, you end up with recursive arguments, unless you have some trusted runtime environment like that proposed by Nigel.

To all Windows users. Have you actually checked to see how much of your C drive is actually writable by ordinary users? Do you even know how to check on XP home, or even know what the ACLs mean? Whereever I have seen Windows locked down hard, I have also found things like Word not being able to exit, because it thinks it has to write some information to a template or somewhere you don't have access to. I can't quote specific examples, but It's happend to me.

You might be surprised to find out how easy it is as even a normal user to change .dll files that are used by other commands.

I'm sure later versions of Windows are better, but why has it taken so long?

Peter Gathercole Silver badge

@Gordon Fecyk

I'm not saying it can't happen. It is possible to engineer root access to a Linux or UNIX system that is managed remotely as long as there is a single vulnerabillity, even a non-root one. BUT, using root as infrequently as possible rather than having admin rights all the time (typical Widows user) must be more secure, even if it is only by degree.

All the time you have the concept of escalated privileged to perform some function, you have the posibillity of this being abused. This will NEVER completely go away, regardless of the OS, until computers are so locked down that you cannot change anything. So, make it so that you use the escalated privilege as little as possible. No version of Windows I have come across has taken this line, with the result that too many people HAVE to run as admin for their apps to even work correctly.

So even though this development is worrying, I am still slightly smug, but cautiously so, and with some respect for the skill of the people writing the rootkits. They are MUCH cleverer than most of us who mearly comment on the effects of their work. Pity about the script kiddies, however. But you cannot control information the way you can a physical object. Once it's out in the wild, its out, whether by design (Open Source) or by accident (leak). The vector makes no real difference.

Acorn alumni to toast tech pioneer's 30th anniversary

Peter Gathercole Silver badge


Not sure about your comment about the paged ROM area being faster.

The Beeb did use fast (for it's time) RAM, but I'm fairly certain that the ROM area was slowed down to 1MHz, while the processor clock was 2MHz (although this may only have been the early systems with the OS and Basic in EEPROM). The speed was required for the RAM, because the CPU and Display ULA had interleved access to the memory, so that both the display hardware and the processor could access the memory at (their) full speed without slowing down the other.

Where the memory was improved was by bank-switch the language ROM (8000-BFFF hex), and OS ROM (C000-FFFF) with the dispay. Acorn did this with the BBC B+, and Master 128 with the Shadow screen, but it introduced compatibillity problems with programs that did not use the OS routines for writing to the display. I think that Acorn copied the idea from either a Solidisk or Watford hardware add-on to the original Beeb.

But these systems never really reached the same popularity level as the original BBC B. Probably, it's time had just come. I still think that there needs to be an education system as accessible as the Beeb for our schools. PCs just do not engage the same degree of enthusiasm in kids or teachers.

Peter Gathercole Silver badge

One of my favorite subjects


Fast, flexible, fantastic.

Was blown away by an 8 bit micro doing 3-D hidden line removal wireframe graphics close enough to real-time to be useable for a game (Elite).

Not only a good teaching machine, but well made, with a consistent OS, and brilliantly documented. Modular, vectored OS calls, overlayed sideways ROMs. The way all home PCs should have bee made.

The only real criticism was that it had too little memory. When using mode 0-4, 20K of the 32K was used for the screen, with 3.5K used for various stacks, buffers, and character maps when using the Cassette fileing system, and an extra 2.75K used if using the Acorn disk filing system (DFS). Woe betide you if you also had Econet (NFS - though not the Sun offering), which took another 1K. Left you with about 5K for your program. Soon learned to turn off the fileing systems that were not in use.

And if you used ADFS, you lost even more. On a normal Beeb, you only really did this if you were running an Econet II fileserver, and you needed a 6502 second processor for that (yes, the Beeb could be networked even before it was popular to do so).

Terms to trigger nostalga. PAGE, RAMTOP, OSCLI, VDU, Fred, Jim, Shelia, OS 0.9E, OS 1.2, BREAK, Escape, Tube, 1MHz bus, Ferranti ULA, Teletext Graphics, Attacker, Snapper, Panic Attack, VIEW, and...

"A plastic flower in a Grecian Urn, Goodbye Peter, now press RETURN"

'Nuff said.

CERT: Linux servers under 'Phalanx' attack

Peter Gathercole Silver badge


One thing that needs re-enforcing is that unless you have a security hole that allows a non-privileged user across the security divide, it is just NOT POSSIBLE to install a rootkit on a properly run Linux system. Rootkits, by their very nature, need to alter/add code to the kernel, libraries or modules that are used to run the system. And this needs root permissions. This is why it is very important to make sure that you do AS LITTLE AS POSSIBLE as root on a UNIX-like OS.

There are a number of well known ways to try to subvert a user currently logged on as root, but a reasonably savy sysadmin should be able to avoid these (you know, don't browse the internet as root, check that your path does not allow commands from the current directory, make sure that there are no executable script files with world-write, don't read mail as root, keep firm control of the permissions on directories in root's path, all the simple stuff).

If a rootkit cannot be installed, it cannot compromise your system, nor can it get access to SSH keys other than the user it is running as.

Please note that I am not saying Linux is totally secure, there have been, and will be in the future, code and design defects which could allow a system to be compromised. I firmly believe, however, that the open source model allows such things to be identified and fixed much more rapidly than a closed source model. Couple this with an effective notification and patch delivery system, and Linux just is more secure.

Contrast this to Windows, where many people by default use an account with Admin. privileges, or with the security notifications turned off? Asking for trouble as far as I can tell.

But the amazing thing is that the UNIX/Linux security and source model is decades old (I've been using UNIX for 30 years, and the Bell Labs. UNIX V6 and V7 code, and all the BSD code used to be available for inspection and modification to the academic community and others for at least that long). And Microsoft (who, in fact, have been UNIX licensees for at least 24 years - they did the original Xenix port to various architectures, before spinning off the original SCO) just don't seem to be able to learn.

Peter Gathercole Silver badge

There are things that can be done

Once you can get access to a system, the whole security requirements changes.

There is no system in existence that will prevent apparantly authorised users from doing some damage on a system, but the degree of damage is what is important. Where Linux benefits is from a strong divide between normal and privileged access. Sure, if your private key AND IT'S PASSPHRASE are compromised, then someone can get access as you to your system. But this is just your non-privileged account, isn't it.

Of course, if lazy admin's directly access root using SSH, or have passphraseless SSH keys, or have sudo rules that allow them to cross the security divide without further confirmation, or store both private and public keys on their boundry systems, or use the same private key throughout their whole environment, then these fools deserve to get their systems compromised.

So here are the rules.

- Use a non-privileged account for initial access any system

- Su or sudo to obtain root access, but use additional authentication steps

- Don't use passphraseless SSH keys, unless you tie down what can be run (see SSH documentation).

- If possible, use hardware based authentication to secure private keys

- Guard your private keys like your life depens on them

- Don't store private keys on systems that do not need them

- Make sure that the permissions on your .ssh directory only allow your ID to see the private keys (I know ssh does some checks, but 0600 is best on files, and 0700 is best on directories)

- Use different keys in different parts of your organisation

- Consider using passwords with SSH (with strong change and strength rules) rather than SSH keys for very critical systems (really)

- Be careful about storing your private keys on shared Windows systems, or systems that have remote users with administrator access (consider portaputty and store the keys on an encrypted USB key).

- If you are really paranoid, regularly change your private and public keys on all your access boxes (please note it is NOT enough just to change the passphrase!)

If you follow these, then even if one of your private keys is stolen, then the amount of damage can be limited. As always, you can run a potentially secure system in a non-secure way. Security is only as strong as the weakest link, and this is often the sysadmins!

Oh, and by the way, if you want to see a file that cannot be seen using a hacked ls, try "echo *", or "find /etc -print". Or maybe use filename-completion in the shell. This is UNIX (or close to it) so there is nearly always more than one way to do something

Mobile broadband: What's it for?

Peter Gathercole Silver badge

@John again

"BT are the only ISP who can serve my house".

You sure? Normally if BT can serve an address, they will also offer the service through BT Wholsale. This normally allows other ISPs to provide service, even though you are using BT "Last Mile" infrastructure. This is even the case if the ISP is not able to install equipment at the exchange "because of space or power restrictions".

I know that Virgin Media (spit, hold out Holy Cross for protection) used to offer no minimum period ADSL contracts, although having checked the current smallprint, it looks like they now have a 12 month minimum contract.

Peter Gathercole Silver badge
Thumb Down

but have noticed that...

... I sometimes forget to complete comments!


Peter Gathercole Silver badge

@John and others

I've been doing 3 mobile on Ubuntu (6.06 and 8.04) for about 6 months, and can confirm some of your problems, but they can be worked around.

When used with the supplied Windows software, 3 Mobile hard-codes the IP addresses of the DNS servers. If you can get the HSDPA modem set up as a managed network, hardcode the 3 DNS servers to be put in when you start the managed network, and make sure that the option to use the provided DNS servers is off. You can use Locations to condition different sets of DNS and default routes if you also use your system on other networks.

The throughput is probably being throttled by the USB TTY modules, which have an effective limit of about 60KB/S. Google for info on the hacked airprime modules and the Huawai modem. This allows reads and writes in units of more than one character at a time, and allows a higher peak rate. I'm using the ZTE modem, which is another kettle of fish, which requires re-compiling airprime to put the USB ID's in the code. Then all you need to put the right udev rules in to load the airprime module rather than the USB TTY module.

I regularly get more than 100Kb/S, but have noticed that

BBC iPlayer upgrade prompts new ISP complaints

Peter Gathercole Silver badge

@Steven Raith about YouTube

Unfortunatly, a lot of an older system's resource (CPU/Memory) is taken up by the large animated looping flash adverts that YouTube now carries (ironicaly from Crucial, or maybe that is by design!) My EeePC701 used to play YouTube well, even at Higher Quality, but now stutters along. The newer releases of FireFox also appears to place a load on a system with this type of advert.

If you can find or make an embedded link for the video so that the adverts do not show, slower PCs can still work quite well. Alternativly, try full screen (I know this sounds silly). Or download the videos, and watch them offline.

I am happily running Ubuntu Hardy Heron on my EeePC.

Lag log leaks - Home Office contractor loses entire prison population

Peter Gathercole Silver badge

Securing data is not genetic engineering...

... sorry, rocket science is too simple now.

Here are a number of measures which SHOULD be made compulsary wherever government held information is used.

- Put a robust RFID chip as an integral part of each official USB Flash drive.

- Put Shoplifter type security (or even make it prevent operation of the turnstyles) on all exits in secure facilities.

- Do not use generic RFID tags, track specific tags (to stop someone identifying a secure USB device as the holder walks around a shoping center).

- Have Official USB flash drives tracked, and holders made responsible for their loss.

- Do not allow official flash drives to be held for extended periods.

- Have a specific process to allow tracked USB flash drives to be removed from secure sites.

- Change the USB ID on the official drives so that they do NOT appear as a generic storage device, so it becomes more difficult to read on ordinary PCs.

- Put the required driver on all systems required to use the official stick, and have it use automatic strong encryption as the data is accessed.

- Don't allow the specific driver to be installed on non-official PCs.

- Regularly rotate the keys on the specific driver and flash drives (this can be done with the flash drives by making holders regularly check the drives in).

- Clean all data from checked in flash drives when they are checked in to prevent people from using them as a backup mechanism.

- Ban the use of personal USB flash drives (or the use of phones or watches, or whatever else provides this type of function) from secure sites as part of policy.

- Disable the USB storage device handling drivers in all systems that can access private data to prevent non-tracked USB flash drives being used (I know this is difficult, but it should not be impossible, even if it means you have to put PS/2 keyboard and mouse ports back into PCs).

- Enforce the already existing GSI Security requirements for all government held data.

I'm not saying that this will make our data totally secure, but it would be a step in the right direction. It would prevent casual examination of misplaced devices. It would not stop a concerted attempt to steal data, but what would.

Very little of this is particularly complex or expensive, as most of the barrier security and procedures already exist in secure government locations.

BTW. This counts as Prior Art in the unkilely event that I am the first person to put all of these ideas together.

Scientists unravel galactic spaghetti monster

Peter Gathercole Silver badge

@vincent himpe

The being was the Great Green Arklseizure, you blasphemer. All of you unbelievers bow down and wait for the coming of the Great White Handkerchief!

Acer Aspire 8920G 18.4in laptop

Peter Gathercole Silver badge

Who in the world...

... wants a 'laptop' weighing 4.1Kg. Surely it defeats the object of being portable.

If watching Blue-Ray is your primary requirement for a laptop then this may be the device for you, but I'd book for the body-building course now.

Royal Navy plans world's first running-jump jet

Peter Gathercole Silver badge


Having recently watched the Sailor episode where the hapless Bucaneer pilot took about 10 attempts to land on the Audacious class Ark Royal (the one scrapped in the early eighties), it is clear that deck landings are always fraught with problems.

I don't see why a F35B would not be able to just go to full thrust, possibly bounce, and get back up to flying speed before running out of deck. The ski jump will not get in the way (at least in the CTOL design of CVF), because the aircraft will be landing on the angled part of the flight desk, and this will always have to be clear for a non-vertical landing.

UK.gov loses 29 million personal records

Peter Gathercole Silver badge


I take it you don't work in IT then.

If you do, then I hope you don't have a site disaster, because you will lose (really lose, not 'share') all of the data that should have been backed up OFFSITE.

It's all a matter of control and process rather than location.

National DNA database grows on the genes of the innocent

Peter Gathercole Silver badge

Can a DNA DB be kept safe

Do you think that whatever agency keeps the DB can prevent leaks, because I'm sure that Insurance companies, amongst others, would love to be able to screen health insurance applications against illnesses with a genetic component.

Also, think what scandals a complete paternity map of the UK population might show! What would the tabloids pay for such information.

Windfall taxing big oil: how to make the gas crisis worse

Peter Gathercole Silver badge
Thumb Down


I agree about the high taxes that the energy companies pay, but when demand causes the price to rise, who actually ends up with the extra money?

Obviously, everyone who takes a percentage cut will take a slice (including the taxman), but has the energy become more expensive to produce as a result of the extra demand? Not if they offset their own fuel costs. Are the wage bills immediatly higher? No. Are the extraction licence fees more expensive? Not immediatly.

I conclude, then, as a result of the high demand, the energy companies do get a windfall, as most of the extra money goes to them. This can only be a benefit to their bottom line. But should they be taxed higher? Probably not, as the taxman already gets a cut of the sale of the energy, and THEN takes a cut of the over-all profits in corporation tax. So the governments win without an extra windfall tax.

It's really just governments trying to fill the gaping holes in their bugets caused by their policies, or trying to score votes.

Biting the hand that feeds IT © 1998–2019