* Posts by Peter Gathercole

2100 posts • joined 15 Jun 2007

Home Office launches £4m cyber security awareness scheme

Peter Gathercole
Silver badge

Re: @Martijn Otto @Khaptain

There is a way to make users like the ones you indicate safe, but it means locking down their computers so that they can't install software, and are completely removed from any decisions about installing patches.

Whilst it would appear that Microsoft and Apple may be moving to that mindset, it is gathering some opposition from computer users, especially those who understand how things work.

I'm sure that there are other organisations that would like there to be this level of control, especially if they can recruit the vendors into installing other software components as part of the patching process.

The problem is one of balance between on-line liberty and security (and I'm not specifying whose!)

2
0

SCO vs. IBM battle resumes over ownership of Unix

Peter Gathercole
Silver badge

Re: I think Apple owns Unix now anyway @lars

Oh. Yes. I forgot about 32v. That was in the same announcement.

BSD/Lite was, as far as I understand, BSD 4.4 with AT&T code removed/re-written. I think, although I am prepared to be corrected, that is the reason why it was called Lite.

UNIX does indeed contain code written at Berkeley. The obvious example is vi, although it would not surprise me if the paging code had something to do with BSD. As I understand it, there were relatively good relations between the Bell Labs. people and the Computing Labs people as Berkeley.

The networking code probably has not, because AT&T took the Wollongong TCP/IP code, and re-wrote most of it to use STREAMS/TLI.

But it does not matter how much code cam back from BSD, because the BSD license is a very permissive one that does little to restrict what the code is used for, provided it is acknowledged.

It is other contributors (which will mostly be companies working with AT&T) that may be more problematic, but I guess it depends on the contractual relationship between them and AT&T. The best place to look is probably the copyright notices in the header files for each release.

0
0
Peter Gathercole
Silver badge
Boffin

Re: I think Apple owns Unix now anyway @lars

I would actually dispute that UNIX(tm) has ever been Open, as we would think Linux or other GPL code is.

Yes, UNIX code source code has been available, but only under license. Versions (editions) 1-6 were available to academic users under a very permissive license, but one that prevented commercial use. At the time, Bell Labs/AT&T were prevented by a US anti-monopoly judgement from supplying commercial computers, and this included Operating Systems. At this time, there was a thriving pre-Open Source group of academic users who dabbled in the code, and shared their work with others. This was a really exciting time (I was there), and you often found 400' 1/2" tape reels being sent around (it was pre-networks) various Universities.

Version 7 tightened this up to prevent the source from being used as a teaching example. Version 7 and earlier code has, since 2002, been published under GPLv2, granted by Caldera (Horray!). This is now "Open", but I don't know of anybody who is shipping a commercial V7 implementation (although a free x86 port is freely available from a South African company called Nordier Associates).

Commercial use of UNIX post Version 7, from PWB to UnixWare was under a commercial license that did not contain any right to the source code. The same was true for all other-vendor UNIX systems. Source licenses were available, but under their own strict licensing conditions, and at a high cost (and often required the licensee to have an AT&T source licence as well!).

BSD code prior to BSD/Lite required the user to have an AT&T version 7 (or later) license. BSD/Lite or later does not contain any AT&T code (or at least nothing that AT&T were prepared to contest), so is available under the BSD license, but as I have stated before, cannot legally call itself UNIX.

Having got that out of the way, why was UNIX used as the basis for Open Systems?

Well, UNIX was always easy to port. This meant that there were several vendors (piggy-backing on various academic ports, like SUN and DEC) who could sell UNIX systems, meaning that application writers have something approaching a common base to target their code, although differences had to be worked around. This was unique. There was no other large-system operating system around at the time that had this.

It became apparent that if there could be a standardised subset of UNIX (commands, APIs, libraries) that all vendors would support, then this could mean that application writers could possibly entertain a "write-once, compile once per vendor UNIX, and sell" strategy. This was first championed by AT&T (who by this time were allowed to sell computers and operating systems) with the System V Interface Definition (SVID), which was adopted by IEEE, with minor changes, as the various POSIX 1003 standards.

These standards are what gave UNIX the "Open" label. Anybody could write an OS that met these standards, whether based on genetic UNIX code or not. This has resulted in numerous interesting products and projects, one of which is GNU/Linux (POSIX compliant, but not any later UNIX standard), and includes such things as QNX, BeOS and z/OS, which can be regarded as UNIX or UNIX-like, some of which are truly open. Not all of these can be called UNIX, however.

I agree about the Linux kernel. The reason why this has remained as a single kernel is because Linus keeps an iron hand on the kernel source tree and official release numbers. It is perfectly possible for someone to take this tree, and modify it (and it has been done by several people including IBM and Google) under the GPL, but they can't get their modifications back in to the main tree without Linus' agreement. They could maintain their own version, however, as long as they abide by the GPL. AFAIK, they can even still call it Linux.

0
0
Peter Gathercole
Silver badge
Boffin

Re: This will only end when the case is ruled on @Vic

I think you're wrong. This is what I understand.

UNIX System Laboratories (USL) was set up as the home for UNIX as part of the SVR4 Unified UNIX program, and was joint-owned by a consortium of companies including AT&T. Part of the set-up was that all UNIX IP and code was not just licensed to USL, but the ownership was transferred from AT&T to USL. (I was offered a job by USL in the UK, and nearly took it, so I have an interest in this part of the history)

When USL wound itself up it got bought by Novell, and the ownership of all of the IP for UNIX went to Novell. This included all branding, code, copyright and patent information.

In 1993 or 1994, Novell transferred the UNIX brand and verification suites to X/Open (now The Open Group), and licensed the use of the code and IP to SCO, although through a contractual quirk (SCO not having enough money at the time), the copyright (and I believe that this includes the right to use and license the code) remained with Novell.

SCO then sold itself to Caldera, which then renamed itself the SCO Group.

The SCO Group then tried to assert ownership of the code and failed. This was one of the SCO Group vs. Novell (or vice versa) cases that was ruled on in Novell's favour. In parallel, SCO had engaged in campaigns of FUD and law suites against RedHat, IBM and their customers. These cases have never been concluded and are the ones that will not die, particularly the IBM one.

Novell was then mostly bought by Attachmate, although, and I quote from the Wikipedia article on Novell, "As part of the deal, 882 patents owned by Novell are planned to be sold to CPTN Holdings LLC, a consortium of companies led by Microsoft and including Apple, EMC, and Oracle."

I was never clear about whether this IP included any of UNIX, or if that remained with Novell. This is the bit I am uncertain about. If it went to CPTN Holdings, this is how it could be used, although looking at the agreement, CPTN's ownership of the IP is subject to GPL2 and the OIN licenses, which may offer some protection.

Confused? You will be after this years episode of SCO*

(* with apologies to the creators of Soap for the shameless paraphrase of their catch line)

Please, please! Whoever own the UNIX copyright, publish the non-ancient code under an open license. There's no commercial reason not to any more.

1
0
Peter Gathercole
Silver badge
Boffin

Re: I think Apple owns Unix now anyway @lars

You are so wrong in your suggestion that there is no AT&T code in AIX. Also, you are wrong about people wanting to pay for UNIX branding. Look at the Open Group website, and see which UNIX variants have been put through the various UNIX test suites (which costs quite a lot of money). IBM, Sun (as was), HP and Apple have all paid the money, and achieved the certification.

IBM has a SVR2 source license and AIX was very clearly derived from AT&T SVR2 code. It was not written from the ground up. I've worked in IBM and had access to the source code, and I have seen parts of the code that are clearly related to AT&T UNIX, complete with the required AT&T copyright notices. This was a long time ago (early '90s), but they were there.

For Power systems the current AIX can be traced back to AIX 3.1, released on the RISC System/6000 in 1990. AIX 3.1 itself was derived from the code that IBM had for the 6150 RT PC, and this was a direct port of SVR2, mainly by IBM but aided by the INTERACTIVE System Corporation, who had also worked on PC/IX for IBM. Reports of the Kernel (in places like Wikipedia) being written in PL/I or PL/8 refer to the VRM, not to the AIX kernel.

I admit that there has been a huge amount of code added in AIX over the years, but it is still a genetic UNIX. How much code is related? Maybe you should ask SCO. They've seen the AIX source.

The same is true for SunOS/Solaris. I was working for AT&T when SVR4 was released, and I can say with absolute certainty that Sun0S 4.0.1 was the same source code base (again, I had access to the source code) as AT&T's SVR4.

Sun were one of the principal members of UNIX International and the Unified UNIX programme that attempted to standardize UNIX in the late 1980's with AT&T, ICL, Amdahl and various other vendors long gone. I still have the notes from the developer conference. Prior to this release, SunOS 3 and earlier was based on BSD 4.2, with enhancements added from 4.3.

I am not so clear about HP-UX, but I know that HP had a direct UNIX V7 port on a system I'm sure was an HP 500 in the early 1980s, although I can't find any references (it was pre-Internet). Wikipedia says HP-UX was derived from System III. HP (and in fact IBM and DEC) were in the Open Software Foundation that was set up in opposition to the Unified UNIX. They had their own UNIX called OSF/1, which had a common code base that was taken from DEC and IBM versions of UNIX. The tension between UI and OSF was known as "The UNIX wars".

Time moves on, and of course there is no feedback from the vendors back into the main tree, so of course the different versions diverge, but I am sure it is safe to say that all three of these are genetic UNIXes, and they all have achieved UNIX branding at various levels. They can all be called UNIX as per the branding rules, but in this day an age, this is not really important. UNIX as a unified OS (much to my regret) is largely a has-been.

My biggest fear is that without some form of standardization (like the Linux Standards Base which is mostly ignored) Linux will go the same way.

0
0
Peter Gathercole
Silver badge
Boffin

Re: I think Apple owns Unix now anyway @peredur

There are nuances to this. Note that I said "UNIX(tm)" not UNIX-like.

Want to know the difference?

There is a set of verification test, owned by The Open Group (http://www.unix.org/), which tests a system for UNIX compliance. There have been several UNIX standards over the years, starting with SVID, through Posix 1003.X, UNIX 93, UNIX 95, UNIX 98 and most recently UNIX 03.

UNIX(tm) is a registered trade mark. Use of this mark to describe an operating system is restricted to those that have passed one or more of the test suites maintained by The Open Group.

OSX Mountain Lion has passed the UNIX 03 test suite. As has Solaris 10 and 11, HPUX 11i, and AIX 5.3 and 6.1. All of these operating systems can call themselves UNIX.

There are absolutely no Linux distributions that have passed any of the UNIX test suites, so legally, no Linux system can be called UNIX.

Two other quirks. There are no BSD systems that have been tested, so strictly, BSD is not UNIX (although there may be historical justification for BSD 4.4 and earlier) . But z/OS V2R1 (and some earlier versions) have been tested and passed against UNIX 95, so bizarrely, z/OS 2.1, an operating system that has little or no UNIX code in it can be called UNIX!

Now I don't know how many OSX systems have shipped in total compared to Solaris, HPUX and AIX systems, but in terms of new systems installed, I would hazard a guess that Apple are now shipping more OSX boxes than the other vendors are of their own brand of UNIX. And you can't count Linux.

This is why I said what I did.

1
0
Peter Gathercole
Silver badge

Re: The code allegedly ported was written by IBM in the first place

@__________

If you are talking about JFS, then the original implementation was on AIX 3.1, but it was re-implemented (not ported) for OS/2, and it is this that was this OS/2 code that was then ported to Linux. So you are probably right, but not in the way you think.

0
0
Peter Gathercole
Silver badge

Re: I think Apple owns Unix now anyway @AC

Would love you to justify this. Apple may now ship more UNIX(tm) systems than anybody else, but they own nothing of the UNIX IP.

OSX is a UNIX derived system, having taken the MACH kernel, married with bits of BSD (which is not branded), and then got UNIX 03 branding. This means that it passes the UNIX test suite, not that it has any UNIX IP in it.

3
0
Peter Gathercole
Silver badge

This will only end when the case is ruled on

I said a couple of years ago that this may come back. Until it is finally ruled on and closed, beyond all hope of an appeal, it will keep coming back. This is both because the claim is big enough to keep creditors and lawyers interested, and because it is a vector to attach Linux as a platform.

Mind you, the landscape has changed. I never fully understood where Novell's IP went to when SuSE got bought. If it is the case that it ended up with a shell company that is controlled by parties who have an interest in derailing Android, Chrome, Tigon and all of the other Linux related platforms, then consolidating SCO's claim with the ex-Novell's IP could prove more than an annoyance.

It all hinges around UNIX code that was allegedly incorporated into the Linux source tree by IBM as part of AIX code that was ported to Linux (I know that JFS was one thing quoted), but IIRC the case was never proved, as SCO could or would not point out the code in question. There were also arguments about derivative works. But they were never closed either.

Like the MS patent list, I feel that it would be in the best interests of all of the interested parties of Linux to make sure that any code that could be cited was rewritten and expunged from the Linux code tree. At least this would protect future Linux products, and turn this into a chase for money, rather than a FUD attack on Linux.

In one bizarre slant on this, it may actually prolong the life of Genetic UNIX (directly descended from the Bell Labs code), as it keeps it in view. I would love to see the SVR4/UnixWare source opened up as a result of any real settlement of this case, but I think that this is unlikely.

6
0

PC makers REALLY need Windows 8.1 to walk on water - but guess what?

Peter Gathercole
Silver badge

Re: My solution @John H Woods

If such a high pixel density is required, why have I never had migranes up until now?

I completely dispute that it is necessary to have such high resolutions.

In my view, as long as there are enough pixels, it's screen size that is important. And don't go on about 'colour saturation', 'jagged fonts', 'graphics intensive work', and 'multiple windows'. They're just excuses to justify the cost of such displays.

The only reason for higher definitions is to get more on the screen, and once the character height drops to below 2mm, it becomes unusable without a magnifying glass, regardless of how many pixels are ued to display it.

5
0

Nuke plants to rely on PDP-11 code UNTIL 2050!

Peter Gathercole
Silver badge

Re: there are alternatives

It depends on the model, but many of the UNIBUS PDP-11s were built out of TTL (even the CPU and FPU). This means that it should still be able to source and fit almost any of the silicon parts, although I suspect that the most difficult parts to source would be the memory chips.

If they were F11 or J11 systems, you would have to rely on existing parts.

But I suspect that with the state of current chip baking technology and the simplicity of the chips back then, it may be possible to create a pin compatible memory chip using an FPGA relatively easily if it were really necessary.

Hmmm. What a project. Keep PDP-11 alive using FPGAs!

2
0
Peter Gathercole
Silver badge

Re: Pah

I was going to say exactly the same.

And you've missed out UNIX, which was definitely multuser on the PDP-11.

I think the hack was confusing PDP with RSX-11M Plus, which really is the ancestor of VMS.

I was the primary technical support for RSX-11M on a SYSTIME 5000E (actually a PDP11/34e with 22-bit addressing and 2MB of memory - a strange beast) between 1982 and 1986. We had 12 terminals working relatively well on a system that on paper was little more powerful than a PC/AT.

I think I can still do PDP-11 assembler. At one time, I used to be able to decode PDP-11 machine code in my head, although this was mainly because the instruction set was extremely regular. I still would recommend people looking at the instruction set to see how to design one. It's a classic.

11
0

First look: iOS 7 for iPad

Peter Gathercole
Silver badge
Joke

It's a style thing

Like all things that look like fashion (and I count iThings in this category), changes do not have to make sense.

0
0

Samsung Galaxy Note 8: Proof the pen is mightier?

Peter Gathercole
Silver badge

Having been a Palm user

I'm seriously thinking about taking a Note 2 as an upgrade on my phone next month. The downside is the size.

There are any number of situations when using a finger is just not accurate enough (such things as free-form document mark-up and notes, sketching and handwriting recognition). I still find Graffiti easier to use than swipe, which I just can't seem to use accurately on my current phone, and doing something that feels like writing is easier with a stylus than a finger.

1
0

Top500: Supercomputing sea change incomplete, unpredictable

Peter Gathercole
Silver badge

Did you see the number of cores on Tianhe-2

It says over 3 million, and draws 17MW of power.

I guess that what this says is that throw in enough hardware, even with the law of diminishing returns, you can have the #1 supercomputer.

0
0

Girls, beer and C++: How to choose the right Comp-Sci degree for you

Peter Gathercole
Silver badge

Re: "This weird new software was Unix" @Sandra

As you might expect, AT&T used UNIX a lot.

I actually worked for an outreach of AT&T that was doing work on the 5ESS telephone exchange, and not only was UNIX used in various parts of the exchange (the AM ran UNIX/RT on a duplexed 3B20D when I was working with it), but UNIX was also the development environment for all the code.

In my time, they were also using Amdahl mainframes running R&D UNIX from AT&T Indian Hill as an emulation environment (EE) for the exchange, as believe it or not, the costs of emulating the exchange on a mainframe was less than having a full exchange as a test-bed.

After I left, they switched to Sun 3 (because the SM used 680x0 processors) and Sun 4 kit for the main working environment. Just before I left, I was playing around with gluing all the systems together with AT&T RFS, which allowed you to do some really neat tricks.

On the subject of Indian Hill (Chicago), pre-TCP/IP and SMTP, the UUCP hub IHLPA, which used to be a go-to for routing mail to systems that you did not have a direct path to was run from this site by AT&T. I don't know when it was decommissioned, but not that long ago (a couple of years) I came across a reference to it in a sendmail configuration that took me by surprise.

0
0
Peter Gathercole
Silver badge

"This weird new software was Unix"

Hmm. 1984.

You were a bit behind me.

I was introduced to UNIX at Durham University between 1978 and 1981 (V6 and V7 on a PDP11/34e, - and yes, there was one girl on the course in my year), and got a job needing UNIX in 1982. I admit that it was at a college (Newcastle Poly.), but I am still using what I learned, 35 years later, as a techie (not jumped to management, teaching [dallied with this for a year], or horror of horrors recruitment! [dig intended, in a light-hearted way]). Very glad I chose what I did as a career, and I'm one of the few people in my sphere who actually like their work, even after such a long time.

Strangely, I dug my copy of Lions UNIX V6 commentary out yesterday to check the way that Ancient UNIX did something that IBM, in their wisdom, choose not to document for AIX. Not sure whether it is still relevant, but it was a real nostalgia trip.

Just hoping there will still be a need for deep UNIX skills for the next 13 years to get me to retirement age. I don't want or intend to retire until I have to!

1
0

Microsoft Office 365 on iPhone NOW: No, we're not making this up

Peter Gathercole
Silver badge

@Stephen Channell

Sorry, you are wrong about Word being on Mac first. I agree that Excel was on Mac first, but Word first appeared on Xenix (Multi-Tool Word), and was ported to DOS, UNIX and Mac.

I used Word 2.0 for DOS back in 1984 or so, and I hated it then, and I still struggle now.

0
0

REVEALED: The gizmo leaker Snowden used to smuggle out NSA files

Peter Gathercole
Silver badge

Re: “Systems administrators.." "..low level, typically have the highest access to systems and data"

Many organisations ban removable writeable media unless the need is justified. There are almost always cases where it's just too difficult to do certain jobs without removable media.

If the sysadmins can make a reasonable case for it, it is likely that it will be allowed, albeit with some additional controls (encrypt the data, use traceable drives etc).

These controls are mainly there to make sure that there is no inadvertent loss of data, or if it is lost, that it can be traced to the careless person. It does not really stop such a device being deliberately used to remove data. To achieve this, you really need to physically disable drives and ports (epoxy glue or break them), have locked PC cases, and make it mandatory that two people are involved with any process that adds or removes hardware. I have a very nice microSD USB card reader, and I'm sure I could hide a 32GB microSD card about my person so that it would not be found except by a really intrusive search.

Completely disabling USB is difficult, as you would also have to deal with the ports being used for your keyboard and mouse. It can be done in a driver by whitelisted USB manufacturer and identity lists, but even this is vulnerable to a sysadmin with the correct degree of privilege.

I'm surprised that he didn't trigger alarms, though. The financial world often seems to have better controls than defence and security related organisations, and when I worked as a UNIX sysadmin in a UK bank, I was always aware that there were people metaphorically looking over my shoulder watching what I was doing (there was no direct root access on production systems, everything was done using a tools like Unix Privilege Manager, which logs the input and output of any command securely off the system). Was a pain in the neck to use, but was effective. Even so, it was possible to disguise what was being done, and take sessions out-of-band of the controls, if you knew enough about what you were doing. And at some point, someone has to know the root password.

2
0

I told you I'd be back: Arnie set for another career revival

Peter Gathercole
Silver badge

Re: Hollywood producers clearly have way, way too much money.

I thought his first film was "Hercules in New York", although I believe his voice was dubbed out.

1
0

PRISM snitch claims NSA hacked Chinese targets since 2009

Peter Gathercole
Silver badge

Re: Another conspiracy theory for you... @YARR

It is not the case that anybody can commit code to an open source project. Open source projects to not run like open access Wikis.

Most projects are moderated, so any change has to be agreed by the moderator. For example, I challenge you to get a fix into the Linux kernel without having to convince Linus that it is worthwhile.

1
0
Peter Gathercole
Silver badge

Re: Conspiracy theoriest right all along @ AC 10:48

I have to agree with both sides on this, but I tend to support Eadon's point of view.

It is indeed only a possibility that the inspection would be done, but it can be done, and as all projects store their code in publicly available source code control systems (Git, Subversion, CVS or the like), it should be possible to work out when bits of code made it into the source tree. This is not a glib assertion, but a real possibility. Couple this with the fact that in order to have changes accepted to the primary code-tree of most OSS projects, any rogue must convince the moderators of the project to trust them in the first place.

The mere fact that there are these controls will dissuade some rogues from attempting it, although it is always possible for a skilled programmer to code something that looks innocuous to a cursory inspection that does something other than it's stated purpose.

I'm sure that if a back-door was to be found in, for example, the Linux kernel, that there would immediately be a rush of people and organisations who would commit serious effort into auditing the code, and anything found would be expunged very quickly, and the rogue exposed.

Contrast this to close source, and even if it were proved that such a back door existed in a product, any audit would be at the vendors discretion, and if they are complicit in the back-door, you haven't a chance in hell of doing anything about it.

It really annoys me when someone knocks back the "you have the code, so go fix it yourself" statements. OK, I know that not everybody has the skills, and often the statement is made in a harsh way, but at the end of the day, there is no compulsion on the code maintainers to do anything when there is a perceived deficiency. Often they are working on their own time and expense.

What is being pointed out by the "fix it yourself" statement is that maybe, just maybe, users should take some responsibility and contribute in some way (time, money, equipment etc.) to a project, rather than just whingeing. Too many users of Free software feel that the fact they are using it entitles them to some special access to the maintainers, almost as if they had bought it!

With the current state of Free Software, any free support you get will absolutely always be of greater value than the money you paid for it, even it it does not fix the problem!

3
0

Microsoft botnet smackdown 'caused collateral damage, failed to kill target'

Peter Gathercole
Silver badge

Re: Not so sure @khaptain

The article you reference does not indicate a Linux fault, but suggests that the servers may have either been compromised by Apache (not part of GNU/Linux) or through poor administration. All that shows is the weakest part

of any system is the wet-ware.

2
0

MacBook Air now uses PCIe flash... but who'd Apple buy it from?

Peter Gathercole
Silver badge

Re: And resemble... look like..

Is that title a Short Circuit reference to when No. 5 is doing a tomato soup Rorschach test maybe?

0
0

Students outraged: Computer refuses to do any work for entire week

Peter Gathercole
Silver badge

Re: The real problem is...

50% would realistically be the upper limit, at least for HE leading to a degree.

A few years ago, I happened to note two stories on Radio 4 on the same day. One was that 50% of young people achieved 2 A-C grade A-Levels, and the other was that the government wanted 50% of young people to go on to University.

The way I looked at it in a tongue-in-cheek way, was that they could eliminate University altogether, and just award degrees to those who achieved 2 good A-levels, as that would be what was necessary to get both figures to agree!

It just reinforced my belief that having a degree structure has to be elitist in order to both work and be useful.

0
0
Peter Gathercole
Silver badge

Re: The real problem is... @Tom 38

You've described something not too dissililar to the old grant system.

University places were restricted; A-Levels gave good idea of who was capable (fitted to the normal distribution curve, rather than skewed to thehigher grades), so University could offer places to students with good A-Level results.

For students who got offers, the family income was assessed (taking into account a number of factors including whether other childern from the family were at Uni. already), and a grant depending on income was awarded. Students from poor families got 100% grant, those from rich families got little or no grant with a sliding scale for those in between. The grant was arranged such that any student taking the piss, and not trying to pass the course could have been made to pay it back.

Thus we had a quite fair system, with those students who needed it most getting complete support. I got about 50% grant, and my parents made the rest up (my grant paid for the accommodation in full, and I got the rest in installments from my Dad). Of course, all students got their tuition fees paid automatically.

By keeping the number of places restricted so there was competition, a University place was valued by most students, rather than being taken as a right. I feel that most of my friends 35 years ago all felt privileged to be at University, and few of them wasted it.

And at the end, because there were fewer degrees awarded, they were valued by employers.

Politians still believe this is the case, even though pumping too many mediocre students through the system has devalued a degree to the point where 3 years in the job market rather than a degree will result in a higher paid job. This makes a mockery of degree students being able to pay back their loan and pay more tax.

The crass stupidity in assuming that if you increase the number of degrees awarded, that the country will end up with a more valuable workforce, rather than the reality of just having people with meaningless degrees not relevent to the work they end up doing is one of the classic errors from a left-wing political mindset.

Kids are not equal, and never will be even if you try to say they are (as in the failed Comprehensive system). The best educational leveler IMHO was the Grammar School system for kids with the right academic abillity regardless of background, complemented with streamed secondary schools to act as a catch-net for late achievers, followed by a means-tested grant system to pay for higher education consisting of University for academic achievers and good vocational courses in technical and art colleges coupled with apprenticeship schemes. With this, you end up with a balanced workforce. This is what was put in place in the 1950's and 60's, and probably produced the best education system ever seen in this country.

I admit that it singles out people who have the chance for higher education early, but you have to do this to get the best from young people. No system is perfect, but I believe that the current system is so broken as to not be fit for purpose.

11
0

All major UK ISPs prepping network-level porn 'n' violence filters

Peter Gathercole
Silver badge

Re: That's enough @Dave 15

Two points.

Porn is hugely different on the Internet than in H&E, Fiesta, Club or Playboy. Legal magazines were not able to show explicit sex, and the pornography laws were so poorly defined that mainstream magazines kept well to the safer side of what was acceptable. Moving images add a lot to the impact of porn, and many of the free sites do nothing other than asking that you don't enter if you are under the legal age in your juristiction. With magazine still photos, and story pages, you had to use imagination, which was (in my day) often just guesswork for virgin teenagers. Nothing is left to the imagination on the Internet.

Secondly, it is easy to find porn, even with quite innocent words, and very easy indeed if you really do want to find it. I grant that is has become less frequent that google returns such results than it used to be, but just think how many ordinary words have alternate meanings.

There was a time when a site could 'seed' their pages with lists of completely innocent words, normally in white-on-white and in very small characters, just to try to get random hits. Many years ago, I remember my daughter searching for "medieval castles", and getting hits from some quite unplesent sites. I think that problem has largely gone away now, though.

I do not agree with restricting the Internet by default, but I can see how horrifying it can be to some people.

1
0
Peter Gathercole
Silver badge

Re: @Miek @D&C

It would be pretty easy for the ISP to put a block on all TCP and UDP access to port 53 to nameservers other than their own. Or one stage further, only allow a whitelist of known ports out. There's lots of things they can do to make your life miserable.

My ISP says I have to use their ADSL router in their Ts&Cs. I don't, because I don't trust their customised firmware to not snoop, UPnP or backdoor my network (and I've a firewall there anyway).

Not having reasonably unfiltered access would be a real deal breaker for me.

4
0
Peter Gathercole
Silver badge

Re: Fortunately....... @Jame Jones

You've missed one of the next steps, that of locking down the OS such that end-users (all end-users) are unable to change these settings 'becuase ordinary users don't understand enough about computers to make sensible decisions'.

Sorry to bring Microsoft into this argument, but creating an OS that encourages users to use admin or admin-enabled accounts should (and was by those in the know) have been regarded as a stupid move way back when.

I used to set up the WinXP machines that my kids used when they were younger so that they were not using admin accounts. Caused some problems with some games, but prevented the computers from being fiddled about with.

Now they all have their own machines they have admin accounts, but regard their machines much more carefully.

0
0

Pen+tablet bandwagon finally rolling, Nvidia leaps aboard

Peter Gathercole
Silver badge

Restive touch screens

As Palm used for years. I still am disappointed by the poor results of using a 'capacitive' stylus on any of my Android devices. I remain sceptical about this 'new' technology.

My daughter has a Wacom tablet, and this technology is excelent, because it uses active components, and picks up power from the screen itself through inductive coupling (I took the pen apart to fix a problem).

Older stylus's had AAAA batteries in them, so were more bulky.

5
0

Fedora's Schrödinger's Cat Linux gives coders claws for thought

Peter Gathercole
Silver badge
Unhappy

Re: Don't mean to nitpick, but...

Much as it pains me to say it, I think that the days of UNIX being the yard-stick for OpenSystems are long gone.

With only AIX (and maybe Solaris if Oracle are still interested enough) receiving anything like new features, and every other Genetic UNIX reaching legacy status or worse, we are going to have to accept that Linux now rules the roost of how non-windows systems should look, however badly it does this. I know that there are flavours of BSD still out there, and OSX can still be called UNIX, but I cannot see there being any new UNIX customers.

What we can now look forward to is a decade of Linux distros that are sufficiently different so that they cannot be treated as a single platform, which will pose a significant barrier to supplanting UNIX in existing environments. And if they don't get their act together with something like LSB, they could lose it completely.

I'm just hoping that ARM servers get enough traction with a dominant distro to appear in the server space and shake things up a bit. Otherwise we will be looking at Windows on Intel for customers who need vendor support, and a plethora of 'propriety' (I use the term very loosely to mean different distros) Linux boxes for less critical systems.

0
0

BOFH: Go on, beancounter, type DROP TABLE asset;

Peter Gathercole
Silver badge

Did Simon forget

the off-site DR copy? Or does he manage that 'facility' (i.e. his garage) for a fee as well?

0
0

Windows 8.1 Start button SPOTTED in the wild

Peter Gathercole
Silver badge

Re: Bunch of nancies

Ahh. But it's a POS that most people recognise and understand.

Oh, by the way. Using the window key and the first few letters does not present you a menu of what's installed in the way that the old start menu did.

4
0

'Secret Pentagon papers' show China hacked into Patriot missile system

Peter Gathercole
Silver badge

Re: hacking required

Generally speaking, export variants of Western weapon systems are downgraded, such as having less powerful engines, or not having the latest avionics and weapon system capabilities.

This has even affected the UK. I understand that some of the VTOL technology in the F35B is so secret that the design details cannot leave the US. Which is strange, as we gave most of it to the Yanks in the first place!

The list of technology designs being reported as stolen is a bit strange however, because F/A 18, Patriot and Blackhawk, must all be regarded as mature technologies now, and aren't particularly bleeding edge.

0
0
Peter Gathercole
Silver badge

Probably stored in 'the cloud' to save money.

0
0

Qualcomm app 'extends battery life' by analysing fandroids' privates

Peter Gathercole
Silver badge

Don't think it will help me

I turn off WiFi, Data Services and GPS when I'm not using them. Or, to put it another way, I only turn them on when I need them. Doesn't everyone?

12
1

Review: Samsung Series 5 Ultra Touch Ultrabook

Peter Gathercole
Silver badge

Re: Can someone please enlighen me @AC

I don't doubt that you think that you can tell, especially if you use a magnifying glass. After all, the marketing people say it is better!

Done any direct comparisons without being told which is res is which? I would use the term 'blind' comparisons', but I don't think it is appropriate. I would suspect that you would flag a 1680x1050 with a shiny screen as being clearer than a 1920x1080 with a matt screen.

You must face the fact that at some point there is a cut-off where more does not really mean better. I just think we have already passed that point. And if having such high resolutions drives up the price or power consumption, then I would dispute that it does no harm.

0
0
Peter Gathercole
Silver badge

Re: Can someone please enlighen me @David

Umm. Text DPI? everything non-textual?

I'm assuming that you mean that you are shrinking the page down, keeping the relationship between the text and graphic sizes the same. Yes. But at the point it will really makes a difference, the text will be too small to read. 13.3" diagonal is really not that big.

Also 'lovely and sharp' is subjective. I seriously doubt that you can really tell much of a difference between what this ultrabook can do, and what full HD provides on the same sized screen.

I would also seriously doubt that a graphic designer (like my daughter!) would use this sized screen for their primary workstation. They really use large 26"+ screens running at HD+ resolutions. I say again that 13.3" is too small for that type of work. It might be what they visit customers with, but I bet they plug it in to a bigger screen whenever they can.

BTW. I do command line on laptops all the time. I'm a UNIX sysadmin. I had an IBM T60 with a 1440x1050 resolution 15.1" laptop, and I could quite easily shrink down the text such that it was clear but too small to use without a magnifying glass.

0
0
Peter Gathercole
Silver badge

Re: Can someone please enlighen me @James 51

With smaller text fonts (like I said)? The only way to get more information on a fixed sized screen is to make the words smaller.

Whilst I admit that you can do this with higher resolution, you end up trying to read characters that are 2mm or less in height. It just doesn't work beyond a certain point.

Example. HD on 13.3" at 16x9 equates to approx. 165 vertical pixels per inch. Nowhere near a Retina display, but assuming text at 10 pixels in vertical height, that would make each line of text about 1/16th of an inch in height (about 1.5mm). Now I'm not sure I really want to be reading characters that small on a screen at just less than arms length. It's just too small.

So it is not the resolution that determines how much you can fit on a screen and still use it, it is the size of the characters. I don't dispute that at smaller character sizes, higher resolution means clearer text, but again, it is a matter of degree.

I also dispute image quality for video. You're being sold a lemon. I really don't believe that you can see pixels that small on a moving image, and even if you can, it's a mobile device, not your primary entertainment device.

0
0
Peter Gathercole
Silver badge

Can someone please enlighen me

What do you use full-HD resolution in a laptop with a 13.3" diagonal screen for?

Do you run your text with ludicrously small fonts? Is the text so much clearer. Does colour saturation matter so much for a device that, unlike a TV or large monitor, is likely to be used in non-optimum light conditions?

I know I am a bit of a Luddite, but it does strike me that it is merely 'my number is bigger than your number', because honestly, I cannot really believe that it makes a huge difference. But then I started using terminals with fixed 7x9 pixel character cell where you could still see gaps between the scan lines, so I may not be qualified to judge!

3
2

Forget tax bills, here's how Google is really taking us all for a ride

Peter Gathercole
Silver badge

Re: A free €100bn @Thomas 6

Except that you (probably) buy your apples at Tesco or ASDA, who funnel the profit out of your local area. OK, you have some minimum-wage jobs in your locality, but the major benefits do not stick.

I would much rather have locally owned and run shops, but unfortunately, I also need to keep my expenditure under control, and big businesses make it hard for them to compete on price.

3
0

Fedora cooks up new Linux for Raspberry Pi

Peter Gathercole
Silver badge
Boffin

@Jake

My journey was V6->V7->SVR2->SVR3 and onwards, so I was mostly isolated from BSD (I did have a BSD 2.3 or 2.6 distribution for my non-I&D PDP11 running V7, for Ingres, and it did have vi on it, but it would not compile in 56K, even using the experimental overlay loader that was also on the tape (this used one segment register in the PDP11 to switch different 8K pages into the process text address space to allow you to have more than 56K of memory in an executable, and it required a new system call to be added to the kernel to allow the dispatcher routine to request that the correct page was mapped in to the process before actually calling the code).

I only really came across vi when I moved on to SVR2 (I had used an Ultrix machine before, but not too much). Up until that point, I had been using ed almost exclusively (although I did also use an extended ed editor called em, tagged as Editor for Mortals which I believe came from Queen Mary College in the UK [hint - watch you don't mistype the "e" as an "r", very annoying])

For some time, I worked for what is now part of Alcatel-Lucent (then AT&T and Philips Telecommunications), and became the terminal 'expert' in their UK system support team, so was intimately acquainted with terminfo (it was SVR2&3 after all), and to a lesser extent termcap (some of the AT&T exptools packages used termcap, even though terminfo was available on almost all systems), and I looked after many different terminal types including AT&T 4425, 5620 and 630, HP2932, adm3 and adm5, Wyse 30 and 50s, almost all DEC terminals from 52s to 420s and compatibles, and even on ibm 3151 (yeugh). I missed out on the days when you had to encode time delays in the various commands, however.

While there, I also had a source-code license for Gosling Emacs, which had it's own (buggy) termcap.

The youngsters of today really don't know what it used to be like. I still get really annoyed when I see people hard-coding ANSI escape sequences into programs rather than using termcap or terminfo, or even Curses. It's just wrong!

0
0
Peter Gathercole
Silver badge

Re: joe vs. vi

It does not show your age, as vi preceeded wordstar by at least half-a-decade.

What it does is to indicate that you learned computing on some piddling little micrcomputer, rather than a mini or a mainframe running UNIX.

1
0
Peter Gathercole
Silver badge

Re: VIM @FrankAlphaXII

If you want to get the best out of Emacs without learning all of the meta key combinations, you need to learn the Electric modes. Once you get the hang of them, Emacs can be a doddle.

The reason why vi (pronounced vee eye, not vie or six according to the yellow book) is a little hard to use is because it dates back to a time when the only keys that you could guarantee were on a terminal keyboard were the alphabet and number keys, a limited amount of punctualtion, as well as an ESC key and a control function. As long as the terminal had a program addressable cursor, and a small number of other features (and really not too many of those), and a termcap definition (yes, termcap in the original BSD, not terminfo), vi would work.

There were some terminals that were too broken, however. I remember comments in the original BSD termcap about some beehive terminals, and a Ann Arbour Ambassidor that were deemed just too brain-dead to be able to write a meaningful termcap entry.

2
0
Peter Gathercole
Silver badge
Boffin

@Jake

Come on Jake.

Hardcore UNIX users use ed!

0
1

Woolwich beheading sparks call to REVIVE UK Snoopers' Charter

Peter Gathercole
Silver badge

Re: Right. @Peter Gathercole

"Alternatively just maybe knowing what internet sites these two had visited which had caused them to be radicalised could provide a list of possible other fanatics in the UK"

I will concede that this could be useful information, but it is likely that this will be obtained after the fact, as I'm sure that all their possessions are now evidence. I would be very surprised if the sites they were reading weren't already known. What you consider subversive information may be perfectly acceptable to other people in the world. As you said, what the security services would like to be able to do is identify everybody who is reading those sites, not necessarily the sites themselves. But this may still finger people who are just curious about such rhetoric.

"If they had posted their intentions prior to just maybe they could have been tracked down via IP information?"

Well. Did they? There is an "if" in your statement. I think that you would probably be surprised by how many people post such statements without any intention to actually carry anything like that out. I have said many times that I would like to drop a bomb on a certain campus in Redmond (there, I've done it again), but I will never really carry that out. If the police reacted to every casual threat that was tweeted, mailed or blogged, they would be very busy indeed.

My comment about porn was to try to show how little people understand our laws. I Am Not A Lawyer, and I certainly don't think I know everything that is illegal (like photocopying the Queens Currency, selling Creosote to individuals who are not in the fencing trade, or allowing ragwort to grow in your garden - all of these are against the law).

My list was intended to be wide but not so wide it would not cover everybody.

You've still missed the point that if they are allowed to do this without proper supervision, at some point they will in a way that is likely to be objectionable to everybody.

3
0
Peter Gathercole
Silver badge

Re: Other soldiers @MJI

I out to point out that I was referring to UK military bases in the UK.

0
0
Peter Gathercole
Silver badge

Re: Other soldiers @MJI

Most military bases have unarmed civilian security (I kid you not). There will be military armed guards somewhere on the base at any time, but every soldier checks their weapon in to the armoury when they are off duty.

I believe that there have to be specific orders in order to allow weapons and ammunition to be issued for use off-base, and that would not have happened (in the British Army) for an incident like this. Even if it were protecting a fellow soldier, off base it is the Police's responsibility. British soldiers are in every way professionals.

5
0
Peter Gathercole
Silver badge

Re: Right.

@Titus

I deliberately made the list as wide as possible so that most of the readers would fall into at least one category.

I know I have latent consistency theory tendencies, so may be slightly paranoid about these things. The point I am trying to make is that if they use something like the list I presented as the initial trigger for monitoring, they may well end up seeing other things that you do that are less acceptable. I am pretty clean (in fact I only fall into one of my own list categories - I'll let you guess which), but will definitely be on their known list (for good reasons only, I hope).

I do nothing that *I* feel is worthy of their attention for bad reasons, but that does not mean I am happy for them to monitor my Internet traffic. I think that if you take the defence that "I do nothing wrong so I have nothing to fear" ignores the fact that you don't know what *they* think is wrong, and there is nobody to challenge their view.

(BTW. If one of the three that you do is porn, then I suggest that you restrict yourself to sites that certify that all their models/actors/actresses/participants are over 18, because if you have images - photographs or other types - of people engaged in sexual acts who are or appear to be (in the eyes of the investigator) under 18 cached on your computer, even as a thumbnail in your browsers image cache, you almost certainly are guilty of infringing the Coroners and Justice Act 2009, sections 62-68).

6
1
Peter Gathercole
Silver badge

Re: How?

Chances are that if you have an IP6 address, then you are probably much more identifiable than if you stick with IP4. This is because it is less likely that an IP6 address will be re-allocated. You will either have it forever,, or at least for a good long time.

But even if you are using a temporary IP4 address with NAT, your ISP will probably be able to identify the account holder and probably the physical location of the point where it touches their infrastructure, just as long as they take account of timezones and DST correctly!

Although I don't agree with it, the presumption is that they could profile a person who was becoming a risk by reading their blogs, forum posts, browsing history, email, IM and SMS messages and even purchasing history (how did you buy your machete), and once identified, single them out for even greater surveillance. Once under surveillance, they can be caught before doing any damage.

But this effectively means that they will need to watch all people who match certain criteria, including many who aren't, and never will be, a threat to society. It's a really difficult problem which will always upset some people on one or other side of the argument.

My view is that as soon as government agencies have the ability to look at what people are doing without sufficient safeguards, they then will eventually abuse that ability, and look for things that have not been sanctioned by this legislation. Anything. Being a member of a particular political party or religious group. Or an anonymous blogger about personal freedoms. Or an infrequent copyright infringer. Or harbouring anti-AGW thoughts. Or being upset with your local MP. Or a consumer of legal on-line porn. Or an objector to HS2. Anything.

Is everybody who supports this charter sure they are squeaky-clean?

6
0

Forums