Re: @Oliver Mayes @Corinne
In my experience, change control is never applied to requirements. It would be good if it could, but generally, it's not, especially on something with a duration measured in a few weeks.
2375 posts • joined 15 Jun 2007
In my experience, change control is never applied to requirements. It would be good if it could, but generally, it's not, especially on something with a duration measured in a few weeks.
That's really a low blow. For all the hassle they get, the constant travelling, the long debates that run into the evening/night, and the exposure they have to their constituents and their problems, most MPs are not in it for the money, and many of them care passionately about their constituents.
I don't know whether you follow your MP, but if you did, you would probably be surprised by how many days they don't get back home in the evening, or how readily they are prepared to talk to any of their constituents.
Rather than constantly being in hotels, they are allowed to have an expensed second residence. If their permanent residence is in their home constituency, then this second residence will be in London or the home counties. If they have been parachuted into a constituency, then it may be there (although I would like all MPs to actually live in their constituency).
Because they are often out of their constituency, they are normally allowed to run an expensed office with some staff there. Often, MPs top-up the running of their constituency office out of their own pocket, or have family members working for more hours than they are paid.
And like almost any other employed person, they are allowed to claim justifiable travel expenses and ad-hock accommodation costs when away from any of their residences.
So yes, they do claim high expenses, because they do things that need paying for. And, yes, sometimes the rules have been abused. But probably not too much now (cases in the media nowadays are mostly historical).
They do not join Parliament to make money, at least not while they are an MP. Mostly people do it because they want to make a difference, and precious few manage this against the political machine. If they get well known, they may make money afterwards by taking directorships or on the public speaking circuit, but I suspect that many MPs after they leave office either move into local government, find normal jobs or retire. Only a few make the really big bucks. Most just grow grey and disillusioned.
I have a machine that I purchased 12 years ago, and it's still running XP!
Let me check. It's the same machine, although I've had to replace the motherboard/processor/memory twice, the disk more than once, the graphics card and the power supply. I also replaced the DVD drive with a DVD/CD combo.
It's still the same machine because the case, floppy disk drive and CD burner are original. I think it has one of the original keyboards attached to it at the moment as well!
In case you ask, it is running a retail copy of XP home, which allows me to change the machine as much as I like!
In reality, most machines purchased in the last 8 years will probably have been skipped a long time ago, because very few people are prepared to do the hardware surgery necessary to keep older systems capable of running XP with SP3 installed.
So we're really not talking about systems as old as 13 years, we're talking about machines that could be less than 5. And some businesses with volume licenses may well have still been building XP systems more recently than this.
My last 'work' laptop was delivered to me new in 2010 with an XP build. It's just been replaced, and I opted to have Linux on it. Yaaaay. I am now officially a Microsoft free worker, having a work Linux desktop and laptop (it's complex, I work for a vendor at an end-customer site), and use Linux exclusively at home.
I can sympathise, but some of the problem is the carrier.
I had a Sony Xperia Neo on a contract from Orange in the UK. It was running 2.3.4.
Sony published a ICS upgrade for the phone, but Orange did not bother to repackage it. One thing that the carriers don't tell you is not only is your phone locked to their network unless you unlock it, but often the phone you have is actually a service provider specific model (check the last few digits of the long model name, and look it up), and cannot take the generic updates for the model.
This effectively means that the same phone may have later updates that you can't use.
I know I could have put Cyanogenmod on the phone, but why should I risk the functioning of the phone merely because the service provider chooses not to publish a usable and available update.
When I got my newer phone, I passed the Neo to my daughter who stuck her Orange pre-pay SIM in and is very happy, even though it is running Gingerbread. Her (and my previous) previous phone (a Samsung Galaxy Apollo running 2.2) was passed down to my youngest child, who uses it with an 8GB micro-SD card as a music player and FM radio.
I think that phone service providers should be forced by law to offer to revert a phone to phone vendor generic software once they decided to stop passing on updates from the phone manufacturer.
While I agree with what you said, I think you missed what I was suggesting. I was suggesting that industry should skill up their coders so that they were capable of writing the efficient code. This would be of benefit to many of us older people, as we came from such an environment.
Education does as industry wants. If there was a serious need to have people trained in writing assembler, within 5 years, the education system would be falling over itself with suitable courses (it takes that long to develop a syllabus and get it accepted). Vocational training could be even quicker as long as there were the trainers able to teach (although this is debatable).
The only reason that Java, C#, .Net and Python are the programming languages of choice in education is that they think this is what industry needs!
There is another alternative to building bigger and bigger data centres.
Rather than look at the power footprint of the hardware, why not start looking at the power footprint of the software?
Looking at what people are doing on systems nowadays, how much more productive are people with, say, office productivity suites today versus what they had 15 or 20 years ago when systems were a fraction of the computing and consumed power (you only have to look at a 15 year old PC, and spot that the power supply could only supply around 100W. Look at a modern PC, and you will find that 300-500W power supplies are the norm now.)
I know that there are new applications that people use that do need high footprint software (anything to do with high quality media is a prime example), but for many tasks, both on a commercial and a personal basis, modern software is big, bloated, and power hungry.
The power economies available from ARM and Intel Haswell show that there are considerable economies in power, but this has largely been soaked up by software with higher requirements. Reducing the memory footprint and CPU cycles required to run the systems mean that each system will be able to run more work in the same power budget.
I'm not saying that all workloads can have their power significantly reduced, (Big Data and HPC workloads will always be memory and/or processor intensive), but much of VDI and running simple data processing workloads, and running web sites are hugely inefficient because of the way they have evolved and the tools used to write them.
So my view is dump the RAD tools and languages that require 10s of megabytes to run "Hello, World.", and move back to the development of light-weight applications on stripped down OSs, coded by skilled coders who are tasked with writing efficient code, and then run more work on systems with the existing power foorprint.
The cost balance will move from quick to develop but expensive to run, to expensive to develop but cheaper to run, but that equation will shift as power gets more expensive. It will have to happen eventually once computers reach the limit of what can be achieved in the available power budget, but why not start now before the crisis hits us?
That device creates a semi-focused EM pulse that would knock out all of the cars (and pretty much any sensitive electronics as well) within a certain area, probably including the police vehicle itself. It's a very blunt weapon. Would be good on a battlefield (which is where it would be effective is used by a non-technologically augmented infantry soldier, especially against smart soldiers and exoskeletons).
I would love to see the compensation claim against the police from a couple of hundred drivers for their cars, in-car entertainment systems, phones, watches, and a myriad of other devices, especially if the device was operated in a built-up area.
Automatic transmissions are still the exception in most of Europe.
I have a compromised right arm. I'm not disabled, but the biceps take no part in moving my lower arm since I ruptured the tendons at the lower end (in case you are wondering, this is not any reflection of the NHS that it was not fixed, there were practical reasons why I did not have it done, including the risk of nerve damage to my right hand and calcification of the elbow).
As a result, I generally have cars with power assisted steering now. I can drive a car without, but driving a car designed to have power assistance without the power is completely different from driving one designed without it. I had a Rover SD1, and even with the car moving, the wheel took two hands to turn if the engine was not running (once, after a breakdown, I was towed using a solid-bar that required me to steer and to some extent brake, but there was no power assistance for either - it was not pleasant). I think I could have driven it if the engine ran but the steering pump was not working , but it would have been difficult.
Similarly, if the brake servo craps out on you while you are driving, don't expect the car to have the same stopping distance that it has with it running. The power assistance is there for a reason.
I agree with your statement about cars having to be drivable without any power assistance, but that does not make any statement about how comparatively safe they are in that condition.
I believe that it's a generational thing.
When I went to University in the late 1970's (when computers were still seen as rooms filled with metal cabinets and blinking lights), not only was Computer Science a rising subject, but it attracted bright people.
I will admit that at the time, it was regarded as a very niche subject, having just about broken free of being a sub-genre of Mathematics, and the people were, how can I put it, um, different, or maybe eclectic, but some of the brightest people I have known were working with computers.
It needed a new and different mind-set that required you to look at problems in unusual and in some cases completely bizarre ways (the canned solutions had not been developed yet). You needed to be a little weird to be attracted to the subject, and there was no promise of high salaries. It suited future geeks like me.
I was lucky enough to have the right skills at the right time, and I rode the wave through the '80s and '90s, being one of the people who advanced rapidly because there was a skill vacuum which led to salaries and responsibilities rising faster than my peers in other jobs. At this time, the high wages and apparent skill shortages meant that Computer Science and related disciplines looked very attractive to new students, which led to a mushrooming of the number and type of courses and students studying them.
But it also led to people to come to the subject as a way of earning a living, rather than because they were really interested in it. A true Computer Scientist will think about computers outside of work. Someone using the discipline to earn a living will normally switch off as soon as they leave work. There are too many people for whom computing only as a job, and this damages the field as a whole.
There has also been a backlash. Many people outside of IT do not understand why there is a legacy of relatively high remuneration. It is still the case that skilled computing jobs can command high salaries, and this is often resented by other people. Many organisations are attempting to align their IT staff down to clerical grades, not understanding that this will prevent them from recruiting the best and brightest. It also makes older people like me very jaded, because I see the lower levels of the profession full of grunts who do not, and in many cases, cannot fathom what it is they are doing beyond following procedures. This reinforces the belief that all jobs in IT are over-paid.
I believe that the same thing has happened in Climate Research, but instead of it having taken 40 years, it's happened in about a 15. The older people who really know what it's all about are leaving the high profile Climate Science roles, and their place is being taken by people who see the subject as en-vogue and sexy, but do not bring the required levels of in-depth knowledge. The big difference is that it is not money, but reputation and influence that is driving the desire to join the field. And instead of being over-paid, the current crop are seen as having too much influence.
You're one of the people who removed the wire clips from the Centronics-type SCSI-1 connectors (they weren't Centronics connectors, that was for parallel printers), aren't you?
With those buggers clipped in, it was often impossible to get the cables out, especially if there was no space on either side of the plug to unclip them!
The reports state that about 7,500 people worldwide are being transferred to Lenovo. Obviously, some of these people will be involved in manufacturing and sales, but there is plenty of scope for the x86 iDataPlex and NeXtScale engineers and architects to be among them.
"Two-thirds of homes already have a satellite or cable box through which they could pay the BBC sub."
OK, that covers 1 of the 8 televisions in my house. Do the other 7 become useless? The people who come up with this guff assume that there is only one TV in the house. I wish they'd leave the 1970's and move into the 1980's, when more than one TV per house became common.
Oh, maybe they assume that they are all modern TV's, and have CAM modules?
Well, it's possible that the ones in my house with Freeview built in may have them, I've never needed to check the LCD TV's I bought the kids. By there are at least three in the house that use external STBs that definitely don't.
If you go down to Tesco and buy one of their £17 STBs for Freeview, they definitely don't. And I suspect that a significant part of the older members of the population, plus a huge number of older TV's that have been re-purposed to entertain the kids or sit in the kitchen will have a cheap STB rather than something that can use a CAM.
So. Are we all going to get some financial support to replace all this with new kit?
And how are you going to make broadcast radio conditional? A lot of radio listening is done in the car or on mobile battery-powered radios that already exist?
The CHRP platform formed the basis of all RS/6000, pSeries and Power systems from the second generation 43P (the 7043 models, not the original 7248 which was a PReP model) right up to the current day.
Although modern Power systems use PCIe rather than PCI or PCI-X, they are still under the covers CHRP platforms, although they are not categorised as such any more, because it is not important. I'm sure CHRP has evolved, but it is still CHRP.
If I look at one of the Power7 systems running AIX that I help look after, I can see "devices.chrp.base.rte" along with 25 other support packages that mention chrp.And I can tell that this is not for legacy systems, because amongst them is "devices.chrp.IBM.HFI.rte", which is the support package for the HFI interconnect that does not appear on any other IBM Power server than the 9125-F2C Power 775 HPC system.
So CHRP is alive and well, but only in IBM supplied systems.
It is possible that the Power8 systems will not be CHRP, because the fundamental GX++ Power bus is no longer used as the primary system bus, and has been replaced by the PCI Express 3.0 based Coherent Attached Processor Interface (CAPI). Whether CHRP will be extended to include CAPI or replaced, I do not know.
If they turn out to be a real bust, and don't get customer engagement, IBM is going to be looking very much weaker with a significantly reduced hardware portfolio.
I wonder when they are going to drop the "Machines" part of "International Business Machines" because they no longer make enough hardware.
Strange you mentioned printers. Remember that Lexmark used to be IBM's small and medium end printer business before it was spun off.
What amazed me at the time was that IBM spun off Lexmark, and then almost immediately introduced new ranges of laser printers that directly competed with the Lexmark product range!
Ditto Palm devices.
You've forgotten to take into account the hosting costs, which include space, power and bandwidth costs, and may also include a rental on the hardware for their servers and rental of an office space.
They also appear to host developer events, which are unlikely to be free to arrange.
Taking this into account, I do wonder how they managed to clock up a $20,000 power bill. How many servers are they running?
Mind you, the picture at the foot of their home page makes it look like their test servers are in someone's garage!
While your history is correct, it seems a bit random except that it mentions both IBM and Fujitsu.
In reality, Amdahl ceased to exist in the 1990's when it was fully subsumed into Fujitsu, and which coincided with Fujitsu effectively exiting the Plug-Compatible Mainframe market when they did not develop a zSeries compatible system.
I keep coming across ex-Amdahl (and ICL) employees in the UK who wound up in Fujitsu's services arm working on whatever they can to stay employed until retirement.
What is more interesting is whether the Flex and NeXtScale lines will go as well, because that will impact IBM's presence in the midrange AIX IBM i, and HPC markets, leaving it in the mainframe and niche server market place. What else does STG actually sell apart from consultancy, and my presumption is that it tended to be sold as a hardware/software/consultancy package a lot of the time. My guess is that it will quietly disappear, with the remaining work split between the Software and GTS divisions.
Yes, I was wondering where the consumer representation was on this advisory board.
The only people on it are those who are likely to financially gain, and not those who will lose.
My household has Sky on one telly, and limited IPTV on two others (through consoles and BluRay players), and the other 5 rely on terrestrial TV.
Living in the sticks, where LTE and Fibre services have not yet reached, and where broadband is currently limited to ADSL 2+ Annex M, and even 3G and DAB services are very patchy, it is unlikely that IPTV for the whole household is realistic.
I believe that was just an example.
An encrypted file may not sound like white noise. The encryption method may introduce patterns, and may not generate a white noise type distribution. I'm sure I could come up with some (admittedly poor, but I only spent 10 seconds on it) method of using integer encoding of an exponential of the bytes in a data stream to generate significantly non-random files.
And how do you 'play' a data file? All audio data has to be encoded in some form or other, even it is successive 8-bit sampled voltage values from a microphone.
Have you ever played around with SoX, and got the encoding wrong. Sometimes not even music sounds like music. Try playing an MP3 as a raw WAV.
The assumption in this sub-thread is that you can recognise that some file or device is actually encrypted. In reality, a file of seemingly jumbled data without a recognisable format could be anything. There does not have to be any implicit recognisable format in a data file. Some files contain headers or some fingerprint that point to the format file the file for convenience, but that is by convention, not any fundamental property of the data.
As long as you know how to process the data (be it background noise from the LHC, some new audio or video encoding, or a valuable secret), there is no need to put hints into the file to help other people. All that is needed is that you and anybody else using the data knows how to process it. It then becomes a matter of inspired guess work with some maths and statistics for anybody else to access the data.
For some background in arbitrary pre-shared secret codes, look back at this previous story. Follow up stories suggested that the message was read only when the pre-shared secret was identified.
in reality, there is no practical difference between encoding and encrypting. Encrypting just has a more complex encoding method.
If you think about it in a lexical manner, en-coding means applying a code to a data set, and at a fundamental level, a code and a crypt (as in encrypt, not that room under a church) are different names for the same thing.
Now sit back and wait for someone to offer a reason why a code and a crypt are different.
That's a very interesting question that I've wanted answering for a while.
If a file is of a format that the investigating authorities don't recognise, how do you prove to their satisfaction that it is not some new form of encryption that they don't know about?
Suspect: "Officer, I was investigating patterns in files of captured entropy data for use in random number generators"
Police: "Don't believe you, sonny. Tell us how to decrypt it, or go to jail!"
"If you know someone struggling with XP on an aged box, do 'em a favour and help them"
I always do, although it puts pressure on my free time. Mind you, in the Unity world, with Ubuntu getting rapidly more resource hungry, I'm having difficulty working out what is now a suitable distro to recommend. Currently, I'm suggesting Mint Debian with Mate, but that is now too large to fit on the smaller netbooks.
I don't want to go down to the lowest levels of Puppy and CrunchBang, even though these may be suitable for the lower-spec machines. I'd like something with regular updates, but not as heavy as the newer Ubuntu releases.
Maybe I ought to look again at Lubuntu or Xubuntu but even these are getting significantly bigger in their later releases because of the underlying code base. I must admit that I've lost faith in Canonical keeping Ubuntu as a mainstream Linux distribution as opposed to a boutique OS based on Linux.
Linux Mint or Win7 was not the choice I was setting. You are lucky enough to be able to afford a Win7 system for her, which makes it a style/fashion choice, unless there are Windows only packages that she needs to run (as opposed to being what she is used to).
If that was not an option, what would she and you have done? You didn't answer that question.
Would Win7 have been important enough to you (collectively) to drop a rent or mortgage payment, or not eat for a month? That is the question.
... that many home users will not/cannot replace what they perceive as perfectly usable system, merely because the OS is out of support.
The only thing that will make then do anything at all is if their critical websites, like their banking, shopping and on-line media sites stop working because they cannot update their browser. As long as updated versions of Firefox or Chrome are available, then they will stay on XP regardless of other problems.
For many, many people, £300 (or more if they have more than one to replace, like kids systems) for a new computer is enough of a hurdle for them to take the risk. After all, what would you do if you could not afford a new system, and had the choice of either stopping using computers completely, or continuing with XP with a higher risk of being exploited. This is what a lot of people miss, especially in the IT sellers and even on this site (where may of us probably have above average incomes).
When the browsers on XP can no longer hack it, there may really be a real opportunity for Linux to extend the life of otherwise potentially useless Windows XP systems. What we need is someone like Which! or other consumer related publications to publish a review of a suitable distro for 2GHz P4/early Core2 grade systems with modest memory and graphics capability as an alternative to sending a computer to the dump. I think that there would be some people who would consider this when given it as an alternative, especially if it is a relatively painless install, aimed at novice level technical experience (yes, it can be done).
I think that the corporates who are/were expecting the end of XP support to be a lever for more PC sales will be disappointed in the home market.
To tell you the truth, I absolutely hate browsers on different devices automatically syncing up. I have different browsing habits at work than I do at home, and can't stand it when my tablet or phone decides it wants to open the dozen or so tabs that I last used at work, especially when some of them are behind login pages, and thus will fail to load.
I mean, I would actually like to go back to the days when I opened the browser, it actually went to my home page, and stayed there until I went somewhere else. Anything I'm interested in, I'll bookmark, and open when I want.
One thing that worries me is that syncing across browsers also means that something is linking the different devices up without my involvement. Mozilla/Firefox must be doing some serious profiling to work out that my Linux desktop at work is somehow linked to my personal phone, but it does.
It's scary when I look up a route in Google Maps, and then have Navigation on my phone pick it up. Convenient, maybe, but I would prefer to have the control over that level of integration myself.
I know that if I dig through the settings enough, and regularly clear the cookies, I can probably get what I want, but the defaults currently do not suite me. Maybe there should be some privacy profile that gets filled in the first time you use a particular browser.
It's probably a generational thing. I don't want my life plastered across the Internet, because I grew up expecting a degree of anonymity in life, whereas more recent generations appear to not value that at all, and often court the whole world to know where they are and what they are doing.
OK. Lets assume that you can compromise the X server by adding this compromised font to a user writeable directory, and forcing a font rehash.
To do this you need to run commands as that user anyway! Again, if I can do that as a user with some privilege, I can by-pass the X server exploit because I'm already running commands as the user!
Please keep up.
Actually I can think of a way of doing this without needing to run commands as the user, but I would need a valid copy of the xauth magic cookie. But if you've got a copy of the magic cookie, you can do all of the key stroke and screen capture you talk about anyway, so again the exploit is superfluous.
Yes, it's terrible. Back before the Morris worm, nobody thought to code anything with the level of code care that we now do, because nobody thought that it was possible to do these things because they'd never been done before. I'm sure that there were huge numbers of buffer over-run and similar exploits scattered around every single OS around at that time.
Every aspect of computing, be it OS design or the primitive networking that was available back then must be looked as being primitive by today's standards.
As always, isn't hindsight a wonderful thing.
It's interesting that the arbitrary code execution buffer overrun problems only work if the compiler stores localised variables on the stack, the stack grows downwards rather than upwards, and there are no stack frame barriers imposed by the OS or language runtime. It also helps in the actual potential stack-smashing arbitrary code execution exploit if the calling-conventions and layout of the stack frames and return addresses (which is architecture and OS dependent) are known about in advance.
This means that gcc compiled binaries running on Intel platforms using the conventional Linux calling conventions can be easily targeted, but otherwise you need to know the target before you start. Of course, corrupting the stack will have unpredictable results regardless of the architecture, but most of these will be denial-of-service type problems, rather than arbitrary code execution. Still a concern, but rather less so,
I'm putting my old-foagy hat on here, because there is a lot of history behind X.
When X was first deployed in the mid 1980s and early 1990s, it used to be that you did not have a graphical login. The sequence was nearly always that you logged in using a text-based login mechanism, and then you started X up with startx or xinit command.
What this meant was that the X-server was run as you as a user, rather than as root, a privileged user. Indeed, as I write this, I've just switched across to an AIX 7.1 system that does run an X server (it's the enterprise management system for an AIX cluster), and I can see that the /usr/bin/X11/X binary does not even have the set-uid-on-execution bit set, so on a traditional UNIX system using code derived from the MIT X11 code, the X server certainly does not need to be run as root when started using the old methods (I've just tested this as well, and startx still works, and still starts up MWM on an AIX system. How quaint!)
Whilst this does not alter the fact that an oversight in the code could cause the problem documented here, it would mean that any exposure will not have root access.
At some point, some bright-spark decided that a graphical login process was a good idea, so they started X (as root) before the user logged in. I would need to check, but I'm fairly certain that the original xdm (X Display Manager) actually re-spawned the X server as the user when you logged in on the console of a system, but it is certain that the CDE dtlogin, and GDM and whatever KDE uses for graphical logins keeps the server running as root, with the correct X cookies in the xauth file to allow the user's X clients to connect.
This is a broken concept. It was originally intended that the X server process should be running as a non-privileged user. It would have been pretty simple to start the X server as a non-root placeholder user, rather than root, but I guess that nobody thought of it as a problem.
I'm not sure whether there are any of the X extensions (like DRM) which need to be able to talk directly to the hardware as a privileged user, but the original intention was that the X server would not be privileged.
But anyway, who uses BDF fonts any more. They're pretty obsolete, and to exploit this, you would have to put a compromised font in the correct place, and then re-start the X server. As the default locations for the fonts are in a directory that an ordinary user does not have access to, as is the initial starting configuration for the X server, it would require root access to put the fonts file in the directories in the first place, which rather negates the value of the exploit! If I already had root access on a system, there are a whole load more back-doors that I could add without going through this rigmarole.
I suppose it may be possible to place the font file in the users own file space, then trigger a font path change and/or font rehash with the server running. It would be interesting to see whether this would actually crash the X server. I might give it a try.
"Without educational licensing, yeah, I'd question it greatly."
And there you have it in a nutshell. Microsoft using their OS and software licensed at a loss in order to reinforce their market dominance.
If it were anything other than something reducing Government costs, this would be branded as an illegal subsidy by anti-competition bodies.
To be fair, they weren't the only people doing this. Part of the design of the BBC micro was that almost everything you needed could be installed in the BEEB itself, including the power supply, so all you needed was a bare drive (Viglen used TEAC drives, or at least mine was, a ribbon cable with the right connectors for the data, a four wire cable with the correct connectors for power, and something to put it in.
I had one of the 'luxury' drives that was 80 tracks, double sided, and even had the switch to double-step the head motor so that it could read 40 track disks!
When I looked, I could not understand how Viglen managed to make a profit on these devices, because the bare drives were no cheaper at retail prices than the cased ones.
Another thing. Viglen was not an Alan Sugar brand back then. He bought it in 1994.
"perhaps they should have thought of that before becoming parents"
Don't take this the wrong way, but do you think people foresaw these problems 10 or 15 years ago when those who are currently looking after tween/teenagers made their decision to become parents? (although I think that the nature of sex often forces parenthood onto a lot of people unplanned, especially if they were not able to get good contraception advice because of a lack of good sources of information)
Nobody really knows what it is like trying to look after children before they have them. Don't you remember the increasingly hollow and worried feeling as the birth of your first child approached? I know I was petrified!
I'm sure that good parenting classes aimed at new parents from 10 years ago did not even mention the perceived hazards that the internet now has. For goodness sake, most households would not even have had internet capable computers before the dawn of this century, let alone devices carried in their pockets that could access it.
Things change, as do responsibilities, and the world of the Internet and what it can enable far outstrips what most non-IT literate people realise, both good and bad! This is why they want someone else to take the responsibility of protecting their children. They just don't know how and cannot understand the process to get the required knowledge.
I don't disagree, especially with the parental responsibility, but it's becoming increasingly impossible to install filters on all the devices, unless you only have a small number of internet capable device in the house. This is especially true if parents choose to buy smart-phones or tablets for their children which are allowed to connect to the Internet.
It won't be long before all TV's and other devices will contain some form of internet connectivity, and trying to put parental filters on those could prove to be a challenge for a technically able person, let alone the average Joe Bloggs. I have well over 30 internet capable devices in my house, and I do not know how to impose filters on Xboxes, Wiis and PS/3s, or even my daughters Mac.
More boundary protection (making the routers act more like a firewall as long as you could select the degree of protection) would help, but that would not be significantly different from the ISP filtering in their network, especially if they maintain the block-list.
Even if you do put some form of parental filter on the individual systems, you are at the mercy of the organisation maintaining the block-list as to what is allowed through just as much as if the ISP does the filtering. I fail to see ant real distinction.
I don't believe in filters as a substitute for responsible parenting. Our household has been connected for over a decade with wireless, and computers that the kids use exclusively (i.e. I don't) for much of that time. For the last 5 years or so, everyone in the house has had their own system that they control. (except my wife. She wants someone else to fix hers when it is apparently broken).
What I do have is a firewall that logs all the URLs that are visited. I told my kids when they were younger that I was not going to put any filters, blocks or parental controls on what came into the house. But I did say that I could see most of what they were doing if I had cause to, although I would not under normal circumstances. As far as I can tell (and I have looked for signs of them using proxy or anonymising services) they have never attempted to hide what they are doing.
We (my wife and I) also have an open policy that if there is anything they are worried about, be it viruses, health issues or inappropriate material, that they could always talk to us to discuss it without any recriminations. And of course, they can talk to each other about similar issues. It has not always worked, I believe that my oldest son was the recipient of non-physical bullying, which he said nothing to us about at the time. But we try.
I hope that my kids are well adjusted, and have acquired a knowledge of where to draw the line about what is appropriate.
That is my attitude, and my responsibility. I know that there are others out there who welcome the additional controls. That is their decision, and I accept that there are valid reasons why they may want that. And having a filtered internet feed does have a place for people who cannot ensure that their systems are suitably protected. It's just another (justified, in their eyes) brick in the wall. It really is the case that even quite knowledgeable people can't be totally sure that the systems in their house are protected to the degree that they would like. Computers are just too complicated for anybody but the most technically able to protect, especially the 'sheeple' you are talking about.
This means that I agree that parents need to take responsibility. But I'm not going to suggest that kids should only use computers under adult supervision, at least not once they reach an age where the parents would trust them to be out on their own, for example. That way leads to young people who will go to extraordinary lengths to get out from what they will see as over-controlling parents. Trust is important.
Your arguments risk descending into the realms of wrapping kids up in cotton wool that results not in well-adjusted members of society, but into a world where these kids, when grown up, do not want to take their own decisions. I've seen the results of over-protective parenting, and it often leads to behaviour as bad or worse than kids given free reign..
It's a complex and difficult area that will always have winners and losers, fans and critics, whatever is done. There is no winning solution, just a choice between less-bad ones.
You've missed the point.
There may be some parents who want to have a filtered connection, but would like sites specifically set up for teenage sexual-health issues to be allowed, because giving a reliable source of good advice is much better than learning in the playground/behind the bike shed (or wherever teenagers hang out now).
From the article, it is these sites that have been incorrectly blocked, so parents with that mindset would not just turn the filters off because it would allow much worse through.
Quite often, sources of good information are publicised in doctors surgeries, libraries, and sex education classes at school. That is how the sites get known. Whether the blocks are spotted depends entirely on whether they are blocked silently, or whether it banners a message You've been spotted trying to access a filthy site. Desist, or tremble in your shoes while we tell the account owner!
Fortunately for me, the last minor in the house turns 18 in January, so I will just turn the block off when I get told how, not that I was overly worried in the first place.
Good point. I should have remembered the exact quote better, especially considering how much of a wordsmith Douglas was. But still 126 ly is nothing bearing in mind that the diameter is 100,000-120,000 ly.
The distance from Earth to the centre of the galaxy is about 27,000 ly, so Ursa Minor Beta at 126 ly from Earth is just next door.
Betelgeuse, which is often quoted as being close is ~643 ly, which is considerably further away.
I'd leave the peanuts unless you intend to travel by matter-transference beam. I'd grab the towel myself.
What I've never understood is why, if the Earth is in the unfashionable western Spiral arm of the galaxy, Ursa Minor Beta (β UMi or Kochab) which is a mere 126 light-years distant and thus in the same arm is the third hippest place in the Universe, and contains the second hippest place. The Hippest place may also be there (Zaphod Beeblebrox's left cranium) if he happens to be visiting the entrance lobby to the Hitchhikers Guide to the Galaxy offices.
"When you are tired of Ursa Minor Beta you are tired of life." (Playbeing magazine).
And, ironically in the UK, I hear most of the DAB commercials on .... digital radio stations! Makes me laugh.
...it's just that it will be relegated to only carry local radio. It's only the national stations that will be forced to change.
So the FM radios that people have will not be come useless. They will still be able to be used, but only to listen to local stations which will still broadcast on FM.
Does not make me want the switch to happen any time soon. DAB reception is dire on my journey to and from work when I do most of my radio listening.
...offering a Win7 update at low cost to existing XP customers. Oh no, they're hoping that those customers will fork out for new machines, and count as new Windows 8 sales!
Unfortunately, unless MS do this, many XP users will keep it until they can no longer log onto their on-line banking, and then there may be scope for persuading some of them to use something like Linux Mint (note, I'm in the process of defecting from Ubuntu to Mint Debian edition at the moment - trying to resist whims of Canonical [Unity and Mir] has finally persuaded me to jump).
The X-axis is the 'wrong' way round, with the latest quarters on the left.
This is not what I expected when I first looked at it.
Except that the 360 does not use x86 compatible processors, so is not strictly a PC in the "IBM Compatible" PC manner.
In order to run Xbox 360 games, they would have had to include some processor emulation or run-time translation of the instructions. This is what Transmeta did for their Crusoe processors, and we can see how successful that was.
Looking at the staff writers remaining, then I think that your last statement is probably true.
One wonders whether this includes any single person contracting companies that many IT contractors work through.
If it does, then the figure is mightily misleading, because it will not indicate any change in anything other than how it is being counted.
...whether the PoS tills run an embedded version of Windows, or one of the full-fat versions?
Ah. I'd forgotten the difference between SAS and SATA. I work so much with SAS that the restrictions in SATA compatibility flew past me while I wasn't thinking.
I'm making a (possibly erroneous) assumption that this thing is put together using industry standards, which may be wrong, but...
If this is two disks, with a 2 port SAS expander built in, then plugging it into a laptop will show 2 drives under Linux. They should just work.
What won't work is any fancy Acronis drive imaging software. But, boot from a live CD, attach your old drive via USB and then use gparted to partition and copy the data around. Only problem you may have is writing the boot record, but that should be relatively easy from Grub.
Anybody fancy giving me one to test this assumption?
Well, I guess spinning rust and tape.
So not so clear cut at all.
I'm still dubious of the longevity of data stored on flash RAM, especially if the flash is stored 'cold', i.e. without power. Until this is proved, I would be dubious about using it for information that legally has to be kept for years, which is the traditional domain of archive and long-term backup.
And that's not to mention the security implications of having the data ultimately stored out of your control ('binding' contracts are only as good as the people who wrote them, and nothing like having physical security surrounding your data). If a cloud provider goes bust, or is taken over by another company whose modus operandi is not acceptable to you, how do you extract and export the terabytes of information they've been holding for you to move somewhere else, and ensure that they've destroyed all copies of the data.
The kit is not 30 years old. The design is.
OK the design probably needs to be updated, but the way this is written suggests that the exchange is still running on kit bought when the IBM PC/AT was the benchmark PC!