1391 posts • joined Friday 15th June 2007 09:17 GMT
On modern Linuxes
the first account setup is an 'admin' account, but by default this gives them very little additional access to the system. What it does, however, is add them into the "admin" group which is setup so that they can use sudo when required to run commands with enhanced privileges. Thus in normal day-to-day use, the system is safe, and you can just worry about things that fire up the request for the password.
If you set up additional accounts without adding them to the "admin" group, they will not even be able to run sudo or use any of the additional commands that need sudo access to run (like package managers, for example). This makes those user accounts safe even from users who click "yes" to everything. Their personal information is still vulnerable, of course, but they will not be able to touch any of the system files or directories.
I though that OSX was the same, but if there are application directories that can be written to by one of these accounts without needing to use sudo, then it's security is significantly weaker than I thought. I will thus nod to everybody who has been saying that OSX no better than Windows, admitting that I was not totally correct, but point out that it is still better than the all-or-nothing situation in the pre-Vista Windows world.
On Windows 95 and 98,
there was effectively only a single user, with some slight trickery to allow some applications to store their defaults in different places for different 'users'.
All users were effectively administrator accounts, and as Fat16 and Fat32 filesystems did not have any form of security-by-user, the entirety of the system disk was vulnerable to infection by any account logged onto the system.
As a sideline, this last point is exactly why you should never do a WinNT, 2000 or XP install using Fat32 as the filesystem for the system disk, as this negates almost all of the security that segregated privileges provides.
On a side note, on XP and Windows 7 (not done a Vista install), the administrator password that is asked to be set up during install is indeed a hidden account that can only be used when the system is brought up in system recovery mode (or similar). This is intended to be used when the system will not start, or when users forget their own passwords.
By default when using the MS XP install process, the first named user account that is set up will be an administrator account unless changed. If you set up more than one user account during the install, the subsequent ones will be not have administrator rights, by default, but this can be changed.
But there is another point here. Many 'canned' Windows installs (for example, from system recovery disks) will not use the normal XP installation process, so even those users who have restored their system will not have seen this setup process. Only those wearing hair-shirts, and doing everything from lowest common denominator (MS install disks and vendor driver disks) will have seen these accounts being set up. But those of us who have done it this way KNOW that Windows installs are FAR, FAR more painful than some of the other OS offerings out there.
@AC 14:40 - Wrong.
That is the admin account for system recovery. Can't use that to log in when the system is booted normally.
The install process gives first user account set up admin rights. Subsequent ones will normally be ordinary users unless specifically changed. I always create my own admin account as the first account, and then create additional ordinary accounts for each of the kids for day-to-day use. I never give the kids the password for the admin account I created. I normally install any programs that then need admin rights.
For those awkward programs that have to have admin rights in order to run, I also create a second admin account, which I then fix in the Registry so that you can't log in using it, and tell the kids to use "Runas" with this account for any applications that won't work from their ordinary accounts.
It's not perfect, because you can really run anything with Runas as long as you can find it on the disk. But it meant that I was able to have one of our shared machines virus free for years (also have external firewall to block direct malicious traffic).
I think some of this must have stuck in the kids minds, because now they are older, and have their own systems that they control completely, they often keep using this model, and generally have less problems that their peers.
Lax network management
For goodness sake, at least segregate your DHCP space.
Allocate two IP subnets, trusted and untrusted. Register all of the MAC addresses of your trusted devices and give out addresses in the trusted range. Any unknown or foreign MAC addresses get given addresses in the other range. Allocate different DNS server addresses and default routes to each subnet. Use short leases to make sure that someone using a fixed IP address will be spotted (by duplicate IP addresses) as soon as the addresses cycle round.
Control routing between the two subnets so that untrusted devices get no access to internal servers, and minimal access to the Internet and such devices as printers. There, does not matter what gets brought in, it is unlikely to do any damage. And you do not even need to invest in a large network infrastructure, as most switches will multinet quite happily.
Of course, if you are paranoid, you could just not give out any DNS address to unknown devices, or you could have something like Wireshark alerting whenever you get a source address in your untrusted address range.
In extreme security environments, lock network ports down at the switch to only a single device per port by MAC address, with the port being disabled if another device is attached. As soon as a user plugs something else in and locks the port, they either have to call the help desk (giving you a chance to rap them over the knuckles), or suffer the port not working forever.
I know that this can be defeated with LAA MACs, but if al you are trying to do is prevent users from attaching smart phones, printers and the like, these devices use fixed MAC address anyway. Most basic users would also not know how to change the MAC address in their PC either.
This is not far fetched. I've seen all instances of the above deployed in real customers, and most large organizations do something along these lines by default.
Meanwhile, back in the real world...
ARMs are currently being deployed more and more widely as people realise that they really don't have a current need for 64 bit processors for much of what they do. 32 bit+address extension will do very nicely.
Just wait for ChromeOS and a decent server distro of Linux for ARM, and Intel will see all sorts of customers defect. They just don't see that it's largely about power consumption, and their track record in reducing power is not good.
@Sir Runcible Spoon
But the BT HomeHub router is on the local network, and so a judicious bit of logging code in the router allows such things to be captured. Remember, a router may do much more than routing, especially if you (or in this case BT) has control of the firmware. I'm sorry for the icon, but I'm not the one being stupid here.
@Don - Depends on which flavour of UNIX
IBM introduced dynamic driver load/unload, shared libraries by default, virtual Kernel address space (associated with never having to sysgen a system again), along with journaling filesystems and many other features, in 1990.
Shared libraries were around in SunOS before then, although the norm was still to statically linked libraries for several years.
I think that your description of X11 applications is completely wrong for everything except Java graphical programs (but that is a Java problem).
The concept of Drag-and-Drop in X-Windows (and it was probably X10 at the time) was shown to me on a Torch TripleX running X.desktop (although I'm sure it was also called LookingGlass and possibly OpenTop) in the middle of the 80's, along with desktop icons and walking menus. I concede that MacOS had these concepts before then, but they were not foreign to UNIX even before Windows.
The standard X-Windows model for GUI type programs was indeed to use toolkits and widgets (effectively library code) for drawing things like buttons, text boxes and pixmaps, and this does mean that the application has to keep some sort of track of what is going on on it's own graphical space, but the server is what keeps track of where the cursor is. X-Windows is built around call-backs and managed data objects, which meant that the X Server (the thing that controls the keyboard, mouse and screen) always has a degree of separation from client programs (which is really to allow X-Windows to run across a network, something that Windows still does not really do well), but it can only marginally be called Object Oriented.
This separation allows a client to be completely ignorant about the position of the cursor and which parts of a window was obscured by another window. Each click, key press and other event was tagged with the current cursor position by the server, and when a part of a window was uncovered, the server gave one (or sometimes many) expose events, saying exactly which part of a window needed to be re-drawn. And if the server was configured with BackingStore, the server itself could fill in the missing bits without bothering the client. This was designed to make it run efficiently with a network between the server and client.
In addition, things like window decorations (frames, resizing options, window control buttons) are all handled by a separate component from either the client applications and the X server. This is the Window Manager, which is what allows you to rapidly change the look and feel of the GUI. This works by encapsulating an application window (X11 defines a window hierarchy, with the root window at the top, application windows in the middle, and individual graphic contexts at the bottom handling widgets within application windows) , allowing keyboard and button events to be acted upon before they are given to the client. This is also an OO type feature.
I don't think that Windows integrated COM into the presentation manager until the late 90's probably with Windows 2000, although it was available to applications, and all windowing applications needed to manage their own
HP VUE and then CDE did provide something like COM, and this was before Windows95, but coding for CDE was difficult, and the old X11 models still worked, so were still used.
There are not many people now who actually code at the X11 level. Almost all applications are now written with toolkits or SDKs (like Motif, Qt and GTK+), which hide almost all of the complexity of how X11 works.
My EeePC 701
is currently acting as an internet router allowing my home network to use a 3G USB dongle while I change ISP's.
I thought it would be a bit difficult to set up, but it took about 15 minutes. I already had Ubuntu 10.10 on it, though, and it is normally used as a portable network capable media player when I don't want to watch what the wife has on the main telly.
Greenland is an autonomous country, has it's own parliament, and is not part of the EU. Thus I was not counting it as part of Denmark the country.
As a result, I'm would be surprised if the ban on Marmite applies.
Point taken, though.
I agree with your sentiments, but I would dispute that major financial institutions use UNIX primarily because it's safer. In actual fact, all of the financial institutions I've worked at (and there have been several) all shield even their more secure OS's from untrusted traffic with layer-upon-layer of additional protection (filewalls, port filters, content level filters etc.), and often run their internal networks in segregated segments for security purposes.
Mainly they use UNIX because it has scaled better in the past, has been easier to port applications between different vendor platforms running UNIX, and has better Enterprise RAS features and vendor support than most other popular platforms.
With very large Intel systems, virtual machine support, and major vendors differentiating their Intel platforms with RAS enhancements, these advantages are being eroded over time.
@AC (both of them)
My views presented here are deliberately contentious, to try to get people to think. I am a little undecided about what is the voting system with fewest drawbacks, but I am reasonably certain that it is not FPTP or AV.
I agree about the post about mathematics (and other) teaching. I am not disputing that governments take a big part of everybody's earnings (although you 50% is an average, many people at the lowest end of the earnings spectrum only end up paying VAT - unless you are including employers NIC, which is an expense to the business of employing people, not really a tax on the income unless you look at it with a real pedantic eye). I used to run my own company with it's own payroll, so I can see all of these aspect of tax.
But unless you go to a full PR system with national candidate lists, which breaks local accountability (something I value strongly), then there will always be some quantization errors in the representation of the people. This is true with AV, STV and MMP. And when considered with current party preferences, will nearly always end up with minority governments.
I know that some successful countries actually work with governments that are in a minority, but for every one that does, I could probably point to at least one where historically they haven't always. In a minority government, charismatic leaders are the key, and in the cynical political atmosphere in the UK, I don't believe that we trust any individual enough, especially after Tony Blair and Margaret Thatcher.
Another point I want to make is that people don't actually want democracy. What they want is what they personally believe is right. Whenever such people do not get their way, many of them are prepared to blame the system, rather than accept that they are out of step with the 'will of the people'. And with a complex voting system, the least well educated and those that don't-give-a-damn-about-how-it-works will always feel this as long as they don't understand the process.
I was also pointing out that whilst the share of seats in the UK does not often track the percentage of the vote (and this has always been a sore point to Liberal Democrats and before them the Liberal Party in the UK - I've been politically aware for that long), it does not actually matter that much to the over-all policies. Bills are presented, mostly (though not exclusively) by the incumbent party, debated, amended and eventually have to achieve a majority of a vote to become a policy or law. Be more afraid of secondary legislation that only has to be voted on by a select committee rather than poor representation in the House of Commons. This is truly unrepresentative.
I accept that between elections, there is little that can be done by constituents to 'sack' their member of parliament, but votes in the House of Commons are by majority, so any minority government has to convince other parties, one way or another, to support them. I would be happy seeing fee votes more often, allowing MPs to vote according to their conscience and constituency wishes rather than along party lines, but sometimes it is necessary to enforce the party lines. So I assert that a party with only 36% of the vote imposing their policies is actually not significantly different from voting alliances seen in most countries with some form of PR without overall majority governments, although I accept that it is the the incumbent party that get to present the most bills.
I would also point out that if you really want an accountable government, you really ought to make it compulsory for all eligible voters to do so, because even 51% of 65% is less that 38%, only slightly more than the number who didn't vote! Is this truly representative? Or do you contend that people who don't vote don't deserve to be represented?
Anyway, beside electoral systems, somebody ought to take the "how-it-works" bat to Sarcozy, to beat some real knowledge into him.
@AC At the risk of reopening the AV debate
According to Electoral Reform Society estimates (there are no direct results, because the votes were not cast in the appropriate way), comparing First Past the Post, Alternative Vote, Alternative Vote+ and Single Transferable Vote, we get the following:
FPTP: Cons 306, Lab 258, LibDem 56, Others 28
AV: Cons 281, Lab 262, LibDem 79, Others 28
AV+: Cons 275, Lab 234, LibDem 110, Others 31
STV: Cons 246, Lab 207, LibDem 162, Others 35
Hmmm. Obviously with all of these alternatives, there is no better split. We still have to have a coalition, and one where *any* two of the three mayor parties would have a majority. <sarcasm>Sooooo much better. </sarcasm>
And with full PR (figure from the BBC), it would have been Cons 36%, Lab 29%, LibDem 23%, Other 12%, which is no better.
I prefer to know that the government was formed by the party with the biggest share of everybody who bothered to vote, at least in a system where a single party normally gets to form the government.
And I don't think that having a system where in in order to get a vote on what biscuits are served at the tea break, you have to engage in horse-trading with parties whom you may radically disagree with like much of Europe has is really that much better, and tends to lead to weak compromise government.
Of course, we could have a full referendum for everything, televised and using modern IT for instant voting, but I doubt that many people could stomach sitting down and listening to the boring and mundane arguments that form most debates in parliament even if they understood them (try watching the BBC Parliament channel for an evening, and that is the interesting selected highlights!)
Just face it. There is no perfect system, and any system will lead to some people being upset some of the time. Significant numbers of people are *always* in a minority in any democratic system.
Anyway, it's all a moot point. Regulating the Internet is like holding liquid mercury in your hand. It's almost impossible.
No it doesn't
PLT devices have discovery protocols (by what looks like a periodic broadcast) so they can see each other. Chances are they also use uPNP and are probably visible to the HomeHub. That's the beauty^H^H^H^H^H^H danger of uPNP.
Even if they do not use uPNP, BT can probably make a reasonable guess about whether such devices are on the net by sampling the packets on the net, and looking at the first six octets of the MAC address that identified the vendor of the device.
My PLTs are Intellon based, and come with a (Windows) utility that allows you to set the encryption key. Not only does the utility find the devices, but also can tell you how fast they are operating, so there must also be some other magic under the covers. I have a Linux utility in source, so I'll have a look at how it works.
Still, I have a Linux based firewall (really, separate from any of the comms kit - Smoothwall as you ask) between my ADSL router and the rest of my network (yes, yes, I know that there is a risk that the PLT escapes onto the wider electricity network, but that's why I set my own key), but it means that my ISP cannot probe my network.
You're aiming your criticisms at the wrong people. Don't blame the Linux community for not fixing the deficiencies of the hardware manufacturers and system integrators.
I know it is changing, but up until recently, in order to get dual head support for a graphics adaptor in Windows, you relied on the adaptor manufacturer to provide a Windows driver CD. If you were lucky or had an integrated graphics adaptor in a laptop, then the system integrator would get it working in their pre-installed image.
It all works out of the box. But consider this. Try taking the same laptop, and installing Windows from scratch. I can guarantee that it will not be so easy now, and in my experience, can actually be *MUCH* more difficult.
Now I know that this does not fix the issues with Linux (all of which can probably be done, I've got dual-head support working in Linux in the past), but rather than pointing your finger at people who give their best effort support often in their spare time, pour your scorn out at the adaptor manufacturers and the people who supplied you with their hardware. Make it clear that you want a pre-installed Linux that works out of the box. Give them the same degree of vitriol that you put into forums such as these.
Will it make any difference? Probably not, considering the fact the Microsoft can choose who to give their substantial discounts for Windows to, and have proved that they are prepared to financially disadvantage suppliers who ship systems with Linux installed. But at least try.
And don't just blame Linux or it's user community.
So. Complain to Adobe and Steam, not to the Linux community.
But I'm not sure whether many people would actually be comfortable paying for a full CS5 license to put on a free Linux system. I would be worried that casual readers would assume that "On_Linux" == "Free", which is really not the case.
Unfortunately, we are in a chicken-and-egg situation. Adobe and other commercial software writers will not put their applications on Linux (particularly a single distribution) until there are enough people willing to buy it to make the port and their support infrastructure economically viable. Conversely, people will not consider Linux until there are sufficient applications that they need. And so it goes on.
My hope was that something like Ubuntu (and I'm especially using this as an example rather than RHEL and SEL because of the procurement costs involved in wrapping up the availability of the distro. with a support contract, partially nullifying the Free aspect), would manage to reach a critical mass that would encourage the applications to be ported. Sadly, this is not happening, in my view, partly because of FUD, but also partly because Ubuntu appears to have taken a sharp turn (Unity et. al.) which has destabilized even the hard-line advocates.
Looking at the alternatives, the other Debian and Ubuntu spin-offs do not have a large enough organization backing them, SELs future is a little uncertain due to the transfer of ownership of Novell, Fedora is not a suitable OS for commercial organizations without a lot of support effort because of the speed of change, and is completely unsuitable for any non-technical home users for the same reason, RHEL costs money, and Centos is probably too enterprise oriented for home users.
To tell the truth
Lexmark ink-jet printers are a pain in the ass for Linux.
But that is mainly because they use a propriety control language that nobody in the Open Source world really wants to use, because the printers IMHO are a complete waste of ink (literally, you use about a third of a cartridge just cleaning the head every time you turn it in). If they are not used, nobody bothers to write the driver.
Lexmark do produce drivers, but only in source, and do not bother to build even for the common Linux distributions. I built them for Ubuntu 8.04 (hardy) and prefer to ditch the printer rather than go through that again!
Lexmark laser printers are not so bad because they either use PostScript of HPCL.
As an example of well-supported printers, I have some HP all-in-one printers on my network, hung off a NAS via USB, and it is HUGELY easier to set up to use from a Linux system than from a windows box (the NAS does not run Windows, so cannot hand out the driver). If you try to use the HP install CD, it assumes that the printer is directly attached to the system, and if it is not, it refuses to load the driver.
Ubuntu 6.06, 8.04, 10.04 do not need a CD at all.
I'm confused about the queueing. A home user, with a printer attached to their PC needs little or no queueing. If you run a Windows desktop as a print server, how does it differ? The only thing I can think of is remote job status, and to tell you the truth, if you use Windows (my experience is mostly XP) like that, the information you get back is of little or no use anyway. And even queueing locally on XP, I keep getting prints stuck at the top of the Windows print queue that stop and block the entire queue until you cancel them (holding them does not work).
lpd - if that is what you are using, although CUPS is much more common on all current Linux distributions, when set up correctly gives you the same degree of control, it's just not dressed up as much.
Oh, and I think that you'll find that not even UNIX had lpd in 1972 (a print spooler called lpd appeared on the V7 addendum tape, and the documentation says that it came from PWB [look it up], but what you probably mean is the BSD lpd with network support that appeared as TCP/IP developed slightly later). The System V print system in the 1980s moved this on, and CUPS is much more than that.
I don't have FreeviewHD
and I get HD through Sky, but at the time, SkyHD was 1080i, so being the financially challenged person I am, I bought a perfectly acceptable not-a-known-brand TV that would do 1080i and not 1080p. This was several years ago, so I guess that you could say that I am an early adopter.
The TV still works find, but I would appreciate advanced warning if the Beeb do the same on Sky, because the TV won't handle it!
Mind you, by SkyHD box (a Thompson) is also a bit long-in-the-tooth, and I've had to fix it twice already (power supply and hard disk), so I doubt it would cope with 1080p anyway.
I also suggest to anybody who buys a TV in the sub-£250 price range to check the display panel resolution on their HD Ready TV (some of which I'm sure actually carry the "Full-HD Ready" logo). Chances are the panel does not have a resolution of 1920x1080 anyway, so it is a moot point.
I hate manufacturers being able to force new purchases on their customers.
It's to make it look like a Chris Foss book cover from Panther and Granada books in the '70s. Things like Children of the Lens, many of the Asimov books (the Gods Themselves would be appropriate, I think), and many other iconic images.
I understand that read-only media is a potential solution, but you then have to worry about updates, as even an OS on a R/O media may contain bugs that lead to information leakage or access problems during the running of the system.
If you look at most Live disks, you normally have a degree of persistent storage, because the Live CD is normally overlaid by a UnionFS, often stored on USB memory device. This allows users to keep information after the system is shut down. If you have persistent storage, especially if it allows browser tools or extensions to be installed, then the system is still vulnerable.
And you also assume that you don't need to install printer, network card or display drivers. I don't know how often you use a Live CD, but whenever I have, I have found it a seriously disappointing experience, being slow, and missing support for anything that is slightly out-of-the-ordinary (like the non-free Radeon and Nvidia drivers to accelerate display performance or a lot of wireless cards).
Using Virtual Machines only works if you use fixed boot images (otherwise you are just exporting the problem into the virtual machine), and if you are talking about server farms, only in a large environment with some trusted support to maintain the infrastructure. It does not help home users, and would be seen as just another level of complexity to configure. And my point about persistent storage above is still relevant.
I have thought all of these things through, and with the current user expectation of control over their own PC's, none of them are really workable.
If we could have a highly trusted read-only image, that did not contain any bugs and also had everything that a user might want forever, then you could propose such a solution, but this is a Utopian view (and you know that Utopia means either "good place", or more likely "no place").
Google, with ChromeOS are trying this, but we need some more work exposing 3D graphics acceleration and abstracted sound and other device layers to be exposed in the browser to make it acceptable for even modest gamers. I am not going to hold my breath for a port of Crysis or BioShock onto Chrome OS.
@Artic fox - But it is
*intrinsically* more robust than normal Windows instances up to and including Win XP, especially where the Windows users have been encouraged to make their normal users administrator accounts (like many, many pre-installed Windows PC's). That is a fact. People who deny this can't actually understand privilege separation.
But this story is about a social engineering issue, where users are being tricked into running something with enhanced privileges. It is not an unseen, unknown back door into the OS, but very visible and relying on user interaction, and as long as an OS has the ability to run something with enhanced privileges, can affect absolutely any operating system.
Let me ask you something. If asked in a pop-up to install something that suggests it will fix a problem (especially if it comes up because of a cross-scripting problem when accessing a Bank or some other trusted organization's site), do you think that your grandparents, or if you are old enough to be unfortunate and have lost them, your parents, can *sensibly* differentiate between what is really safe and what is not? I know that I am worried that my 82 year old father, who is a regular Internet user, cannot differentiate between 'good' messages from Microsoft Windows Update and 'bad' ones, even though my two brothers and I drum it into him at every opportunity. And I also have to dis-infect my two youngest (teenage) children's systems sometimes, even though they are old enough to understand the dangers.
Current OS's are just not suitable for purpose when given to non-technical users.
Whilst this may be a new instance, it's nothing to be surprised at. Any OS that allows you obtain escalated privilege to do something legitimate can be compromised like this, including all variants of *nix platforms.
Why it is more important is that Mac users, who have been lulled into a false sense of security by too many unfounded claims that OSX is immune from malware, will suddenly have to become much more aware of what they are doing.
In some respects, although I would suffer like everyone else, I think that sudo, UAC on Windows and whatever they call the equivalent on OSX (I know it's sudo under the covers), which make it easier to do things with escalated privileges, should be removed. This would make hoops that you must jump through to be able to do destructive things on a system cause you to really think what you are doing, rather than just clicking on "Yes" or asking for a password. But the hooks they use are built into all modern OS's, and even if they weren't supplied with the OS, they would still be there. And SELinux and Role Based Access Control (RBAC) only changes the problem, not solves it.
Of course, this makes computers difficult to manage by ordinary users, so will never happen. And if someone did propose a locked-down OS, then everybody would be screaming from the rooftops about too much vendor control over the OS.
I came to the conclusion some time back that all PC OS's are too complex to trust ordinary users to look after properly, but have not got to the next step of trying to solve the problem. This issue shows that even OS's with good security features are not safe if users do not understand what they are doing.
Maybe Google ChromeOS is the way to go. Locked down OS with a configurable application layer on the top (I just wish it was not in a browser). But I'm sure you will still see personal information being stolen, botnet clients and anonymizer proxies on this platform once the crackers start looking.
An alternative is to use a DVD or I believe BluRay disk produced with THX (like any recent Star Wars disk). This contains a setup tool called THX Optimizer
On the menu screen, move the highlighted area to the THX logo, and press enter or play. You should find several setup tests for resolution, contrast, colour balance, sound and a couple of other things. To get the colour right, you really need a blue filter of a particular pantone colour, but you can get an idea without it.
Bad examples - Goodmans and Morphy Richards
You can add Bush, Grundig and Ferguson.
These were all recognised brands in their own right back in the 1950s, '60s and '70s. They were not badge-only brands, and did their own product development. When the companies fell into hardship, as they all did because they tried to maintain a UK manufacturing base when making things in the far-east became much cheaper, the brand name was bought by companies that specialised in generic goods made in Japan, Hong Kong, Taiwan and Korea and more recently China, putting the name on the product, and presenting them to patriotic UK buyers as if they were still made in the UK. Nothing illegal, but you can bet the "Made in Wherever" was in as smaller print as they could manage while not infringing Trading Standards.
This was to capitalise on the "Buy British" campaign that tried to keep wealth in the UK during this miserable period of austerity.
If you look at Bush, it's been traded between companies for years, and is now effectively an Argos brand name.
I used to enjoy trying to spot how many different brands a product could be spotted with. Made shopping with my parents much more bearable.
Can't even pop down to the highstreet to Dixons. It's not there, having gone web only half -a-decade ago.
All of the Dixons stores they wanted to keep were re-branded Currys.digital.
I always avoided Saisho as a brand, and - strange as it may sound - I was a committed Practika camera user, so never considered (not that I would have ) a Miranda camera.
I think you are confusing concept and implementation.
It is quite clear that pretty much all of what you quote has been available for many OS's, with UNIX as one of them. For example, the segregation of authority has been in almost all multi-user OS's since computer-time began, and most implementations follow at least in part the Multics model. Ditto 'home' directories (and I can quote from direct experience VAX/VMS, RSX/11M, RSTS, PrimeOS, MTS, Harris VOS, AEGIS and Domain/OS [the earlier versions were UNIX like, rather than actual UNIX], and probably several I can't remember, and - of course - UNIX's from Bell Labs Version/Edition 6 onward and Linux)
In recent history, there has also been a common model of GUI, so I would not want to call whether GNOME/KDE was influence by Win95, with parallel evolution, or whether there is a common work-a-like (for example, whether both of them took anything from the Apollo/VUE/Motif/CDE developments). I certainly would not want to say that Windows 7 desktop was a copy of KDE any more than KDE being a copy of the Win95 desktop.
Many people who look at such things believe that the Genetic UNIX kernel (i.e. derived from AT&T code) has a number of fundamental deficiencies that need fixing by being replaced. Some, though not all, have been fixed in BSD derivatives and Linux, although both have other issues.
If you look at it, Rob Pike, Ken Thompson and Dennis Richie, all movers and shakers in original UNIX, have moved on, and developed (with others) Plan 9 as a replacement for UNIX.
Microsoft believed that they could develop a new OS from the ground up which would be better than anything then available. They did this by using IBM and people from other OS families (including Dave Cutler who was a VMS architect). Unfortunately, whether it was because of the short-sighted view of what it would be used for, or the requirement of backward compatibility with older OS's, Windows NT (new technology) and derivatives ended up where we are today.
If you look at the privilege separation, NT file system and multi-tasking ability compared with other OS's of the time (and Linux was not one of these), and integrated GUI, WinNT should have been quite a capable OS, but something important got lost along the way.
I blame the application model of being able to write where-ever you want on the filesystem (required for non-NT Windows application compatibility, and carried on to NT through poor education and practices by application writers) as the primary problem. If they had put a DOS/Windows 3.1/95/ME comparability mode that used something like chroot to isolate such application from the main OS, and enforced system directories that would never be written to by non-privileged applications for system libraries, utilities and DLLs, used specific User ID's (not Administrator) and directories as placeholders for applications, and made sure that nobody EVER installed NT using a FAT32 filesystem for C:, then I think that NT would have developed into a potent multi-tasking, multi-user OS. That did not happen and it's not where we are now.
BTW I am, and have been for over 30 years, a strong UNIX and latterly Linux advocate, but I do not let that blind me to the merits of other OS's, even if they are only potential.
@Robert Long 1
It depends on what kind of earth-like life you are thinking of. If you think of bacteria or other single-celled organisms, then gravity is probably not too much of a problem. If you think of higher lifeforms, then gravity will be a big factor.
How could it not be!
Can I press the button now, Teach.
Oooooh, that's nice.
I can't remember what dropbox state as their business continuity model, but if they offer any form of backup at all, then they have to have some means of reading the data to replicate or copy to backup media. Even if they offer encryption, then unless it is client side (i.e. on your system) before being sent over the wire, someone would have the opportunity to capture whatever is needed to prime the encryption.
Let's face it, if you use somebody else's service to store your data, do you ever have anything other than their assurance for the security of that data? The only thing you can be sure of is what you do yourself, so either don't trust them with sensitive data, or encrypt it, just as everyone else is saying.
It's a no-brainer, really.
Buying and collecting
You can't have it every way. Either you pay a higher price for someone to hold stock locally, or you must be prepared to wait until it can be delivered from a central warehouse.
It is only in the very rare situation where you will find a local retailer with products in stock, that they are prepared to sell at the same price as the large chains.
I must admit that having lost our local Currys during the sharp intake of breath DRG had when the credit crunch started, and also having lost our Woolworths, I am starting to look very favourably to our local independent electrical retailer, just to try to keep them in business and local. Yes, they're more expensive, but so is petrol, and my time is also precious. They will deliver exactly when I want, and also offer far more help and advice in fitting, often for free, than I got after paying for an installation service for my last cooker from one of the Nationals.
yes and no. At the point of consumption, it has little to no fat, so qualifies as a low fat food, and definitely will not contain cholesterol (one of the reasons to eliminate saturated fat). But is seriously bad for the everyone except for the dentist!
Actually, conventional food science says that sugar is only converted into fat if there is a surplus of energy foods in the diet. If there is not enough other energy sources, it will be used to power the body. Still, it's not likely that someone consuming peppermint patties is short of calories in the rest of what they eat!
It always worries me
when a company reporting poor part-year results, offers to increase the dividend return to it's investors.
It strikes me as a callous offer to try to keep share prices up, mortgaging future profits to make the company look better now.
But then there is so much in the shareholder-value-is-paramount led business model that I disagree with.
Stop the world. I want to get off!
"around 20 years"
I don't know where that came from. The filing date on this patent is 10th December 2003, although it is a continuance of other patents, the earliest being 07/926,333, filed in August 1992 (from the link in the article to the U.S. Patent office site).
IANA(IP)L, but I would assume that it is the filing date of the actual patent, not any precursor patent applications that have been abandoned, that is important.
Mind you, if there is a precursor patent application that was abandoned, and the technique is similar, does the abandoned application not count as a clear and unambiguous case of prior art!
I know it was not shipped with the laptop, but IBM and latterly Lenovo have complete service guides for all Thinkpad models on the Internet, freely available, and easy to find.
Want to know the part number for the screw that holds the power socket into the case for your 1998 vintage T20? It's all there. And the strip-down guides are not just the easy to open doors and hatches, but removing every component that has an identified part number down to bare plastic cases, ribbon cables and screws.
That's one of the main reasons why I choose them for my workhorse laptops. The other reasons being easy availability of spare parts and general robustness.
I'm fairly sure that I've heard in a TV documentary about Hersheys that the reason why US chocolate tastes so bad (or unique, as they put it) is because in earlier days, it was shipped around the US by rail in unrefrigerated boxcars in high temperatures, so by the time it got to it's destination, it was seriously past it's best.
People in small US towns across the US, being told that Chocolate was the height of confectionery in Europe ate it anyway, and got used to the taste.
When distribution got better, people complained that it did not taste the same, and so it was re-formulated to deliberately taste the way it does.
This could be an apocryphal story, but I believed it at the time.
What I found hilarious was a declaration on the packaging of some Peppermint Patties that came from the US that stated they are a low-fat food, which is true (only a very thin layer of dark 'chocolate' on them), but as they are about 60% refined sugar, the statement really is designed to mislead the US population about it's suitability as part of their diet.
If this is how you see it
then it will become the Amstrad Emailer of the future.
Using app or apps as a contraction for application is as old as the hills, or at least the PC (although probably older than this, but I have no references).
I'm fairly certain it was in common use in the 1980s. It's difficult to find references, because the Internet didn't exist back then, and most things were documented on paper. Maybe someone could trawl Usenet archives to try and find the earliest reference, or go through the Personal Computer World archives to try to find the earliest example.
I think that Macross (especially Macross Plus) is a better example in anime, but it is quite a common theme.
I sometimes wonder where all these missiles are stored, especially in the transforming robot/planes that must have so much more gubbins under the skins, but if it were realistic, it would not be so visual.
The Mote in Gods Eye
Although it is one of my all-time favourite books, it's a bit long, especially as it is so fast moving as everyone is almost constantly in a state of disciplined panic. I'm sure it could be cut down, but would loose much of the background that is essential to the story.
I'd love to see the first contact with MacArthur tearing through the light sail and wedging the Crazy Eddie probe into the hanger filmed. It's a terrific bit of writing, and would translate to the screen really well IMHO.
I was really surprised that Oath of Fealty never made it into the lists. That would film well with a small principal cast and much that could be filmed without a huge CGI budget. It could also be marketed as a near future story to the mass market.
@Andrew Garrard re: Enders Game
is one of the few books that I had to read cover-to-cover in a single session. But the tactics that made Ender so different would be almost impossible to relate on the screen, it would be just too confusing for most people, especially in the zero-gravity encounters when there is no 'up'.
If I remember correctly, there was only a single child death in Battle School (and a documented history of a few more that are mentioned, but not detailed), and this was not in one of the set battles, but a bullying incident that would not be too difficult to portray. In the battle room, participants are 'frozen' by immobilization suits and guns not that different from laser-zone guns instead of killed.
All of the 'real' battle scenes except the flashbacks to the first war were deliberately stylised so that they appeared like tactical exercises to the children. That would be easypeasy to film.
A major UK bank I worked at had some very good people (since made redundant) designing and building a customer facing environment in accordance with their quite rigorous security standards. One of the steps to getting it approved for use was a PEN test that was tasked to one of the organizations who are regarded as good at such things (if you think of the first name to spring to mind you've probably guessed who they are).
Halfway through the morning, a message got back to the admins from the PEN testers that went along the lines of "Could you please open up some of the firewalls and server ports to allow us to actually see some of the systems. We're having difficulty getting anything to respond to our probes".
You can guess what the answer to that was!
I agree with the AC
If the files are corrupt, it is probably something other than the media or transport layer of the network that is at fault. If you are using TCP/IP, then you should get an error free stream of packets, because the TCP layer does packet checksum and retransmission. I suppose that it is just possible that a corrupted packet actually has the same checksum as the original, as the 16 bit checksum on TCP packets is not especially robust, but I do not know how likely that is.
Anybody any idea?
Different potentials on the earth in a ring.
I don't know exactly how the rings were arranged in the building (I was a mere student, not one of the staff), and I admit that there could have been a wiring fault, but this is what we measured. It may also be that the neutrals of the different phases were not tied together correctly. The hall comprised of a number of different buildings up the street, so may have had separate single phase installations at some time in the past which may explain it not being a proper three-phase installation.
This was 30 years ago, and the building was a lot older than that. I have no idea about the current capacity of that situation, I didn't want to put a load across it to measure the current. It may have just been an earth-leakage problem in some equipment attached to one of the rings. It was close to the kitchen.
Each phase could also have a different earth, with no common connection through a conductor. The earth itself is not a conductor, so it is perfectly possible to get local variations in earth potential.
what the TV licensing people have to say about this.
If you use a USB dongle for freeview, you are covered by your home TV license with a laptop as long as the laptop is running on battery. As soon as you plug it in, you have to be covered by a valid license for the premises you are in.
The important thing appears to be that the receiving device is battery powered.
This thing looks like it is battery powered, so you can plug your laptop into the mains with impunity! I'm sure that there will be debate about this, but this is how I read it.
I also wonder how leakage into adjacent properties not covered by a license will be seen by the TV licensing people.
I really doubt that many houses are on more than one phase. It can cause all sorts of earthing problems, some quite dangerous.
I worked at a REC (Regional Electricity Company) for a while, and it was explained to me that in most streets you end up with the phases being alternated down a street, so you are rarely on the same phase as either of your neighbours.
A single phase is quite capable of delivering the 60-100A that will be more than most dwellings need. You can tell by finding the electricity meter in your house. If you have a single meter with a single (normally red, and quite thick) wire connected on the input side, you are on a single phase.
I have PL set up in my house, and we have it working through the all three floors of the house, through several breaker boxes which are only connected together at the main live feed into the house. Some of the parts get better speed, however. Most of mine are eBuyer special 85Mb/s Turbo mode devices, about as cheap as I could buy.
When setting up some PA kit at my University hall of residence, we had a persistent buzz we could not get rid of. We traced it down to having different phases and earths on each side of the dining hall, and we measured more than 100V AC between the earth pins of sockets on each side of the hall. This scared me quite a lot.
@AC re Caves of Steel
Not the radio version, the 1964 TV adaptation by Terry Nation, and starring Peter Cushing. Read the article on Wikipedia. Only a minute or so of clips remain.
@Alien Doctor 1.1 re. chemical warfare attack
I did say "clean". I would be interested (in an academic sense - I have no dirty sock fetish) in finding out what made your socks such foul things. If you wore them for days on end, or paddled through mud in them, or maybe didn't wash your feet than I can understand. If your feet get wet, then maybe the choice of sandals was wrong in the first place.
Maybe I'm lucky, but my socks come off at the end of the day only a little more smelly than they were at the beginning, and even less so if I have been wearing sandals..
- IT bloke publishes comprehensive maps of CALL CENTRE menu HELL
- Nine-year-old Opportunity Mars rover sets NASA distance record
- Analysis Who is the mystery sixth member of LulzSec?
- Prankster 'Superhero' takes on robot traffic warden AND WINS
- Comment Congress: It's not the Glass that's scary - It's the GOOGLE