1925 posts • joined 15 Jun 2007
It goes back to beyond the golden age, and pre-dates the term Science Fiction. I seem to remember Isaac Asimov commenting in the forward to one of his short stories on the argument between the use of the two terms when Astounding Stories was being published (it's even older than Isaac (rip), but he was representing the view of Hugo Gernsback, the founding editor).
Haven't heard that term in a long time!
It's not even mid-engined!
I was going to mention transputers in my last post
but I decided that it was long enough already!
This is completely wasted on ~100% of commercial software
In that part of the software market, it's all about rapid application development, and sod the efficiency. They rely on Moore's Law to make sure that by the time their software hits customer systems, the computers are powerful enough to cope.
So MIC processors will be completely wasted on commercial boxes, which is where the majority of the systems will be sold.
Even if someone (extremely cleverly) produces an IDE that can generate parallel code to make good use of many-cores, much of the workload that is done is not suited to run in a parallel manner anyway.
Apologies in advance to those that do, but most new programmers nowadays are never taught about registers, how cache works, the actual instruction set that machines use, and I'm sure that there are a lot of people reading even on this site who do not really understand what a coherent cache actually is.
I work with people who are trying to make certain large computer models more parallel, and they are very aware that communication and memory bandwidth is the key. Code that is already parallel tops out at a much smaller number of cores than the current systems that they have available can provide. And the next generation system, which will have still more cores, may not actually run their code much faster than the current one.
But even these people, many who have dedicated their working lives to making large computational models work on top 500 supercomputers, don't really want to have to worry about this level. They rely on the compilers and runtimes to make sensible decisions about how variables are stored, arguments are passed, and inter-thread communication is handled.
And when these decisions are wrong, things get complex. We found recently that a particular vendor optimised matrix-multiplication stomped all over carefully written code by generating threads for all cores in the system, ignoring the fact that all the cores were already occupied running coded separate threads. Ended up with each lock-stepped thread generating many times more threads during the matmul than there were cores, completely trashing the cache, and causing multiple thread context switches. It actually slowed the code down compared to running the non-threaded version of the same routine.
It will be a whole new ball game even for these people who do understand it if they have to start thinking still more about localization of memory, and if they will have difficulty, the average commercial programmer writing in Java or C# won't have a clue!
I would like to know
where the references to back this claim up are!
what the advantage of a MIPS processor over ARM is.
ARMs are already cheap-as-chips, low power, and easy to license. Several Chinese companies are already making SoC implementations, with graphic assists on the silicon, including Rockchip, who seem to produce millions of the things to go in chipod and apad type devices.
They need to at least recover their complete court costs in a timely manner. Otherwise, Lockheed et. al. and their proxies will just tie SpaceX up in court until their budget is exhausted.
This is the problem with the US (and increasingly European) legal systems.
Anyway, I'm hoping that they successfully defend their reputation.
BBC iPlayer - another brick in the wall.
I'm cross, but not because AIR is going, but because it is proving the trend that is making Linux a less suitable OS for ordinary users.
BBC iPlayer was one of the few platforms for content delivery with content expiry that actually worked reasonably well.
The reason why this is important revolves around the perfectly understandable attitude of the content owners wanting to protect their content, and thus their existence.
Like it or not, free content is not the way that the world is going, and the large production companies investing millions in current TV series and films will not license their content for delivery channels unless those channels at least make it difficult to capture and re-distribute it. And strictly speaking, get_iplayer accesses the content in a manner against the terms and conditions for iPlayer.
This means some form of DRM. Without a trusted DRM mechanism, you won't get _legal_ streams or downloads of new content playable on Linux. Without big-name current media, those enlightened ordinary users who try to use Linux will give up. So goodbye to Linux as a creditable Windows alternative.
One of the fears that the content owners have of Open Source platforms (and this includes Open DRM and content delivery platforms, not just the OS) is that someone can take the source and hack it to allow data capture. They will never trust it, so unless AIR remains closed-source (which is perfectly allowable under GPL/LGPL provided it is written correctly), it will become untrustworthy, at least to the content owners.
Whether a closed solution is actually any more secure is an interesting question, but that is a matter of perception and contract law (if you provide some software for a fee, and it fails to do what it is meant to, leading to a financial loss, then it does not matter what the License Agreement says, there may well be legal redress against the provider).
Open source makes no promises, has no contract, and thus has no legal redress.
Sadly, despite efforts from people like Red Hat and Canonical, I think Desktop Linux has now missed the boat. It is clear that the world will/is moving on to tablet and mobile based devices which include some form of content delivery and control system built in from the very beginning. These may be Linux/UNIX based, but they aren't what I call a general purpose Linux device, which is what I want.
but you forgot it is not an infinite resolution camera! They use "image enhancement" to sharpen the image. That's the magic!
I keep asking why, when matching fingerprints, the computer shows each record on the screen. Just think how much faster it would be if it didn't have to do that, and say, just did a relational database search on a hash of the loci!
Only topped by the real-time IR satellite images down to a resolution of about 5cm that appears in Behind Enemy Lines. I'll also swear that the first missile fired at the F/A 18 is in the air for nearly two minutes, whilst following highly evasive manoeuvres.
Maybe I'm showing my age, but I used card punch time clocks (which normally are referred to as "time clocks") in one of my early jobs.
Might I suggest that you watch the Warner Brothers cartoons of Ralph E. Wolf and Sam Sheepdog. They always clock in at the beginning of the cartoon, and out at the end. That's a time clock.
1. They might have access to leaked phone number lists, or they may have a copy of a Directory Enquiries CD set from BT, or they might just make them up!
2. They probably don't. It's just a line dangled to make them appear more plausible. Alternatively, they may have some leaked information from BT or your ISP, because it is certain that at a known time, those organisations know which IP address is allocated equipment on which phone line.
3. Windows is ubiquitous. For home systems, chances are that at least 90% of homes with a computer have a Windows variant rather than a Mac, Linux or other system. And even those with Linux probably have Windows installed somewhere as a dual boot.. The Reg. readership are not typical. My house as all three (Win2000, WinXP, and Win7, OSX, and Linux), as well as an AIX box.
I suppose that there will be an increasing number of houses that have broadband for just their TV, gaming console, iPad or Android Pad. I wonder how the ISP's will cope with supporting such customers? At the moment they all appear to be geared around having a Windows box around.
An understanding of evolution was not essential to the creation of the smallpox vaccine. This was developed by observation, hypothesis, prediction, experimentation and conclusion, exactly as the Scientific Method dictates.
Your example of a Flu vaccine is not a good one, either. Most Flu outbreaks are of known strains, of which there are many. Each vaccine developed is a mix (normally of three strains), and is only effective against a small number of these strains sometimes more than the three target strains), and it is the job of the vaccine producers to make an informed guess about which will be the main threats each year. They then prime the process to produce the vaccine (which are developed in chicken eggs) to produce the vaccine for that year. This process takes weeks to months to get the number of doses for a large population. If they select the wrong strains, the vaccine could fail to protect at all.
What gets the medical profession worried is new mutated strains of 'flu, for which they don't yet have a vaccine. It is necessary to isolate the virus in order to culture it to produce the vaccine. By the time a vaccine for a new variant is produced, it may be that a sizeable part of the world population has been exposed, reducing the value of the virus.
And why do you think that you can trust what a half-life means? And how do you know what radioactive decay is? And how do you know how much of the original sample remains? And how do you know you can trust the mass spectrometer? And... and... and ad nauseam.
Until you think about it, most people regard experimentally confirmed hypotheses as truths. Unfortunately, science does not really refer to truths, but about not-disproved hypotheses. This is a fair point if you believe the scientific method, but becomes hard to justify to someone who wont acknowledge it.
You just have to try arguing this with one of these people who are good at it to understand what it is like. They effectively argue that you have to justify the entirety of known science in order to trust it, and most people get too cross after a while to argue effectively. I just refused to continue once I realised what their tactic was.
Creationists do not dispute extinctions. They just don't believe the time scales over which they happened.
I've whiled away many hours arguing about ID and creationism with some otherwise completely rational people, and the most skilled of them have convincing-sounding answers to almost every question you could ask!
Firstly, they argue that the dating techniques are not accurate, as nobody understands all of the hypotheses that they are based on, you have to take it on 'faith' that the whole chain of scientific proof is true, and thus their single faith belief (in the Bible) is more trustworthy than many beliefs that previous hypotheses were correct.
Then they will argue that if dating cannot be relied upon, then how do we know that the Earth is older than 6,000 years (I don't know where 10,000 years came from, my friends were certain it was only 6,000).
Then they will argue flood.
Then they will argue 'test of faith' of the believers.
The most recent discussions I had with one of them even allowed for micro-evolution (change of colour, eating habits etc) as a result of environment.
It's all highly amusing, and I still count several of them as friends. But that does not stop me thinking that, at least in their beliefs, they are a bit crazy. But it livens up a beer or five!
Ahhh beery crazy discussions!
This is Apple
with big pockets. I would imagine that Dell, HP, IBM or any of the white box manufacturers would have been quite happy to flash different bootstrap code from normal to allow OSX to boot, considering the number of servers they would sell. Would probably also still support them as well, if asked.
This is the "Rules of Engagement"
The Royal Navy are quite capable of preventing a lot of the dhows and speedboats from causing bother to the tankers. After all, even a 30mm cannon can do serious damage to an armoured wooden ship, and helicopters can react very rapidly over quite large distances.
Unfortunately, the Rules of Engagement state that they have to have a reason for stopping or boarding the dhows, and also that there has to be evidence of hostile action before the RN can fire on ships in the Indian Ocean and Arabian Gulf.
Besides the pirates, there is a large amount of quite legitimate sea travel in these seas, so the standard tactic of the pirates is to look as innocent as possible until they are within a few hundred metres of their target, and then move fast. Once on board, they have hostages to hold the navies of the world to account to prevent any action.
Because the military lawyers advise against possible harm to civilians, especially the hostage crew once a ship is taken, it is almost impossible for anybody to take it back without collateral damage, no matter how well trained or armed they are. This is compounded by the unprecedented access the media has to publicise what has happened, and focus the World's scrutiny.
This is not just an RN problem, but one that affects all countries navel ships in the area.
I think that all of the examples you quoted show Apple refined other peoples ideas.
The iPhone, slick though it is, is just a smart phone, and people like Compaq/HP and Palm were selling smart phones with touch screens long before Apple.
The iPad is a touch screen tablet. Many of these before the iPad, but again the iPad is very well executed.
iMac - Pretty for it's time, but in no way was it the first system-in-a-box with a keyboard and mouse. I could point out several CP/M systems from the early '80s with similar form-factors, and the classic Macintosh pre-dated the iMac.
Stylus free touchpads. Goodness. How long have Synaptics been around?
Development and distribution. I know of several websites that will allow purchase and delivery of applications direct to a device, even to smart phones. I'm not too sure about a development environment, I don't know whether this is actually integrated into iTunes, because I do not write such apps.
BTW, you missed out iPod, or maybe you realised that that was not the first in it's field either.
Apple are great at industrial and ergonomic design. No doubt about that. But innovative?
Now, for innovative, try thinking Nintendo with the Gameboy and Wii. It is possible to be first in the field.
This is all fine
as long as the only data you keep is in a form understood by the cloud. I must admit that I have only Google Docs to go on (and I don't use that much), but it appears to me that if you want to keep some data that does not fit with the applications supported, you will struggle.
Of course, as I have often said, I am not a typical user any more, and many people only use data of defined types 'music', 'pictures', 'video', 'documents (embracing email, letters, the odd spreadsheet)', but as long as there is no generic data container (think file), I will not be able to work totally in the cloud, and probably won't at all (damn, wrong already - I've just remembered that I'm using gmail a lot now).
Computers are a generic tool to me. I may use one any time for a purpose I have not yet thought about. I'm regularly throwing gigabytes of data around my home network, and have not got sufficient bandwidth to do that over the 'net.
All of this hype about the 'Cloud' is currently just a wet-dream of the people who want to tie-and-charge consumers (I won't say customers) into their money generating machines. It may change to an benevolent altruistic model, but I'm not holding my breath.
that ACS:Law £200,000 fine was against a limited company. If that was the case, then UK law says that he *personally* is not liable for the company losses unless he was a director, and then only if he was negligently running the company (and although he was a con artist, this does not amount to negligence in UK corporate law).
This article says that he has been declared *personally* bankrupt, so the two things are not necessarily linked.
When it comes to personal property, as long as the money used to buy it was extracted from the company in a legal manner, then there ain't much that can be done to link the company losses against him personally. That is what a limited company is all about.
Of course, he could have been stupid, and set it up as a partnership (trading, not legal - although who with is a moot point) or as a sole trader, at which point he would be liable. But he wouldn't be that stu..... Oh, wait. Maybe he would.
Unfortunately, this would be SOOO insecure, as the answer-back string is triggered remotely.
As can (believe it or not) the programmable function keys of a VT220. I'm sure that I spent some time twenty years or so ago, writing a program that would set a PFK (on the shifted function keys IIRC), and then trigger it.
All you needed was write access to the device, and you cold make the current user apparently run anything you wanted them to! Similar techniques worked for HP2392 as well.
This was with UNIX, not VMS, so I'm not sure that this was possible unless you were already were a privileged user (could you so it through Phone, I wonder).
You're assuming a certain type of game.
I would guess that Nintendo are trying for another Wii moment, with completely new types of game with more interaction that you can get from a tradditional controller.
But this is the manufacturers talking
They are not interested in the netbook they sold yesterday. That's history. They are looking at the one they may not sell tomorrow.
I'm still happy with what my EeePC701 can do running Ubuntu. I'm just a bit worried where I can get a replacement battery when it dies!
@Joel 1: He might have been in the audience!
@JEDIDIAH - So has Android...
... but you have to jump through hoops to find them!
top, ps and kill all exist and can be used (at least top and ps) if you can get a shell on the phone. Kill depends on how you get the session.
But then you can also run "Advanced Task Killer" which is in the Market place, and 2.2 onward has an enhanced Task Manager
"mirror the mainboard"
I presume that you mean that the PCI cards appear on the 'wrong' end of the board, and also that the case opens on the 'wrong' side. Chances are these were systems with BTX (as opposed to ATX-type) motherboards, that were supposed to mark a new integration of board and case design to allow better cooling. It was an Intel specification. Gateway and Dell produced several systems using them.
Absolute bugger to try and find a replacement, because nobody makes them any more.
At one of my contracts
I spent a lot of time gathering data about systems that needed OS upgrades in a company with a large (more than a thousand system-images) heterogeneous estate. I created it in a relational, normalised manner that allowed complex queries.
When it was decided that the task was too big for one person to actually do all of the upgrade work (duh! hundreds of systems!), I was told to hand my data over to an administrator to manage it, and was relegated to just a technical resource performing some of the upgrade work. The first thing the administrator did was to dump my data into an Excel spreadsheet "so everybody could use it", after which the management of it went to pot. Because of numerous data-loss errors, they eventually surrounded it by scripts to effectively serialise access, not trusting Excel's multi-user protection features (this was some years ago, so things may have got better).
I had actually asked for the data to be stored in a multi-user RDBM (it is a large organisation which employs a dedicated DBA team, so there were plenty of databases around), but I was told that there was not a suitable system around for management tasks, and told to do the best I could with what was available. I did not feel appreciated at the time.
I find it incredibly ironic that an organisation that has bought in to databases, spending millions on Oracle and other DB licences to manage customer data, cannot see the benefit for using such tools for their own management purposes.
Ho hum. I can't see myself working there again! Everything has now been moved to India.
Not just Apple
almost all the world's consumer electronics, tat and anything else that has fallen in price dramatically over the last 20 years.
Even the stuff made in Korea and Taiwan often contain significant numbers of components sourced in China!
Answered my own question. On 28th September this year, users of Mendip have to re-tune our boxes again(!), presumably to have the channels shuffled down to lower frequencies. Can't find the exact details, but www.digitaluk.co.uk says that this needs to be done.
"But that 790MHz band butts right up against the top DTT band, known as Channel 60*"
I know there is an asterisk against this (but I can't find the note that should go with it), but channel 60 is *not* the top TV band. TV bands go from channel 21 to channel 68, and on the Mendip transmitter, we currently have C61, C62 and C67 carrying terrestrial digital television. If we loose all channels above 60, we will loose all BBC channels, and many of the extra ITV and Channel 4 channels.
Pointing the aerial the other way will give me Welsh television from Wenvoe. I don't understand Welsh (living in England) so this is no help to me for the channels I might loose.
Also, I currently have a TV in several rooms in the house. I only have Sky on one of these. Are they going to pay for additional satellite boxes, together with a multi-channel LNB and all of the wiring in the house to maintain my current TV service?
Also what do they mean by "and 5.7 million users who've plugged them into their own televisions"? Do they think that anybody who has *not* paid for a TV installer to plug it in are incompetent? I think the Electronics part of my degree probably means I understand more about transmission lines, aerials and the like than a TV installer who has probably done a one day course in how to read a compass and point the aerial in the correct direction!
Admittedly, to do the job properly now there is no analog TV signal, you really need a signal strength meter to get the best signal, but there are ways and means if you don't have one.
that there will be no way of using this from Linux!
And I hope you will be able to un-register devices from accounts for when, say, a young person leaves to set up their own dwelling.
Even if this is a positive spin on DRM, I don't trust the industry.
OK, UK Census.
But how about -
National Government: DVLA, HMRC, DWP, IPS (passports), MOD, GCHQ
Local Government departments: Electoral Role, Council Tax, Benefits system
Health system: All your health records.
Commercial: Your Bank, Utility companies, anybody who holds your bank details, your telecom provider.
Other: Basically, any personal data covered by the Data Protection act which makes it an offence as a data holder not to take all relevant precautions to keep the data secure.
Now, what were you saying about critical data and the requirement for strict network control?
I've worked in UK bank's IT departments where the network control was much more severe than UK government agencies, with serious risk of disciplinary procedures, sacking, and even report to the police for prosecution under the Data Protection legislation for anybody who does not follow the rules about connection policy. This included things like PDAs, USB memory keys, and anything that could possibly be a communication device.
Now where I am currently working, I'm not even allowed to plug a non-approved keyboard into their systems!
Lax network management
For goodness sake, at least segregate your DHCP space.
Allocate two IP subnets, trusted and untrusted. Register all of the MAC addresses of your trusted devices and give out addresses in the trusted range. Any unknown or foreign MAC addresses get given addresses in the other range. Allocate different DNS server addresses and default routes to each subnet. Use short leases to make sure that someone using a fixed IP address will be spotted (by duplicate IP addresses) as soon as the addresses cycle round.
Control routing between the two subnets so that untrusted devices get no access to internal servers, and minimal access to the Internet and such devices as printers. There, does not matter what gets brought in, it is unlikely to do any damage. And you do not even need to invest in a large network infrastructure, as most switches will multinet quite happily.
Of course, if you are paranoid, you could just not give out any DNS address to unknown devices, or you could have something like Wireshark alerting whenever you get a source address in your untrusted address range.
In extreme security environments, lock network ports down at the switch to only a single device per port by MAC address, with the port being disabled if another device is attached. As soon as a user plugs something else in and locks the port, they either have to call the help desk (giving you a chance to rap them over the knuckles), or suffer the port not working forever.
I know that this can be defeated with LAA MACs, but if al you are trying to do is prevent users from attaching smart phones, printers and the like, these devices use fixed MAC address anyway. Most basic users would also not know how to change the MAC address in their PC either.
This is not far fetched. I've seen all instances of the above deployed in real customers, and most large organizations do something along these lines by default.
This is aimed
at the large swathes of rural counties where even 500K/s would be welcome. You know, the ones where all they have is dial-up and if they are prepared to pay for it, satellite broadband.
This will make my wife happy (but not me or the kids) because she wants to move to to an even more remote part of Somerset than we are currently in, but realises that lack of Broadband will always be a show-stopper. I just have to remind her that 2Mb/s is far from Super-Fast, even if the Government says it is.
Oooh. New icons!
It's still the UNIX security model, it's just that the default user almost certainly has a particular group in their groupset, and the directory in question has group read-write-execute on it.
It's been possible to do such things as this since the year dot, or at least UNIX V7 circa 1978.
On modern Linuxes
the first account setup is an 'admin' account, but by default this gives them very little additional access to the system. What it does, however, is add them into the "admin" group which is setup so that they can use sudo when required to run commands with enhanced privileges. Thus in normal day-to-day use, the system is safe, and you can just worry about things that fire up the request for the password.
If you set up additional accounts without adding them to the "admin" group, they will not even be able to run sudo or use any of the additional commands that need sudo access to run (like package managers, for example). This makes those user accounts safe even from users who click "yes" to everything. Their personal information is still vulnerable, of course, but they will not be able to touch any of the system files or directories.
I though that OSX was the same, but if there are application directories that can be written to by one of these accounts without needing to use sudo, then it's security is significantly weaker than I thought. I will thus nod to everybody who has been saying that OSX no better than Windows, admitting that I was not totally correct, but point out that it is still better than the all-or-nothing situation in the pre-Vista Windows world.
On Windows 95 and 98,
there was effectively only a single user, with some slight trickery to allow some applications to store their defaults in different places for different 'users'.
All users were effectively administrator accounts, and as Fat16 and Fat32 filesystems did not have any form of security-by-user, the entirety of the system disk was vulnerable to infection by any account logged onto the system.
As a sideline, this last point is exactly why you should never do a WinNT, 2000 or XP install using Fat32 as the filesystem for the system disk, as this negates almost all of the security that segregated privileges provides.
On a side note, on XP and Windows 7 (not done a Vista install), the administrator password that is asked to be set up during install is indeed a hidden account that can only be used when the system is brought up in system recovery mode (or similar). This is intended to be used when the system will not start, or when users forget their own passwords.
By default when using the MS XP install process, the first named user account that is set up will be an administrator account unless changed. If you set up more than one user account during the install, the subsequent ones will be not have administrator rights, by default, but this can be changed.
But there is another point here. Many 'canned' Windows installs (for example, from system recovery disks) will not use the normal XP installation process, so even those users who have restored their system will not have seen this setup process. Only those wearing hair-shirts, and doing everything from lowest common denominator (MS install disks and vendor driver disks) will have seen these accounts being set up. But those of us who have done it this way KNOW that Windows installs are FAR, FAR more painful than some of the other OS offerings out there.
@AC 14:40 - Wrong.
That is the admin account for system recovery. Can't use that to log in when the system is booted normally.
The install process gives first user account set up admin rights. Subsequent ones will normally be ordinary users unless specifically changed. I always create my own admin account as the first account, and then create additional ordinary accounts for each of the kids for day-to-day use. I never give the kids the password for the admin account I created. I normally install any programs that then need admin rights.
For those awkward programs that have to have admin rights in order to run, I also create a second admin account, which I then fix in the Registry so that you can't log in using it, and tell the kids to use "Runas" with this account for any applications that won't work from their ordinary accounts.
It's not perfect, because you can really run anything with Runas as long as you can find it on the disk. But it meant that I was able to have one of our shared machines virus free for years (also have external firewall to block direct malicious traffic).
I think some of this must have stuck in the kids minds, because now they are older, and have their own systems that they control completely, they often keep using this model, and generally have less problems that their peers.
Meanwhile, back in the real world...
ARMs are currently being deployed more and more widely as people realise that they really don't have a current need for 64 bit processors for much of what they do. 32 bit+address extension will do very nicely.
Just wait for ChromeOS and a decent server distro of Linux for ARM, and Intel will see all sorts of customers defect. They just don't see that it's largely about power consumption, and their track record in reducing power is not good.
@Sir Runcible Spoon
But the BT HomeHub router is on the local network, and so a judicious bit of logging code in the router allows such things to be captured. Remember, a router may do much more than routing, especially if you (or in this case BT) has control of the firmware. I'm sorry for the icon, but I'm not the one being stupid here.
No it doesn't
PLT devices have discovery protocols (by what looks like a periodic broadcast) so they can see each other. Chances are they also use uPNP and are probably visible to the HomeHub. That's the beauty^H^H^H^H^H^H danger of uPNP.
Even if they do not use uPNP, BT can probably make a reasonable guess about whether such devices are on the net by sampling the packets on the net, and looking at the first six octets of the MAC address that identified the vendor of the device.
My PLTs are Intellon based, and come with a (Windows) utility that allows you to set the encryption key. Not only does the utility find the devices, but also can tell you how fast they are operating, so there must also be some other magic under the covers. I have a Linux utility in source, so I'll have a look at how it works.
Still, I have a Linux based firewall (really, separate from any of the comms kit - Smoothwall as you ask) between my ADSL router and the rest of my network (yes, yes, I know that there is a risk that the PLT escapes onto the wider electricity network, but that's why I set my own key), but it means that my ISP cannot probe my network.
@Don - Depends on which flavour of UNIX
IBM introduced dynamic driver load/unload, shared libraries by default, virtual Kernel address space (associated with never having to sysgen a system again), along with journaling filesystems and many other features, in 1990.
Shared libraries were around in SunOS before then, although the norm was still to statically linked libraries for several years.
I think that your description of X11 applications is completely wrong for everything except Java graphical programs (but that is a Java problem).
The concept of Drag-and-Drop in X-Windows (and it was probably X10 at the time) was shown to me on a Torch TripleX running X.desktop (although I'm sure it was also called LookingGlass and possibly OpenTop) in the middle of the 80's, along with desktop icons and walking menus. I concede that MacOS had these concepts before then, but they were not foreign to UNIX even before Windows.
The standard X-Windows model for GUI type programs was indeed to use toolkits and widgets (effectively library code) for drawing things like buttons, text boxes and pixmaps, and this does mean that the application has to keep some sort of track of what is going on on it's own graphical space, but the server is what keeps track of where the cursor is. X-Windows is built around call-backs and managed data objects, which meant that the X Server (the thing that controls the keyboard, mouse and screen) always has a degree of separation from client programs (which is really to allow X-Windows to run across a network, something that Windows still does not really do well), but it can only marginally be called Object Oriented.
This separation allows a client to be completely ignorant about the position of the cursor and which parts of a window was obscured by another window. Each click, key press and other event was tagged with the current cursor position by the server, and when a part of a window was uncovered, the server gave one (or sometimes many) expose events, saying exactly which part of a window needed to be re-drawn. And if the server was configured with BackingStore, the server itself could fill in the missing bits without bothering the client. This was designed to make it run efficiently with a network between the server and client.
In addition, things like window decorations (frames, resizing options, window control buttons) are all handled by a separate component from either the client applications and the X server. This is the Window Manager, which is what allows you to rapidly change the look and feel of the GUI. This works by encapsulating an application window (X11 defines a window hierarchy, with the root window at the top, application windows in the middle, and individual graphic contexts at the bottom handling widgets within application windows) , allowing keyboard and button events to be acted upon before they are given to the client. This is also an OO type feature.
I don't think that Windows integrated COM into the presentation manager until the late 90's probably with Windows 2000, although it was available to applications, and all windowing applications needed to manage their own
HP VUE and then CDE did provide something like COM, and this was before Windows95, but coding for CDE was difficult, and the old X11 models still worked, so were still used.
There are not many people now who actually code at the X11 level. Almost all applications are now written with toolkits or SDKs (like Motif, Qt and GTK+), which hide almost all of the complexity of how X11 works.
My EeePC 701
is currently acting as an internet router allowing my home network to use a 3G USB dongle while I change ISP's.
I thought it would be a bit difficult to set up, but it took about 15 minutes. I already had Ubuntu 10.10 on it, though, and it is normally used as a portable network capable media player when I don't want to watch what the wife has on the main telly.
Greenland is an autonomous country, has it's own parliament, and is not part of the EU. Thus I was not counting it as part of Denmark the country.
As a result, I'm would be surprised if the ban on Marmite applies.
Point taken, though.
Polar Bears in Denmark? Maybe in zoos!
Have I missed something, or is your grasp of geography a bit weak?
I agree with your sentiments, but I would dispute that major financial institutions use UNIX primarily because it's safer. In actual fact, all of the financial institutions I've worked at (and there have been several) all shield even their more secure OS's from untrusted traffic with layer-upon-layer of additional protection (filewalls, port filters, content level filters etc.), and often run their internal networks in segregated segments for security purposes.
Mainly they use UNIX because it has scaled better in the past, has been easier to port applications between different vendor platforms running UNIX, and has better Enterprise RAS features and vendor support than most other popular platforms.
With very large Intel systems, virtual machine support, and major vendors differentiating their Intel platforms with RAS enhancements, these advantages are being eroded over time.
@AC (both of them)
My views presented here are deliberately contentious, to try to get people to think. I am a little undecided about what is the voting system with fewest drawbacks, but I am reasonably certain that it is not FPTP or AV.
I agree about the post about mathematics (and other) teaching. I am not disputing that governments take a big part of everybody's earnings (although you 50% is an average, many people at the lowest end of the earnings spectrum only end up paying VAT - unless you are including employers NIC, which is an expense to the business of employing people, not really a tax on the income unless you look at it with a real pedantic eye). I used to run my own company with it's own payroll, so I can see all of these aspect of tax.
But unless you go to a full PR system with national candidate lists, which breaks local accountability (something I value strongly), then there will always be some quantization errors in the representation of the people. This is true with AV, STV and MMP. And when considered with current party preferences, will nearly always end up with minority governments.
I know that some successful countries actually work with governments that are in a minority, but for every one that does, I could probably point to at least one where historically they haven't always. In a minority government, charismatic leaders are the key, and in the cynical political atmosphere in the UK, I don't believe that we trust any individual enough, especially after Tony Blair and Margaret Thatcher.
Another point I want to make is that people don't actually want democracy. What they want is what they personally believe is right. Whenever such people do not get their way, many of them are prepared to blame the system, rather than accept that they are out of step with the 'will of the people'. And with a complex voting system, the least well educated and those that don't-give-a-damn-about-how-it-works will always feel this as long as they don't understand the process.
I was also pointing out that whilst the share of seats in the UK does not often track the percentage of the vote (and this has always been a sore point to Liberal Democrats and before them the Liberal Party in the UK - I've been politically aware for that long), it does not actually matter that much to the over-all policies. Bills are presented, mostly (though not exclusively) by the incumbent party, debated, amended and eventually have to achieve a majority of a vote to become a policy or law. Be more afraid of secondary legislation that only has to be voted on by a select committee rather than poor representation in the House of Commons. This is truly unrepresentative.
I accept that between elections, there is little that can be done by constituents to 'sack' their member of parliament, but votes in the House of Commons are by majority, so any minority government has to convince other parties, one way or another, to support them. I would be happy seeing fee votes more often, allowing MPs to vote according to their conscience and constituency wishes rather than along party lines, but sometimes it is necessary to enforce the party lines. So I assert that a party with only 36% of the vote imposing their policies is actually not significantly different from voting alliances seen in most countries with some form of PR without overall majority governments, although I accept that it is the the incumbent party that get to present the most bills.
I would also point out that if you really want an accountable government, you really ought to make it compulsory for all eligible voters to do so, because even 51% of 65% is less that 38%, only slightly more than the number who didn't vote! Is this truly representative? Or do you contend that people who don't vote don't deserve to be represented?
Anyway, beside electoral systems, somebody ought to take the "how-it-works" bat to Sarcozy, to beat some real knowledge into him.
@AC At the risk of reopening the AV debate
According to Electoral Reform Society estimates (there are no direct results, because the votes were not cast in the appropriate way), comparing First Past the Post, Alternative Vote, Alternative Vote+ and Single Transferable Vote, we get the following:
FPTP: Cons 306, Lab 258, LibDem 56, Others 28
AV: Cons 281, Lab 262, LibDem 79, Others 28
AV+: Cons 275, Lab 234, LibDem 110, Others 31
STV: Cons 246, Lab 207, LibDem 162, Others 35
Hmmm. Obviously with all of these alternatives, there is no better split. We still have to have a coalition, and one where *any* two of the three mayor parties would have a majority. <sarcasm>Sooooo much better. </sarcasm>
And with full PR (figure from the BBC), it would have been Cons 36%, Lab 29%, LibDem 23%, Other 12%, which is no better.
I prefer to know that the government was formed by the party with the biggest share of everybody who bothered to vote, at least in a system where a single party normally gets to form the government.
And I don't think that having a system where in in order to get a vote on what biscuits are served at the tea break, you have to engage in horse-trading with parties whom you may radically disagree with like much of Europe has is really that much better, and tends to lead to weak compromise government.
Of course, we could have a full referendum for everything, televised and using modern IT for instant voting, but I doubt that many people could stomach sitting down and listening to the boring and mundane arguments that form most debates in parliament even if they understood them (try watching the BBC Parliament channel for an evening, and that is the interesting selected highlights!)
Just face it. There is no perfect system, and any system will lead to some people being upset some of the time. Significant numbers of people are *always* in a minority in any democratic system.
Anyway, it's all a moot point. Regulating the Internet is like holding liquid mercury in your hand. It's almost impossible.
You're aiming your criticisms at the wrong people. Don't blame the Linux community for not fixing the deficiencies of the hardware manufacturers and system integrators.
I know it is changing, but up until recently, in order to get dual head support for a graphics adaptor in Windows, you relied on the adaptor manufacturer to provide a Windows driver CD. If you were lucky or had an integrated graphics adaptor in a laptop, then the system integrator would get it working in their pre-installed image.
It all works out of the box. But consider this. Try taking the same laptop, and installing Windows from scratch. I can guarantee that it will not be so easy now, and in my experience, can actually be *MUCH* more difficult.
Now I know that this does not fix the issues with Linux (all of which can probably be done, I've got dual-head support working in Linux in the past), but rather than pointing your finger at people who give their best effort support often in their spare time, pour your scorn out at the adaptor manufacturers and the people who supplied you with their hardware. Make it clear that you want a pre-installed Linux that works out of the box. Give them the same degree of vitriol that you put into forums such as these.
Will it make any difference? Probably not, considering the fact the Microsoft can choose who to give their substantial discounts for Windows to, and have proved that they are prepared to financially disadvantage suppliers who ship systems with Linux installed. But at least try.
And don't just blame Linux or it's user community.
So. Complain to Adobe and Steam, not to the Linux community.
But I'm not sure whether many people would actually be comfortable paying for a full CS5 license to put on a free Linux system. I would be worried that casual readers would assume that "On_Linux" == "Free", which is really not the case.
Unfortunately, we are in a chicken-and-egg situation. Adobe and other commercial software writers will not put their applications on Linux (particularly a single distribution) until there are enough people willing to buy it to make the port and their support infrastructure economically viable. Conversely, people will not consider Linux until there are sufficient applications that they need. And so it goes on.
My hope was that something like Ubuntu (and I'm especially using this as an example rather than RHEL and SEL because of the procurement costs involved in wrapping up the availability of the distro. with a support contract, partially nullifying the Free aspect), would manage to reach a critical mass that would encourage the applications to be ported. Sadly, this is not happening, in my view, partly because of FUD, but also partly because Ubuntu appears to have taken a sharp turn (Unity et. al.) which has destabilized even the hard-line advocates.
Looking at the alternatives, the other Debian and Ubuntu spin-offs do not have a large enough organization backing them, SELs future is a little uncertain due to the transfer of ownership of Novell, Fedora is not a suitable OS for commercial organizations without a lot of support effort because of the speed of change, and is completely unsuitable for any non-technical home users for the same reason, RHEL costs money, and Centos is probably too enterprise oriented for home users.
- Hi-torque tank engines: EXTREME car hacking with The Register
- Review What's MISSING on Amazon Fire Phone... and why it WON'T set the world alight
- Product round-up Ten excellent FREE PC apps to brighten your Windows
- Product round-up Trousers down for six of the best affordable Androids
- Why did it take antivirus giants YEARS to drill into super-scary Regin? Symantec responds...