1816 posts • joined 15 Jun 2007
Should have used the Joke icon. Plus, I think you mean "netbook" rather than "notebook". Maybe spell-checker error.
Re: lack of innovation @Kebablog
Ah, but why did DEC become a target for a Compaq buyout.
They had been in decline for some time, but licensing problems with Intel (Intel infringed on some DEC patents IIRC, but DEC suffered as a result - never really understood how, but they did) and problems further developing Alpha meant that DECs share price dipped, and Compaq made an offer the shareholders could not refuse.
Compaq probably over-reached themselves, and coupled with a loss of market share meant that they became weak themselves.
@Paul Crawford - I understand your concerns about who pays
but as the entirety of UK Government expenditure comes from taxes or sales of national assets, everything that the government does, including many things that are directly for "national security" are at our expense.
I totally agree that "national security" is hugely overused without the correct justification. I suspect that this is because some MPs are prepared to rubber-stamp anything that mentions the term without asking whether it is being correctly used.
Of course, if you do a global substitution to replace "national" with "Government", you may get a different picture.
It's good to see
that what has been standard practice in the Mainframe/Midrange platform market is finally becoming a reality in other architectures.
I'm an IBM pSeries and Power person, and we have been able to do remote IPL, console, and configuration/management. for years. OK, you paid a premium for some of the features, but many of the base capabilities have been built in for well over a decade, and now includes IVM/PowerVM. The RS/6000 F40 had a service processor when announced in 1996.
It does help that most of the system admin can be done from a command prompt, though.
One of the customers I worked at had a majority of their critical servers in lights-out, mainly unattended sites scattered around the country from before the Millennium..
Legislation of it's age
IIRC, the original intent of the 1984 Data Protection act was mainly to enable people about whom information was being kept to be able to make sure that it was correct and what it was being used for, rather than for any other reason.
It may seem difficult to believe these days, but the idea of data-mining was so far off the agenda as to be unimportant, at least outside of the Security Services. In 1984, computer systems were rarely networked, and datasets were stored in isolation from each other. Client-Server computing applications were still relatively rare, and the chances of being able to discover new aspects of peoples lives by joining datasets together was so difficult to be nearly impossible.
I remember looking at the requirements to be able to change all copies of incorrect data with a degree of horror, as I had no practical way of re-writing data on system backup tapes.
Fortunately, the only personal records I was responsible for were the login details of users of the computers I administered. The Data Protection officer for the Polytechnic where I was working judged that with a small amount of change, the login details (held without any identifying information about the user other than their name and the course they were on, which was implied by the naming convention) was exempt from registration, although we did go through the exercise of filling in the forms to document that the exercise had been completed.
I was immensely grateful for this.
The only thing I can assume is that the reason you're not running Linux on the laptops is because you haven't really tried, even though you say you run it elsewhere.
I've put Ubuntu on a huge variety of laptops and netbooks from Asus, Lenovo, HP, Samsung, Dell etc, and it just works. No additional drivers, no command line tweaking, all sound, video and network devices at least working. Maybe not the accelerated graphics, and maybe not Bluetooth, but enough to use. Certainly better than a raw XP install from MS media, where almost nothing works without the vendors driver disk.
Re: Simple Explanation - MS MURDERED THE NETBOOK
You missed the bit where MS had decided that XP was at end-of-marketing, and not only did they extend XP, they actually introduced a new version specifically for the reduced memory in netbooks. This was either a shrewd marketing move or a synical u-turn, depending on your point of view.
It would have been interesting to see whether a more mainstream Linux variant would have made them any more sucessful.
I'm still using my early eeePC 701 with Lucid on it on a regular basis, and would like to find another to replace my current firewall system.
Re: bad example @Robert Helpmann??
It's not quite the same. In a typing pool, they would often type from dictation, either via a dictation machine, or through the phone system (or in the really old-fashioned office, by a secretary taking shorthand). The typists needed to be able to correct grammatical errors, and spell correctly, and also know how to format a letter.
Data entry is normally repetitive, vary rarely free text prose, and extremely boring. And it's slowly being replaced by OCR and mechanical form reading, or direct entry over the Internet anyway.
Re: Ah, another patent encumbered format @Frank Bough
That's probably true now, although as standards progress, it means that you keep having to update your adapter (or phone or tablet) every time a new codec is becomes 'standard'.
Re: Ah, another patent encumbered format @JDX
But it is not just the software. In your comments, you're assuming that the people who write an alternative implementation have ripped off your code.
This is always about software patents, not the code itself. OK, you write a nice implementation, your code should be protected, I totally agree. But the algorithm used should be open, so that someone can provide an alternative implementation. If you can prove that they copied your code, then I would support you suing them through every court in the land. But if they wrote their own, through their own efforts.....
It's a serious dilemma, I admit, but all the time we have ambitions to produce a truly free operating system suitable for everybody,, then we have these problems.
I could (although with some reluctance) support going to a model where the OS is free, but the licensed codecs you need have a reasonable cost associated with them (in line with the H.264 charges that seem entirely reasonable). That is how it was in the early Windows days (remember Windows when it could not play media out-of-the-box and you had to buy software to play music and video), although too many people just copied MP3 and DVD packages on Windows.
If we go to this model, it should be clear that this is the case to users of all operating systems, and maybe other OS vendors should be prevented from providing the software as part of their OS offerings. But this has not exactly been a totally successful strategy in the Browser rulings.
And as long as the alternative implementations abide to the rules on the use of LGPL toolchains, this should not fall foul of any open systems licensing, either.
Re: Ah, another patent encumbered format @ Spaniel
I got it wrong. It's 100,000 units, not 10,000.
This is a quote from the MPEG-LA H.264 License terms summary.
For (a) (2) branded encoder and decoder products sold on an OEM basis for incorporation into personal computers as part of a personal computer operating system, a Legal Entity may pay for its customers as follows (beginning January 1, 2005): 0 - 100,000 units/year = no royalty (available to one Legal Entity in an affiliated group); US $0.20 per unit after first 100,000 units/year; above 5 million units/year, royalty = US $0.10 per unit. The maximum annual royalty (“cap”) for an Enterprise (commonly controlled Legal Entities)
is $3.5 million per year in 2005-2006, $4.25 million per year in 2007-08 and $5 million per year in 2009-10, and $6.5 million per year in 2011-15
All rights to this text belong to MPEG-LA (just a disclaimer to avoid any copyright issues)
So, 20 cents for every shipped copy between 100,000 and 5,000,000, and 10 cents after that up to a maximum of $5 million dollars. That's quite acceptable if you are incorporating it into a product costing $20, but not so good if you are wanting to include it in a popular free Linux distribution. I wonder whether the fact that you are not 'selling' Linux is enough to get out of "sold on an OEM basis" part of the clause?
Re: Ah, another patent encumbered format
I'm perfectly happy with software being paid for running on Open Source platforms, but patenting the codecs such that you can't legally provide them as part of a free (as in free beer) OS puts huge amounts of leverage against projects who want to provide a free OS.
The problem is that if you stay within the law, and don't ship what may become the de-facto standard for video, then something like Linux will always be seen as not for general consumption.
Alternatively, if you ship the codecs as part of a distribution regardless, so that the experience to the end user is good, then MPEG-LA can then demand a payment from you. You have no revenue stream because you are providing the software for free, and cannot pay unless you are Mark Shuttleworth (who paid for an H.264 distribution license for Ubuntu, and got heavily criticised for it).
The problem clauses in the H,264 are the volume, which says something like the distributor has to pay a licence fee per copy deployed if they ship more than 10,000 copies, and the one that says that if you have to pay a fee if you use the encoder to produce commercial videos.
Bearing in mind how viral Linux distributions can be, how do you measure how many times it has been deployed. I download one install image, and use it to install thousands of systems, and offer re-distribution from my web site. How is that measured? And who should pay?
And what qualifies as commercial? If one of my kids record the neighbours cat doing something comical, and upload it to YouTube, and Google attaches adverts, is the video for commercial purposes? Should I pay for the encode? Should Google, even though that may not have encoded it?
Licensing like this is a legal minefield for Open Software since the days of MPEG2 Layer 3 (aka MP3) or GIF. My point is that it would be so much better if the codecs (or even just the algorithms) were available under a permissive license.
Ah, another patent encumbered format
Let's hope that MPEG-LA are more generous about the licenses, although I would be surprised if they were.
And to pre-empt people who say that H.264 was freely available, I suggest that you look at the commercial encoding and decoder volume distribution clauses in the license agreement.
I had these discussions with my Palm Treo 650
I kept being phoned up by my Service Provider saying I could upgrade my phone.
At the time (pre-iPhone), my answer was another question "What can I upgrade to?"
When all they could offer was a Windows Mobile phone, I normally answered "And that is an upgrade?" Eventually, I went Android, although an odd quirk was that because I didn't take an upgrade when I was entitled to it, when I did, the discount I was offered was less if I had upgraded promptly. Bizarre!
I was very happy with my Treo, and it is still by fall-back phone. And now, over 6 years later and still on it's original battery, It still lasts longer on one charge than my Sony Xperia.
Re: 7 chaos emeralds in the first title?
Was it not "The Eggman" in the Japanese version, rather that Dr Robotnik?
Sonic also spawned at least two cartoon series on TV. Not quite the 360 degree marketing that Nintendo had with Pokemon, but still quite high market penetration.
I can still remember the nasal "I'm waaaaiting" from the cartoon series, which was akin to Sonic crossing his arms and tapping a foot if you didn't move in the games.
Can't believe nobody has mentioned Spinball yet!
This was a really strange blend of Sonic and pinball, and yet it worked quite well.
When I got my oldest son a Dreamcast (picked it up at 00:00 on launch day), together with a copy of Sonic Adventure, I remember watching in dizzy, sickened awe at the speed of the traversal of the 3-D ramps and loops with the background swooping and rolling around. I found it unbelievable that it was able to render the backgrounds as fast as it did.
Whilst Sonic was undoubtedly a triumph, Sega had a stable of exceptional other titles that over the years included Alex Kidd, Ecco the Dolphin, Panzer Dragoon (particularly Saga), Knights into Dreams, and my personal favourite, Burning Rangers (the absolute heyday of the Saturn IMHO).
Re: you forgot!
I'm sure I still have a picture in the 1976 "Electronics Tomorrow" special edition of the magazine Electronics Today International (the one that also had pre-release articles about Star Wars) of a PET 2001 prototype that was curvy.
When the production PETs came out, I thought the steel case and chiclet keys were just plain ugly, although that did not stop a group of us on the college staff-student consultative committee from trying to get one bought for the college. Unfortunately (for us), the council voted for a mini-bus instead. In hindsight, that was probably the best choice, but it did not seem so at the time.
I could not believe how many attempts at passing the stats the psychology students were allowed before they failed at the university I was at. IIRC, they had at least six chances, and I only had two at the maths to support my computing/electronics course. And their stats were really simple! I did the same at Advanced A-Level maths.
Re: It's whether the degree is *hard* or *soft* @boltar
It's not that one is better than the other. I suspect that from a purists point of view, the most elegant, sophisticated and efficient code is written by old-school computer science graduates, you know, those who actually understood the reason for doing things, not just the learn-by-wrote of current teaching methods.
But I would also accept that the code that most resembles what is required for a particular problem may be produced by people who don't have a computer science degree.
The thing is that someone who has been taught computer science probably has a relatively poor understanding of problems that are not directly related with computers. So if you are programming a system for another discipline, someone with outside skills who has cross-trained to get relevant programming skills may not turn out the best code, but may have a better understanding of the requirements, especially if they have applicable knowledge for the problem. This is not a hard-and-fast rule, there will be exceptions, but computing is a terrifically introvert area of work.
A previous poster pointed out that the best technical writers are not computer scientists, and I agree. Writing good documentation is a totally different skill from writing good code. Someone with a basic technical understanding, access to the code writers to ask technical questions, and good writing skills will in almost all circumstances turn out better documentation that the code writers themselves. At least the spelling and grammar will be correct!
I checked last week
and if anybody is likely to have problems, it is me.
My closest English transmitter is Mendip (channels 49-58), but as that is about 40 miles away, I currently need an multi-tap amplifier to make sure that all the TVs in the house get acceptable signal.
Within a few degrees of arc of direct line to Mendip, and at a distance of no more than 600 metres, there are two cell basestations run by operators who won slots in the 4G auction. So there is a great chance that if these basestations start operating in the 800MHz band, my TV aerial and amplifier combination will get a great 4G signal, with little chance of using a directional antenna to alleviate the problem. If a filter attenuates the TV signal too much, it will probably degrade the signal enough so it is no longer viable.
I'm trying to get information from the operators (found using http://www.sitefinder.ofcom.org.uk/search), but so far, they have not answered my queries. I want to know as soon as possible, as I have several TVs in the house, and want to find out the impact as soon as possible.
I just hope that I don't have to wait until too late to find out.
Re: cooling the water before sending it to the super
Even that's not really possible. They use an evaporative cooling system as much as they can (they really are into providing the perception that they use as little power as possible - which could be understood if only you knew who they are). The only time that this cannot be done is when the outside temperature is too high, and this is likely to be the time that heating anything is least required.
I got my information directly from the building power and cooling engineer/manager, and if he can't work out a way of getting something useful back, I'm certain it can't be easy.
Re: Paying to keep it running.
The heat that comes out of these systems is regarded a low-grade. What this means is that the temperature differential is not high enough to make it particularly practical to use.
Roadrunner is air-cooled (see the picture, spot the hot-cold aisles and no water-cooled rear doors), so the the heat will be picked up by the air handlers.
I've worked with 2 generations of IBM was water-cooled supers, and the output temperature of the water is around 25 degrees centigrade (although slightly hotter for the newer system). This is colder than the ambient temperature of the halls (it's a power concious organisation that is experimenting with running the machine rooms hotter than you would normally expect to save power). This makes it less than luke warm, and certainly not hot enough to even heat the water to wash your hands. The cooling works by cooling the water before sending it to the super (input temperature around 13 degrees centigrade), so the cooling was actually at the wrong end.
Of course, you could use heat-pumps to concentrate the heat, like ground water heaters, but there is a law of diminishing returns operating. If it takes more electrical power to concentrate the head than would be necessary to directly generate the same amount of heat, there is no gain.
"Think of all the copper"
is not that appropriate as a title for a picture showing fibre cables! Those are fibre Infiniband cables, with electro-optical transceivers (like SFPs) at both ends. The colour is a dead give-away.
Undoubtedly there will be quite a lot of copper in the system, but not in the picture.
It's a simple balance...
...availability vs. cost.
There have been highly available and even un-interruptible services around, but you have to pay for them, and they don't come cheap. Nor do the staff to set them up and run them.
Microsoft probably have an HA offering now, but I expect that even they will charge a premium.
Segregating the function, so that you can put your information distribution systems on simple, small, cheap, and redundant servers in front of your actual service machines can help with the appearance of a service being available (as well as increasing the security), but if you truly want high availability, it's going to cost.
"shoot the drone out of the sky"
He's obviously not seen birdshot in action fired from a shotgun. If you can hit a clay pigeon or a bird at 20 metres, you should be able to do enough damage to bring one of these 'copters down, which is within the range of most shotguns.
And if they tried to retrieve it, that would be trespass.
Re: Kebabfart M5-32 is not the only one with 32TB RAM
I would dispute that seti@home is an HPC workload. It is a distributed workload, partitioned into units that can be worked on in isolation from each other.
There is a huge difference between a distributed workload and a proper HPC workload, and people like weather agencies, atomic research institutions etc. would be only too happy to explain.
Proper HPC needs a huge interconnect, so that a single model can be broken down into multiple threads spanning many systems, all passing data between each other.
BTW, IBM P7 795s are quite cool, but P7 775s are even cooler (literally, water cooled cooler!). I know, because I work with a couple of clusters worth.
Re: As far as my neighbours are concerned
It's not impossible. I can claim both in my experience (it's even on my CV)!
(Historically, UTS and AT&T R&D UNIX on Amdahl mainframes, AIX/370 on IBM mainframes, and two generations of IBM HPC where I currently work!)
Currently, I don't think there is any mainframe proper sold with UNIX, although Linux would not be a problem. I don't count HP Integrity Superdome or the IBM 795s as mainframes.
But having said that, I still get asked about PCs. My stock answer is "PCs, horrible little systems. I can't stand them!"
@Ross K - Don't be dense!
Each direction of my daily commute takes me 75-90 minutes to travel 42 miles.
It's not traffic that slows me down, it's the fact that there is not a single stretch of dual carriadge way or better, and there are things like towns and villages to drive through with speed restrictions, tractors and other farm machinery, caravans, sheep, cyclists and even the odd tourist who thinks that doing 25 on a national speed limit country road because it's pretty does not worry the people behind them who cannot pass because they cannot see far enough ahead to pass safely.
Just because you may be able to jump on a motorway and burn along at 80 does not mean that everybody can!
My lifestyle is my choice, I admit, and I put up with the drive because it's actually a nice place to live with many other benefits, but sometimes it does get too much.
Re: modern technology @dz-015
Why? Because some of them may actually think of working in the field, and they cannot make a decision about whether they would be able to until they have relevant knowledge. It's truly shocking how little almost all kids know about how computers work when they leave school.
I'm not saying that there is no value to iPads, but that there are better ways to obtain the skills. In their day, BBC micros could do representative actions for almost the whole spectrum of contemporary computing skills (I know, I built and ran a lab of them in the early '80s at Newcastle Polytechnic that was used to teach computer appreciation), as well as learning to program. I used it to teach structured programming in BBC Basic and Pascal, assembler programming, word processing, spreadsheets, graphics (including basic design using a digitiser, WIMP and touch screens), robotics, basic networking (putting an oscilloscope on the Econet cable was a real eye opener for the students), and many more things than I can remember.
Tablets can do some of these, but I think that as a representative computing device, they are poor. It really does depend on whether they are the ONLY devices available in the schools.
Photo, music and video editing can be done, but would be better on a machine with more memory and disk, for anything except the smallest project.
For an art and design tool, something that had the accuracy of a Wacom digitiser is essential.
And about cooking a steak. You don't need to know how to farm just like you don't need to know how to fabricate a CPU or memory chip, but you need to know how to use the cooker, pans an utensils in order to perform the creative culinary part of cooking. Using an iPad is like knowing what seasoning to use.
Re: modern technology @dz-015
The amount of 'learning' necessary to get an iPad working is minimal. The one good thing about what Apple have done is to make it so any fool can use one with little to no training. And even if they did not have one provided by the school, a large majority of kids will learn transferable skills from their own devices. If there is any benefit, it will be having a standardized device for distributing learning material, but there are other probably for more practical and better value devices for this.
Owning a tablet myself, I can understand that using a media consumption device may be useful, but I actually don't find it very useful for my work, because of the difficulty of getting information on and off because of the (very sensible) security policies of where I work. Schools would be no different, especially when you use such a controlled device as an iPad without some sort of relaxation of the restrictions. So unless you categorize learning as another type of consuming media (maybe it is), I find that the overall value of providing iPads is poor compared to other uses for the money.
Over the years, I've seen technological aids used in teaching, using slide and film projectors, television, video-recorders, audio tape based language, micro-film and fiche based interactive courses and finally various generations of computer based training. But do they work better than a good teacher and appropriate books? I'm not sure, and I think back to the most memorable years of school when a sometimes boring subject was brought to life by a capable and enthusiastic teacher with nothing more than a blackboard and text books.
Give the kids an appropriate understanding of how they (computing devices) work, together with the correct amount of other real life skills (reading, writing, basic maths, contemporary history, nutrition), and that will be a much better use of time and resource.
I am still waiting for the delivery of the promise of natural language recognition combined with Artificial Intelligence (always 5 years away for the last 30 or so) that will make interacting with your information system like interacting with another person. When we get this, all this crap about learning how to use a computing device will become obsolete, and we can go back to learning useful knowledge rather than teaching the current in-vogue fad!
"Specialist Technology College"
My youngest is doing A levels at what is described as a "Specialist Technology College".
I got a text last Thursday asking him to attend a special session in the Media centre.
When he came home, he had been told that because of a "computer failure", all of this years assessed media work would have to be redone, because it had all been lost. His timetabe for this week was suspended, and he was expected to spend the whole week just catching up.
Whatever "computer" they were talking about was not the only failure.
Icon speaks for itself.
Re: Congratulations PC makers!
Back in the Mid '90s (around the time of the Pentium 90), I looked after some older IBM PS/2 Model 80s, which had 25MHz 386DX processors (they really were cutting edge at one time). I also had access to OS/2 running on various systems.
IIRC, there was a dancing animals (birds and monkeys) video shipped with OS/2 Warp that I managed to run on the PS/2s (AIX PS/2 with xanim ported, again IIRC). It was pretty low resolution, and the extension was .avi, although I don't know what the codec was (do I still have an OS/2 Warp install CD to find out - I must check), and these systems did not have sound cards, but they were running video. If they could do compressed video, I'm sure they would have been able to do MP3 audio.
Re: Death of the sensible UI
I say go back to troff, tbl, eqn, pic and ms or mm macros, edited using emacs.
No, seriously. I mean it. Take that straight-jacket off of me! I'm a retro-technologist, not mad!
Re: Can you imagine the stress...
Actually, the couple of times I've seen someone do a recursive delete of / on a UNIX box (rm==UNIX/Linux) has not caused the system to reboot. What happens (at least on AIX) is that as soon as you hit /usr/lib and /lib, and wipe out some of the shared libraries, the system becomes largely unresponsive, but does not reboot. You end up not being able to log in or issue remote commands, and anything that spawns a new process that is not already running fails with exec errors. This largely happens unnoticed, unless you happen to have an open session. The system just seems to die, but still responds to icmp echo requests because of IP offload to the network adaptors.
Mind you, the next step of physically rebooting the system fails, with the system stopping before it's able to even start init. Again, on AIX, IIRC, it stops with something like 553 on the LCD display (which shows how long it has been since I saw this, as LCD codes on modern Power systems normally display 4 digits now), which normally means that it can't mount the /usr filesystem, but in this case means that it can't even run the mount command. I expect something similar but specific to the flavour of *NIX on other systems.
Time to reach for that system backup that you took. Or a pint to help you consider your options!
Re: Guess that includes me then @Connor
I started reading your comment, and had to check that I was not the author. I've taken exactly the same route, right up to accepting Unity on Precise (12.04).
My Laptop is old. It's not that powerful, but it is mine and it works, and I would prefer to not have to replace it at the moment. It worked fine with Gnome 2 on Lucid, although Compiz worked better with Hardy (KMS implemented in the kernel in Lucid and other distro's broke certain ATI drivers)
It runs Unity 2D, but the experience is awful, both because it is not the full-fat version, and because the performance is crap. Same KMS issue as Lucid preventing composite rendering, probably.
So, I've put Gnome 3 with Cinnamon on (just install the package, and select at logon). I would prefer for it to be offered as a choice during install, but I can stay with Ubuntu and remain current enough without having to change the way I work. So far, I've managed to keep it sufficiently like I want it.
I'm not sure that Unity will ever allow me to work the way I want to as it handles windows in a completely foreign way to what I want to do, but I guess that time will tell.
Re: Allow me @Phil
Bell Labs PDP11 UNIX V7 and earlier did not have any support for overlays. I worked with them on RSX-11M, so I understand what you are talking about.
As far as I am aware, there was some prototype overlay code in the later BSD PDP11 releases, but it would only work on a machine with 22 bit addressing and separate Instruction and Data space (I&D) machines (11/70, 11/44 and later systems). Before this, the standard trick used for large software programs (and I saw this done for the BSD release of Ingres) was to split large programs into several executables, and use some proto-IPC interface to spread the function around. IIRC, Ingres from a BSD 2.3 tape used named pipes. Shared memory and message queues were all in the future, but I believe that the was a primitive semaphore implementation in UNIX V7. Have to look to find out.
There was some work done by Keele University in the UK to produce an overlay loader for UNIX V7 on PDP11s, which I managed to get working on my Systime 5000E (a strange beast, being a PDP11/34E [normally 18 bit addressing, no I&D], but actually with 22 bit addressing added by Systime). I used it with some success, but I never managed to get VI working on my small machine. It was all good fun, as was debugging the Calgary device buffer modifications to maximise the number of device drivers you could compile into the kernel on this 22bit non-I&D machine. Out of the box, the mods assumed that if you had 22 bit addressing, you had to have separate I&D spaces, because no DEC PDP11 did not.
Fun times, long gone.
Re: I'll delve into my archives ...
I would have thought that the Berkeley code would be re-distributable. The Berkeley software license was pretty permissive from the work go.
I'd love to know about the UnixTSS myself. Not because I have any (I obeyed the rules and always left it behind when I changed jobs), but I would love to see some of it again, especially the STREAMS and RFS code.
I just wish I had taken copies of the Bell Labs V6 PDP11 distribution, and the BSD 2.1 and 2.3 tapes I worked with in the very early '80s. I know that V7 and a later PDP-11 BSD tape images are available, but by that time, they were already getting difficult to work with on non-separate PDP11 systems.
And it does not work on Linux, and did not on Android devices last time I looked.
Re: Missile Command anyone?
I used to be pretty good at Missile Command. Was certainly on the High-Score table most times anywhere I played, and often at the top.
One day I cam to my favourite machine (with the smoothest track ball), and there was a stranger playing. I watched him clock the machine (twice, IIRC), have cities stacked up across the screen, and then get bored after about 45 minutes and walk away before he was wiped out (in fact, before he even started losing significant numbers of cities). You would not believe how erratic the intelligent mines became, and yet he could hit them. I think he must have maxed out the difficulty levels, and the machine started using more and more lurid colour combinations to put him off.
I never saw him again, and I lost all interest in playing, knowing that I could NEVER be that good. In fact, that was pretty much the end of me spending time in Arcades.
IBM Model M
People are talking about Mechanical and Model M in the same comments. As a complete fan of the IBM Model M, I was actually disappointed to find when I tried to clean some Tizer or Irn Bru from one of mine (that child of mine will never be fully forgiven) that once you get through the deep hex head screws and plastic welded lugs, what sits under the buckling spring mechanism is still a membrane keyboard, just with the aforementioned spring and plastic rocker sitting on top.
So no microswitches (in fact, I'm not really sure any keyboard used microswitches), although keyboards from the late 1970's and 1980's had discrete push-to-make release-to-break key switches soldered directly onto PCBs. My Issue 3 BBC micro ended up with more solder on the back of the keyboard PBC than metal track, because the repeated strain on the soldered pins would lift the PCB track from the board, and break it.
I remember Newbury Data RS232 terminals from my time a University having the same problem. You would often find one with the 'return' key nor working, which everybody avoided, but could still be used with Ctrl-M instead!
And another point
How does the Windows market share get measures. Is there a chance that it measures the number of systems sold with Windows, rather than the number that actually run Windows?
I know that most large businesses can buy Intel systems without a Windows license, but does anybody have any idea of how many actually do rather than just junking any pre-installed windows installation and reformatting the disk?
Blimy. On my monitor, it's difficult to see the difference between the bright red and bright pink on the market-share charts. Never mind, neither look particularly important,
Re: I wonder if you can hack the cable...
A cable virus! Load it into the cable and it back-hacks the iDevice.
I believe that there was some concern about FireWire some time back, and some speculation that Lightening may be vulnerable in the same way through RDMA. Anybody remember whether these fears were proved groundless?
Mind you, as the software had to be loaded from the iDevice in the first place, you you would need to get it past Apples App. police.
Not sure whether you were commenting to me, but I have known users who worked entirely from inside Emacs.
Before the advent of WIMP, the multi-window, multi-buffer and electric modes for Emacs allowed users to run a shell (using Emacs as the command editor), read their mail and news groups (there were mail and news clients written in Lisp), compile and debug their code, and even run NetHack from inside an Emacs window on a serial terminal that had a termcap definition (you know, something like a VT100/220 compatible, I won't call it dumb because termcap defined a dumb terminal as one that could not do cursor addressing).
The extensibility of Emacs was legendary, something that has surely been forgotten over the years.
You young whipper-snappers just don't know how easy you have it grumble grumble....
But the integrated editor is horrible
Made me laugh out loud! Embarrassing when at work.
Re: Discounting the cheap boxes
My goodness. At this rate with the keyboard shortcuts, we'll be able to pitch a comeback for Emacs!
Re: pretty stupid robots eh
This used to be a 'lost' Hitch-Hikers Guide to the Galaxy episode linking the first to the second radio series. It was only broadcast twice originally, and then disappeared from the airways as it was neither in series 1 nor series 2. I recorded it
It's since made it into the CD collections fortunately.
What I like is when Marvin is left to delay a Frogstar D. Can you guess with what weapons? Something pretty devestating surely? No, nothing.
IIRC, his last comment as the Hitch-Hikers offices collapse around him after being destroyed by a neutron-ram is "What a depressingly stupid robot"
Re: Sigh @everyone who replied
The C4 signal from Wenvoe is quite bad, and C4 appears at a different position (FV channel 8 between BBC3 and BBC4?), and not all FV boxes allow you to manually re-number channels.
I'd spotted the rescan myself after I'd posted. I must admit that rescanning is getting depressingly frequent, especially on my older FV boxes that don't do it automatically.
The splitter boxes are intended for multi-occupancy buildings (which I suppose my house is at the moment) and are quite expensive. If it were really that simple, you would not be able to buy 8 port LNBs.
And even if I did become eligible for AT800, and they agreed to fund an intelligent splitter and/or install the wiring, the disruption of laying cables over the whole of the house would be horrible.
I'm just hoping that I will not be affected. Only time will tell.
Re: if you don't own the software..
You've already given them that right. It's in the Windows EULA.
Re: How does this work for upgrades... what's the dentition of a computer...
For Windows, MS calculate hashs of information about a number of different components in a system (processor, memory, network card, display adapter, disk and controllers, BIOS signature and many others), and actually stores this on their systems as well as in hidden and protected files that not even an Admin user can change. When you change components, the checking process tries to work out how much of the system has changed, and either allows the change or deems it a different computer and asks for re-authentication. It's been like this since XP. It allows you to change processors, disks and display adaptors with relative impunity.
Unfortunately, now that PCs have heavily integrated motherboards, most of the components Windows check are actually on the mobo. This means that changing that is almost certainly going to trip the 'it's a new computer' check, and has done for much of the last decade. The Microsoft Licensing Centre have been fairly good about this in the past if you've cared to explain, and issued the new authentication strings if requested, but I suspect that is likely to change.
I suspect that Office will plug into that process, bearing in mind Windows already does in the Genuine Windows (dis-)Advantage tool, and non 365 Office only runs on Windows.
- Asteroid's DINO KILLING SPREE just bad luck – boffins
- Just TWO climate committee MPs contradict IPCC: The two with SCIENCE degrees
- Stick a 4K in them: Super high-res TVs are DONE
- BEST BATTERY EVER: All lithium, all the time, plus a dash of carbon nano-stuff
- Review You didn't get the MeMO? Asus Pad 7 Android tab is ... not bad