1394 posts • joined Friday 15th June 2007 09:17 GMT
It's entropy-mental, my dear Watson
This problem is almost certainly to do with the entropy gathering in the random number generator of the system being cracked. It is normal to stir in entropy data gathered from the hardware (like minor variations in temperature, or bus noise, or maybe small power fluctuations) to the random number seed process to avoid deterministic sequences of random numbers. If you can guess what a sequence of random numbers will be during key generation (SSL uses transient session keys to encrypt parts of the rest of the key exchange and checking process), then the value of the key is severely reduced.
Exactly how this entropy data is gathered is normally specific to the OS and device, so I suspect that the same technique would have to be significantly varied for each different device you want to attack. If you have a device with good isolation from external variations, then this technique will be virtually useless (although gathering entropy data becomes more difficult). And if you have a really good hardware based random number generator that has it's own entropy gathering mechanism, then this will be impossible.
Mind you, I don't fancy being around if they start using focused beams of radiation to affect systems within Data Centers.
Tinfoil hats anyone!
But how about...
... a pad that can work independently, but can be dropped into a cradle with keyboard, mouse and screens attached, which works a lot like a PC.
I think that you can already see some of this happening in netbook space. I was surprise how well my EeePC 701 worked with a normal keyboard, mouse and screen attached. Looked like a normal computer!
If this is a loitering weapon (unlike a cruise, which hits a geographically static target with high accuracy), what the hell are they using to recognize the target?
For goodness sake, we can't get facial recognition working for passports in controlled conditions in airports, so how is it going to work in a mobile platform with limited weight and power constraints, for a moving target in a poor environment?
More likely, it will need some form of marker like a radio beacon to be planted in the car or on the clothes of the target. This will still need someone to get up-close. Either this, or someone staring into a video screen in Florida or Nevada.
Even things like the current generation of guided smart bombs that can follow targets normally need a beacon or laser illumination (ever wondered why yank infantry keep pointing gun's at a target, even though there's a C130 gunship attacking it [seen clearly in the first Transformers movie, for example]).They're pointing lasers at the target, and the gunship or smart bomb homes into the reflected coherent light.
For anybody who uses a separate partition for their home directory, then it is likely that whatever changes they have already made for Hardy, Intrepid, Jaunty or Karmic will persist across the new OS (along with inappropriate settings for new versions of older software as well, unfortunately!)
I include Hardy, as I am going LTS to LTS releases on my main boxes, missing the intermediate versions.
but seriously. You just can't win. Make no changes, and you are being unambitious or not keeping up with the competition. Make changes, and it is all unnecessary. Thank god it is very easily configurable!
Missed the StrongARM(ed) reference
... which is/was the Digital Equipment Corp./Intel variants of the ARM chip (at least before Xscale replaced it, which was then sold to Marvell).
I did not see an XScale pun in the article.
But I believe that when ARM Holdings were set up to guide the development of the architecture, the chip design was re-branded as the "Advanced RISC Machine", removing the Acorn name (at least this is what Wikipedia says here - http://en.wikipedia.org/wiki/ARM_architecture)
I really miss the thinking behind Acorn machines.
re: Doing nothing
I wish that there would be a change of emphasis from "Global Warming" to "Preserving Fossil Fuels".
I'm sure that nobody would deny that the amount of fossil fuels is finite, and it is quite clear that we are depleting the available reserves. We will run out, period. There can be no real denying this. What may be argued is whether it is 50 or 500 years away.
I am generally in the skeptic camp with regard to man made global warming. I believe that man's impact, though present and undeniable, is dwarfed by what Mother Earth can do on her own. I do, however, support whole-heartedly renewable fuels, because when we've used 100 million years worth of gas, oil and coal, it's not coming back in a hurry. We are treating the Earth like a big (zinc-)carbon battery, but we can't just buy another once it is exhausted as we would an Ever-Ready(tm).
I think that we should be preserving oil at least, because it is useful as a lubricant, not just as fuel. We need to balance our energy use with what enters the Earth's domain from the Sun. This is, ultimately, where all our energy comes from in one way or another.
I'm sure that many people would agree once they consider the arguments, and I believe that Governments should switch tack to this in order to persuade the populace to change their behavior.
If this is really what was going on, then it raises the battle between IT security and the crackers to a new level. Sounds like all use of work mail addresses on social network sites should be banned forthwith, as should access to said sites and personal Webmail accounts from work PC's. Makes you think that the security police were right (and painfully obnoxious) along.
How long before saying who you work for on open sites becomes a workplace disciplinary offence!
Mind you, the presence of zero-day and un-patched known vulnerabilities in the OS and browsers does not help
Not copied, sold by IBM.
Xenon and Cell are both mainly designed (and manufactured) by IBM. Just shows you how good the IBM Chip design and foundry businesses are at attracting high volume customers. And they also provide the CPU for the Wii, although this is more like a standard PowerPC.
BTW. The Xenon is not a Cell derivative. It has modified PPE processors (3). The PPE is only part of the Cell, and the Xenon has no SPE's. Not really much of a comparison.
Check what a Bussard Ramscoop is. It projects a large magnetic field *ahead* of the ship to guide the hydrogen where it needs to go. Looking at the design of the ships, it would have to be a highly shaped field to avoid subjecting the saucer section to a possibly damaging strong magnetic field. And if the same basic design was used in the Intrepid class of ships (like NCC74656 Voyager), then it would have to be even more sophisticated, as the nacelles move (supposedly to make them more in line with the center of gravity of the ship).
As I said before, I don't believe that the design of the NCC1701 Enterprise in the original series was actually scrutinized by engineers or scientists. The function of various parts of the ship was assigned by fans, some of whom ended up writing books. I would be interested to see how many of these works are actually regarded as canon, and have made it into the production bible for the various series.
In order to retrofit one of the later codec's, they would have to compromise the backward compatibility of Theora (which is one of the sacred cows for the Theora project) , or else spend quite a lot of time merging the codec's to allow the reference Theora code to use both.
If someone (like Google themselves) were to just put the later codec into an ogg container, (which would be quite simple if the later codec is available in source form), then we could get an open 'standard' created before Theora complete their work. Thus Theora would be sidelined. This is why it would be bad for them.
Google may re-iterate the openness of Ogg Theora, and keep the later versions closed for their own use, or they may open up the latest version.
The problems that apparently prevents some people from accepting Ogg Theora are questions about the possibility of patent infringement, and the performance.
The first option may allow the question of patent infringement to be put to bed once and for all (unless the patents are challenged), but the performance issue would remain.
The second option may kill Theora off completely as an irrelevance.
Either way, it is likely to be bad news for the Theora project. This would be a real shame, as they have been working really hard to produce an acceptable open codec.
You took the words..
..right out of my mouth.
I'm being a trekker cynic here, but I don't believe that ST-TOS (The Original Series) EVER mentioned Bussard Ramjets in the series. This is real science grafted on to the story by fans who were keen to try to fill out the specifications of ships with something approaching real science (Bussard Ramjets being theoretical at the moment).
I actually doubt that this type of propulsion fits, anyway. Hydrogen ramjets are only suitable for sub-light propulsion, and I'm sure that in Ringworld Engineers or Protector or one of Larry Niven's other novels (only a tenuous Trek link, but one of his stories was adapted for the ST-TAS (The Animated Series)), the sciency bit at the end says that hydrogen ramjets will only take you to half the speed of light, assuming that you can eject the hydrogen nuclei at close to the speed of light.
If you look at pretty any of the Enterprise schematics, (NX-01 through 1701-E), I'm fairly certain that the impulse engines (sub-light propulsion) are on the back of the saucer, not in the nacelles. The nacelles are for generating the warp field, which has no current scientific grounding, and nothing to do with Bussard Ramjets.
But there is a strong consensus that the dish at the front of the engineering hull (or on the front of the saucer on the NX-01) is to do with shields that are used when traveling at speed to deflect dust (and presumably hydrogen atoms).
The problem of dust at speed has always been recognized in SCI-FI. EE 'Doc' Smith (one of the original Masters of Space Opera) used to equip his ships with armor several metres (I'm sure he used metres, strange really as he was an American) thick, and he made them tear-drop shape to reduce the drag.
Back in the fray.
I know I said I'd stop, but the latest comments about desktop managers, and the look and feel of some noticeable Open Source packages got my attention again.
It is clear that some people here just are not prepared to give any ground to Open Source. The comments about look and feel from Colin Barfoot point out the fact that if there is a different look and feel, there would be complaints about difficulty of use, lack of common interface etc. If the look and feel is made similar to other packages, there are complaints about lack of innovation. There is no common ground that would satisfy these two positions.
Yes, the font handling is not as good as it could be in several places, but the comment about using Microsoft fonts is only true because so many web sites and documents use Microsoft specific fonts (such as Arial and Calibri, for example). If you do not have access to these fonts, because of some licensing conditions, then substitute fonts must be used, which will always look odd unless the metrics exactly match. If document writers and web designers kept to the common subset (and by this, I do not mean the Microsoft CORE font set, because this is licensed), then everything would work much better. This is another example of Embrace and Extend that Microsoft is so fond of. It is interesting that CORE fonts are available for Windows and OSX, but Microsoft have not granted a free-to-use license for the rest of the industry. I wonder why this could be!
I don't know how many people here have looked at the complexities of font design, but to make a font that uses the same spacing metrics, but is sufficiently different from another licensed font to avoid copyright infringement is very difficult. And with Microsoft changing their standard fonts from one proprietary set to another proprietary set in the latest Office incarnations just reinforces the moving target argument.
The whole technology led world is currently in such an inward looking spiral, requiring change for changes sake and revenue generation, and going faster and faster so that it will eventually implode. I hope for all of your sakes, that it comes apart gracefully, rather than a nasty mess.
And for the record, I too have been working in the industry since before the IBM PC was launched, though slightly after Bill set up Microsoft. I've seen and used X10, X11, NeWS, Looking Glass, Sun's pre-X windowing environment, Motif, CDE, Aqua and virtually all versions of Windows. The difference is that I started with UNIX, and I am still earning my living in that environment, extending it to include Linux. This longevity in a single industry sector is something that few can claim.
I see your bullshit, and raise you a reasoned argument
I do not know anybody who welcomes change for change's sake, unless they have too little to keep them occupied!
I deal with real people, not organisations, and most of them accept change as a necessary evil. But many of them don't like it. Think analog TV switch-off, the arguments about DAB, the fact that they're forced to accept Direct Debit for their utility bills or effectively be fined for it, or having to go to two-weekly refuse collections. All of these things have benefits, but still generate resentment.
Just go to the pub, and listen to what people are talking about. I'm sure that you will find people complaining about change all over the place. I do.
And is there is a limit to the change you would accept? How about switching the side of the road we drive on, forcing you to change your car as a result. How about a change in your working conditions asking you to do another three hours a week, or altering your pension provision, even if you are told it should have no net effect? How about your credit card raising the interest rate on your account?
Would you be happy to install tracking hardware in your car so that road tolls can be imposed, because the Government told you that it was going to reduce congestion and RTA's, and not just for their coffers).
I see these as a difference only in degree from Microsoft imposing change requiring users to upgrade. Anyway, beer time.
@I think you're the first...
The reason why office 2007 sells so much is because the older versions are no longer available. I'm sure that many people who buy it would prefer to buy 2003, but can't.
The fact that they don't think they have an alternative to MSO is also an issue.
I bought Office Home and Student, because my youngest son's teacher would not accept a presentation created in Impress. Not because it was worse, or would not play in Powerpoint, but just because it was not created in Powerpoint. I resent having to be forced to buy a product that I do not want merely because the education system have bought the MS line, and accepted their advantageous licensing position. The same argument spans business as well.
I am moral. I could have pirated it, but I didn't want to. Neither of my son's like the fact that they have 2007, when the school only has 2003 (they could get it nearly free, but the school does not have the budget to pay for the technicians to do the upgrade). It's different. Difference generates hostility. Some people may think 2007 is the best thing since sliced bread, but many don't.
This is a fact, and if you don't believe it, I suggest that you talk to people who are not in the IT business as either producers or primary consumers (I mean IT departments of organisations). Poll your children's friends parents, or your plumber, or even your accountant. I really don't think that the everyone you talk about is as inclusive as you make out.
Of course, you could take the line that if someone does not have enough money to afford MSO, then their opinion is not worth taking, but that would be elitist, don't you think.
I posted about alpha, beta and release candidate for laughs, and judging from the thumbs up, several people understood.
But having just read the comments again, several of your arguments just do not add up. You've used the fact that MS Office sells in large volumes to justify it's goodness, without taking into account the self-perpetuating dominance that Microsoft have on the market. Many copies are sold because of FUD or momentum, not because people make a reasoned comparison.
You've also effectively said that because you understand and were willing to learn the ribbon, that anybody should be able to, and if they do not, they are lazy or in some way intellectually challenged. That is far more insulting than anything I've said, including the alpha, beta, rc jibe.
I have said that change taxes some people, but I've experienced lots of people who just don't understand why this constant churn is necessary. They get bewildered by the huge range of options, menus and inconsistencies between packages and different versions of the same package. This is not because they are stupid, but because they use computers as tools rather than the computers being their profession. They want to learn something once, like riding a bicycle or driving a car, not have to relearn it every five years.
The term freetard is very derisive in the way you use it. Not everybody who uses Open Source software is to be scorned. There is much in Open Source that is good, and just because someone gives their work to the community in general is not a reason to sneer. Save your scorn for those people who steal other peoples work by not paying for licenses. Of course, it may be that you are one of the people who feel threatened by other people giving away their work for free, but you should only be worried if the quality of the package you write is worse that that in the Open. The answer to that is to either get better, or find another career.
I buy good software, and all of the music and other media that I consume. I use Open Source because it often does the job I need, and saves me money. I don't steal software, movies, music or books.
I find the poor grammar, and lack of correct capitalisation in people's comments notable. I equate it to people having an unreasoned rant, and not taking the time to consider their use of English, which I extrapolate to mean that they have probable not considered the content of their comment either. This was the cause of my 'foaming at the mouth' comment.
You come across as dogmatic, condescending, and often arrogant. If you came across like this during a sales pitch, I would probably quietly show you the door, regardless of whether you tried to impress me with the quality of your designers, programmers, or the result. But I respect your point of view, even if I don't agree with it.
The fact that you can't take criticism, or reasoned argument without descending to insults (and you've done this more than anyone here) probably indicates some type of insecurity.
I'm leaving this particular set of comments now, because I don't think I have anything else to say without repeating myself.
You'd probably still complain, and many people don't really understand percentage chance.
What does a 80% chance of rain mean to you? 80% of the region getting rain, it raining for 80% of the day, or it raining on 8 days out of 10 that they forecast an 80% chance of rain?
I can tell you that it is the latter, but I suspect that you had to read it twice to understand it. If you asked the readership of the Daily Mail this question, you would get most of them saying they don't know, and many of the rest just guessing one of the three.
And on the 2 out of 10 days that the chance says that it won't rain will still be wrong according to the media, even though the chances were correct. The media is fickle that way.
And if you remember your O Level or GCSC maths, you have to allow for short term anomalies . If you remember, when you toss a coin, over a larger number of tosses, you will get close to 50% heads and 50% tails, with some freak occurrences of it landing on it's edge. But that would not mean that you could not get 5 or even 10 heads in a row. It's unlikely, but it will happen sometimes. This is why it is never safe to bet on averages on a roulette wheel or any other game of chance.
The Met Office already do the analysis of predicted vs. actual weather. This is what they do to refine the model(s) to try to make them more accurate. But as I understand it, snow is particularly difficult to forecast, because very minor changes in the boundaries between the air layers can cause either rain, hail, sleet or snow.
The UK is caught between three major weather systems. You have the Arctic, that is cold air that attempts to push south, Europe that is fairly static, and at this time of year mostly cold, and the Atlantic which is very turbulent but quite wet and warm. As a result you have a three way battle, and I believe that it is one of the most difficult weather systems to predict in the world.
Precipitation is caused when warm moist air meets cold air in some way. The warm air moves up over the cold air, and as it gets higher and cooler, has to drop it's moisture. Depending on the temperature gradient, and the speed of air movement, and the turbulence at the air boundaries, the water droplets will coalesce and maybe freeze in different ways, leading to all of the possible outcomes.
Sometimes the forecast is easy, and sometimes it is not. Sometimes the warm air is kept south, leading to cold weather with no precipitation. Sometimes it just rolls straight off the Atlantic, leading to wet but warm weather spreading from west to east, and sometimes it diverts north, and then is carried back down south by the Arctic air, leading often to snow. And sometimes cold and warm air meet over the UK, and under these circumstances it is difficult to predict.
It is generally acknowledged that none of the current models give reliable results more than 10-14 days into the future. This means that the short term forecast will be based on modelling, and the seasonal forecast will be based on longer term cycles which can be identified by trend analysis of the of the past several years of actual weather rather than the air condition models. This makes a huge difference in the way that the forecasts should be used. You would not use the seasonal forecast to try to predict the weather for Christmas day, but you might use it to give an indication of what December in general may be like.
This is my schoolboy geography view of the weather, and no doubt somebody will pick holes in it. I do not claim to be a weather forecaster, but I believe what I've said is mostly correct.
I'm waiting for a maths or geography teacher to pull me up, so I've used the pedantic grammar alert icon!
@agreed vendor lock in
I think your comment about people pirating MS Office is a sad reflection on the morals of people not respecting the value of other's work rather than a comment on the MS Office vs. OOO debate.
People pirate it because they can (and because they do not understand the consequences), and they can then enjoy the benefits of both using MSO and not paying for it. If they were found out, and threatened with fines, or their computer stopping working (remember Microsoft have TPC functionality in Vista and later), I feel that we would have a quite different set or arguments going on here. We really would have a higher penetration of OOO in the market. Microsoft have known exactly what is going on ever since they forced people to install Windows Genuine Advantage. I am certain that it passes all keys for licensed software back to Microsoft. There's nothing in the T&Cs that prevent them from doing this, and lots that says they will.
But MS is not actually interested in the individual with a pirated MSO installation. They will happily ignore the possible millions of copies of MSO installed on certain low end vendor OEM and retail copies of Windows, because they know that it is reinforcing their market dominance AND ensuring that the computer is still running Windows (and also the financial return on taking these people to court would be so small). But if they find a commercial organisation using pirate copies, they will roll out the lawyers in very short order.
If you think that money is not always the driver behind what people run on their computers, then I think you have a skewed view of the market. What people use is a trade off between what they can afford and what they need. In the commercial market, companies think they need MSO, so pay for it. But in the SOHO market, money is an issue. If FUD and compatibility are removed from the equation, they would probably choose free rather than something that costs considerable amounts of money even if it does look less polished.
I refer back to my Aston Martin vs. Ford comment in one of my previous post as supporting evidence. I take it you are at the Aston, BMW or Merc. end of the market.
@Chris Thomas Alpha
Your shrill voice repeating the same ill-informed arguments for rolling over and submitting to Big Business as it owns you is getting tiresome.
I can only hope that the Beta version of Chris Thomas is better, and by the time they get round to the release candidate, the rough edges will have been knocked off, and it will be fit for purpose.
Ordinary users (in the world)...
... do not like change. Change taxes their thinking processes, and even now, many 2007 users I know struggle with what they see as unnecessary complexity. For them, it's not intuitive. What they want is something constant, like the indicator being on a stalk on the left-hand side of the steering wheel, or the menu items being in fixed places around the window.
And I doubt that many employers think that facebook is an improvement (on no facebook) if their employees spend all day on it rather than working! Bit like posting comments to the Register, I guess.
As you get older, I suspect that you too will start wishing that change would slow down. It's a sad reflection on aging and society.
And please stop frothing at the mouth. It does not endear you to anybody.
Don't fully agree!
I respect your arguments, but think that they only apply to part of the market.
If you are a large organisation, and can justify large outlay for beautiful on-screen presentation, then go ahead. Gloss sells here, just as much as in the fashion world, but so does the FUD about moving away from MS Office. But if beauty was always more important than utility, we would all be driving around in Aston Martins, Jaguars or Mercedes rather than the Fords, Vauxhalls and VW's that we do.
But for any number of small organisations, where every penny counts towards making any profit, and the most complex document they produce is an invoice or cost benefit spreadsheet, then beauty on the screen is a luxury they can ill afford. The important thing is the end document, and this will depend on the skill of the person, not the package they are using.
Productivity is an issue, but the numerous changes in interface between different versions of MS Office (especially 2007) cannot be counted as a productivity enhancement in anybody's eyes. I've listened to too many people turn the air blue when they can't find where something is in MS Office 2007.
I understand your point about Photoshop and The GIMP, as this is a product aimed at professional Graphics Designers who appreciate good design, but the majority of MS Word users ARE NOT professional document writers. They just need something to put words on paper. Many of them would probably still be comfortable with correctable typewriters and pre-printed stationary if computers themselves were not so cheap.
I have a cautionary tail with regard to Photoshop in answer to your diversion away from Office packages. My daughter used PS under an educational license on her Mac. Now she is no longer a student, she should re-license her copy, and has found that it will cost her more money to do this than she earns in a month (she is struggling to find reasonably paid work in the field), and more than her Mac cost new! But the GIMP is free. Her choice is not to pay bills for a month, continue using her installed copy against the license conditions while possibly saving up, switch to the GIMP, or not do any computer enhanced graphics. Some choice!
My beef is not with functionality, or interoperability, but with the crass way that Microsoft (and others) lock their customer base in and abuses them with unnecessary updates and other money grabs. This is where Open Office has a place, even if only to remind Microsoft that they are not the only player in town. Do you think that the "Home and Student" edition of MS Office would exist AT ALL if Open Office was not there? It's mere presence affects the market in a beneficial way for end users.
My point that 3151's were not good terminals. You would not get Vi working on a 3151 with a 'bad' compatibility cartridge and the out-of-the-box termcap/terminfo entries on AIX (or any other UNIX variant).
Sorry, my post wandered from the initial thread.
I have always found it to be a real benefit thinking about the content first, and then making sure it is pretty afterwards. Using a text editor is ideal for this. This is not a UNIX bigot's point of view, it's from long experience of writing technical documents using both ways of working.
I have seen too many supposedly good technical writers spend more time fiddling with the format rather than thinking about what they were writing, and then turning in hurried and poorly thought out technical documents just in time for their deadline.
I believe that WYSIWYG was the worst thing to happen to office productivity. Let a text formatter work out how to fit the paragraphs and pages together. They are generally better at it than you and I (at least in the technical arena), and as long as you can tweak it to remove the worst of the uglies, the documents will not look any worse (and may look much better!). And don't talk about style guides. Word's habit of keeping the style when cutting and pasting has led to more font/paragraph inconsistencies in documents I have been given than I can count.
The only time WYSIWYG is useful is if you are after the full DTP experience for full page layout, like magazine articles or advertising, and you would not be using Office or Open Office for this, unless you are forced to, or are a masochist.
But then, I am from the Troff/MM/MS macros era. My documents may lack some of the niceties (although with tbl, pic, grap, and eqn it's a close call), but they will be consistent from beginning to end, and I can concentrate on making the content correct. This is far more important in my line of work.
BTW. How do you know that the text was not written using vi and cut and pasted! And I would not call this comment window I'm typing in anything more than a simple text editor, more like notepad, wordpad, EDT/EVE or any number of simple text editors than Word or Oowriter.
I think your chronology is wrong. UUCP was probably the earliest, certainly a long time before FTP and the general takeup of home computers able to talk to Fidonet et. al.
UUCP over modems was a common communication method back when the modems cost (currency)1000's, and only companies would buy them. The earliest instances of Usenet was UUCP based (UUCPnet?), and worked via emails sent through a connected network of systems mainly using modems. One system would host a group, and would receive posts via email, and distribute them likewise. Everything else evolved from there.
Basic UUCP functionality was include in BTL UNIX Version 7, dated around 1976, although it may have had some support in V6 (I've lost my documentation). It's probably in my coat, wherever that is.
My god! One that still works!
These terminals had the most unreliable video system I have EVER seen. The brightness is the thing that fails most, leading to having to peer at the screen with all the lights off. Then there is the flyback supression that lead to ugly left-to-right, bottom-to-top diagonal lines. The power switch breaks, and the clips/screws that hold the mainboard to the case appear to come undone. The built in tilt foot breaks, and the tilt and swivel base (if fitted, it was an extra purchase) would fall off whenever yo picked up the terminal.
And this is just the hardware!. The 3151 used IBM specific terminal codes (i.e. not compliant with ANSI X3.64, Wyse 50/60 or any other terminal I came across). Whilst they worked, there were some real ugly features like not being able to turn on or off the bold/underscore/flash capabilities independently. IBM addressed this by having 'compatibility cartridges', which definitely did NOT do what they said on the can. The cartridge for AIX compatibility was supposed to work with AIX (surprise), but in reality, because there were multiple versions, most of which were broken in different ways, it was useless. You had to tweak the termcap/terminfo entries to get them working at all.
And don't get me started in the stupid cables that were the official way to plug them into a PC/RT 6150 or RS/6000. 10 pin MODU or RJ45 to 25 pin D-shell, straight through 25-25 pin serial cable and then a Serial Interposer (that was wired differently from a standard null-modem) that stuck out of the back of the terminal just begging to be broken. (later RS's used 9 pin D shells, a major step forward).
The only good feature was, as jake said, that they came with Model M keyboards (a real class act), but this was spoilt by having a stupid RJ11 connector on the end of the cable that meant that the keyboard could only be used on 3151s without hacking them around. I also agree with vi (pronounced vee eye not vye or 6 [think about it!]), but would say that if it suits you, Emacs still works very well on non-graphical terminals.
Thankfully, the 3152 and 3153 were better, but they were still worse than DEC/Wyse/HP and any number of small company alternatives.
My favorite was a small company set up by some ex. Wyse engineers, called Falco. These were amazing terminals, with good keyboards, readable screens, dual serial ports with separate terminal sessions on each, good ANSI X3.64/VT220 (amongst a host of others), and to cap it all, good Tektronic 4014 emulation. And they were cheap! They were the perfect compliment to System V systems, allowing all of the AT&T goodies such as S, graph, sag and the tek backend to diTroff to work. The only thing that was better, I found, were the Blits (5620/630/730), but these were in a different price league from any other terminal.
Oh well. It's mainly all boring history now, as my work colleagues keep reminding me. Where's the Boring Old Fart icon! I guess a beer will have to do.
Don't compare apples and oranges
You cannot compare the two. Although CP/M was a pig to use in hindsight (who remembers PIP), it was very like it's peers, and possibly a bit better (certainly from version 2.2 onward). It was so good for it's time, in fact, that Seattle Computer copied it to create their DOS (DOS is too frequently used to me meaningful without context), which was noticed by Microsoft, and the rest is history.
Of course, CP/M was a rip-of of previous systems like Digital Equipment Corp.'s RT/11, which it self was a derivative of one of their PDP/8 OS's.
But in those days, an OS was really an application launcher, pure and simple. A lot of people I knew actually did not use the OS at all, but just created (or got a friend to create) a bootable disk that automatically launched whatever application it was they needed. So you had a WordStar disk, and a VisiCalc disk, an MS Basic disk etc. Once you were in the app. you never left it until you saved your file, and turned the computer off.
Why not just 2 hard disks? One with Win7, the other with WinXP. Do serial checks, one with the XP disk and one with the Win7. Repeat, and see whether anything changes.
Cheaper on hardware, but possibly more time consuming.
Remember that what is stored on the database is just a hash of the DNA, not a complete assay. Nobody is going to be able to clone anything from the database, and identifying health problems is unlikely.
Why not Ogg Theora then?
Surely this should now be a no-brainer.
This is no concession...
Notice the "not-for-profit"caveats. This means that privately produced YouTube videos are fine, but you can't use it for fee-based video services (think Sky AnytimePC, or even under some circumstances, BBC iPlayer or 4OD), and you certainly would not be able to use it for promotional material from commercial organisations.
This makes it unsuitable for a universal codec,
Sounds like a minor concession to line up future license fee revenue streams to me!
Likening the current Linux kernel to V4-V7 Bell Labs. UNIX on a PDP/11 is like saying that your Ford Mondeo is a re-implemented Model T.
It's true, it has four wheels, a motor, and used a steering wheel, brakes and a gearbox, but there ain't no compatible parts!
Now I'll defend V6 or V7 as being brilliant for their time until the cows come home, but don't suggest that Linux is 'just a re-implementation'. Even if you did, SVR3.2 with TLI or BSD4.4 would be a better reference than anything that ran on a PDP/11 (think communication subsystems).
And anyway, why should Android not be multi-tasking (OK, I'm blurring the distinction between timesharing and multi-tasking), but both the timeslicing and the privilege protection is just as useful in a mobile device as it is on a multi-user computer. After all, the inability of the iPhone and iPad to multi-task is one of it's biggest criticisms, and having an ineffective security model was seen as Windows biggest problem. Multiple tabs in Chrome on ChromeOS will probably be implemented as threads which will probably need to run in parallel, anyway.
Whilst BeOS, OS9 and VxWare people may think their OS's have significant advantages over a Linux kernel, Linux is not such a bad place to start. The API's are well understood, the code is open, and you can comfortably remove the overhead you don't need (I remind people that vanilla V6 UNIX on a PDP/11 without separate I&D space HAD to fit in less that 56K of memory!).
Not free.... unless
... you are prepared to run it under Wine. You would be breaking the license conditions, and I believe that IE6 is the last version that you will get to run comfortably, but it works (I dual boot the Laptop I am doing this on, and have a valid Windows COA for it).
D-to-A conversion takes time. You cannot get away from this, and the delay varies inversely with the power of the microprocessor in the decoder. Exactly the same happens with digital TV. Try tuning TV's in different rooms to BBC One, one on digital and one on analog terrestrial (quick, before it disappears!) And for Sky or Freesat,, its worse still because of the longer transmission path. Nobody has complained about this yet.
I know all about the problems with the time signal, I use it myself (on FM of course), but you ought to realise that there are propagation delays in all transmission systems. When I was involved with radio clocks, there was a map that used t be published that detailed the NPL radioclock delays to the extreme edges of the country. This must have been upset when the service moved from Rugby to Cumbria. 200 miles will lead to a measurable delay in the milisecond range. Ignorable if you are setting your watch, but not zero.
It could be apples and oranges here of course. If you have a decent aerial on a radio that does not move, especially if it is in some form of HiFi, FM quality is predictable and generally quite good. But even in this case, the stereo decoding introduces hiss (try hitting the mono button on your tuner when listening in a quiet environment, and seeing how the hiss disappears). And you must remember that many of the commercial FM stations use dynamic range compression and dead space elimination techniques to boost the quiet parts of the music and make hiss less noticeable. Try listening to Radio 3 or Classic FM if you want to make comparisons.
In a car, there is all sorts of interference, especially when the car is moving. There are drop outs as you move, especially in built-up or hilly areas, and cars are not good environments for electrically sensitive equipment (yours may be well shielded and suppressed, but you cannot control the rusty 20 year old Fiesta that pulls up next to you at the lights!)
In some cases, DAB can eliminate this interference. If you get a good enough data stream, other interference can becomes irrelevant (it's digital!), and transmission and decoding hiss disappears. But more often than not, the same interference that degrades the FM signal will also damage the digital signal, and when the DAB receiver does not get a enough of the digital stream, it either burbles or just drops out for a couple of seconds, whereas FM may still be listenable.
My guess is that a lot of the people who say DAB is good and FM is bad listen to FM in the car, and DAB in the home, whereas a lot of the people who say that DAB coverage is bad are probably trying to use it on the move, or just in areas of crap reception, where FM degrades more gracefully.
Ah. The same section of the M5 and North Devon link road that I can't get reception on then.
But for me, it extends north all the way up the A396 until I get nearly to the coast, when presumably, I start picking up the Welsh multiplexes.
This is subjective, but I would say that a high bit rate DAB station when the signal is good sounds better than FM. But the problem is that only a small number of the stations actually broadcast a high enough bit rate, and you cannot depend on a good signal.
In general, I would prefer to listen to hissy, uninterrupted FM in the car than to a DAB station that keeps dropping out for seconds at a time. But at it's (infrequent) best, DAB can sound superb. I listened to a carol service on Classic FM (on DAB - 160kbps) in a quiet environment through decent headphone recently which was simply breathtaking in its clarity, dynamic range and lack of noise or digital artifacts. Very rare, but a good indication of what is possible.
Blackspots. I'll say!
I have a 50 mile drive to work. I can get Radio 4 (which surely must be one of the stronger stations) on DAB for the first 5 miles, and the last 2. At one other point during the drive, I may just about be able to get a signal good enough to recognise the broadcaster, but not what they are talking about.
This is on a properly fitted, car specific DAB radio. And about 13 miles of the journey which is DAB dead is on the M5.
I'm sticking to FM, even though I want to listen to Planet Rock and BBC7, and I invite Peter Mandledroid down here anytime to see whether he would find DAB acceptable as an FM replacement (I would even refrain from haranguing him about much of the dross he says in public during the process)!
That's not what DPI is for. Any fool can identify bittorrent traffic by looking at the port numbers and the first few bytes of each packet, but DPI should be able to catagorise what is being carried in the bittorrent stream. This should mean that they can work out that Ubuntu 9.10 is OK, but Avatar is not.
But I'm not sure you need to use DPI for this (at least with torrents), as all you need to do is join the leechers, and grab the first few blocks to work out what is in the torrent. Of course, if this was known to be being done, you build an image that is a mix of copyrighted material sandwiched between something that is not, but that just makes it an arms race between the community and the ISP's. And before anybody starts talking encrypted torrents, remember that to be usable, the leecher has to be able to decrypt what they have downloaded, or else it is not worth doing.
But I am worried by "40 per cent of Virgin Media customers will be monitored for illegal music sharing, but those involved won't be told". I would suggest that there is a "yet" to be added at the end.
BEEB and Teletext
It went much further. Acorn made the Teletext adapter for the BEEB that allowed you to open teletext pages as files on a file system from BBC Basic. One would open a page by specifying the page number as the filename, exactly as you would a file on disk, and then read a record that corresponded to the entire screen, and decode the information inside your program. I am sure that the adapter also cached some of the pages so you could get fast access.
Was fun to play with, but I could not really see a real application for it. I guess it was really an early example of a "Screen Scraper".
I could not get the hang of the locking graphics modes for Mode 7 which allowed you to specify disconnected and joined up graphics modes and colours. The person who thought this up (for Prestel and Ceefax, before the BEEB came along) must have had a seriously deranged way of looking at things. But it was a hardware mode, implemented by the display hardware (an SAA 5050), and allowed much clearer text than the all-points-addressable modes (the cell size was something like 15x10 compared to 8x8 in the graphics modes) meaning that the text was very clear even on cheap televisions, the screen only used 1K of memory, and allowed all 8 colours, plus flashing colours to be used.
Was a clever way of maximising the usable memory in a machine that looked under-provisioned for memory even when it was launched. Ah, the memories.
Lamborghinis maybe, but you could probably trademark Lamgorhinis yourself.
The monopoly on 360/370 family mainframes is, as Henry Wertz points out, not enforced by IBM. The plug compatible manufacturers simply left the market. It was as though everybody but Ford stopped making cars.
I've heard it expressed that the 370 was the first open computer platform. IBM published the 370 Principals of Operations (POO), which documented the complete workings of the CPU, channel structure and I/O processors that enabled the whole PCM market, exploited by Amdahl, Fujitsu and Hitachi, to name just a few.
Acer's core business?
Problem is Acer is not in the OS business. What would they put on it? Android? Chrome OS, Windows Mobile? Or maybe whatever Microsoft is touting as a tablet OS at the moment?
The Windows solutions won't give it the same WOW as the iPad (at least to fanbois), and Android or Chrome OS will probably need some some development by Acer to make it attractive to customers.
I would actually like to see a device like the iPad running WebOS, but I doubt that Palm have the spare cash to develop such a thing. Tablet, pen/touch integration, Multi-tasking, a long history of developer-friendly application environments. I believe that it would be more usable than what we have been told the iPad will deliver. Still, we can dream.
Not so! I used it only last month, but it was for a nostalgia trip!
But the real question here is when is an obsoleted protocol/service no longer required. I'm sure a lot of people would like to see ftp deprecated, but it's not going to happen for a while yet.
You've not been reading the Reg. properly. HTML5 is far from done-and-dusted, as it relies on the underlying codecs being in the browser, and there being arguments about H.264 and Ogg Vorbis.
BTW. H.264 is not open source. Even though it's freely available It is patented. This is something quite different.
It's relative. If you have a road-warrior desktop replacement which is over three years old, three hours would be excellent! If you have a current netbook with all of the power saving features enabled, it's not. Remember that the eeePC 701 (the original 4GB SSD one) was only quoted as having about 3.5 hours of battery life with the high capacity battery, and that was only three and a bit years ago!
My trusty thinkpad T30, which is about 6 years old, gets about 1hr 20min. When it was new, the handbook said that it should have been 3-3.5 hours. If I still got 3 hours I would regard it as excellent, but as I only use it on batteries infrequently (part of the problem), it's no issue (and being a Thinkpad, I could always get a Chinese replacement battery for about 30 quid if it was).
Rocks chucked at a planet...
...by things called "Mass Drivers", have featured in Sci-Fi like "The Moon is a Harsh Mistress", Babylon 5, and even the Anime series Gundam (the original, not the numerious follow-up's).