1734 posts • joined 15 Jun 2007
The Lawyers will have to look at the wording of this judgment carefully. The way I quickly read it, any technology that facilitates location and downloading of copyrighted material could fall foul of the way it was worded.
If the critical part was the filtering, then any service that allows a partially wildcarded search including words such as mp3, avi, mpg may be covered by this.
I'm thinking search engines in particular, but even things as mundane as an FTP server could be seen to be covered by "providing direct infringers with a product that enables infringement".
You would have to ask whether ARPA and UC Berkeley are also liable for laying the groundwork for the Internet as a whole!
..because the rights include the sheet music, the right to perform and the recordings.
I *think* that it was the sheet music and the right to perform the songs published on Northern Songs that was sold (and if I remember my history, it was NOT any of the Beatles who made the decision to sell, it was Allen Klein and the board of Apple Corp. who were in control. The members of the Beatles had a 50% share in Apple Corp, but crucially the casting vote in any split decision went to the Chair of the Board.)
I believe that Sir Paul has since bought at least some of these rights back, even though they always had retained the right to perform the songs themselves.
The ownership of the recordings always remained with EMI, and was one of the cornerstones of their profitability. It is interesting to note the rash of new mastered copies and compilations that appear to be hitting the shelves in the run up to the copyright on the recordings expiring. Is this one of the things that is bringing EMI down?
And Rik. Why the obfuscation about their names. What's wrong with Paul McCartney, John Lennon, George Harrison, and Richard Starkey?
Here I am!
Long filenames on UNIX appeared in the Berkeley Fast Filesystem in BSD 4.2 around 1983. In AT&T releases up to SVR3, you still had the original limits of 14 characters overall, including dots or other characters (UNIX does not and never had the concept of a three character extension).
Around 1987, when SVR4 (and soon after, OSF/1) appeared, pretty much all UNIX vendors either had, or had plans to drop the original Version 7 derived version of UFS for one based on BSD FFS.
So yes. UNIX had it before Win95 AND OS/2.
Damn. Trying to be too clever for my own good. Of course I meant Theora!
Actually, I was not agreeing... at least not directly. As I read the license, H.264 is a cash-cow for the alliance, and is in no way is free like Ogg/Theora. End users are unlikely to have to pay according to the terms and conditions of the license, but content providers and codec suppliers will be without any doubt. In addition, it can be a throttle on the acceptance of free software.
As I tried to point out, this puts it on a collision course with the Open Source purists, meaning that people like RedHat and SuSE are unlikely to include support for it in their distro's. because if they ship (or even make available) more than 100,000 copies, they become liable for considerable license fees, with no way of recouping these from their end users. This *may* be OK for the larger distributions (although I doubt it myself), but puts an onus on any distribution supplier to track the use of their distribution.
Given the viral nature of Linux distribution (download it, burn it, give copies to your friends, distribute it via torrent etc.) it becomes impossible for any distro supplier to do this. Maybe you could track new systems appearing on the 'net through some spyware, but can you imagine the furore that would result!
What would happen is what happens now for MP3 and DVD, that the distros are shipped without H.264 support, with easily available instructions on how to add it from repositories outside the control of the distro owner. This will be a serious barrier to the adoption of Linux by people who just expect their computers to work out-of-the-box. This is why Canonical have bitten the bullet, and paid for license, because they want to be able to ship 'it just works' versions of Linux.
So this a sticking point, and will just enforce the notion that Linux can never be mainstream.
I defend my FUD comment, because putting a notional Sword of Damocles over the head of anybody who uses Theora is just that, notional. Unless you have explicit evidence of patent infringement, of course. As I pointed out, there cannot be any proof that any piece of software does not infringe someone else's patent, and this applies to H.264 as much as Theora. The only difference is that the Mpegla consortium that own the H.264 patent have a larger set of resources to fight any action, with money and additional patents to enter into cross-licensing agreements with anybody prepared to take them on.
Too many people concentrate on the fact that Flash is used to deliver video, and forget that it is actually a multimedia scripting language that does so much more than just video. It was never intended to deliver video, that is just an interesting sideline that became possible when systems became fast enough to do it.
I don't like Flash even knowing this, but HTML5 and H.264 will not do the interactive applications (I've come across Flash newsletters and estate agent brochures) and games that Flash delivers. I've even come across a comic book delivered as a Flash download.
The only thing I have seen that comes close is Silverlight, and I think that even the harshest critic of Flash will probably not dump it in favor of Microsoft's propriety alternative (and yes, I have tried Moonlight, which always seems 1-2 major releases behind), and I have not seen any statement about Silverlight being supported on iPod/Pad.
1. Is there a guarantee that the the licensing of H.264 will be renewed using the existing Ts&Cs with known changes? Well, yes, but the charges are clear, inflation of 10% per 5 years renewal is almost guaranteed, and it is the case that currently people like the BBC and Sky are *currently* liable to license charges for videos over 12 minutes long, and it is quite possible that Canonical will be liable to license charges if and when they ship over 100,000 copies of Ubuntu in a single year (which is why they have entered into an agreement).
The same is true for a rival to Apple who may ship over 100,000 media players using H.264 in a year.
2. There is a HUGE difference between *knowing* that there is a future patent/Licensing claim, as is the case with H.264, and suspecting that there may be but don't yet know, as in the case of Vorbis (I don't think that the Ogg container is likely to be patent encumbered, at least nobody has been talking about it yet).
If you play this FUD card, then you must acknowledge that any piece of shiny new software is a potential patent infringement, because that is the way that the patent system works. It is not possible to know every nuance of every patent still in force, and absolute proof of lack of infringement is not possible even if you pay megabucks in patent searches. It is still possible that someone may claim that H.264 infringes on a pre-existing patent.
This shows the essential weakness of the patent system, that it is impossible to prove a negative (it is always easier to prove that something has happened, as opposed to that it hasn't, and never will).
The FSF do have a war-chest for defending Open Source projects published under the GPL, although it is the case that Vorbis is dual licensed under LGPL and the BSD License. I'm sure that if there was a challenge to Vorbis, this would grow, especially if commercial organizations start using Vorbis more than they currently do.
WEP yes. WPA, probably no.
WEP can be cracked if you gather enough packets (but 90 seconds when you are in range is probably not enough time, even if you engage in aggressive packet injection).
WPA/PSK, you have to gather enough packets during the initial key setup using the fixed key. This is very short. Once the keys start changing dynamically, you have very little hope regardless of how many packets you snarf, because by the time you have enough, the key you are trying to crack has changed. And if you are using WPA/TKIP with a Radius server, for example, you do not even have the initial window of oportunity.
I realize that what I say here is simplistic, and there are known attacks for both PSK and TKIP, in general they take 10's of minutes, and I don't think that the google cars or bikes were traveling that slowly.
It is the case that there is an implied trust in anybody to whom you give your information, whatever it is.
I think that what Mark said is quite true, although a bit strongly worded (this is subjective, as many people, especially when they are young, use the f word freely in everyday speech without any specific meaning). I mean, why would you trust someone who you don't know with any sensitive data?
Anybody who believes that a third party does not have the ability to abuse their data is clearly too naive to be allowed to use such systems.
And in my mind, this extends to people like Google and other cloud storage organizations . Do you really trust them with all of your documents and emails about *everything*? What recourse do you have if your sensitive or commercially valuable IP leaks, whether intentionally or by accident? If they could be sued for loss of reputation, or damages through commercial loss, I'll bet that they would all close down, or become subscription services bound both ways by contract very quickly.
You sure you don't mean Saito or possibly Major Kusanagi with Sniper software rather than Batou? He's more like heavy infantry, where size is more important than accuracy.
When I switched from Red Hat (original, not RHEL) to Ubuntu as my main distro, it was exactly this dilemma that bothered me. I was not looking for an Enterprise release specifically, merely one that I would not have to upgrade every 6-9 months, but which would remain current enough that I could still get packages to compile.
Fedora became too volatile, and (I'm afraid), I was not in the market for paid support, which made RHEL unattractive to me.
I selected Ubuntu (then fairly new, I jumped on at Dapper), and have mainly stayed on LTS releases although I did put Jaunty on a netbook.
My experiences are that Enterprise or LTS releases have good and bad points.
If you remain too far behind the curve, it actually becomes quite difficult to add compile-from-source applications, but you do get good availability and stability. As my day-to-day system is a laptop which I used to plug in all sorts of miscellaneous hardware to try to get working (I was ahead of the releases for WiFi, 3G broadband, TV adapters, HomePlug adapters), I needed to be able to take what was currently being worked on, and try it. This became impossible if you fell too far behind the mainstream.
If, however, you follow the curve too closely, then for a period of time after an initial upgrade, you may have stability and functionality issues. Like many users, I had quite a challenging time when PulseAudio became the preferred sound system.
My answer is to remain on the previous LTS release until the new one is 3-6 months old. This allows you to remain fairly current, but avoid many of the teething troubles. I'm looking at Lucid on one of my systems, but will not switch from Hardy yet.
Of course, many enterprise systems will be installed for specific applications, rather than for general purpose use. For these, upgrade the system in line with the application. Once you have it stable, leave it for as long as you can (security updates excepted), and only consider an OS upgrade if the application requires it, or the OS drops out of support.
To all of the people who are complaining about major applications changing with upgrades, what the heck! Just put your favorite on *in addition* to the new one. They are likely to still be in the repository in most cases, and work as before, unless the package owner upgrades it significantly (I still rue the day that xmms became xmms2).
OK, so you need to authenticate, possibly we should give everybody a physical means of identifying themselves (after all, numbers and ID strings can be copied), and mandate a way of electronically reading these securely on someone's own PC.
So you are now supporting ID cards, with card readers, attached to PC's with trusted and supported operating systems with DRM built in - say Windows running either Vista or Windows 7. This is what the industry advisers who will be engaged by any government will say. Win for Microsoft and the PC makers, don't you think!
What happens to people who can't or won't invest to do this? There will not be sufficient demand for polling stations, so would you install PC's in Post Offices or Libraries (oops, none of these left), or possibly Pubs (rapidly going the same way in small villages)? Will we have a underclass of people who can't vote because they live in the country and have limited transport options?
I really don't think this is what you want.
R not 'created' in 1996, more like copied
R was is an open-source re-implementation of the tool S that was part of AT&T's software toolchest long before 1996 (Wikipedia says 1975), and originally developed by Bell Labs.
This is acknowledged in the documentation for R.
I was using S in 1988, and it was not new then. I was interested in it because it has a number of similarities with APL (A Programming Language) that is often cited as the first interactive computing environment.
How much of a problem?
Although I am a long-term technical specialist, I'm (almost proudly) mostly web ignorant (I can use it well, but don't expect me to be able to write any HTML or XML without a tool - Hey, I'm a core UNIX specialist, not a web designer!)
Looking at it with the eye of a novice then (hand-grenade time, this is flame bait), what is the problem with IE6?
I know, I've read that it does not confirm to standards, it's HTML implementation is poor, and generally web designers bitch at it all the time, but from a users perspective, pages appear when asked and it works in places where firefox, Chrome, and Safari still don't (I run Windows 2000 Pro in Virtual Box, and IE6 under Wine, and also have Windows 2000 Pro in a little used partition on my laptop for those awkward sites that insist on IE as a browser, especially when they need Silverlight or WMP as a backend for media - must update to XP to allow IE7, it is licensed for that).
I don't want to buy Windows 7 for my laptop, which is still working, so I don't need to change it. And I'm sure that many people who are mostly infrequent users feel the same. Ubuntu does 98% of what I need on a 2GHz Pentium 4 mobile.
I would actually like to see web sites that are strictly bound to one browser, or which are so PC unfriendly by splattering large flash animations across their pages, or which require a 1280x1024 minimum screen to display, blacklisted by the community for a few days. As a Linux and Firefox/Chromium user on laptops with 1024x768 screens, this would be a far better protest as far as I am concerned.
What I am also appalled at is the attitude of Giorgio Sardo, who obviously has an agenda of selling Windows, and indirectly, new PC's. Microsoft should be forced by legislation to provide modern browsers on their legacy OS's all the time there is still a sizable number (say, greater than 25% of the whole) of systems still in common use!
If IE6 stops working, then I suspect that there will be an anguished cry from tens of thousands of users, and a sudden lack of space in the Electrical section of the local municipal dump, not a large number of people installing Firefox.
I used to keep all of my ripped audio in Ogg Vorbis on my laptop. That was until I started using a high capacity media player. The time (and space) required for transcoding from Vorbis to MP3 for large parts of my collection began to get annoying, and due to a strangeness in Amarok and the transcode plugin, I ended up with both Vorbis and MP3 copies on my laptop. This ended up confusing Amarok, so I have now switched to all new rips saved in MP3. I don't like it, but I value my time and disk space more than a principal.
Software algorithms are not industrial processes. They can be innovated by you in your office or bedroom, by Johnny when he is not in school, or by your wife if she has the skill. It's a potential low cost, easily doable by almost anybody.
Having software patents prevents you from doing this, because if you unknowingly infringe on a patent, you are not allowed to use your own innovation. Are you prepared to do the searches to make sure that that clever snippet of code that morphs your cursor when you move over an icon or window does not infringe any one of thousands of patents? And what if you want to show off your extreme cleverness to your friends, are you prepared to indemnify them against possible law suites?
It's not the same as the way of physically holding data on a CD, or the process of masking transistors onto silicon, or any one of the hardware related patents you hold up as examples, because as an ordinary user, you will not be in a position to produce an industrial process in the same way as you can write software.
Patents can and should protect and encourage innovation, but the whole system has been corrupted to allow large corporations to make sure that no-one else can innovate. It is possible to own a patent, and then to grant an irrevocable right of use without fee to anybody. This is what everybody is hoping that Google will do with the patents they have just acquired.
More and more merchant receipts only show 4 of the 16 numbers. It's stupid to have them all.
What I want to know is...
If an ATM defaults to reading the mag stripe, where is the PIN stored? Is there a one-way hash algorithm in the ATM that reads a key from the card, that together with the PIN can be used to generate a non-reversable cryptographic signature whose authority can be checked in the ATM?
I would prefer to have different numbers for the same card for ATM and Merchant Services transactions. This would be much safer than using the same number everywhere.
Reason for not handeling card
is not so they can't read it, it is so they cannot put it through a card skimmer. There are not that many people with eidetic memories (for goodness sake, I can't remember a new phone number for more than a few seconds).
I'm fairly certain that a high enough res. camera or two would be able to capture the name, dates, long card number, and the security code on the back even if the till operative did not handle the card. This is enough to use the card for Internet transactions.
The scam used to be to skim the card, send the details to a country that does not have UK chip and pin, clone the card, and use it to pay for goods in that country. And if you are also able to grab the PIN by shoulder surfing, you can used the cloned card to get cash out of a non-chip-and-pin ATM abroad as well.
Now that all you need is the visible information from the card for card-holder-not-present transactions, the whole system is open to abuse. This is the reason why we have the Verified by Visa and the SecureWhatever-it-is for Mastercard for Internet transactions. But this is not needed for card payments over the phone, so don't do it.
The instance of banks that the PIN must be kept private should be communicated to retailers who put their merchant devices on fixed installations in plain sight (Tesco, I'm singling you out here, but I'm sure that most other Supermarkets are also guilty of this). I'm certain that I could with reasonable accuracy observe the PIN number of the two customers ahead of me in the queue on most occasions. This makes the whole system a joke.
You don't get an idea about the size of this monster until you get to the pictures of the touchscreen being used. I'm certain that because of just the size, this system will never appear on my laptop replacement shortlist. I guess I should have guessed, it having a 17" screen, but if it is so wide, why have they not made space for the missing keys!
They use this information themselves to check that you are still authorized to use ADSL, so it is no hardship for them to log it.
co.uk not clear.
Not sure I agree. Whilst Computer Programs are not patentable, it is still being argued about whether a software technique can be regarded as an invention, and thus patented. UK and EU law appears to be at odds here.
Also, while you may have a standard, there is also nothing that states that a standard is not encumbered by patents. I do not believe that H.264 is either a free or an open standard in the generally accepted meaning, it is just that the license fees have been waived until 2015. This should ring alarm bells to anybody with half a brain cell.
I'm fairly certain that most flash videos are H.264 encoded with a flash UI wrapped around them. This calls a decoder in the flash runtime. This is one of the reasons why the performance is so dire on non-windows platforms as Adobe show no real interest in anything other than the mainstream.
If we intend to keep the low power/cost end of the computing platforms alive (such as phones, pads and netbooks), we absolutely need the decoder part of a codec in the browser, not just language interpreters that allow you to run a decoder.
I've been playing around recently trying to use a different backend for flash video, specifically using mplayer with the correct modules for flash video. This works great (and much faster), until you hit a site that tries to query the version of flash in the browser plugin (like iPlayer), whereupon it falls down in a heap.
Can we tell where this is going yet?
This targeting of Ogg/Theora is the most blatant example of standards land-grab by patent that we have seen so far.
It would appear that Microsoft/Apple et. al. are not deploying their patent IP to generate income at this time, but merely to stifle an alternative technology that may deprive them of an effective monopoly.
I say effective, because the cross-licensing that big IP holders engage in has the ability to deflect anti-monopoly legislation, because a consortium of co-operating companies is not deemed to be a monopoly under the current rules.
Of course, once they have this effective monopoly, they can then leverage it for revenue generation. We can only hope that Google is prepared to defend the codec that Theora is based on.
As pointed out, if Microsoft and Apple are successful, then it is a grim portent of what is to follow.
USB wireless keyboard
Plug in a normal keyboard, drop into the BIOS during startup, and turn on Legacy USB support. Then reboot to see whether Grub understands the keyboard.
Grub is a minimal OS where size is a real issue. It relies on the BIOS settings being right!
I know that this is a trivial change, but the default background for Lucid reminds me of the early days of colour television when the tube would become magnetized leading to unpleasant blotches of colour.
It's different, I admit, but not pretty by any stretch of the imagination.
Troll alert! David Lawrence
I hope that this was deliberate flamebait!
The only reason I've had to do something like this in the last 5 years on Dapper or Hardy is when I have tried to get some hardware working when the vendor has not done anything to make it work under Linux themselves.
Remember, when you install a new piece of hardware on Windows, you have this nice shiny round thing with the hardware (it's called a driver CD), that the vendor has put a lot of work into to make it work in Windows. If they bothered to do the same for Linux, you would never have to touch the kernel. Try getting an HP printer working in Windows without the install diskette. It's nearly impossible if it is a printer Windows does not already know about.
For joe average, who wants to write documents, or browse the web, or even plug a printer in (unless it's a Lexmark - spit), everything they need is likely to already be in the distribution, or at least in the repository.
And claiming that you have to reboot twice is rich if you are coming from a Windows environment. Just installing a system from Windows media and the driver loads for XP will require you to boot your machine many, many times more for even Windows 7 than a recent Linux (just built a Windows 7 system over Christmas myself. Easier than XP by far - nearly as easy as the Lucid install I did on my workhorse laptop this morning!)
Go on, give it a try. Build a Windows and a Ubuntu system from scratch, and report back to the thread if you dare.
re. strictly for geeks...FAIL FAIL FAIL with a side order of FAIL
If there is one thing that UNIX like OS's are *MUCH* better than MS, it is the consistent directory naming system.
Remember that Linux, like UNIX, is a multi-user OS, so all of *YOUR* files should be kept under *YOUR* home directory, not scattered across the *SHARED* part of the filespace. This is what makes it possible to have UNIX like systems share their userspace across a networked filesystem, compared with the absolute CRAP of roving profiles in networked Windows systems.
If you look under your home directory on Lucid (or mos of the earlier Ubuntu's), you will find a directory called Desktop, one called Documents, one called Music, one called Pictures, one called Videos and one called Downloads. They are all yours, and will never be interfered with by another user logging on using a different name. If you want more, just create them in your home directory with whatever names you like (I have a local bin and a local lib and a local tmp just for me, but then I am a dyed in the wool UNIX user).
This is one of the fundamental strengths, for as a non-privileged user, you only have write access to the files under your home directory, and a restricted number of temporary directories. So if (and when - even on Linux) you get exposed to some malicious code, only your files are at risk, not the system as a whole!
and wash the cars
No. They teach the children, run the libraries, clean the streets, collect refuse (unless this is outsourced), inspect the environment, enforce parking regulations, grit the roads, examine planning applications, man the help lines and front office services, manage the care home provision (and in some cases run them), run the leisure centres and swimming pools, cross children across the road, maintain the street signage, run the electoral role and elections, collect an manage council tax, and on top of all of that, they manage themselves and their presentation to the people.
And I'm sure I have missed a whole lot out.
I might have mixed up local and county council responsibilities, but the councils actually do a huge amount to maintain our society. Whether they do it well or no is another matter...
If they were able to make a value decision themselves about benefit vs. cost, then I would agree with Mark65.
But they are being told in no uncertain terms by a meddlesome government what they must do, with strict timescales and financial penalties. Their hands are tied, they generally do their best (which may not actually be very good, because councils are not in the Internet business), and probably spend more money than they need on failed work and private sector 'consultants'.
Councils can be run like a business, although one with a tied customer base. It is believed by government and their (paid) advisers is that this is the way that councils should be run, as private sector management *MUST* be better than public sector. The problem with this is (as you have suggested) that councils are not really commercial businesses, and will only loose customers if the actually move geographically.
Please look at the scale
What are you comparing a council to?
If you do like-for-like comparisons between a county council with maybe 30,000 workers serving around 400,000 households with 850,000 residents (these are the approximate figures for Norfolk) against a corporation with 30,000 employees and close on a million customers, I think that the figures may be surprising.
How much do you think that a small bank, or possibly IBM UK spend on their web sites. I'm sure that the comparison would be very interesting. I would not be surprised if the councils spend less.
And look at the services they are being forced to supply (by government regulation) on the web, even if only to tell people their rights and entitlements. Housing, social services, schools, care provision, refuse collection, environmental health, roads, planning, enforcement of regulations, business rates, council tax, court services, local business development. And I'm sure I've missed many out. All the information has to be correct within guidelines.
It's a big, big problem that is quite beyond the experience of most people to comprehend (and probably most councils). This leads to the problem being treated as an elephant task, one bit at a time, which as we all know leads to inefficiencies.
Southwest One is an example where private-sector companies come in and manage to spend more money doing less than the councils ever did.
My god (if I had one), you're a better man than me!
Never sleeping or eating or going to the loo. You really never have an opportunity to log out!
I think you need to move your long-running tasks into a batch process that can run divorced from a GUI, and learn to set up your environment quickly.
I have met members of the "it takes too long to set everything up when I log in" brigade, and all I can say to them is to stop whining, and learn to automate their post-login setup process. I tend to use a personally enforced logout/login to keep the number of windows I have open manageable.
I think we have identified a REAL geek here! Obviously does not have a Significant Other who complains about the noise produced and electricity consumed (while using the tumble drier even when the sun is shining!)
I agree it does not NEED shutting down (see my comment on my firewall), but I pay huge amounts to the electricity company already.
What I am hoping for now is a power supply that draws less than a watt when in standby, a motherboard that will respond correctly to WOL requests and an ACPI subsystem which allows Linux to suspend and resume correctly. I can start my RS/6000 44P remotely, why have I not been able to get any of my Linux boxes to do the same.
I'll look at this again with Lucid, as I've recently replaced the motherboard in my workhorse system.
Maybe I should have checked, but...
...I was taking the article at face value, and it says that the patches were rolled into the release candidate. The release candidate is supposed to be what is released unless something seriously bad is found.
The reason why it is important is because there was bad press and lingering stories of woe for a couple of weeks after Karmic was released, and Canonical cannot really afford the same for Lucid.
This is newsworthy because we are two days away from GA, and this is not a bug in a beta, it is a bug found in the Release Candidate. Generally speaking, it is unusual to get late fixes in the RC, because of the alpha and beta programs.
One question, however. I know why you might want to keep a server on for months at a time (my firewall system gets restarted about 4 times a year when I have a loss of power), but why stay logged in for all that time? Every time you log out and in (at least on Dapper, Hardy and Jaunty, and various RedHat and Mandriva distros), the X server is re-started, recovering the lost memory.
This make me feel uneasy
Surely, playing around at this level could well cause cancers and teratomas (some of the pictures of these just make me feel sick!).
After all, it is a breakdown of the controls on cellular division that cause both of these conditions.
OpenOffice is not written as a commercial product. The fact that it is so good is a tribute to the people who have contributed their time and other resources, and the companies who believe that they can make other sales as a result of being apparently altruistic (Old Sun comes to mind). They are entitled to give their effort away if they choose.
But just giving away software is not a business model, which makes it an unfair comparison. In this respect, I do actually agree with Microsoft. They have invested effort producing the software, they are entitled to get reward for their effort if people want to use it. It would be their right to give it away, but it is not a duty for them to allow anybody to use it. It is more an argument of value and worth.
Now I'm not saying that people should stop using OpenOffice, but just that they be aware that free at the point of use does not mean free to produce. I also disagree with the prices that they charge, but I agree about their right to charge something.
You could almost turn the tables, and claim that Microsoft's commercial product is being undermined by the supply of a free alternative. This is very similar to the argument that Mozilla used when arguing against Microsoft giving away Internet Explorer (OK, OO is not an integral part of any OS, but that was not what was initially argued).
Not that simple
As I remember it, the HP/Intel tie-up looked like a good fit, even though in retrospect I would say that Intel took HP for a ride.
At the time HP decided to jump to a joint-developed chip with Intel, it was engaged in an arms-race with IBM. Tim said that it was the Itanium that made IBM invest in Power, but in actual fact this investment pre-dated the Itanium by at least 5 years. IBM bumped development of Power, PowerPC, RS64, and Power2-7 at various times, but it has been an almost continuous process if we overlook the stumble that happened with the 64 bit PowerPC 620 processor.
The original RIOS based IBM POWER systems, the RISC System/6000, was launched in 1990, and had been under development for at least 5 years before that. The driver was to be an industry leader in the Open Systems market place, as IBM had at last recognized that there was money to be made.
When first announced, the RS/6000 model 530 killed everything on the market stone dead, it was so much faster. HP had PA-RISC running in their MPE/iX line at the time, but it was not a single microprocessor, being built from discrete logic. The RS/6000 caused a huge stir, both because it was so much faster, and also because IBM put significant marketing weight behind their new systems. Sun were immediately knocked off the top of the workstation market, and never managed to really get back up there, and DEC invested heavily to try to produce a really hot chip in the Alpha, that was as a result of the need for speed, was significantly flawed and never really delivered on it's promise.
HP rushed systems to counter the RS/6000 based on the single chip implementation of PA-RISC, and running HP/UX. These were the HP 9000 model 720, 730 and 750, and the race was then on between IBM and HP to see who could have the fastest system. This reached it's peak in the late 1990's, when some models of RS/6000 had marketing lives of less than 6 months.
This was tremendously expensive, and HP, who did not have a big chip-fabrication division valiantly struggled to keep up, but was ultimately doomed to fail.
The way I remember the Itanium being pitched was that Intel were going to take on the development of the PA-RISC single microprocessor replacement, keeping most of the instruction set, but putting in features that would allow the processor to also run x86 binaries, and enhancing the x86 architecture for 64 bit. Intel would get access to HP's IP for the PA-RISC (which included high clock rate silicon and cache IP), and would use their considerable chip making skill to drive the product forward. HP would get a class-leading processor to keep their workstations and servers going. At least that is what was said by Intel.
What actually appeared to happen was that they designed Itanium to be their own processor, with less emphasis on making it a PA-RISC replacement, and more on trying to make it an upgrade path for 32bit x86 servers. They delivered it late, and the product did not live up to their claims as either a PA-RISC replacement, or a 64 bit x86 migration path. Intel attempted to use some of the IP to produce high speed x86 processors, but botched it with the Pentium 4, which was ultimately a dead-end.
Because of the delay, the world in general, and HP in particular, started looking elsewhere. HP appeared to loose interest in the UNIX market place, allowing both their own products and the subsumed products from DEC/Compaq (and to a lesser extent, Tandem) to fall into the legacy category. They produced Itanium based servers, but they were never up there with IBM, except in the very-large system market. Only customer pressure has kept many of the OS's alive.
In the meantime, IBM has been left with the only non-Intel/AMD UNIX offering that was actively being developed, and as a result, has kept market share. Even though there has been no real competitive pressure, IBM has used the convergence of the AS/400 and RS/6000 lines, and to a lesser but significant extent the z series, to move the architecture forward. They have borrowed from other IBM systems (and their competitors) to introduce type 1 hypervisors, hosted application partitions, and a pretty much unrivaled virtualization capabilities. The supported filesystems have scaled, the support for other technologies such as SAN and SVN has gone hand-in-hand with other IBM products.
...and made sure it was really hot
I'm off to find a teacake, and some perspective.
Bit of a strange one today...
I'm struggling to work out whether the PFY was the cause of Simon's disappearance, or whether he did it on his own. I mean, if the PFY had actually attempted to dispatch the BOFH, how come Simon managed to claim the accident compensation (after all, you cannot make a claim for yourself if you are 'dead'), especially if you keep having your life support machine broken.
I feel that this story could have been much more (although maybe it is, and I am just commenting too early)
And why the requirement to stitch the PFY up with a potential fraud if you intend to let him back to work? It would make him more likely to try to get a 'promotion', especially if Simon is using that information for some form of blackmail.
Anyway, hoping that there is more details of this story to come.
What a climbdown
Matt Bryant, using an amazing piece of ass-covering, claimed that his original post was just troll bait, rather than a real comment.
Unfortunately, the more sensible members of the Register commenting community were able to see through this with apparent ease, identifying Matt as one of the trolls that he claims to be targeting.
Is there a course on writing ambigious headlines?
Because this would be a prime candidate for a case study.
Completely accurate, but you have to actually read the article before you understand what it means!
Sounds like a way for early adopters of the iPhone who have resisted the urge to upgrade to recover some of the value by making a market for their second-hand devices.
I expect an upsurge of these early iPhones appearing on eBay.
@AC. My bad.
Intel only announced that Atom was running Android last week (apparently, I appear to have missed it. Must have been because I had a busy week doing some real work).
But as to Atom and Windows, may I point you in the direction of what is happening in the NAS space, where Atom is already pushing ARM out. Almost all of the recent devices run Windows Media Server on Atom, possibly because this is of use to people with Windows PC's but also because Atom is sufficiently low power that the ARM advantages are being eroded.
If Apple were to buy ARM and restrict advances in the architecture, do you really not think that the real winner would be Intel?
Not an immediate collapse
ARM do not produce processors. They license the technology, and guide it's onward development.
The current generations of ARM processor made by people like Qualcomm and Marvel would still be produced under an existing license that Apple could do little to change (unless the Qualcomm, Marvel etc. lawyers were sleeping at the wheel).
What they could do would be to stifle innovation for new developments and licenses, keeping the best for themselves and allowing the rest of the world to struggle on with what they have already.
But I would hold up what happened to Mips and Alpha (and to an extent PA-Risc) as examples of what happens when a company not in the primary business of designing processors have control of a processor architecture. And SPARC may be going the same way (do you really think that Oracle are really interested in investing significant sums to progress SPARC beyond what we have already).
I rue the day when we have just Intel and AMD (lumped together because they produce code-compatible processors), and possibly IBM if they decide to continue developing Power, are the only game in town.
ATMEL are ARM licensees. They have a series of ARM7 and ARM9 based products.
Below this, they have the AVR Microcontrollers, which are much simpler beasts.
Do you think that there is much overlap? I don't think so.
So they have the fab. and they have what looks like their own range of micro-controllers. How long would it take to develop their 32 bit AVR to provide the ARM functionality? By which time, everybody else would have switched to Atom or whatever lower power processor Intel has in the pipeline.
This is bad news
An independent ARM allowed the processor to become ubiquitous. If Apple buy and then restricts ARM technology, it gives Intel a clear playing field to clean up.
And with Atom comes Windows...
Google's going to have to port Android to Intel!
Is this a reference (Gripping hand) to "The Moat around Murcheson's Eye", aka "The Gripping Hand" by Larry Niven and Jerry Pournelle.
If so, well done that man! Excelent book!
Answer to your first question
On paper. The paper system still exists, and will continue to do so for those exceptions (like not having an Internet feed) that will continue to exist in the future.
Just because Internet filing is the preferred route does not mean that it will be the only one.
...cannot be used for this type of wake-up remotely, unless the system generating the WOL packet is either on the same physical (wired) network, or there is some form of MAC level routing set up.
By definition, WOL packets have to be at the MAC level (if the laptop is off, DHCP cannot allocate IP addresses), and these do not route through standard IP routers.
And this also means that the laptop cannot be on a wireless network, as WOL does not work over an 802.11abgn network.
My suspicion is that when the laptop in on, there is a VPN set up to allow the laptop to access and be accessed by the school systems, regardless of the network, routers and firewalls between the school and the laptop. This will be a software VPN, which relies on the OS, which means that the system has to be on.
Unless someone has produced a LAN card that is integrated in these laptops that does IP and VPN actually in the NIC when in standby mode. If they have, I suggest that these would be laptops to avoid, as who knows who would be able to snoop.
How long ago
...was your last Linux install.
I've put Hardy and Jaunty on lots of systems, and generally it just works.
Almost every wireless card I've used (and I've got many rattling around in drawers at home) is recognized without need of a vendor supplied installation disk.
The last problem I had was the hacked Atheros chipset in my EeePC 701 with Hardy (fixed by a specific module from the community), but by the time Jaunty came along, it worked without problems.
What impressed me recently was when I took my mule system, and replaced the motherboard, which resulted in different processor, support chipsets, graphics adapter, memory, network - well pretty much everything besides the wireless card (it's a deskside system some distance from the core of the home network) and the media peripherals.
The existing Hardy install (yes Ubuntu 8.04 - two years old, but kept up to date) barely batted an eyelid. It recognized the onboard Nvidia graphics (it previously had an ATI AGP card), asked to install the correct driver for it, and came up as if nothing had changed. It just coped with the fact that the support chips changed from a VIA set to an Nvidia nForce set, or that the processor changed from an AMD Athlon XP to a Pentium Dual Core.
The last time I did this with Windows XP, I had so many problems, mainly because the Windows 'you've changed your machine, are you still entitled to run Windows' checks caused me to have to call Microsoft to re-authorize the retail version of XP (which is allowed to be moved between systems as much as you want). And the specific IDE drivers for the original motherboard refused to let me access the optical drive to enable me to load the correct ones from the driver CD packaged with the new motherboard to fix the problem.
I've not used Vista, but have built a Windows 7 system last Christmas. I was genuinely impressed by how easy it was to install, and it is clearly a step change from XP, The install I did was on pretty much generic hardware, so I would hope that it would be quite easy.
But comparing the installs of Linux and Windows is largely bogus, because almost nobody outside of the technical community actually installs Windows on any system. They buy it pre-installed, and just use it until it becomes so cluttered and slow that they discard the whole system. To somebody who has never installed a system, it will always be a traumatic operation to partition their disk and install a completely foreign OS with no experience of building systems. This probably explains many of the 'tried it, found it so difficult that I just switched back to Windows' type of comments.
Many of these people would find a second or third install so trivial compared to the first that they would change their view.
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders