59 posts • joined Saturday 27th October 2007 22:08 GMT
Re: progressively more disenchanted
Amen. How is any salesman, teacher, techie, OEM supposed to play for the team when no one knows what's going on? I love to search for data. I hate to search for applications. Is that so hard for Shuttleworth to understand? I have a monitor a mile wide. I have absolutely no use for an OS designed for a smartphone here.
"A leader with vision is better than one without."
Sometimes but not always. I've heard of Hitler, Mussoline, Idi Amin, Gadaffi and some others who had plenty of vision but were just headed in the wrong direction. Shuttleworth seems intent on solving a problem that doesn't exist while ignoring the elephant in the room. The elephant is that the only thing blocking more widespread adoption of GNU/Linux is retail shelf-space. Canonical has made great progress there before Unity came to be. The GUI is not the problem. Trying to stuff a desktop into a smartphone is.
Re: Mark, you keep punching that straw man...
The issue is not just about developers but OEMs and ordinary users. I don't know any who share Shuttleworth's idea that Apple and M$ are absolutely wonderful. In fact, many OEMs and retailers are despairing of selling Wintel and are seeking an exit. When the battle is almost won, Shuttleworth seems to be disbanding his army.
Where has he been the last decade? Even my toddler granddaughter can run XFCE4 on Debian. If she doesn't need Unity, who does? Shuttleworth should stick to supporting OEMs and not designing software. What's Unity as a fraction of GNU/Linux, 1%? Why does he feel so important? He's not and the world will move on without him. While millions of PCs are shipping from OEMs with Ubuntu, millions of others are shipping with other distros and the hobbyists still visit Distrowatch.
Net Applications Data Has a Bias to Business Usage
Net Applications shows California has 9.64% GNU/Linux share. If you ask for California - Sunnyvale you get 2.93%. The difference? Google's 10K employees switching to GNU/Linux a couple of years ago. We know there are whole school divisions and others using GNU/Linux but they count as nothing because they are not businesses connecting from business domains to business sites during office hours. There are more than 30 million people in California. Is that bias or what?
Dalvik is not a Java virtual machine
Despite Oracle's claims, Dalvik is its own virtual machine running its own language, not Java. The apps that run in Dalvik are usually written in Java language and cross-compiled/translated into Dalvik's language. Oracle has no patent on virtual machines so they have no basis to complain about Dalvik.
No licence is required to run your own virtual machine. Everyone does it and it is no business of Oracle's.
What's with all the angst about thin clients ?
I have been using small cheap thin clients for years and the performance is better than most desktop PCs: logins 5s, open big app window < 2s.
As for cost, what's the cost of 100 hard drives versus a few big ones on a server? Power consumption? Freight, space, whatever the measure, thin clients have better price/performance ratio.
It must be that you anguished guys are using that other OS. GNU/Linux rocks with thin clients largely because most files users need to click something are already cached in RAM which is a million times faster than hard drives.
This VDI stuff where the OS and all the data is sloshed over the LAN to get any work done makes the network a bottleneck. If you use good old LTSP and such, the data and the application are together with no latency.
Thin Clients are Suitable for a Wide Range of Applications
TFA: "thin client solutions are not appropriate for a wide range of business activities"
Not so. Only full-screen video with its large bandwidth requirements is a no-no for thin clients. Even then for a few users of full-screen on a server it can be done. For most other uses, bandwidth actually is less for thin clients than traditional PCs with files on the server. It is much less effort to move a few text and pictures over the LAN than a bunch of data-files. Thin clients may increase the average or minimum load on a network but the peak loads will be much less.
We can also look at the functionality of the working parts of a PC to see the waste. Besides energy, look at the wasted expenditure on hard drives. If you have 100 PCs with 100 hard drives instead of 100 thin clients with far fewer drives on the server, the advantage is obvious. Same for CPUs. Why have 100 powerful CPUs idling on thick clients when you could have a low-powered CPU working reasonably hard on the thin client and a few powerful CPUs working hard on the server? Thick clients just make no sense. The presumption should be that all client machines will be thin unless there is a particular reason to go thick.
Moore's Law will allow us to push more processes into the server room but nothing will recover the waste of resources on the thick clients forever into the future. You can break even on the cost of a changeover to thin clients in a year or two in energy but there are immediate returns on investment in lower maintenance and longer life that are so solid this technology should be the norm.
Where I work, the cost of old PCs is so low that we do not buy new thin clients but the performance increase obtained by using old PCs as thin clients of fast new servers is all the justification I need. My users boot and login twice as fast as they used to do with thick clients and I have almost no work to do to keep them running. It is a clear win.
Where I work we have a lot of good PCs 6-8 years old. We had one power supply fail out of 80Pcs. They are a bit slow running XP so we put GNU/Linux on them. We bought a few new PCs and put GNU/Linux on them. When we want performance we use the old machines as X clients of the new PCs so everyone has a piece of large RAM, fast CPU, and fast storage. They make excellent thin clients and we only need to upgrade a fraction of our PCs to stay current.
The "custom" of changing PCs frequently is incredibly wasteful but very profitable for Wintel.
Only M$ has billions taken from buyers locked-in and can afford tones of research into what cute useless features will amuse users. Only M$ has made an office suite a conduit for malware and a burden to open standards everywhere.
No Problem for Me
If you run on the old-fashioned thick client, you get what you deserve. I use a GNU/Linux terminal server because I like to share... My OpenOffice.org binaries are already in RAM when a user clicks on an icon and using the shared memory features of 'NIX OS, everyone gets to use the single copy so my window pops open in 2s or less. My login to a useful desktop is only 5s, unlike that other OS.
Eat your heart out, folks. This is the 21st century. Do your computing the right way and you can enjoy the benefits of modern hardware.
I work in places with lots of old systems and no record-keeping for licences. When that proprietary stuff needs to be re-installed, I just replace it with FLOSS equivalents. It simplifies my life greatly. If a certificate of authenticity is missing from a PC, I replace the OS with GNU/Linux as well. I am instituting backups/imaging/record-keeping so this may be less of a problem but FLOSS is much easier to manage. I only need one image of GNU/Linux that will run on all our PCs. I need four images of XP. When the time comes to kill XP, I can use the imaging system to deploy GNU/Linux in an evening. This proprietary stuff is too much work. I want to work for my employer, not some software vendor.
Virtualization is Big
It is a major tool for consolidation of servers and it is magical for thin clients, a very successful form of virtualization. Thin clients really save capital cost of equipment and power consumption because one server can run hundreds of $250 thin clients each using 30 watts or so. Where I worked last year, they could run all their services on one or two machines with virtualization. The saving in hardware and power consumption and space would have been huge. If you get improved security from virtualization it is a huge plus. We do not need more servers as much as we need more services. Virtualization does that very well.
Thin Client Terminal Servers
In this area, virtualization has been paying off handsomly for most workloads, point, click and gawk. I can reduce service calls on hundreds of PCs while concentrating on a few servers. The servers can be fire-breathing dragons with all the modern resources users want while I do not have to run all over the building, finding keys, fitting into schedules and such. For most loads, the load on the server can be large but still responsive and the end-user gets better performance than they could on the usual thick client with slow discs and per-user malware fighting.
The folks who want to cling to XP longer are going to love virtualization. They can protect the virtual machines using state of the art stuff and use XP indefinitely. The folks who migrate to GNU/Linux are going to be in for a treat. You can run many more users on a GNU/Linux terminal server than with that other OS.
Glad To See So Many Posts
It means Ubuntu had lots of installations.
Mine was pretty smooth except my virtual machine did not have a virtual monitor so X was stuck in 800x600 by default. Supplied an /etc/X11/xorg.conf and it is beautiful. No other problems. Writing a book with LyX at the moment. Very nice.
Use Thin Clients
A lot of the advice given above applies to thick clients and servers but seems to ignore the huge benefits of thin clients: lower power consumption, footprint, noise and MAINTENANCE. Fanless thin clients will run until the screen resolution has evolved past their limit. They last several times longer than a thick client and avoid the need to walk around. They also cost less to buy because there is less material in them.
Once you are using thin clients you can concentrate on the servers and do whatever you want to keep them updated. Because one server can operate many thin clients this is a much lighter task. The entire system has much better performance because a server can specialize in some application and be tuned for it. You can cache all the files in RAM, for instance. You can use huge RAID or SSD to boost performance, too, at much less cost than optimizing the many clients.
Use thin clients. The savings are huge. I like GNU/Linux on thin clients and terminal servers to save on per-seat charges for licences.
GNU/Linux on Thin Clients
If it is five years old or older consider using it as a thin client. Replace all dead machines with new thin clients. Use simple X window system for all thin clients on the LAN. Replace all ten year old machines that have managed to survive with new thin clients because their hardware is becoming hard to drive.
Replace/repair/update all servers every few years. That is where the performance is kept.
With this recipe we get state-of-the-art performance at least cost and we do not have to do anything special because Wintel wants it. We may start using ARM thin clients this year.
Don't Even Try
If you approach bosses as an extortionist you should be fired. You can keep records of failure rates but until the boss's PC dies, he will not care.
A much better approach is to persuade bosses to convert to thin clients for half the cost of an upgrade of PC hardware thick clients. Then you can upgrade the software on a few servers and let the thin clients last a decade with fewer problems with bosses. It is much easier to persuade bosses to upgrade/consolidate a few servers every few years than hundreds or thousands of thick clients and the bean-counters will be impressed with reduced power consumption and maintenance cost.
This recipe is doubly cost-effective if you can run GNU/Linux on the thin clients and terminal servers as there is no per-seat licensing costs. There may be a few apps or users who are difficult to move this way, but the overall system will be in much better shape operationally and financially using a mixture than using the expensive solution everywhere.
TFA article has a few things wrong. GNU/Linux is in double digits whether people know it or not. M$ admitted to 7% in Steve's presentation to analysts. M$ is not even counting GNU/Linux thin clients accessing M$'s Terminal Services. 10% of the world's PCs are thin clients and a lot of them run GNU/Linux. The truth is closer to 10%. GNU/Linux passed MacOS around 2003 and has had a good rate of growth since. If the reader doubts this, answer this question;"Why did M$ subsidize XP to the tune of $2 billion if GNU/Linux was not breathing down their neck?" No answer? I thought not. Not many ARM netbooks are running that other OS. GNU/Linux works on them and they are ramping up production. The usability issues are gone on a well-configured OEM installation. eeePC showed that. Many others were snapped up by consumers. The current low attachment of GNU/Linux is due to M$ being willing to forgo profit to keep monopoly a bit longer. They will no longer be able to do that after Christmas. The armies of ARM netbooks will take a bite and M$ will not run "7" on an ARM netbook.
TFA completely ignores the fact that GNU/Linux is on fire in the BRIC countries where hundreds of millions will soon buy netbooks when the price drops just a little more. Once ARM production satisfies the need, no one will be laughing at GNU/Linux any longer. Some of the new ARM chips will give Intel a run on notebooks, desktops and servers in a year or two.
Change is happening. Claiming it is not happening may make some people feel better but it is a lie. The monopoly is ended. M$ does not have enough money to buy everyone off.
GNU/Linux does not need to succeed on the desktop but it will happen because people want inexpensive PCs etc. that just work. M$ has failed miserably to fill that role. People do not need DRM, phoning home, and malware. Those make money for M$ but rip off the end-user.
GNU/Linux Can Specialize on Everything
Really. We have 200000 FLOSS projects. We are much bigger than M$. They have more salesmen than coders.
The world of IT is worth much more than M$. They bought out netbooks and it cost them $2 billion. Let them buy out ARM netbooks. That will cost them another $2 billion. As many billions as they have, they cannot afford to buy us all. M$ cannot provide free IT to the world and still be a ripoff monopoly. By the time "7" sorts itself out, M$ will be a normal corporation scrambling to survive.
2009 is the year of GNU/Linux on the desktop. By the end of 2010, M$ will be struggling for share on the desktop. Virtualized desktops can access web applications and do everything without all of M$'s baggage. Red Hat and IBM see that and will sell lots of GNU/Linux desktops to business. OEMs will sell many GNU/Linux desktops to emerging markets like th BRIC countries.
Ask yourself why M$ is bothering to slander GNU/Linux with major retailers. They are trying desperately to hold onto their last stronghold. Everyone who has seen a GNU/Linux netbook will know a lie when they hear it. The jig is up.
The region of the images involving the balloons contains a steganographic message for the Technological Evangelism department of M$ in each region. In the USA no evangelism was thought necessary. They are pretty thoroughly locked in. In Germany, the message was too large so they needed a larger region, hence the different image. Munich has finished buying M$'s licences/protection fees so the team was dealt a scathing reminder of the consequences of failure.
GNU/Linux is Bigger Than MacOS
TFA is very informative but puts illegal copies and MacOS ahead of GNU/Linux as competition. GNU/Linux has more than double the share of PCs than MacOS. GNU/Linux does not have to improve to grow market share on the low-priced PCs. Emerging markets and newcomers to the PC market are very price-sensitive and GNU/Linux is the clear winner on prices. Do not forget ARM where M$ is not a player.
In Russia and a few OEMs, consumers are offered a clear price for GNU/Linux and can choose it. For cost-concious businesses and schools moving to thin clients. GNU/Linux is already a good choice. That is 10% of PCs, about 2% of production. GNU/Linux is attacking on several fronts.
The big takeaway from TFA is that consumers will not be able to understand the pricing scheme. People like simple choices. This is not simple. GNU/Linux is simple. It is the same price whether you run a mainframe or a netbook, $0.
Two Out of Three Isn't Bad
Larry got it mostly right. Thin clients are 10% of installed base of PCs and they save users a bundle while giving improved performance on GNU/Linux or UNIX systems. The Java thing did not pan out but we have everything we need: speed, power, low costs.
More than 80% of users of PCs could use thin clients very well. They work well in place of the desktop so about half the market for PCs could go thin eventually. I would use thin clients anywhere folks did not need mobile computing or video/heavy graphics. Browsing/editing takes care of most tasks in business, government and education.
Immunization is Like That
You get a tiny pain but long-term protection.
I have migrated many groups. The small ones are easy. You can hold their hands. The large ones are more difficult. You have to explain logins and issue passwords to everyone and you cannot meet most of them.
I like the approach of Extremadura. They swooped in on a weekend and the users had new desktops to work on Monday. Sweet. I will bet that caused consternation but the improved performance afterward makes it acceptable. One needs to give the end-users reasons for the change and show them immediate benefits. It really does help to issue new monitors/keyboards/mice at the time of migration. The end-user does not expect everything to be the same on a new system.
You can go to the opposite extreme like Munich where from conception to finishing the migration is taking 8 years. The equipment will need replacement about the time they finish... chuckle...
Somewhere in the middle is the best compromise between cost of migration and pain. However it is done, it will pay off sooner or later. I like to go with thin clients and GNU/Linux. Often the payoff is instant because the cost per seat is about half. One can either double the number of seats for the same money or buy lots of toys with the savings. It is all good. Many of my clients go from XP on 5-10 year old machines to GNU/Linux running on a quad-core 64bit server and really appreciate the improvement in performance. They do not appreciate switching to that other OS next release and slowing down...
Moore's Law Does Mean Prices Fall
If the number of transistors on a chip double every 18 months or so and the cost of producing a chip does not double, the cost per transistor drops. In a competitive market, the other guy can use this to drop his price per chip so you have to drop yours, unless you have a monopoly. Intel made sure they retained their monopoly, illegally.
Moore's Law permitted going to larger wafers and multi-core chips. More transistors per mm^2 increases the incentive to put more on a chip/wafer. It does lower costs/prices per chip.
Did anyone else notice that when AMD temporarily created a monopoly in AMD64 chips that the prices for their top processors were a bit high? That was a legitimate monopoly. They innovated. Intel innovates but also bribes folks to retain monopoly. There's the rub. If Intel's products were better than AMDs Intel would not have had to do that. No one would want to sell AMD chips.
Install GNU/Linux Once and You are Good
"cheaper alternative, in my experiance, has been to re-install Windoze every 6 months or so. It's surprising how just a Windoze re-install will improve the speed of a PC."
Yes. I have see this too. However, if you install GNU/LInux you can run for years with better performance and no slowdowns. Last night I installed Debian Lenny GNU/Linux on an old PC. It was loaded with undetectable malware which was clogging the uplink with something. Download volume = upload volume. Download speed changed from a few kilobytes per second to more than 100 kilobytes per second and the desktop was very snappy. The owner should be able to run for years on that 1.4gHz CPU.
A few months ago, I upgraded the motherboard and hard drives on my personal PC. I did not need to re-install, just copied all the files to the new storage systems. The OS was Debian Sarge GNU/Linux installed in 2004. To upgrade to Etch, all I needed to do was type apt-get update;apt-get dist-upgrade after switching to the new repository. Debian refreshes itself. No need to do much every 6 months except ask it to do so. Further, when you re-install that other OS, you need to re-install drivers and applications separately. With GNU/Linux, if all your stuff comes out of a repository, the package management software does it all.
If time is money, the time you save on re-installing and re-re-rebooting is worth far more than the cost of a new PC when you run GNU/Linux.
A liquid fuel and a fuel cell system can easily go 100 km. One can make methanol, for instance, from biomass pretty easily. I do it as a science experiment for kids (using a fume-hood, of course), Ethanol or any liquid fuel rich in hydrogen can work.
Lots have switched to GNU/Linux
Brian wrote: "I would like to see a major company take the Linux plunge and see the difference "
e.g. The French police saved 50 million euros.
e.g. IBM, Novell, several banks and lots of governments use GNU/Linux.
It is good that IT is an industry dedicated to doing things faster, better and cheaper. The bad and ugly are present but we can work around them. If you look hard, you can always find someone selling what you need instead of what the bad guys want you to buy. By making the right choices, we can cut our cost of operation by a factor of two or three and increase the useful lifetime of equipment and software. The bad and the ugly can only stay in business if we let them push us around.
I Say It's Past The Time When The World Drop M$
It is closed-source software. Only M$ can fix what they made. They won't fix it in a timely fashion and there are too many vulnerabilities altogether. Depending on M$ to provide software for the world's computers is foolish.
It is time to go with FLOSS. At least with FLOSS we can fix anything that goes wrong and because the software is better designed there are fewer vulnerabilities.
"Microsoft Windows XP Professional SP2, Vista, and Server 2003 and 2008 does not properly assign activities to the (1) NetworkService and (2) LocalService accounts, which might allow context-dependent attackers to gain privileges by using one service process to capture a resource from a second service process that has a LocalSystem privilege-escalation ability, related to improper management of the SeImpersonatePrivilege user right, as originally reported for Internet Information Services (IIS), aka Token Kidnapping."
For Pity's sake. They cannot get the basics right and then bury stuff ten levels deep in a GUI. It's a house of cards. Get out from under it before it falls on you.
2009 is the Year of GNU/Linux
GNU/Linux is on a roll. There is news that the French have realized huge savings with it. The trolls on the GNU/Linux boards are being drowned out by the weight of rational users of GNU/Linux. M$ is laying folks off and selling more vapour-ware. The recession is forcing many to opt for lower cost after waiting out Vista. Thin clients are steadily advancing and GNU/Linux works well on them. The netbooks continue to be a bright spot. BRIC countries are hardly slowing down in their adoption of GNU/Linux. Large businesses and the retail buyers are about the only customers M$ can rely on these days and the retail customers are seeing some GNU/Linux on the shelves.
The recent study by IDC shows a huge shift in sentiment. A couple of years ago KACE did a similar study. These results are consistent with those and show that GNU/Linux has matured on desktop and server. With virtualization/thin computing GNU/Linux continues to shine. By the end of 2009 there will be more than 10% of desktops using GNU/Linux. Some parts of the world will be at 20%. At these levels of adoption, few on the planet will not know about GNU/Linux and everyone will be able to make a choice. The current regime where M$ pays OEMs to install that other OS is not sustainable. If M$ cannot stop the slide in 2010, they will have lots of downside. M$ is effectively cutting its prices now, but hiding the fact with NDAs. The SEC filings continue to tell the tale.
Now, instead of one Vista-Incapable suit there could be 100 million. That should finish M$.
What was the judge thinking? hundreds of millions of copies sold and there is no class???
Good Reason to Use Free Software
Free Software is distributed under generous terms including making copies and redistribution so allegations of doing so are moot. These terms means a low or no licence fee is usual so it costs less, too.
Just bought 4 of these
By great good fortune, I switched to 500 gB drives for an upgrade. Now to flash the firmware on the 1TB drives ...
This looks like an ugly event in the history of hard drives and the timing is awful:
consolidation in drives imminent
It Actually Saves Money and Increases Performance to be Greener
Any organization with multiple computer seats can be a lot greener by using thin clients. Thin clients are generally good for ten years of use, two or three times the life of a thick client with all its moving parts and high maintenance. That automatically cuts down negative environmental effects by a large factor and gives benefits of reduced maintenance, lower power consumption, less noise and no dust collection, smaller footprint, all big money savers and things that increase productivity.
On the server side, consolidation and virtualization will do more than any change in manufacturing technique.
For software, using FLOSS saves a lot of money and increases the lifetime of equipment because FLOSS is not in the conspiracy to force upgrades. Money saved on licences and longer lifetime of equipment can be invested in other green technology or put in the bank.
GNU/Linux with Apache is the dominant OS on servers on the web according to Netcraft
M$ may hold the majority of file servers because of embracing/extending/extinquishing CIFS but they have no real merit over NFS+CUPS in the 'NIX world.
The world is not enamored of Vista and Vista II, so the dominance on the LAN could end precipitously. Other pressures against M$ are server consolidation. One can consolidate better with a more efficient OS. Any 'NIX is more efficient than M$'s product because of shared memory. Virtualization can work with M$ to consolidate but you can put more processes in a give amount of RAM with GNU/Linux.
In a Downturn Free Software will Survive. Will That Other OS Survive M$'s Downfall?
Pity all the developers hired to make software that costs lots of cash. They will be out of a job when folks decide to switch to free software. Free Software has a growth rate like 50-100% per annum even in a downturn. That other stuff takes a nosedive. If Vista II flops, expect layoffs at M$...
IBM did it years ago. IEFBR14 could really make your computer run quickly. It consisted of one instruction, to return control to the operating system. That is provable the fastest possible programme so any patent for accelerating programmes should be trumped by prior art.
Sheesh! We need judges who can code a line or two of software before they opine on the originality of code. We have known we can exchange storage for speed since the 1960s. We have known that we can expand loops to inline code or optimize the innermost loops or use different/faster instructions therein. There is nothing new under the sun in software, only combinations yet unused. A permutation is not an invention. Move along, please.
Windows ve Linux
So true. I have read several reports that the typical IT department needs three times as many servers running that other OS as GNU/Linux. One thing these big chips with huge caches will do for GNU/Linux is to allow more users to run on a single GNU/Linux terminal server. With dual-socket-quad-core a fairly large school can run most of the desktops from a single server. That is performance. Hex/octal core chips will be able to do that with a single socket and we can use a second machine for backup for very little cost compared to a bigger cluster of servers.
This year I ran a lab in which I was able to observe XP, Vista and GNU/Linux running on a variety of hardware:
1)Vista sucked on AMD64 X2 5000 with 2 gB RAM
2)XP was OK on P4 with 512 MB
3)GNU/Linux on thin clients with 64 MB was the best.
I should explain the last item. I ran 24 users on thin clients from an old XEON server with 2.4 gHz clock and 2 gB RAM, 80 MB per user on the server and 64 MB on the client=144 MB and 100 MHz per user on the system. This means Vista is many times less efficient than GNU/Linux. Vista may be designed to maximize profits for M$ and Intel but it sucks bigtime for the customer/user.
A recent server by KACE found that 11% of IT professionals were in the process of switching from M$ this year, more next year and a bunch after that. In two years the M$ monopoly will be down the drain, thanks to Vista and M$'s contempt for users. M$ could fool all of the people some of the time and some of the people all of the time but they cannot fool all of the people all of the time.
M$ should invest its billions re-inventing itself and not the OS. If they do not they will be a dinosaur within five years. Once the monopoly is broken, they will have to compete on price/performance and Vista-like OS will not make the cut. It's time to change to a UNIX-like OS be it MacOS, openSolaris, FreeBSD, or GNU/Linux. I recommend GNU/Linux because it has been doing the job for ten years or more, has fantastic (and still improving) device support, is modular and configurable, and is lean when you need it to run on anything built in the last ten years or more.
In 2006, I built a complete IT system for a school using GNU/Linux. They were able to afford twice as many seats plus toys for the price of a system running M$. M$ makes no sense to anyone who cares about price/performance and who is not locked in. Emerging markets are not locked-in and M$ will be irrelevant soon. Get used to it, Mike.
Hiding the Price of the OS
This is an illegal trade practice where I live. It is illegal to bundle products like an OS in order to hide the price of it. This is widely used by M$ to avoid competition on price. Look on the web. You will rarely find the same hardware available with XP/Vista/GNU/Linux as options. The motivation is obvious. A monopolist cannot both compete on price/performance and rake in extreme profits.
Would you work for SF as a sysadmin?
The guy did his job and was terminated. This fuss about the network being locked up tight while still running means he did his job. If they had asked for a smooth turnover to his successor all this would have been avoided. If they had redundancy in the sysadmin position this would have been avoided. If they had required documentation of routine operational procedures, system tweaks, and passwords, this would have been avoided. Bean counters with tight budgets mess up systems, too.
I took over a system (not SF) from a guy who left no documentation and I had to hack into every machine to regain control. When I left there was a 60 page manual with all the details of how to run the system. If I had been suddenly dismissed there could have easily been a similar crisis for the next guy but that did not happen because reasonable employment practices were followed.
It looks to me like SF is a place sysadmins should avoid.
What about thin computing?
Why nothing on thin clients? There are many clients for each server so there is where you find big savings on power reduction while keeping full-speed computing. 80% of tasks can run thin. Why does IT continue to accept M$'s FUD that thin is dumb? This is not the old days with serial links and 10baseT. We have huge RAM, faster storage on servers and gigabit/s. I have done side by side tests and users prefer thin, given the choice, because it is faster in many cases, especially with GNU/Linux and shared memory. We do not need M$ in IT. It holds us back from important innovation like thin clients.
GNU/Linux terminal servers love many processes, a bunch for each user. I regularly run 700 processes on a single core in 2gB. Multiple cores are very useful on these machines and they are more like desktop machines than web servers.
All of the terminal servers I build have a bottleneck somewhere. On my current machine it is the LAN because I can't get my PHBs to give me a gigabit/s backbone. With gigabit/s you can easily handle 50 X clients on one NIC. Boards with three or four NICs are available. Multiple cores on 64bits pushes the bottleneck back into the CPU/memory subsystem. 2 cores is limiting, very limiting. 4 cores is great. Dual socket/4 cores is heavenly. When they talk about 8 or 12 or 16 cores, I know the bottleneck will be in the CPU/memory area. I think it is clear they need to stop ramping up the cores and start doubling the cache instead. If you can keep entire processes in the cache, you win, big-time. I think this can already be seen with Vista. It runs fairly decently on an Intel Celeron M chip with large cache and is slow as molasses on dual core AMD64 with a smaller cache.
With GNU/Linux adoption growing at 50% per annum and LTSP a great way to run GNU/Linux, AMD should look at a chip designed to run huge numbers of processes with lots of cores and huge caches. Otherwise, we may have to go with Intel, as painful as that may be... Intel is a near-monopoly. If they destroy AMD we will pay dearly, but AMD has to give us a reliable option. They did with the Athlon, and AMD64. Can they keep doing it?
M$ Keeps Admins Busy
Title should be "MS Keeps Admins Busy". That way, M$ hopes to keep them so busy they have no time to think. M$ hopes the idea that they could live without that other OS will not creep in. That other OS is not easy to maintain. Patch it and you have lots of downtime for malware, breakage and re-re-reboots. Don't patch it and you have more downtime for malware. With GNU/Linux, I patch when I want, and the system just keeps running. I feel like the Maytag repairman, with plenty of time to contemplate IT.
GNU/Linux is ready
I use GNU/Linux in schools. Kids and teachers use it all the time with no problems. What is not ready? The end-user does not install drivers. The end-user uses drives, printers, the monitor, mouse and keyboard. Every driver for many common devices is included with the kernel. How cool is that? CUPS supports a massive number of printers. Check compatibility before you buy hardware and you will have no problems that way.
GNU/Linux has been ready for about a decade. People who resist change are all that is holding it up, and M$'s anti-competitive practices. Did you ever see Walmart advertising the GNU/Linux boxes on TV or in print? I thought not. I guess they did not want to offend M$. One of M$'s pet tactics is to require exclusivity. Walmart is tough with suppliers. They should be tough with OEMs and M$. The same goes for Dell. Try buying a GNU/Linux box from them if you just visit the site and finding a box. They do not give GNU/Linux as a radio button on the choice of OS, ever... How strange is that? Could it be that M$ forbids head-to-head competition?
can read the Captchas for the bots... Set up bots to open accounts and route the captchas to a human who can learn and improve his speed. Pretty soon you will have humans able to type captchas at 60 per minute. A network of such humans could open hundreds of thousands of accounts daily. So, Google will have to go to plan B, which is... I have no clue.
HAHAHA! WHOOEEE! LOL ROFL
You mean to tell me people actually pay money for software like this when they can have GNU/Linux for free??? This is hilarious. I gave up being fooled by M$ many years ago. When will everyone else catch on that they are being had?