18 posts • joined Tuesday 15th September 2009 04:12 GMT
Being a favorable Linux user, I actually prefer my software to be online, but managed.
However, I am *very* much against the so-called "Cloud." Largely for the fact I see far too many liabilities for far too little benefits. Adobe is already demonstrating, quite well, one of the big liabilities of the cloud, in pulling this. I prefer paying for my sofware *once* and only *once.* Not pay a software developer repeatedly to be able to use my computer how I like, aside from normal ISP costs, of course.
All that said, sometimes I can appreciate holding physical media in my hands because then I know I generally will only be dependent on myself for getting its data. I still am fine with getting my software online so long as my software isn't being held hostage for a regular fee (That and none of the software I use is on the "cloud." And I will make sure to keep it that way.)
As I am not a professional graphics user, I'm never needed to use Photoshop, ever. GIMP handles my needs very well.
Doesn't deserve the same respect as Dennis Ritchie.
He only did three things worth respecting. And there's a certain man who died a few days later who deserved far more recognition than Steve Jobs.
1. Jobs helped propogate the idea of the PC to the everyman (Alongside IBM and Bill Gates.)
2. Jobs got Pixar going.
3. Jobs pulled a few open source projects out of development limbo and is pretty much the only reason BSD hasn't completely lost its user base.
One big thing against Jobs: He was not a revolutionary, no matter how much people liked to paint him up as one. Apple is just as bad as Microsoft with the "not-invented-here" syndrome. They're just worse because of the cult worship Jobs gets.
Also, I should note, especially in the early days of Apple: Steve Jobs wasn't even creating or designing the stuff Apple was selling. It took him getting fired in 1986 and struggling to get NeXT a success to get him any sort of technical competence. It was Steve Wozniak we have to credit for Apple's prooducts up to the Macintosh. And frankly I never saw any sign that Jobs had even that much technical knowledge when he came back to Apple in 1997. He clearly wouldn't have had the know-how to even recognize the potential of Unix back in 1986. More than likely he rode on the back of competent engineers in NeXT. And it was actually, believe it or not, thanks to Microsoft he ever got back to Apple, likely bringing those same engineers with him.
Not that I am saying Apple riding on the power of Unix is bad. It's actually very good. My point is that I have my doubts Steve Jobs himself ever had the know-how to actually say what Unix was or what it represents. This can be evidenced by the fact that while Unix may have definitely improved Mac OS X, it still seems wasted on it since Apple is big on hiding any technical features from its users anyway, almost completely crippling the full potential a Unix system could have. This is partly why I dislike Ubuntu is well, so don't think I'm just picking on Apple here.
Good accomplishments, but compared to Dennis Ritchie he was a nobody in the tech sector:
1. Ritchie invented C, a language that shaped every language devised after its creation, and is one of the most powerful compiled languages in computing history. Even Apple's own Objective-C owes its existence to C.
2. Ritchie co-created Unix, an advancement without which computers would never be where they are today. Every operating system in the world owes its design and theory to this operating system. Even Windows. I might add also that OS X is Unix, and it was Unix that saved Apple, not Jobs.
3. Ritchie and company are responsible for a lot of research and development projects that gave Steve Jobs any sort of career. Without the work at Bell Labs, we wouldn't have microprocessors, the transistor, Unix, C, C++, or any other number of genuine innovations without which modern computing just wouldn't be possible. Not all of it can be credited to Ritchie, but a heck of a lot of software advancements can. Basically, to sum up, Ritchie is one of the people responsible for pulling the operating system and programming languages out of inefficient time share systems and into real-time multi-user systems as well as low-level compiled languages ideal for any task, but those especially fit for a need of power or close-to-the-metal programming that doesn't need architecture-dependent code (In the form of assembly.).
4. Ritchie really was a revolutionary. Whereas Steve Jobs was basically selling products people already had seen before but in shiny packages with less features, Ritchie and company were basically INVENTING the stuff Jobs would sell over a decade later. What makes him go from inventor to revolutionary is that he created stuff that actually did revolutionize the entire tech industry through the creation of Unix and C, as well as his other research projects. Steve Jobs can't lay claim to a single innovation on that scale, not even the PC, which was invented years before Apple was even founded.
I dunno. I have seen how a lot of "hardcore" Ubuntu users behave, and yes, a lot of Ubuntu users seem to think that not only are they the only Linux in town, but that *all* open source and Linux development seems to center around "improving" Ubuntu. This is one of many reasons why a lot of straight up Debian users despise Ubuntu, since arguably the Debian developers do most of the work actually doing the compatibility work and most of what Ubuntu does is rebranding and picking and configuring very specific packages "good" for the desktop.
Though I'm saying this as a somewhat bitter current Gentoo and former Arch user.
You know, Apple fans tend to compare OS X and Windows as if they were the actual hardware people use. The "Mac" vs. "PC" ads have always been OS X vs. Windows Vista/7 and barely ever even touched on the actual Mac hardware vs. PC hardware, largely because by the time Apple launched those stupid ads Macs and PCs had virtually identical hardware.
Or just use UDF. It was designed for portable media and all the major operating systems use it. And Android itself could support it with a simple checkbox in the Linux kernel configuration software. It'd take Google 5 minutes to turn off vfat if they wanted.
vfat is not the only "universal" filesystem out there. In fact, vfat was never actually designed to be one, it just got there thanks to Windows market share.
We could easily move the devices to UDF, which is actually designed for what vfat is doing right now, and is supported by all major operating systems out of the box, and is part of the Linux kernel tree, meaning Android could EASILY support it with a simple checkbox switch. Also, vfat could easily get switched off the same way by unchecking it in the same kernel configuration file. Any Linux novice could do it if they were building Android or a custom Linux kernel from source.
To be honest, there's actually no reason to support a specific Windows filesystem to make a portable device or drive usable universally.
As for patents, I tend to take Microsoft patent claims with a very heavy dose of salt. They claimed Linux itself violated 235 of their patents, but because they never actually said what the patents were or even responded to those calling them out on it, they were likely just trolling and had no such patents, or at least no valid ones they could actually take Linux developers to court with.
Missed the Point
You're completely missing the point of why people have separate partitions for /home or any particular directory. It's not necessarily to protect you from a failing hard disk (Though it can if your /home is on a completely different hard disk or even a different computer.)
It's primarily to protect your valuable data from whatever may come if you reinstall your operating system. It can allow you to save a LOT of time and headache on backups for that purpose when instead you can just tell the installer of your OS to go ahead and reuse that partition instead of creating a new /home.
Other benefits include allowing you to use a single /home directory for multiple *nix installations, even sharing of your data with Windows in a dual-boot.
Windows is not successful because it's well designed or does things "correctly" as Windows is neither of those things.
I agree this article overestimates the problems with Ubuntu's automatic mode, especially in light of the fact it allows for two means of manual partitioning on the same disc, and allows you to specify filesystem mountpoints at install time if you go manual.
I do think it's automated partitioning leaves much to be desired. But, if you've already got a /home you want to assign to a new Ubuntu install you will NOT want to give it the reigns of what is mounted how anyway, since it'll completely ignore your intended /home anyway.
Sorry, but Windows offers scant few options for fixing major issues before "reinstalling Windows" becomes the only real option. Sure, things made VERY slight improvements from XP to Vista to 7. But I must emphasize VERY SLIGHT.
In all my years doing this I've found Windows boxes to be the single biggest pain in the ass to support. And it's all due to the registry. See, in a standards-compliant operating system (Under POSIX, FHS, and SUS which are the *real* operating system standards, not what Microsoft offers its users as a de facto standard that only applies to Windows.) configuration is always plaintext, and organized in simple directories anyone can read, but not necessarily make changes to. This, by the way, is also why Windows is so readily infected is because it offers piss-poor file permission protections and thus even an unprivileged yahoo can still change system settings and thus so can a virus.
See, with Windows, it stores ALL the stuff important to the system working properly in a binary database ONLY THAT INSTANCE OF WINDOWS CAN ACCESS. If Windows breaks, good luck getting access to its configuration to fix the problem. In a standards-compliant system one could run a boot disk and use a simple text editor to fix what is likely one errant value. Because the windows registry is a binary database, you can't access it with anything but the registry editor installed on that copy of Windows.
Sure, Microsoft added a "last good configuration" option, but that hasn't done a good job on the vast majority of Windows boxes I've had to fix. Safe mode is nice if there's a driver or virus running amock but still relies on the system actually WORKING enough to boot to the desktop in the first place. And the recovery console they used to offer had very little useful tools, none of which, by the way, could access the registry.
The registry is far from robust or reliable. Sometimes... okay more frequently than "sometimes..." a COMPLETELY MINOR VALUE that's set wrong in the registry can cause a fatal chain reaction to a Windows installation.
Trust me, Windows is still far from the "reinstallation unneeded" state all the other operating systems have reached, and all because Microsoft doesn't have the first smegging clue how to design a quality operating system.
I haven't had to reinstall Linux in a long, long time. Why? Because if I break something I just start up a Live CD and change my configuration file to something "correct."
Not surprising, Apple's always been scum.
I am still grateful to this day I am not an Apple fan. In fact, I am grateful to this day I never liked Apple. It doesn't surprise me that Apple would not only come up with plans to shut down jailbroken phones but also make up some bullshit about protecting their users int he process.
Anything to keep from looking like the real dishonest moneygrubbers they actually are, eh?
Thing is, Apple's NEVER really been the heroic underdog it's been trying to depict itself as. EVERY move its made has been about consolidating its products in a way that makes sure they have the control over their users for the most money.
Still, this isn't illegal. As awesome as this new law is, it doesn't actually stop a company from blocking it. It just means they can't sue or press charges.
Okay, reading through this I think people are overreacting.
Yes, this is a vulnerability.
No, the fact that this requires local access pretty much means that fixing the problem is typically a waste of time since anyone with physical access likely doesn't need or is even likely to take advantage of ANY exploit to compromise a system. THAT is why it went unfixed for five years, there are actual zero-day exploits that are much more worth a kernel developers' time to fix, not that Linux has a heavy amount of this.
Microsoft's problem is that it DOES have what would be considered high priority exploits go unanswered for YEARS before they fix it, if at all.
Okay, now, this might have to do with X requiring suid root. Something being actively resolved and has been for some time with a lot of desplay drivers. The means of the fix of this problem? KMS. Kernel Mode Switching, allowing you to change display modes and use displays beyond displaying something without requiring root permissions to be had SOMEWHERE.
Fortunately, almost all the open source video drivers in the kernel support KMS or will in the near future. The trouble is with old or unmaintained drivers. And unfortunately nVidia has yet to put KMS support in their drivers. However, Nouveau does have KMS support, but none of the features one would want like production-quality 3D acceleration without a helper.
Thing is, even if X manages to get non-suid state within a year we still have one issue, there WILL be a program, SOMETHING, that requires suid root to work. Like sudo. Sudo uses suid root, but its configuration is pretty much designed for explicitly detailing who can do what and how they must authenticate. And often some sudoers aren't actually configured to have root permissions at all, but access to another user.
All in all, this is not really news or an issue.
And the reason I think it's annoying when we Linux users point out Windows bugs is simply because it shouldn't surprise us anymore when there's a huge gaping hole in Windows Microsoft ignores for years.
This sounds familiar.
This sounds familiar. Didn't we see an attempt like this with Vista that failed miserably? And since Windows 7 offers next to nothing new over Vista aside from stability and speed improvements, why oh why would an IT department need this when, once again, XP works fine?
Or are they also panicking about businesses and governments switching over to Linux, which is becoming a more frequent occurrence every day?
This is Microsoft in panic mode, ladies and gentlemen. This is not a drill.
Microsoft rarely admits when its something they do that screws stuff up. They blamed all of Vista's quality problems on the hardware manufacturers. Apparently the hardware was so crappy on my computer that Linux could run it perfectly.
New name. Same search engine.
Is there a REAL difference between Bing and the old Windows Live Search? As far as I've read to this day it's done the exact same thing: Ad revenue for Microsoft more than actual search results. I'll keep using Google since it won't put up "sponsored results."
I remember when the BSD folks, along with the Windows folks, leaped on LINUX for the setuid root exploit which was the fault of the compiler and not the actual code in the kernel source.
Hilarious now that a glaring exploit appears for the allegedly most secure operating system ever created (Something I never really saw the BSD fanboys ever actually prove, btw.).
No Open Document Format?
I see that this isn't ODF-supported. Hello, non-standards compliance.
- Apple's spamtastic iBeacon retail alerts launch with Frisco FAIL
- Submerged Navy submarine successfully launches drone from missile tubes
- Cache in the Attic El Reg's contraptions confessional no.2: Tablet PC, CRT screen and more
- Pix Astroboffins spot HOT, YOUNG GIANT where she doesn't belong
- Developer unleashes bowel-shaking KILLER APP for Google Glass