26 posts • joined Friday 21st September 2007 14:38 GMT
Unfortunately, it's all too real
You may consider me a troll, but fact is, rootkits are real. And anyone can compile a doctored binary on an open source OS. Privilege escalation is also a very real threat. While it is possible to give primary priority to the /bin binaries, by default the user's .bin folder gets priority. If Linux becomes mainstream on the desktop, these threats will all too become real. Its security architecture is simply not set up to handle modern threats.
The guy failed to take into account several things
One, Linux is a LOT more expensive to support and maintain than Windows. Each new revision requires new driver releases to go with it. And there's tons of things that don't "just work" with Linux that work with Windows. Many XP users are using the same drivers they had 5 years ago, or even 9 years ago with Windows 2000. In 6 months, all the Linux driver vendors will have to release new drivers for that specific release. He honestly thinks Linux is going to be that much cheaper? How much is HP going to charge to support that cluster***** of an operating system that has wild dreams of adequacy?
Two, Linux is far more vulnerable to today's threats. Today's threats do not arrive from a remote hacker. They arrive in your inbox from your infected friend. You run it, it drops some modified binaries in your .bin folder (while otherwise appearing to be exactly what it claims to be), and it soon has your root password and full control over your system. Then it installs itself seamlessly with the operating system (it's open source; it can modify core OS files without breaking them), and you'll never know it's there. You'll never be able to get rid of it without a full nuke, either. Then it takes advantage of Linux's full Unix TCP/IP stack and does some very nasty things. Honestly, once the full threat of widespread desktop Linux becomes realized, I would not be surprised if many ISPs start outright banning it from their networks.
7.62x39 is a totally different caliber from the 7.62 NATO. The 7.62 NATO is 7.62x51, better known as the .308 round. A big, powerful full-size rifle round. The 7.62x39 is a smaller assault rifle round, comparable in power to the US 30-30. They're in a totally different power class and role. The 7.62x39 bullet isn't even technically the same caliber. While 7.62 normally designates a .30 caliber round, 7.62x39 is actually .311 caliber. The .308 is, of course, .308 caliber.
Norton really is a virus
Typical Norton bullcrap. I agree, Norton is a virus. It often does irreparable damage to your system when you uninstall it, too. I've had it permanently damage some Windows features.
Backups today are a joke
Honestly, backups anymore are a joke. You don't like Vista's shadow copy? How can you honestly make it easier than opening the entire drive or folder up in the exact same format as the drive itself. Oh wait, it isn't pretty and doesn't have rainbow flags on it? I'm sorry.
1996: Tape backups. Ugh. Slow, and you couldn't use your computer for several hours. Set it overnight and leave it running, assuming you could actually sleep through the racket.
1998: The glory of CD burning came to computers. Sure it was 2x and took 36 minutes to burn a single disc, and if you so much as surfed the Internet it would break, but it was faster than tape backup and didn't require software for other computers to read the files.
2000 or so: Buffering technology which prevented coastering of CDs, and fast enough computers to handle the load. Also DMA Mode for optical drives. Drives that were fast and could burn a whole CD in 3 or 4 minutes.
2002: Jumpdrives which let you quickly copy data and take it to another computer. No software required.
2004: DVD burners start becoming affordable and mainstream. Now instead of 700 megabytes, you can burn 4.7 gigabytes for about the same price.
2009: DVD burners and jumpdrives are the primary backup media. For truly large things, external hard drives are available.
Re: Anonymous Coward
"It's all good, nothing to discard here! Every sentence is a pile of steaming bullcrap. Nice one. I would add that the only way to REALLY destroy the data is to melt the whole computer, powercord included. Never know which kind of personal data can remain stuck in all those wires."
And you accuse me of being a troll? Do YOU have a degree in Information Systems Security? I do. I've worked with all the software I mentioned. It's very real. I've recovered data from disks that were wiped above and beyond the recommended number of passes with government-approved wiping schemes. Don't believe me? Dig up a demo copy of FTK (they're out there) and try it. See how you feel about data security then when you get it ALL back. Fact is, you may be able to wipe a disk with software enough times so that you cannot recover the data, but on modern hard drives this would take several days and it is far faster and far safer to just destroy the drive. Do YOU want to risk it, particularly with the cheap prices of hard drives nowadays?
Wiping software honestly is horribly ineffective against some of the more advanced recovery programs such as FTK (Forensic ToolKit). I've seen disks wiped with DoD-certified wiping routines get overwritten 25 times and still get recovered by FTK. I don't imagine DBAN would do any better. Fact of the matter is, the only way to ensure it's not coming back is physical destruction. Even basic software like GetDataBack NTFS can often recover several layers down.
Consider this very-real scenario: you sell your computer or give it to someone. They suffer a data disaster. Either their partition gets corrupted, gets deleted, or gets reformatted, and they need their data back. So they buy some fairly inexpensive data recovery software to get their stuff back. This software recovers not only their own data, but YOUR data which was buried a few layers deep. Now they have access to all sorts of things about you that they shouldn't have.
For reasons above, I will never sell or give away a hard disk. My old hard disks go in storage in case I need to recover something off of them or use them in another build, and at the end of their useful life are physically destroyed through any of various means. If I am giving a system as a gift, I will purchase a virgin hard disk and configure it. It also gives me the peace of mind knowing that they have a reliable drive.
Re: Colin Millar
Thank you. I share your opinion completely. Firefox is feature-poor and clunky. It starts slower than IE and most other browsers, and the user interface can best be described as "poor". The vaunted tabbed browsing relies on a tiny button buried among similar-looking buttons to open a new tab. Or a keyboard shortcut (CTRL+T) that's inconvenient and uncomfortable to hit. And once you start adding features to it with addons, guess what? It slows down everything because a major portion of each addon must be loaded at all times and webpages are processed through the addons. And ANY one of these addons may compromise the security of your computer. Wow, that's a LOT better than IE where you only require Flash, Java, and Silverlight and you have more features than Firefox ever had.
As if there was another reason
For network administrators to ban it from their networks. Let's see: it doesn't respect network policy settings, and NOW it has a porn mode that means it won't even record the sites people go to so you can't catch employees except with firewall logs (which don't show which account actually accessed it in most cases), and there's no real way for network administrators to lock this feature out. Yep, I'd ban it.
As if I had another reason to say Linux has no freaking place on the home desktop. Let's see: GUI that doesn't use file extensions so that .doc could be an executable, check. Rootkits that can integrate themselves completely seamlessly, check. User bin directory that executes before system commands, check. Open source commands that anyone can make adulterated versions of, check. And a need to go root and re-enter your password often, check.
If our current "home" versions of Linux were, today, deployed on the majority of users' desktops, it would be a security disaster of epic proportions. It is too freaking easy to elevate privileges, and too easy to trick users into executing malware without enforcing file extensions that match file type. Malware writers would be all over it.
I'll tell you what I am
A veteran tech and network administrator. I have been working on computers for 12 years, and building machines for 8 years. I am a network administrator and have managed a network with 7 servers and 200 clients. You want shit hardware? Try 200 IBM Netvista 1.8 ghz computers from 2002 that IBM decided to use up old memory stock on, and instead of using DDR they used 128 megs of PC133. Some of them had IBM Deskstar "Deathstar" drives, but even those were more reliable than the Seagates. I pity the bastards who had to take over that network, what with around a 40% disk failure rate over the course of a year. I tried to source new machines for them, but the funds weren't there.
Yeah, they sucked. But I made them run, and they ran pretty damn well. I know my shit. I know when a system is running well, and I know what it needs to run well. Those systems should have had 256 memory. They didn't, although with all the machines I cannibalized in a year's time there was probably enough left over. I should note that when you're running an organizational network, open source software is generally considered a security risk because it does not respect network security policies and is of questionable quality. Unless you're utilizing it in a standalone server role, it's generally avoided except for a few industry-accepted applications.
Point being, I'm well aware of what a machine SHOULD have, and well aware of what you can put in one for any given price. And I'm in tune with my systems well enough to know how they're performing. You dare call me an MS-fanboy, well, I can look at you can call you sheep. "MS BAAAHHHD!!!" That's about all I hear coming from the crowd. What do you propose we use? Linux? It's rotten to the core and totally unsuited for use as an end-user OS. The OS of the future isn't built around a mainfram OS that originated in the 1960's that has a totally flawed and proven-wrong security model. OSX? Enjoy paying money for service packs (what do you REALLY think Tiger, Leopard, etc. are?) and gaping security holes that Apple denies exist. I thought support was supposed to come with the OS purchase price? Look at reality here. If you buy junk hardware, don't blame the software.
Let me explain a little bit about how processors and memory work together. Newer, faster processors are capable of feeding more memory in a given time period, and require more and faster memory in order to run at their full performance. A 64-bit processor can also address twice the memory in a single operation as an equivalent 32-bit processor.
So, what do we actually have here? We have a situation where OEMs should have been increasing their memory to match their CPUs, but they did not. Even with XP, it's going to be choking the performance; it's a simple matter of mismatched hardware. It may seem blazing fast, but you're not taking full advantage of your processor. With multiple cores, we are also reaching the stage where having more than 4 gigabytes of memory can be beneficial in some rare instances, but in order to do this you need a 64-bit OS. A 64-bit OS on a 64-bit processor, what a novel idea? XP only lets you use 50% of your processor, and is only capable of addressing half as much memory per operation as your processor is capable of.
Vista will run just fine on 1 gig of RAM as long as you're using Office products or very low-level gaming. It will run very nicely with no noticeable speed decrease, unlike XP where running with marginal memory results in massive performance degradation. For gaming, you should have 2 gigs. You may choose to go above 2 gigs, although I have yet to encounter a program that would benefit from it. I'm sure some games such as Crysis might, although nothing I use does.
As for the interface, well, that's personal preference. Yes, it's different. I find it to be more efficient to use, however, much like how Windows 2000 was more efficient than Windows 98. I consider myself a power user, though, and use all aspects of the system. Yes, it's different, but most things are simpler.
So really, in all practicality, what are you expecting? I first tested Vista on a 1.4 ghz Athlon Thunderbird with a gig of RAM. It ran just fine and performed about the same as XP. I can build a system for $550 or less that'll run Vista perfectly. Incidentally, that's what I have always considered bare-minimum to build ANY PC correctly, and is about the same price as these budget POSes OEMs sell you with some minor upgrades. Decide who your real enemy is here: Microsoft who built an OS for modern systems, or OEMs who are milking you for every last dollar and selling you stuff they know is inadequate, yet could do it right for only a few dollars more. They create systems that are less than the sum of their parts.
You know what the retarded thing is?
Vista runs JUST fine if you have a machine built right. I've been running Vista for nearly a year, and I've had far fewer problems with it than I had with Windows XP during its first 3 years. Do you know why? Because I built my machine properly.
How long was 512 memory considered a standard level with 256 being still available? The answer is TOO LONG. Processors increased in power ten-fold, memory performance increased several times, yet the amount of memory OEMs are willing to package with their systems remained largely the same. If you followed the technology curve between 1996-2002, a gigabyte of RAM should have been the standard around 2004, and they should have been transitioning to 2 gigabytes right about then. Guess what? These scam artists were still under-equipping their systems with memory, choking off the processor and gutting the performance.
Then there's these jackass hardware vendors who won't support their own hardware. Nvidia didn't release a single driver from December 2007 to June 2008, and the December 2007 driver was grossly unstable. Statistically, Nvidia drivers alone were responsible for 25% of Vista crashes. The ONLY semi-stable Vista driver before June was the October driver. You see it time and time again, hardware vendors refusing to support their product and screwing YOU, the consumer.
There's no secret to the system I built. 2 gigabytes of memory isn't that expensive. A semi-decent video card isn't expensive either. But these jackasses are screwing you left and right. Nvidia should have had stable drivers along with the other hardware vendors, and the OEMs should have put more RAM in their systems. It's hard to blame Microsoft when vendors decided to stagnate and screw you. Fact is, Vista works. Vista is stable. Vista is fast. If you want to blame someone, blame Dell, HP, or whoever built your PC. They screwed you, not Microsoft. Put blame where blame is due, and don't go, "Oh, my system runs like crap with Vista! It's got the same memory in it the system I bought 5 years ago had!"
How about this
I'd rather a few specific major sites simply refuse to serve pages to out-of-date browsers. Try to search with Google, and instead of results it responds that your browser is out of date. Go to MSN.com and it informs you that you need to update. Of course, it won't happen unless someone provides them major incentive to refuse traffic.
They failed to change the network admin password after they fired him. Bad security practices.
Had the guy instead used a program on the server to remove shared printers, shred the system logs on startup, and reset the machine, he likely would have gotten away with it. Or had he used a clean system for the attack. It's practically impossible to get away with this sort of thing, though. There's always traces of some sort. The fact that all his home machines had been wiped was major evidence in itself.
How about this instead: parents who spend to much damn time on the phone and not enough time taking care of their kids are bad parents, and bad parents raise bad kids.
This is also called a Partical Projection Cannon. A laser ionizes the air, followed immediately by an electrical charge and ionized particles. The PPC concept has been around for a very long time, although its use as a weapon remains largely in the realms of sci-fi (Battletech / Mechwarrior universe, and Robotech).
Funny thing is
Seagate may have put out more drives, but historically there are a HELL of a lot more Western Digital and Maxtor (pre-acquisition) drives still working after 5 years. Seagate has a horrible reputation for drives that die immediately after the 3-year warranty expires. The average life of a drive is 6 years.
I work at a school with a lot of older computers, with multiple hard drive brands. I can honestly say I have not found a SINGLE working Seagate drive. The Western Digitals and Maxtors are pretty solid (Maxtors especially), but every single Seagate is dead, without fail. I don't even bother to test them anymore. I just throw them out.
Another miserable patent troll. This is why the whole damn patent system needs an overhaul. It needs a development clause: after filing for a patent, you should have only 2 years to bring the product to market. Otherwise the patent expires and becomes public domain. If you cannot develop it or have no intention to develop it, you have NO legitimate reason to possess a patent for it.
The problem with this. . .
"Winning exploits must target a previously unknown vulnerability; vulns that have already been reported to the affected software maker or a third party are not eligible."
That is horribly unfair, because Apple in particular fails to fix vulnerabilities even after they've been reported. This skews it horribly in Apple's favor. After all, what other company sits on a publicly disclosed security vulnerability for a year and STILL doesn't fix it?
One less. . .
One less yapping rat.
What really sunk 3Com. . .
Was Nvidia. Let's face it: were the Intel integrated NICs worth crap until just a few years ago? Not really. They were software-based and had poor performance. Nvidia released a GOOD integrated NIC, and that forced all the other vendors to step up and upgrade their own products. After that, there just was no real need for PCI NICs, except in servers, and often not even there. I predicted 3Com's demise several years ago, and I'm not surprised to see them go. I still have a few 3Com 3C905C-TX cards laying around. I picked them up 5-7 years ago for $25 each.
What will be the next company to go down the crapper? Well, Cisco is a long way off from folding, but their decline is already evident. Other companies like D-Link are starting to produce products that are, in many ways, just as good. Why pay twice as much for a Cisco managed switch, when you can get one just as good from a competitor? One thing you do not see them competing with yet, though, are the routers themselves. I see Cisco slowly retreating into a router-only company (except for a few high-end backbone products), and then eventually disappearing as the rest of the industry catches up and produces cheaper, more user-friendly products. It will probably take another decade or more, but their end will come. You just don't design products like that anymore, and Cisco is unwilling to change.
And this is why. . .
I won't let ANY Apple product run on a secure network. The Microsoft bugs pale in severity compared to the stuff Apple leaves in its product, and the difference is that Microsoft actually FIXES them. Apple buries them, then denies the problem exists.