So.... not changing the name to Windows smash-the-window-and-ransack-the-joint?
any other suggestions...
A remote-code execution vulnerability in Windows Defender – a flaw that can be exploited by malicious .rar files to run malware on PCs – has been traced back to an open-source archiving tool Microsoft adopted for its own use. The bug, CVE-2018-0986, was patched on Tuesday in the latest version of the Microsoft Malware …
It's been obvious for at least thirty years - Microsoft is the Monopoly and Marketing Giant....
......but has NEVER been much good at software. Here we have (yet another) an example of M$ taking a perfectly good piece of someone else's software and totally screwing it up. The other examples, I hear you ask? How long have you got? Powerpoint, Lattice C, Multiplan, Avalanche, Internet Explorer (purchased/licensed from NSCC), Visio........the list is endless.
The error here wasn't destroying perfectly good open source code, the error is the defender program itself. Anti virus isn't the solution. We should go the Mr Burns route and infect our systems with ALL the malware, then none of it will be able to fit through the door!
That's a little harsh...
Maybe there's a little group? IBM (points at Notes - just in general not necessarily security. I mean it may have security bugs if anyone cared and were prepared to expose themselves to the horror), Apple with OSX "security patches", Microsoft, Adobe (for everything), Sun (Java), Oracle (unbreakable....).
Maybe it would be easier to make a list of the Midas companies?
"... day Micro$hit makes a product that doesn't suck is the day it starts making vacuum cleaners ..."
Eyewateringly overpriced, overmarketed, shiny, horrible looking, not very good vacuum cleaners?
Too late, Dyson got that particular halfwitted market segment sewn up years ago.
This shitty code is in your medical devices, cars, industrial systems, phones and most devices in your homes. It's present on every website you visit.
Insecure by negligence and stupidity, it's everywhere in your life.
But hey - psychopaths are running the companies that make this stuff & they don't give a shit. They are cutting cost to get paid. You are not the 1% so fuck you.
Even Billy G and Ballmer had standards when it comes to patching software.
SatNad, since becoming CEO, had lowered the bar to record new lows. He's outsourcing the software testing to the users, especially to those useful idiots called 'Insiders'. That's irresponsible and Microsoft should be sued.
"This shitty code is in your medical devices, cars, industrial systems, phones and most devices in your homes."
Cobblers. Embedded systems (i.e. pretty much everything you're talking about here) programming is a world apart from desktop/cloud programming - when you know you can't always push out bugfixes to all your existing customers simply by sticking a new binary onto an update server, you do tend to spend far more time making sure the code you do send out the door is as bug free as you can possibly make it.
"But hey - psychopaths are running the companies that make this stuff"
No, they really aren't. At least not on the planet the rest of us are living on. Maybe on your world (you know, the one where your post might actually make any sense) things are different...
" Embedded systems (i.e. pretty much everything you're talking about here) programming is a world apart from desktop/cloud programming"
Remind me how secure is SCADA code again. And how many cars have been hacked via on-board systems. And how many medical devices have been hacked.
You are living in cloud cuckoo land mate. All the code written is shit, insecure and done to the lowest possible cost and quality.
All code, or just the code you choose to hate?
I'd agree with you if you said every piece of code ever written by anybody was a giant fuck-up and the internet needs deleting.
Maybe you should be a leader and start by wiping your PC?
Just on a side note, why do people still use RAR archives? I saw a firmware update for download just today that was packaged as a RAR. The majority of OS have some support for opening ZIP built in so why use a archive format that requires your end user to download an extra piece of software to open it?
RARs are proprietary format as well so you need to buy WinRAR to create a RAR archive so surely even 7-zip would be a better option as that is open-source.
7z is still one of those 'exotic' archive file formats... it's similar to what Ogg or FLAC is for audio files.
I still remember the ARJ and ACE file formats.
You don't need WinRAR to open/decompress RAR files. If you want, you can always use the shareware WinRAR... or there are not-so-legal ways to overcome the restriction. ;)
There is also no shame paying Rarlabs for its excellent software.
Proprietary isn't always necessarily bad... depends on the spirit of the person or company owning the software.
Unsigned integer values? Even Microsoft's own development documentation recommends not using unsigned integers in Windows applications because of the unforeseen side-effects. It sounds like Microsoft isn't taking their own advice. And retrofitting existing code like that is just a bad idea.
"Even Microsoft's own development documentation recommends not using unsigned integers"
I can't decide whether to upvote this as top-shelf satire, or downvote it as a huge WTF?
I mean, yes, if your integers are unsigned, anyone can replace them with other integers and you won't be able to tell. On the other hand, integer signing has never been useful as a form of DRM, and can make it more difficult to update the integers if it turns out one requires patching.
The problem, as ever, is backward compatibility.
Computers were designed from the start to use integers without cryptographic signatures, so it is not possible for applications to detect whether an integer is signed or unsigned just by looking at it. A program must be compiled with foreknowledge about which integers to check for signing. Signing is a "cool hack" first used in the late 90s as an attempt to prevent piracy, pioneered first by Microsoft, quickly followed by most of the rest of the industry. Applications designed for unsigned integers will run fine on modern operating systems, but if signed integers are used by mistake, this can result in crashing, especially if the numbers involved are modern numbers that can be quite large. This is because cryptographic signing uses a "hack" that takes over the topmost bit, which may be flipped in some circumstances. This confuses older software.
Microsoft's hacking of the modern RAR program to force the use of outdated "unsigned" integers is an example of how the company has failed to move with the times. This dinosaur's days are limited.
I'm sorry, I can't tell if you're serious or if you're doing a slightly late April Fool's joke. I mean, WTF?
Signed vs unsigned integers is not referring to signing in a signature/DRM sense, but to whether to reserve one (top) bit of the bytes used for storing an integer as an indicator of it's sign (1=negative, 0=positive).
The consequence of changing from signed to unsigned is going from (in an 8 bit system for clarity) an integer holding a value from -128 to +127 to an integer that holds from 0 to +255 and, unfortunately, that can have a huge impact. The binary value 10000010 is a signed value of -126, whereas if you switch the type to unsigned the same stored value now represents +130. In other words, changing the type without changing the underlying contents completely changes not only the value but also the scale, and the larger the bits involved the greater the scale issue.
With a 64 bit system the impact is enormous and numbers no longer bear any relation to what they did.
Signing is *not* a "cool hack". Using unsigned integers is *not* outdated - it's just a different data type with no extra crypto built in at all. It is, quite literally, the same values being read either as a number encompassing (a) the largest range the bytes can store or (b) the largest range the bytes can store if the highest byte only has 7 bits because the topmost bit holds the +/- indicator. Is it a 64 bit number or is it a 63 bit number with a sign bit.
haha got caught out - so now we know M$FT grabbeth open source code, b0rks it a bit, and push said code back into their own products...
No wonder why win10 sucks majorly.
Whose code is in it is anybody's guess.
As said in one MAD magazine's Shakespearan parody "In the sight of other men, take only your due. But, when alone, grabbeth what you can."
I had the disprivilege to work on that team years ago. This in no way surprises me at all. The team was very aggressive in dropping new definitions - every 4 hours on the fast track, 8 hours on regular. However, their methodology for delivering the new bits was all based on a scripted work flow with flag files, and the like. It was a hugely delicate process. Any time one piece of it didn't complete on time or as expected, the whole thing came crashing down and had to be manhandled. It sucked.
This is far worse than that - any developer who does that change without fully understanding the consequences is below a beginner level, he or she showed a basic lack of knowledge about computing, and you wonder how such a developer could have been assigned to work on such software - one that runs with very elevated privileges that magnify even the littlest exploitable bug.
It is true Nadella now sees Windows desktop just a data hoarder and ads slinger, thereby I'm sure a lot of software development was probably outsourced and moved to low paid software sweat shops.
It's actually even worse than you describe.
What kind of product design process doesn't understand that changes need to be justified (for reasons including those you mention), and preferably they need to be properly considered before the shit hits the streets.
Worse still, what kind of commercial market-driven world is it where if a company designs+produces defective hardware and Bad Things happen as a result, the company management can (in many jurisdictions) be held responsible under Product Liability laws or similar). Sometimes defects result in a recall, very rarely does it result in prosecution and conviction, but at least there is a chance of consequences.
But if a company designs+produces defective software and Bad Things happen as a result, the company spinners just have to say "sorry mate, it's software, no warranty, not our responsibility, not our problem, tough shit. Now pay me and my channel partners etc for the upgrade that fixes it, or we send the boys from FACT round to see you, and you wouldn't want that to end up in the papers would you."
"Weird" doesn't even start to describe it.
The Open Source Initiative (OSI) provides an Open Source Definition (OSD) and the Unrar license does not qualify. The OSI provides a list of licenses that have been reviewed for their compliance with the OSD and the Unrar license is not listed. The Fedora Linux distribution project does have a written review of the license which starts off with:
"This license is BAD, and should not be used in anything in Fedora. It has use-restrictions that make it GPL-Incompatible and non-free."
So, Unrar is source-available but no Open Source. From a practical standpoint, Unrar probably does not get the same level of code review and community contributions because the license causes the code to fall outside the scope of the Open Source community.
It is understandable that a part of the malware scanner must run with LocalSystem rights to ensure it has read access to all the files, but once the file is read into memory the decompression and detection engines should be able to run unprivileged. Even if Microsoft addresses unrar related exploit, the fact they perform any decompression at LocalSystem level indicates a much more serious and fundamentally flawed design. The OpenSSH service has a long history of avoiding full privilege level exploits because it is designed with privilege separation of the different code. Microsoft has claimed to be doing something similar for the Edge web browser. It is unbelievably incompetent of them not to do the same for an anti-malware service.
Any "security" service that is not written to withstand itself being under attack is providing security in name only. It seems like "Security Essentials" was designed just to get the approval of a rubber stamp that they provide malware protection. In terms of getting the approval of the security community, this product clearly violates modern best practices.
Essentially since they process virtually any file they come into contact with, they expose a huge attack surface. Just imagine you had a bog standard Windows PC and someone sent you some .rar file. Since you don't have the software to unpack it, nothing would happen... unless you have turned on Windows Defender which would choke on it, allowing remote code execution.
This is by far not the first incident. Unpacking archives is something non trivial to do. If you need to write code to unpack dozens of obscure archive formats, you are likely to mess up at least some of them. Even if you want to test it, you're unlikely going to find a fuzzer for those obscure formats.
"Unpacking archives is something non trivial to do."
"you're unlikely going to find a fuzzer for those obscure formats."
"It's a common problem with "Antivirus" software"
Scuse me? That may be correct as written, but I'm pretty sure you know better than that.
What happened to "unit testing" and other such pre-historic techniques? Not to mention half-decent change control processes?
Especially in the cases where code that is necessarily handling untrusted input is intended to run in a system environment where it (by design, necessarily or not) has easy access to elevated privileges?
Were those things too much like hard work for the kind of people that have, for many years, brought us CVEs including the words "a specially crafted [whatever] file (e.g. web page or email message) can lead to unauthenticed remote code execution with elevated privileges"?
When will Micro-shaft learn that running a anti-virus scanner with limited permissions?? When?
Like the famous GCC (I know they don't use GCC) signed/unsigned comparison warning? The one that causes Open Source software maintainers (who shall remain nameless) to mindlessly change signed integers into unsigned? (Because unsigned overflow is well defined in C and C++ so the compiler doesn't by default warn).
I agree with statements above that there is no excuse for shipping these bugs, or for violating "it ain't broke don't fix it" by changing signed to unsigned without good reason... but I do understand the instinct. Since I first encountered K&R v1, I wondered why the hell the authors made strlen return a signed integer. I could only assume they were too lazy to type the (too-long) word "unsigned". Or was unsigned not in the first version of the language? - frankly I forget.
It just makes large parts of my brain hurt when an integer that can never represent a negative quantity (string or array length being the canonical examples) is declared signed. It's sad that "signed" is the default because I'd be so bold as to say that most of the integers I declare can never go negative.
Biting the hand that feeds IT © 1998–2019