Another Footgun Moment for Microsoft
Security researcher Stefan Kanthak claims the Microsoft Visual C++ Redistributable for Visual Studio 2017 executable installers (x86 and x64) were built with insecure tools from several years ago, creating a vulnerability that could allow privilege escalation. In other words, Redmond is distributing to developers executables …
Security researcher Stefan Kanthak claims that the Microsoft Visual C++ Redistributable for Visual Studio 2017 executable installers (x86 and x64) were built with insecure tools from several years ago, creating a vulnerability that could allow privilege escalation.
Is that not the case for each and every single line of code released by MS ?
Seriously, I think the program should look in the application's directory for DLL's to load, that is how DLL overriding is supposed to work.
The fact that Windows devs are using an outdated software distribution model where you have to download some executable file from the internet that nobody can trust is the MAIN problem here! PLUS desktops think it is wise to place all downloads into one folder is yet another problem in this case; on UNIX this is fine, most of the time, because pkg's, deb's, or rpm's are not executable.
The simple solution would be distribute installers in zipped folders (so even IF dumbcunt chooses to unzip into Downloads, the installer lives in a child directory).
You could think that since Windows supports MSI that all software would use that format, but no ... dumbcunts.
PS: Ever wondered why some people distribute Windows software as executable (for dumbcunts) and ZIP ?
An install should be little more than an unzip and some very standard, declarative registry / symbolic link settings. Things should find the installed software. The disaster that is COM has a lot to do with it.
Also, why is the C++ run time not just a standard part of windows, given that that is what much of windows is written in.
"The disaster that is COM has a lot to do with it."
Now, now, don't go blaming COM for this undead 'DLL Hell' problem. COM works when it's implemented properly, but not the same way that so many people have ABused it by polluting the registry with everything and its grandmother.
Instead, you can blame LAZY DEVELOPERS for thinking "an application" equals "slap together a bunch of shared libraries and shared components" VB-style. And pollute the registry with the 'registration' of all of those said shared components and libraries with their unnecessarily large number of COM objects that you'll never use.
2 words: STATIC LINK. This avoids the entire problem. That includes static linking runtime into your DLLs. Then, actually using COM, or something like it, to talk to your own DLLs avoids having to rely on globals allocated using shared runtime - how about THAT! You don't need to use Micro-shaft's "class factory" nonsense either if you have your OWN object constructor within the DLL that knows what the 'new' operator is for... as a simple API function of your own design.
So, when some stupid downloaded game or "dancing gopher app" breaks the C/C++ run-time and/or MFC shared DLL's, you know that *MY* application will NOT be affected, because *I* went through the trouble of making sure there were NO dependencies on ANY of that! And the executable size is only about 200k more, if I do it right.
Seriously, Micro-shaft TOTALLY SCREWED the whole 'shared lib' concept a long time ago. You might as well just statically link. Your application will LOAD FASTER, it will RUN CLEANER, you won't be getting that midnight phone call because "Phil installed a game and now your application won't work and we need it yesterday".
And, you WON'T be relying on the "Micro-shaft's installer" for YOUR success!
LGPL can be satisfied by wrapping their library with your own API or DLL that encapsulates what they do. You make a copy with your name on it, open source the library, wrap it with what you need to do, and ship YOUR binary, open sourcing it as needed to comply with licensing. Not a problem, really. been there, done that, with jpeg as I recall. And your DLL, of course, is statically linked with the run-time. So no other dependencies [aka potential cock-ups].
Actually, COM could be safer in this case, because which executable implements the COM calls is usually stored in the registry with its full path, so I believe it may be immune to this kind of attack, as the application will load it only through its CLSID.
It's LoadLibrary() and CreateProcess() calls that can be vulnerable when a full path is not used, and the proper calls to SetDllDirectory() and/or SetSearchPathMode() weren't made.
Once again this can't be made the default, because many application relying on finding their DLLs in the same directory without using a full path would stop working.
AFAIK, DLL planting attacks may be performed in Linux too - it's just a matter of writable directories and shared objects load order.
2 words: STATIC LINK
The article mentions system DLLs. It's been a long time, but I think I remember that the order for DLL loading is something like: same directory as EXE; current working directory; windows/system; path
I also remember that LoadLibrary can take a full pathname and you can find the windows/system path easily. So a fix would be to specify the fully qualified pathname of the system DLL.
It's been a while though
2 words: STATIC LINK
Static linking certainly has its place but the more common the library, the more copies sitting occupying RAM, maybe pushing working sets over a physically-addressed cache's size and impairing system performance...
Also, the more burdensome to patch some vulnerability (admittedly avoids some patch breaking your application).
TL;DR: static and dynamic linking both have benefits and drawbacks; which one "wins" depends on circumstances.
Complex installers are need for complex software. The executable part usually take care of upgrading existing installations - i.e. installing or upgrading a database schema, transforming some settings or file structures, create/update/remove services/daemons, create/remove web sites, etc. etc. For some software, just copying some files and creating links in really not enough.
Even my debs/rpms have code which is executed pre/post installation/removal steps to take care of specific needs.
Sure, you could put them in separate executables the user has to run before or after the setup is run, but often it just adds complexity to the setup procedure, and can cause more mistakes and support headaches.
COM has nothing to do with this bug, nor with installer complexity. It's just a language-agnostic RPC solution, and may be better than invoking a shell to run a script and parse its output - plus remembering to sanitize input to avoid vulnerabilities.
Microsoft decoupled the C/C++ runtime from Windows long time ago, because Visual Studio development cycles are separated from Windows ones - you can also deploy application without using the system-wide runtime, if you know how to deploy your application properly. Just, many developer just use the redistributable - which should be built with up-to-date technologies, obviously.
"Complex installers are need for complex software."
too hard, smarter people, blah blah blah [remember _I_ wrote an installer >20 years ago and kept it up to date through every version of windows since '95].
You do NOT need a complex installer. You need something that a) finds an existing install, b) checks the version resource of anything it's about to install or remove and make sure that the thing you want to put there has the correct version (whatever that might be), and c) maybe add some registry entries or icons or whatever.
It's not that hard. Seriously.
Now when it comes to registration and all of that, COM DLL's already have a mechanism by which you can have the DLL register all of its stuff, and unregister it. There's an exported function [I forget what it's called] for each of those things. So you just have to call that function for each DLL. No big deal.
For the application itself, I generally add a command line option like '-install' or '-uninstall' that will let you use the application itself to set up "all that stuff". So if, like VLC, you associate a bunch of file types with your application, you PROBABLY already have a function in the executable to do that. Great! Call that when you do '-install' and un-do that when you call '-uninstall'. How hard is THAT?
Point is, it's NOT hard, it's surprisingly EASY, and almost brain-dead SIMPLE. You do _NOT_ need "a complex installer", particularly if you're NOT installing a monolithic ".Not" runtime and C++ shared lib installer that requires 2 reboots and a prayer (and a sacrificial chicken) to get your system up and running.
icon, because, *FACEPALM* [but your point reflects what Micro-shaft WANTS you to think, so you're forgiven for buying into their FUD and drinking their coolaid because you didn't know better].
I too wrote an installer in the 1990s - it was in TurboPascal and was for DOS. It was even able to get the video card type and install the proper BGI drivers. It could also run batch files back then. It had an editor which bundled the setup engine with the install data, and compressed application files. It would be fun to publish the code on GitHub.
It is clear you never deployed complex applications which needs a lot of setup work to be ready to work (and a lot of clean up to be fully uninstalled). It could imply creating or altering a database schema, setting up a web server or services, install and configure additional software, perform requests to servers, and many other operations. They could need to query the existing systems to understand which configurations are needed. In one of our latest setup, the client setup has to ask the server a client certificate, and install it. It happens at setup time because the required credentials were only asked there, as the software had no UI, and anyway once assigned users shouldn't be able to mess with that.
Installing file and creating a few registry entries is simple enough, and there are even a few APIs to help you. That's the easy part. It Creating good setups, that install application properly, and when needed removes them completely without leaving garbage behind, could be quite hard, especially for complex server applications, and even some desktop ones., and is a programming effort on its own. And it tells which developers are great, and which ones aren't.
Free tools like Inno Setup are very good and don't use the Windows Installer engine, and I'm quite sure they are far better and safer than any setup tool you wrote yourself, being largely used.
I have a good knowledge of both InstallShield (both the old proprietary engine and the msi one), and InstallAware - all of them allow to execute specific tasks to perform complex setups. MSI even allow for writing reusable components to deploy frameworks, runtimes, third party applications, etc.
You look to have little knowledge about why COM DLLs require registration, and how it works. Actually, COM DLLs export just a few needed functions, because everything else happens through COM mechanism - they aren't pure C/C++ DLLs, especially since they are designed to be called by any language supporting COM - using early or late binding, and each call may require a lot of work, especially if marshaling requires complex transformations, making COM calls slower or much slower, depending on what execution path is used.
Also, any setup code you put into the application may need Administrator rights to work. Setups are usually executed elevated by a user with enough rights, even using remote administration tools. Windows have specific code to understand when a setup application is run. Having that code inside the application is not a good programming practice, and still many user would be blocked trying to use it.
"why is the C++ run time not just a standard part of windows" because there is more than one set of libraries, M$ don't want to leave things static where just any compiler could use their libraries so they keep changing things and they certainly do not want people using all the components as some are restricited to their own use and "undocumented".
So on your question of the runtime libraries being part of windows then since windows is more likely to be older that the compiler then chances are it is using a different set of libraries anyway.
Personally I prefer compilers that produce binaries that have any library components included in the project EXE and bizarrely thiis actually makes for a smaller install base than M$ supposedly reusable libraries.
Did you mean toolchains or something? It's not the compiler, rather the linker, and this is just what they mean when they say static linking above. IINM all of them do it, always optionally. (ok *maybe* there's a weird toolchain where a dynamic linker and a static linker are separate programs and the build you wanted determines the one that gets used, but I didn't ever hear of it, just there being "the linker" and that having a switch).
FWIW, RPM files can contain executable content in their %install section; see the RPM spec file reference. The thing is that installation scenarios are always going to require executable content of some kind, whether that's something relatively innocuous like creating new directories or something more involved like updating the Registry or driver database.
Re: using .ZIP or some other publicly-documented format. Yes, the problem with MSI files is that, being a proprietary format, it is woefully underdocumented and very few tools exist that can read them (I know of one, LesMSI). Usually, you can use the msiexec command-line tool to force a network install which will spit out the files but that won't get you the registry settings or anything else.
FWIW, RPM files can contain executable content in their %install section; see the RPM spec file reference.
Yes, debs have the equivalent as well ... and why is that relevant to the subject ?
The point is, we do not want punters to execute stuff in Downloads to avoid DLL overriding. The RPM/DEB/PKG/etc file is not executable, it is a installation package.
I agree about MSI, that format sucks, but better than plain executables, you will not be able to inspect the contents either, but at least MSI is NOT executable.
I go for ZIP when available.
PS: Quite a few people here disagree with me, have I missed something obvious ? Do you guyz not use DLL overriding ?
you have to download some executable file from the internet that nobody can trust
Yes. Not only that, even if you download an executable that you trust (even one you made yourself), and there's a rogue DLL lying around in your Downloads directory - boom!
Windows supports MSI
Don't get me wrong, I like MSI in principle, but the implementation leaves much to be desired.
The main trouble with that is it's so damn slow. When selling your own software, you want it to install as fast as possible. My completed installation count almost halved when I used MSI instead .exe (I still made the .MSI download available to those who actually wanted it)
Also, it's a nightmare to do the simplest of things with WiX.
MSI does work like that, sorta, but it's not very flexible.
A mixture of requirements from people who don't "get" deployment, the fact that it wasn't originally designed for general use, Microsoft designing a package system without understanding package systems, and MSI being highly opinionated about everything leads to some very complicated and flaky MSIs. Also it looks hideous these days.
Burn, which is the affected part of WiX, was an attempt to fix this issue.
The main problem, and it's one that affects all software development to some degree, is that devs don't keep libraries they use up to date. Managers don't want to fund that work because they think it's not important.
"A website that delivers malware can, depending on your browser and its settings, automatically download . . "
A long time ago (in the previous millennium actually), I was at work and got an internal message to check something at a URL. This was not spam, nor malware, nor a joke. It was work. So I fire up IE and click the link, going to a website and, somehow, notice that I had a new file in my Downloads folder (which I had open on my desktop).
I do a double-take, close IE, delete the file, and start Firefox to follow the same link. Firefox gave me a popup warning that the site was trying to force a download, and did I accept ? Did I not !
That was the day I realized that some unfathomable moron had thought it a good idea to include in browsers and HTML a function to download stuff to people's computers without either their knowledge or consent. I was speechless with rage. It is inexcusable.
From that point on I vowed to never again trust IE and only use it on merchant sites I know and trust when I can't use use a more secure browser.
It was one hell of a wake-up call.
"It was one hell of a wake-up call."
a) never use a Micro-shaft browser
b) avoid using windows if at all possible
c) only surf the web while logged in as a non-privileged user, ideally not the same one you do your wok as, NEVER 'root' or 'administrator' or similar.
e) *NEVER* preview e-mail in HTML. *NEVER*. In fact, don't use HTML for e-mail at ALL, and view as plain text ONLY. This destroys the effectiveness of scammer e-mails, since they often leverage HTML to do their phishing, click-baiting and tracking. Thunderbird lets you turn off HTML in e-mail.
f) *NEVER* "install this" to view the content
g) NEVER install a flash plugin (use HTML5 only for media)
h) don't read PDF files within the browser; use an external application NOT written by Adobe or Micro-shaft (a recent vulnerability on this one proves I'm right about this).
and so on.
These simple rules, or ones similar to them, will help to protect you against nearly all of the known exploits, and probably most of the zero-day exploits.
But you discovered the most important one on your own: *NEVER* use a Micro-shaft browser!
They're all versions of NT.
In the BEGINNING (not the real beginning of course , just as far back as I can be bothered to go back) there was DOS, and the world saw that it was shit.
Microsoft said "Let there be GUI" and there was Windows" 1.0 and it rubbish.
Eventually there was Windows 95 for home use or small companies and Windows NT with advanced features like user account passwords for bigger organisations.
Eventually (again) Microsoft said "sod the two versions" let's make Windows XP, base it on NT and ship it to consumer and business markets. And it was so and has been ever since.
Windows ME just never happened, OK?
... Microsoft were capable of shipping a version of the C++ runtime that didn't crap a lot of READMEs, localisation files and other debris into the root directory of my system volume when I installed the bloody thing. Seriously, what sort of amateur-hour bullshit is this?
Or installers that try to create/use a C:/tmp dir, rather than using %TEMP%
If I remember correctly, some installers that 'crap' files in the root, actually check to see which drive has the most space free, and craps on that one. So you can end up finding runtime files etc on D:, E: etc. as well :-/
Hah, we do the same as Microsoft.
Chicken and egg problem, you cannot make the runtime installers with something that itself needs the runtimes to be installed to run!
Now, I am the first to bash Microsoft but in this case, I don't entirely disagree other than to get rid of the runtime installers entirely and keep things clean with a single .dll, like mingw manages to do.
It looks like MS are trying to update their installers, such that they're backwards compatible through to XP.
One major problem there is that none of their current tools support XP any more. So, they'll need to patch in some XP support to something modern, or add in the required fix to something pretty old, which can then be used to generate "fixed" installers.
What's the bet that whoever could do the required patching has either moved on from MS in the years since, or moved up a few ranks in seniority and doesn't remember that crap any more anyway? ;)
I put my own installer on github a while back. There are others, too. FREE ones. Mine's so brain-dead simple to use it could make anyone who's ever used InstallShield or tried to program for MSI cry...
(too bad I suck at marketing or I would've pwned it 15 years ago)
He might be right about there being a bug (well, undesired behaviour; loading local DLLs isn't a new or unexpected thing) but he comes across as a shouty idiot.
If you want to be taken seriously learn to interact properly with actual human beings rather than acting like a FUCKING basement dwelling LOON.
Unless the fine article has missed the crucial bit, he also overstates the seriousness of the problem in real life.
There *is* a problem if you persuade someone to download the VC runtimes directly and then execute them from wherever your browser dropped them. That is likely to be a directory where your browser has dropped other stuff in the past and there could be rubbish there. However, hardly anyone does that.
In nearly every case, these runtimes are redistributed as part of a larger package and a bootstrapper kicks off the MSIs one by one. Typically, the packages are distributed on a CD-ROM or equivalent image and are therefore launched from a directory with contents that are entirely under the control of the setup author and entirely benign.
Firstly, it sounds like a bug where the installer is trusting its environment (the downloads folder, or wherever it is run from), and using the (standard and well documents) Windows way of searching for libraries - current folder, system folder, %PATH%. Presumably the fix (which according to the article was done in a subsequent version) would be to only load the DLLs the installer uses from only the system path instead. Of course, you have to trust that what is in the system folders is legit, but if it isn't, you have bigger problems.
MS slipped up by not updating the version of the tools it uses to build the installer. That's a mistake on their part, but not exactly a mistake that nobody else would ever make.
Visual Studio is not part of Windows and because Microsoft believe in Chinese Walls, stuff that isn't controlled by the Windows team does not get distributed with Windows.
Each build of Windows is, of course, built with a particular version of Visual Studio and so that version of the runtimes will be bundled. However, earlier versions probably won't be and latter versions clearly can't be (at least until Raymond Chen finishes his time machine).
C and C++ have long ignored the modern issue of software correctness. These are bugs that could be automatically discovered and guarded against, but put there by unwitting programmers. The C philosophy has always been some misguided notion of 'programmer freedom'. But this is really naive and stupid – bad hackers can use these bugs to attack systems, and what is more is C and C++ are the perfect hacker tools. Software correctness and security are two sides of the same coin.
It is time the industry moved to modern tools and languages.
Well C++ has the problem of being so complex nobody can understand it with the problem of making it easy to shoot yourself into the foot.
C, on the other hand, actually is simple enough so you can understand what you are doing. So with C you have a chance of writing correct software if you are _really_ careful and have good experience in at least one assembler.
Considering the details of the PDF format, with it's many weird complexities, there aren't many high level languages that could deal with it in a sane way. After all PDF allows you to wrap all kinds of data in all kinds of formats. A text can be encoded as lossless JPEG2000 wrapped in Base64 wrapped in "gzip". It never was designed to be parsed with a proper parser.
Perhaps to use an image.
C is like a room with a white floor and dark holes in it. Yes the holes are dangerous as you can fall into them, but with a bit of experience you can see them and take propper measures.
C++ is like the same room, but with a layer of carpet applied. The holes are still there, and you can still fall in, but seing them is somewhat more difficult.
Other languages use different ways to solve the problem which could perhaps be seen as flooding the whole room with water (so you can swim) or filling the holes with concrete (so you can't fall in, but also not reach the walls of the holes). They all have their advantages and disadvantages.
"So with C you have a chance of writing correct software if you are _really_ careful and have good experience in at least one assembler."
You can write correct software if you can hold all the details in your head and then are really careful. However, there is a contradiction in your sentence. This complexity is actually brought about by C because C gives little help in this area that a modern language would.
Programming is also not about machine code or assembler. Maybe it is helpful if you know that for C because C exposes these details and expects you to think in this way. But that is precisely what makes C an outdated language and ultimately insecure.
Biting the hand that feeds IT © 1998–2019