* Posts by Elledan

23 posts • joined 3 Mar 2017

Rust in peace: Memory bugs in C and C++ code cause security issues so Microsoft is considering alternatives once again


Re: Don't be a tease

Speaking as someone who has actually done work on the Mozilla codebase (around version 3.6 or thereabouts...), I can say that the problem that Mozilla was having had absolutely nothing to do with C++, but everything with the lack of a proper build system (60,000-line script in the root), no clue about segmentation (build system tossed all header files in the entire source tree into a global namespace), zero documentation (aside from a few inline comments in the source now and then) and essentially a burning dumpster fire of which Mozilla pretended that it was a functioning codebase.

But sure, their problem was C++.


Re: Don't be a tease

Rust is totally recoloured C++. And by the fact that it uses inferred typing by default (strong typing requires a lot more typing than in C++ and rarely appears in example code) it uses weak typing.

If you want strong typing, use C++, or even better: Ada. That's a language which doesn't allow inferred typing. At all. Not even a typedef with the original type it was typedef'ed from. That gets you a nice compile error.

That the Rust creators don't even know Ada exists says a lot about the language.


Don't be a tease

So if Microsoft switches fully to Rust, or Kotlin, or Go or whatever is the flavour 'awesome language' of the month, then what does that say about Linux and all those other open source OSes out there? What about LibreOffice and such massive desktop applications? Should Unreal Engine be rewritten from C++ as well?

Colour me sceptical, but this whole story feels like knee-jerking to me. Anyone who has actually looked at and used Rust notices that it's basically just recoloured C++, with an even more obtuse syntax, almost fully inferred type system (you know, weak typing like in scripting languages) and instead of OOP the not very intuitive Traits system (which is being added to C++ by some enterprising C++ devs as we speak).

There was the joke going around that Mozilla invented Rust because they couldn't admit to their C++-foo being too weak and their codebase being unmaintainable due to every poor development practice in the book having been poured into it since the 1990s when it was Netscape. Programming isn't magical after all. It's still engineering and no matter what materials and tools you pick, you still have to put in the hard work.

Out of Steam? Wine draining away? Ubuntu's 64-bit-only x86 decision is causing migraines



So as Windows not only has continuing 32-bit, 64-bit and now 32 & 64-bit Linux (WSL) support, Linux (face it, at this point Ubuntu is 'Linux' for most people) is losing features?

I'm still no huge fan of Windows 10, but I really appreciate how I can still run any old and new 32-bit Windows apps, even those using legacy APIs, and with WSL and the upcoming WSL2 integrate working with the Linux world into my day to day workflow without jumping through hoops.

I wonder whether it's just a lack of resources on Canonical's side, or an actual difference in philosophy? If Ubuntu drops this support, does that mean that Mint Linux and Debian are also affected?

Nintendo Labo: After a day spent fiddling with flaps, you may be ready to, er, Lego


That warm, fuzzy feeling

Right from the moment that Nintendo Labo got announced I had this feeling that it might just turn into something awesome. Sure, the functionality that it brings isn't world-shattering. As the fine article rightly points out, Lego (and others) have been there already.

The biggest innovation Labo brings is that it's _cheap_. This enables the children who get a Labo kit to start glancing at all this wonderful cardboard, rope and tape lying around the house and beginning to picture their own creations, reusing a Labo kit or taking inspiration from it.

Compare this to Lego Mindstorms and kin, which when all comes down to it, is rather dull. The contraptions you can build with it are fairly limited, you're not going to build a grub-stumping robotic exoskeleton with it any time soon. Buying additional Lego kits is expensive, and as I have noticed over the past years, Lego has become less about 'here's a pile of bricks, make something with it' and more of a 'here's a design, build it according to the instructions' kind of thing.

It's hard to beat the cost, flexibility and wide availability of cardboard here. Want to repair or extend something? Just grab that cardboard Amazon box before your dad throws it into the bin and scurry off with it like a prized possession before turning it into another marvelous invention.

The parents will no doubt also appreciate the lack of constant pestering for $200+ Lego kits by their offspring :)

MH370 search ends – probably – without finding missing 777


Re: Silver lining

If I had been on the MH370 recovery mission, I would have looked on the map beforehand, I promise :)


Silver lining

Despite the tragic loss of an entire airplane along with its passengers and crew, MH370 has reignited a number of discussions about how to deal with possible future crashes like these, where airplanes are lost in uninhabited, hard to reach areas, or - like in this case - in the middle of an ocean.

What will come out of this is hard to say, but it may be that airplanes will in the future be tracked live by satellites, or we will have better recovery methods. Having much improved maps of large parts of the Pacific Ocean courtesy of a few years of detailed scans cannot hurt either.

We saw previously with AF447 (lost over the Atlantic Ocean) how complicated recovery can be, even if the location of the airplane is roughly known. That search took two years (2009 - 2011), in a relatively well-defined search area, carried out by a number of search and recovery missions.

As AF447 was following (more or less) its schedule flight path, it was fairly easy to deduce the rough location of the crash. As MH370 shows, with massive uncertainty about the actual flight path we are left to make educated guesses based on washed-up wreckage and ocean currents.

We will have to see what is deemed realistic in terms of technological options and available budget in the (cut-throat) passenger airline industry to prevent another MH370 mystery. Beyond that we will have to see when MH370's remains will be found. Hopefully before its FDR and CVR black boxes have to be written off completely.

Half of all Windows 10 users thought: BSOD it, let's get the latest build


The dark side of rolling releases

One thing which I have always liked about Windows was that you'd get the big new version (WinXP, Win7) and any issues with it would be ironed out with service packs and patches. The result would be a better version of the original. The current version of Windows 7 which I am using on my systems is a better version of the Windows 7 which I originally started using in 2009.

I also don't have to worry about a next big upgrade breaking anything or changing anything fundamental about the way the OS works (UI or API level).

During the time that I used Windows 10 on a work laptop (Lenovo Thinkpad T470p), I found myself struggling with an OS that was somewhat like the Windows which I was used to, but also beset with issues (configuring IME input is either half-broken, or I have to RTFM) and the limited options for Windows Update meant that other than deferring updates I had to suffer through poorly timed automatic reboots.

A few rolling releases later, I had a Windows 10 that had gotten more features shovelled into it (admittedly WSL is slick), but which mostly still felt half-formed and incomplete. This also isn't going to get any better, as each year's update(s) has to be seen as a new OS release, according to Microsoft.

I'm not sure I could commit myself to more of this mayhem that comes with this, other than maybe opting for the Enterprise version (in so far as it's available...) and hopping from LTS to LTS version by the time its support lifespan runs out.

This basically makes it very easy to just stick with boring ol' Windows 7 until 2020 (and beyond?) instead and miss out on all the exciting fun that Windows 10 brings.

This makes me sad, because Windows has been my primary OS since 98 SE.

Big bimmer bummer: Bavaria's BMW buggies battered by bad bugs


Crossing fingers

It's always fun to see a pile of CVEs for a system you actually did work on for a number of years. Curious to see whether the component I worked on was involved in any way or not.

Doesn't seem like BMW learned from the plain-text authentication issue BMW's ConnectedDrive system had a number of years ago.

As for QNX, it's not a bad OS, just horribly proprietary, expensive (got quoted 10,000 Euro/seat) and with incredibly outdated tools and environment (GCC 4.4.2 with Dinkumware STL on 6.5.x). Developing for it reminded me of using a Linux distro from more than ten years ago.

Wondering whether their Embedded Linux (Yocto-based) infotainment systems are similarly affected.

Have you updated your Electron app? We hope so. There was a bad code-injection bug in it


Desktop application framework?

Calling Electron a 'Desktop Application Framework' is like calling Google Chrome an 'OS'. In the end Electron is merely a fancy wrapper for an HTML/JS website. Like any browser it needs regular security updates lest it becomes the security risk its potential attack surface promises.

With Electron 'apps' being just websites which rely on remote APIs, allowing one to run arbitrary JavaScript in those browser instances seems like an exquisitely poorly considered idea.

Latest F-35 flight tests finish – and US stops accepting new jets


Re: This plane has been

I wouldn't put too much blame on the programming language used. We don't have the details on the exact runtime and configuration used for the software, as there are ways to make C++ far less tolerant of human fsckups and provide some level of runtime recovery. Of course Ada was designed from the ground up to brutally destroy any semblance of a 'bad idea' :)

That said, usually the way the project is managed is the first part of the problem. Add to that poor and shifting requirements, unrealistic deadlines, high turn-over due to burn-out and a general sensation of 'rushing through things' leading to demotivated developers.

Of course Ada is the better idea for avionics and related, and I say that as primarily a C++ developer. Yet there are few technical solutions to incompetent management.

Just be glad they didn't join the Java hype, I guess, or switch to NodeJS half-way through :)

Reg man wraps head in 49-inch curved monitor


You get used to it.

Many years ago (~2000) I opted for a Matrox G550 card just to be able to run two CRT monitors side by side. Since that time I have gone from dual monitor setups to my current triple-monitor setup with a total desktop resolution of 7,680 times 1440 pixels.

Having a central monitor is really nice for one's main content/editor/document, with the two other monitors for having documentation, references, simulation outputs, IRC windows, etc. on them. It's also great to have a single VHDL simulation's traces span across all three monitors to get that crucial overview when doing debugging.

The fun thing with using three monitors, and trying to use all three DisplayPort outputs on one's GPU (GTX 980 Ti), is that one discovers that three DP outputs is more of a gimmick, as regularly one of the monitors will desync because the DP hardware cannot keep up. Switching one DP output to HDMI fixed that. Using a single monitor would be nice there, but I like having the separation between the displays.

When doing drawing (drawing tablets), it's nice to have the tablet's surface not mapped to one massively wide monitor, but just to the center one.

That said, I have been tempted to try to play some games spanned across my entire desktop surface. Just because I can :)

Here's how we made a no-fuss RSS vulture app using trendy Electron


Call me old, but...

One of my current laptops which I frequently use dates back to 2007 (AMD X2), and it's zippy enough for just about any task. The only things the poor thing struggles with (even after doubling the RAM to 4 GB and adding a 7,200 RPM HDD) is anything involving lots of JavaScript and of course Java.

Running something like TweetDeck in a browser tab is a great way to bring Linux to its knees. Running an Electron 'app' next to the browser is a ridiculous proposition.

As a primarily C/C++ dev, I would have written an app like this in something like WxWidgets or Qt, have it cross-platform as well, and use less than 20 MB of RAM.

Now get off my darn lawn.

Unlucky Linux boxes trampled by NPM code update, patch zapped

IT Angle

To be fair

To be fair to those who had to reformat their Linux systems courtesy of this bug, I remember well from my previous job how incredibly hectic the pace is with JS dev. It's at the low-paid end of the market, so there's no real time (or budget) to invest in careful testing and development.

Add to this the enormity of the dependencies NPM downloads, the incompatibilities between NPM versions (e.g. the modules) and the commonly seen attitude of 'it worked on their system, so it should work on mine'.

As a result, JS dev all too often results in a frantic mashing of keys, repeatedly removing the node_modules folder, changing NPM versions, repeating 'npm install' until hopefully things Simply Work Again (tm).

I cannot say that I would care to ever return to that world.

*Wakes up in Chrome's post-adblockalyptic landscape* Wow, hardly anything's changed!



There are moments when I browse the Intertubes without an adblocker. Last time was when I had received a new work laptop and I was trying to just be productive on it (not browse dodgy sites, promise!).

Took me about a week before I broke down and went the full NoScript, uBlock Origin route on that system as well. That dodgy sites have horrific ads (hello, ThePiratebay!) is a given, but for software & hardware dev-related sites to be also affected by the same ad-network-induced plague made this easy.

Add to the visual cancer and blood pressure issues the fact that ad-networks (via JS) are also the primary attack vector for malware and kin, and ad-networks should be paying me to vet the safety of their ads, since they are seemingly both too disinterested and too incompetent to do so themselves.

F-35 flight tests are being delayed by onboard software snafus


Re: C++?

Ada is used for things where any glitch would be absolutely, completely, totally, beyond any shade of doubt inexcusable. Things like an airliner with hundreds of passengers falling out of the sky, an ICBM veering off course and circling back on its country of origin, or a satellite worth millions vaporising due to the rocket launching getting confused.

A military jet going SNAFU or worse is deemed 'acceptable'. I would bet that the scope of the F-35 programme is also partially to blame, along with the allocated budget. Ada programmers (I pretend to be one occasionally as well), tend to be pretty rare, let alone ones who know the language inside out. Unless you're going to shell out the big bucks to get enough of them, you have to settle for the cheaper, more plentiful options. Which is going to be C or C++.

Before the Ada requirement got dropped (as noted), they'd simply have to get enough Ada programmers together, now they can get cheaper devs. And likely cheaper managers as well. Speaking as a senior C++ dev and dabbling Ada dev, I can honestly say that C++ is just fine if you stick to the (well-defined) requirements and follow a well-designed architecture during implementation time. Ada mostly catches the 'oops' issues which tend to slip through the cracks with C/C++ dev (and other languages...).

A long-winded way to say that it's not the language that's the problem, but primarily the management that has to turn a set of requirements into a functional design and implementation.


Re: I can understand....

C++ isn't always C++. It primarily refers to the specification, but the implementation can differ significantly. Look at managed C++ on .NET, for example, or the various runtimes which add runtime safety features which makes it more akin to Ada. Without more details it's hard to condemn them on this fact alone. Be grateful that they didn't pick Rust or JavaScript at least :)

That said, I agree that using Ada would make sense here, mostly due to the strict definition of the underlying hardware in the application, which theoretically makes it fool-proof (until a better idiot is encountered, of course).

Crypto-jackers slip Coinhive mining code into YouTube site ads


Static HTML is the best

For years now I have gotten used to browsing with ads and JavaScript disabled, only (temporarily) whitelisting a URL when I absolutely need to use a feature on that site. Lots of websites do however simply not work without JS enabled, often displaying just a blank page or something equally useless.

With Meltdown and Spectre we saw JavaScript-based PoCs which can (potentially) steal private data, including passwords and the like. JS is also a common infection vector for malware, which most recently saw this crypto-coin mining thing become popular.

The last time that I felt that I could browse safely around without having to install a lot of filters to remove all the (dangerous) crud from today's WWW was probably somewhere in the late 90s.

Did the advertising thing just go totally overboard? Is it time to treat JavaScript-based sites with the same disdain as we did with Flash?

Wondering where your JavaScript libs went? Spam-detection snafu exiled npm packages


The better way

One thing which amazes me so much about the JavaScript and Java ecosystems ('swamps'?) is that their idea of 'dependency management' involves pulling libraries and snippets from all across the internet. Whether it's NPM or Maven (Nexus), the issue remains the same. Good luck compiling that 1-year old project because some crucial dependencies no longer match or have gone AWOL.

Or hey, maybe you fancy compiling/building a project somewhere where pulling down half a GB of dependencies won't go over so well?

In the very comfortable world of C/C++ development, libraries usually take the form of a few MB (or kB) of source and a DLL/SO file if it's particularly big. You get them from one's package manager (Aptitude, Pacman, etc.) which is a repository curated by the same people responsible for your OS (or close enough, like with MSYS2).

Of course, the Java ecosystem seems to be currently moving to a dependency-less environment, minus one: the entire Spring universe. As soon as Java on Android is dead, they can probably just drop the 'Java' name and go with just 'Spring' :)

Oracle VP: 'We want the next decade to be Java first, Java always'


Design-by-committee languages suck

No unsigned integers (now awkwardly crammed back in), only Big Endian support, everything is a reference and a potential Null type (argh!), the horrors of the JNI, no multiple inheritance, no pass-by-reference, no default parameters, no defining of custom operators... etc.

Speaking as a primarily C++ dev who also does work in other languages (including Java, both web platform and Android), I can honestly not wait for the day that Java just goes away. It's such a horribly designed, flawed and broken-by-design language that simply should not exist, let alone be praised in any way or fashion. Somewhat like JavaScript, but I digress.

Here's hoping that Google will indeed be dropping Java soon as a first citizen on Android and we can observe the slow demise of a language-that-never-should-have-been :)

Java? Nah, I do JavaScript, man. Wise up, hipster, to the money


Re: Crippled C++

"Java IS simple."

The hours I have wasted debugging issues with the JVM's GC (Android and server), as well as untangling multi-layered levels of exceptions, fixing NullPointerExceptions (Null types are a design flaw), and discovering code flow in a Java application using interfaces and injection have made it clear to me that Java is not 'simple'.

If anything, Java lends itself for the writing of applications which are even less maintainable than if they were written in any other language. I have had more fun debugging C++ code which made heavy use of templates.


Crippled C++

After a number of years of programming Java, I have come to realise that Java is merely a regression in programming languages. It's essentially C++-with-only-classes-and-fewer-types. Oh, and you only get pointers (the basic type, no smart pointers) with no fancy pointer arithmetic to ease the pain.

I can get why COBOL is still doing well today. It's a well-designed language which went through literally decades of improvements to add to its versatility. Much like with JavaScript I have no clue why Java is even popular, however, beyond it being pushed in an unrelenting fashion (oh hi, Oracle!) and having become part of the CEO Buzzword Bingo.

But hey, C++ is just way too bloody complex (said whilst snickering).

Biting the hand that feeds IT © 1998–2019