Forth has been doing it the exact way he says is terrible for 40 years. When writing an application on an embedded Forth system, the application and the kernal are at the same level. There's no protection whatsover. You're free to f**k up with your poorly written software in any way you want.
Forth has been used in countless space experiments on the shuttle and other space systems for decades. IIRC 10 of the 12 CPUs on the Philae lander and orbiter were Forth CPUs. It's also been used in most of the worlds observatories (controlling radio telecsopes) for years.
Forth is an amplifier. Badly written code shows up real fast as badly written code. However, you *can* write code right on the hardware and it can work just fine. It just takes discipline and good procedures and management. It can be done. It has been done.
All that said; he has a point. These walls between OS and application software are necessary, because software *is* buggy, and software does crash. OS's are buggy too. Part of the problem is simply down to the complexity of modern OS and application software. When a Swing library in a Java program is rendering it's window on the screen and painting its buttons, putting text in a text box etc, how many levels of abstraction are there between it and the graphics hardware? A thousand? Two thousand?
If we want more reliable software, we have to write simpler software.
Forth, which is still around and still used, takes all that away. It is simple enough that (as in my case) the entire workings of the Forth kernal can be understood and held in the head of one person (I should, I wrote my own Forth system) and by extension, the applications written in it, too.
To be fair, the applications written in Forth are vastly simpler than those written on contempory PCs. We tend to write on the metal, in deeply embedded or industrial control environments, where software can be much simpler, and the only code in memory is the code that *you* put there, because it is specifically needed for something that you understand. PCs have the entire kitchen sink in memory and anyone of them could go wrong.