Lets stop acting like miserable old gits
People have been complaining about modern tools encouraging poor development practises ever since the first high level languages and compilers were developed - probably even before then. But wasting endless developer time over trivial details is by far the greater problem.
If you criticise developers for every problem that is obvious now, you create an environment where everyone wastes time worrying about hundreds of possible future problems that will probably never occur. The codebase gets overcomplicated dealing with all these non-issues, and maintenance gets harder as a result.
Personally, I was told a lot in the classroom, but it didn't mean a lot. I really learned through experience - ie by seeing the results of my own and other peoples mistakes. The real value of the teaching was probably to help me recognise the true nature of those mistakes.
Where the same mistakes are relevant using modern tools, the modern tools aren't at fault for 'hiding them'. In what sense are they supposed to be hidden? For instance, the problems using floating point in Java, C++ or Delphi are no different to, and no more hidden than, the problems using floating point in Fortran or whatever. But its so easy to forget that we had to learn from our mistakes too.
You think that's not the issue? You definitely remember being taught the problems with floating point in some classroom in the seventees, and never made that mistake ever?
Well, what about all those lessons you were taught all that time ago that turned out, with experience, or just as times changed, to be wrong or trivial and unimportant?
I remember a teacher making an endless fuss about flowcharts. No design was complete without a flowchart, that had to describe every single detail of the code, because after all the design had to be directly and mechanically translatable to code.
You know flowcharts - every arrow is effectively a goto! And they are such low level representations that, used as that teacher intended, they aren't design at all - they are just graphical code.
Basically, he was telling me never to hack together real testable code but rather to hack together untestable graphical code and expect it to magically work first time. In general, the best way to keep him happy was to write the real code behind his back and translate the working result into a flowchart. Don't tell me you never did the same!
But then he'd never worked on more than a few hundred lines of code at a time, as far as I can tell. He basically expected to work on one well-specified subroutine at a time, flowcharting no more than two or three loops, ready to code it in assembler. And he thought compiled high level languages were a fad that would eventually be restricted to a few niches, insisting that there was no real gain since 'a single line procedure call maps to a single line jsr call anyway'.
The point being that he probably became a teacher because he was already a bit of a dinosaur even then, and struggling to get development work.
So maybe you suddenly recognise the value of learning from experience, and of not obsessing over everything you were told in the classroom. A degree of scepticism is healthy, and you learn what is really important by doing the job and making an occasional mistake.
The world would be a strange place if the more experienced didn't know things that the less experienced don't. It's nothing to get angry about. Just be glad about it, in this age of change when the less experienced have an annoying habit of knowing things you don't.