This, and the next gazillion exploits, are the result of this simple recipe:
1) Take a human - any one will do. They all screw up.
2) Take a language. They all have their flaws, but pick one that doesn't seem to give a damn what you do with memory, like C++.
3) Blend and wait.
It's the 21st century now, and, as someone who was coding C++ when there wasn't even a compiler for it I just wonder why it and other languages with similar flaws still being used so much? Sure, there may be a small percentage of situations where the bare-metal speed is worth it, but when you're writing software that will be deployed on a significant proportion of the devices in existence, using languages that make things hard to test and that so brilliantly hide the mistakes of us fallible humans seems positively stupid.
Can we stop now?
(I will not suggest another language. I've learned about 5 in the last year alone and I'm exhausted. Please agree amongst yourselves and I'll learn that one!)