Back when I were a lad, "IT" was mainframes and shadow IT was PCs and Sun workstations. Actually my first job was to look after some Apollo workstations that had been bought _after_ a run-in between a bunch of engineers and the corporate IT dept (corporate IT insisted on getting an IBM mainframe that was unsuited to number-crunching, and then charged royally for processor time).
It's one of the eternals of "the business" - IT depts need absolute control to run systems efficiently (= cheaply, really) but because it costs a lot of money to establish that much control, it only happens when technology is mature. So the empire of control is vulnerable to new technologies, especially from vendors who do not have a legacy money-spinner that is at risk of being killed by new & more nimble technology.
There's no point to whingeing about the situation; there is no solution, only mitigation. i.e. monitor the new technologies that are appearing and look at how they might be valuable to the business; and make a plan to work with early adopters to find out the best way of assimilating these technologies into the existing infrastructure. Resist the temptation to tell all the users to wait for the supposedly equivalent product from $INCUMBENT_BIG_VENDOR, for it will always be compromised to the advantage of their legacy products.
One of the biggest problems with "the thrill of the new" wasn't really mentioned in the article, namely the problem of divergence. e.g. the multiplicity of cloud storage offerings resulting in siloes of information and wasted end-user hours as users manually try to keep track of what information is where. That is probably the best reason of all for IT departments to engage with early adopters: at least get agreement on no more than two competing solutions that will be evaluated. Those early adopters can be your friends, as they have tremendous energy and power to influence other users.