"4K" kinda probably not going to need that much data
I mean, yes higher Ethernet speeds will most likely be needed, even if it just means that your datacenter can be packet more efficiently over time. If you can replace 4 servers with each having a 10G connection with one with a single 40G that will mean you can do much higher densities.
However "4K" is just another video resolution. Since the advent of block based video codecs the resulting bitrate does not grow linearly with the number of pixels you want to transmit. You can see that with typical TV transmissions. HDTV roughly requires 4 times as many pixels per second than SD, however while SD typically runs at 4-10 Mbps, HD rarely is done at more than 13 Mbps on TV. It's just that there is not that much more relevant information in the picture to be encoded. "4K" will likely continue the trend and I'm guessing it'll be transmitted at around 15-20 Mbps. Better codecs will help with that, too.
So in short there are probably many reasons for 400Gpbs, but 4K probably won't really be among them.