Yes - we probably need to increase the local capacity... but it's also possible to have devices respond to local voltage, rather than just to grid scale DFS. And that could be very easy way to improve the distribution of demand... as we move to more and more "delayed usage" type loads (hot water tanks, storage heaters, EV charging, home battery charging) then those devices can actually drop their usage when local conditions require it.
Indeed some of them (specifically home batteries and EVs) can feed the local network, actively lowering the strain, and then recharge at a time when the local grid isn't as strained.
That's clearly not going to "solve" all the local network issues, but our current domestic usage is really peaky, e.g. cooking dinner tends to be at about the same time for most households.
We can shape much of that usage by time of day tariffs - Obviously E7 and E10 did that decades ago, and many ToD tariffs still follow the same process. But what you then end up with is everyone putting all their high load devices on at exactly the same time, to take advantage of that cheaper rate.
For something like an EV - they spend ~23 hours a day parked. You don't really care when it gets charged, just that you keep it topped up enough to account for typical daily usage - or that it can be filled for a special journey. Similarly you don't really care when your water tank is heated, just that you have hot water in the tank when you want it.
So local conditions based "avoided this time because of local conditions" credits (of some sort) would allow for the demand spikes at half time in sportsball to be much less of a drain on the grid, because all the kettles get put on, and high power devices can simply go "you know what, voltage is dropping a bit, I'll pause for a few minutes". Some of them could even export for a higher credits.
The result is actually beneficial to the grid, and no significant effect on users.
My peak half hour power draw from the grid last year was 14.5kW, but my average was well under a kW.
Smoothing out that curve is entirely possible, but at the moment it's very beneficial to be fully lumpy - only 5% of my electricity for the year was used at "peak" rate... I actually don't care when peak/off peak is, I just schedule things to use the off peak whenever possible.
If I assume I can replace gas with a heat pump at a SCOP of 3 (very conservative) then my electricity average usage would still be under 1.5kW, but again it would be lumpy, and more seasonally lumpy than it is at the moment - January would have been at about 2.2kW for instance.
There is no way you're telling me that the grid can't cope with a third of the country boiling a kettle at the same time.
UK electricity consumption is actually falling, despite increasing electrification:
2022 figures
"Electricity demand reached a record low in 2022 of 320.7 TWh, down by 3.8 per cent from 2021. Electricity demand has declined year-on-year since 2015"
If you look a bit further back, that decline has actually been ongoing since 2005 (where we used 406 TWh) Table 5.1
So we know the grid can handle more than it currently does - by more than 25% - because it's already done it.
(Note that the effect of additional substations for new developments and towns isn't being taken account of here - those will spread the load even further, but I'm ignoring that).