Re: Far too late to worry
" That's over 20 years of data that "they" have about us."
1984 was nearly 30 years ago.
However the ability to collect, aggregate, and analyse such data in bulk was difficult for most of that time. It was only after about 2000 that Wireshark on a PC started to outperform the specialist network analysers. The latter were still dependent on filters to limit the amount of traffic they had to store and analyse.
I remember attending a seminar where a manufacturer unveiled their new, and very expensive network monitor. It only had enough memory to capture no more than a few seconds of traffic from one link - at a speed that wouldn't even suffice for home broadband these days. The salesman's solution was "You have to learn to use the filters". My reply was that if I knew what filters were relevant to the particular case - then I would already know what the problem was.
By about 2010 the network traffic at node points in a company network was again outstripping the available monitors' ability to capture it without connection filtering.
About that time I looked at the big brute arrays that were coming available to do mass collection. It still raised the question of how much computing power was going to be needed to analyse and correlate all the TCP-IP connections to keep up with the collection volumes.
It is probably still a race between network link data collection and the growth in networked traffic over multiple delivery paths. The monitoring is probably losing.
The place to capture data is at the firewall where an abstract can be obtained that is already defined by its higher protocol layers. No doubt firewalls would need to be much bigger beasts to handle even that.