Gee, you think?
The mind absolutely boggles ... We groked this back in 1979.
Marketards & Manglement, in the current era, are a clusterfuck waiting to implode when it comes to computers & networking..
Innovations in mobile and cloud computing, social technology and the use of "big data" present an emerging risk to organisations' IT security, experts have warned. The European Network and Information Security Agency (ENISA), which is an EU advisory body, said that those technologies would increasingly provide the platform for …
@DAM: Look up the IBM 5100, from 1975. Xerox's NoteTaker followed in 1976. The particular machine I was discussing was GM Research's Micro Star, from 1979. The place where I worked in that timeframe (Bigger Blue) recognized the issues that such a tool would add to corporate, government and military security when combined with new-fangled electronically coupled modems.
@AC10:31: We had flight simulators running in 64K before the original IBM PC was released. "Big" is relative. Corporate data was corporate data even during the Industrial Revolution.
The more we forget history ...
The majority of definitions of Big Data do refer to the size aspect
What you have given an example of there is unstructured data which is another issue and can be seen as a sub set of big data.
Personally I think that big data should be avoided in a proper technical discussion, it's definition is far too loose and size is relative
"The majority of definitions of Big Data do refer to the size aspect"
No, they usually refer to "large data sets" which usually means lots of data rather than large data. For instance an exabyte file system with 1 piece of data is not a big data problem because (assuming serial data) you only need to process 1 file and you can't split it into multiple workloads. Instead this would be a traditional data problem easily solvable by increasing throughput. It's generally coincidental that "big data" tends to be large data, but usually you'd have information which can be processed in parallel or many instances of data leading to the "big data" problem. The problem with big data is processing it in a reasonable time frame, and not increasing processing time exponentially with data set size as you would using traditional analysis techniques.