True dat
I've long been of the mindset that all computing is just operations on data, whether it's stored in a database or stored in memory (indeed, in-memory databases are bringing us full circle)
Data was born around 20,000 years ago, around the time the last ice age was at its peak and Cro-Magnon man was appearing in Europe. Data was made both by those early humans' minds and these humans’ ability to store facts outside their brains. Why the human mind? It is because data doesn’t exist outside the context of the mind …
Computing can be reduced to a number of formalisms. Saying "it's all about data" is no more insightful than saying "it's all arithmetic" or "it's all function construction and application" or "it's all compression" or "it's all moving heat around".
In fact, from that set, there are probably more interesting insights to be derived from, say, "it's all compression" than from "it's all data". In particular the rather dubious analogy between tally sticks and writing on the one hand, and structured and unstructured data on the other, that Whitehorn draws, while a decent bit of parlor discourse, seems rather too shallow to illuminate anything significant.
And of course the history of forms of data-keeping was hilariously abbreviated, when not outright wrong, but what can you expect from this sort of article?
"Try to think of a computer application that doesn’t manipulate data. It is a crucial test because it is impossible."
Muhammad Ali said "Impossible is not a fact. It's an opinion. Impossible is not a declaration. It's a dare. Impossible is potential. Impossible is temporary. Impossible is nothing.”
The nul device driver.
The Windows System Idle Process.
"
Those example do manipulate data as if you look at them at the assembler level as those instructions do contain data (location for jumps if nothing else). I guess it depends on your definition of both computer application and data.
"
They may *contain* data, but do they *manipulate* data?
Actually, the real riposte is that there are some computer applications that don't manipulate very much data, and computing rather than data manipulation is the most important thing that they're doing.
The earliest computers - excluding Colossus from consideration - were thought of basically as programmable scientific calculators, and that's how they were used. Some programs would take long lists of numbers as input (and thus they clearly did deal in data to a significant extent) - and others wouldn't.
But in business applications, data was relevant before there were computers - companies kept boxes of punched cards for transactions and for customers. And dealing with data was painful before random-access storage (the disk drive) came along... and once it became available, then it made sense to start thinking of how to make better use of its capabilities. And then various early types of database (hierarchical, Codasyl) came along before the relational database was invented.
Guess what? I'm sitting at home watching a documentary called "Sand Wars", which argues the world is running out of sand. All the construction using concrete and asphalt is the big reason. Apparently desert sands is too eroded to be used for this.
SAND?! We're running out of SAND?! I think we're screwed.
>Silicones are polymers that include any inert, synthetic compound made up of repeating units of siloxane, which is a functional group of two silicon atoms and one oxygen atom frequently combined with carbon and/ or hydrogen.
Wikipedia is often full of shit (religion, politics, etc) but it tends to get basic chemistry right. There is a reason the word silicon can be found in silicone.
Two things that should be added.
1) The first true Big Data system I can think of predated computers. The Dewey Decimal System (and similar schemata) analyzed books to classify their content in ways that would allow similar books to be grouped together on shelves. Not much, but still the first attempt at systematic, ideally repeatable, Big Data analysis.
2) ADP and EDP were two significant steps. Automated Data Processing, beginning with Hollerith's census machines, created the first machine readable and machine manipulable databases on punched cards. The databases were tabular and unidirectional - while you could use collators and sorters to select and order the stack, the final accounting machine run went sequentially. Electronic Data Processing (i.e. what we now think of as computers) allowed databases to be accessed in any order. EDP also allowed the creation of databases accessible via indices which was key to developing relational databases.
As always, there are long chains of ideas and innovations in this domain that lead to the present. Joanne Yates' highly readable study of business communications, Control through Communication, describes the major innovations in business DP in the US during the nineteenth century, for example. The transitions from pigeonhole desks to chapbooks to flat filing to vertical filing were hugely important, as were those from manual copying to spirit duplicators to carbon paper to xerography; and those from quills to fountain pens to typewriters. (And, of course, none of these transitions were abrupt; we still have spirit duplicators and mechanical pens, etc.)
Foucault's The Order of Things, though controversial, documents many of the epistemic changes in Early Modern and Modern Europe regarding the nature, manipulation, and organization of information. There are numerous competing views, of course.
If memory serves, the first episode of James Burke's Connections (and first chapter of the follow-on book) concerns the invention of the modern digital computer, hitting such now-well-known highlights as Jacquard's loom and player pianos. While it doesn't have the depth expected of an academic treatment, it's good fun and enlightening if you aren't familiar with all the bits he discusses.
And so on. I have shelves' worth of books that touch on the subject, and it's not one of my research areas. Well-trodden ground.
This post has been deleted by its author
Yes. Visi-Calc was the original "killer app" which got PCs (mainly the Apple ][) into small business.
For importance in legitimating PC use in business, its only real competitor is probably the IBM PC, which drove PC adoption by larger businesses thanks mostly to the IBM name, and secondly to its ability to replace 3270 terminals on management desks with a more-functional device (because it could both be a 3270 and run spreadsheet and word-processing applications).
It's not clear that many people actually used PCs as 3270s (until much later when TCP/IP stacks and TN3270 clients became common), particularly since third-party hardware was required until the 3270PC came out, but it was one of the marketing points IBM emphasized when selling them to large businesses. It made the IBM PC seem "serious": you could use it to run your mainframe-based management apps.
This post has been deleted by its author