Wow, I was only just reading MACH1 yesterday as I was on a nostalgia fest.
As soon as Google have this perfected I will have one, as long as it includes an ad-blocker
is an occasional column written at the crossroads where the arts, popular culture and technology intersect. Here, we look back at 2000AD's MACH 1 - the first secret agent with his own, in-body computer. In 1977, Pat Mills, the first Editor of 2000AD comic, created MACH 1, a strip telling the story of John Probe, a super- …
>(and it had rounded corners)
Not in Bob McCall's concept art it didn't. In the film it is hard if the top corners were rounded (they sit against a desk of the same colour) but the lower corners, under what look like hard buttons, weren't rounded. Tch, silly Kubrick not knowing IBM would sell their consumer division to Lenovo, and that Pan Am would go belly-up...
In one of Iain M Bank's Culture novels, the habitat's AI requests that a message be passed on: "Sorry to trouble you, but you're closest... would you kindly inform the ambassador that he is speaking into his broach?". A gentle dig at the Star Trek: NG communicator, perhaps?
If you want the genesis of the fondleslab, read Niven and Pournelle's The Mote in God's Eye
Oh, please. While TMiGE is a good read (like most of N&P's efforts), it's hardly the first SF to feature handheld computers. Even the Stratemeyer Syndicate's Tom Swift Jr series of children's SF had them; Tom Swift and his Phantom Satellite (1954) includes the invention of a handheld computer called the "Little Idiot". The LI's capabilities are a bit vague (like most of the "science" and technology in the books), but it features voice recognition, among other things, so you could claim it anticipates Siri if you want to indulge in that sort of silliness.
For that matter, Niven's own 1967 short story "The Soft Weapon" features a handheld computer; it's not a slab, but that's because it does other things. It's sort of a Swiss Army Everything.
My point is that the general idea of handheld computing has long been floating around in SF, in many forms. Crediting one novel - particularly one from as late as the mid-70s - with it is purely arbitrary.
 My favorite example of computer speech input in pop culture, though, has to be the Batcomputer from the 1960s Batman TV series. It used speech input, but for output it would spit out a punchcard, which Batman would read. In SF, often speech recognition is easy but speech synthesis hard.
"Heinlen had mobile phones in space patrol :) "
The neat bit is he does not make a point of it being a mobile.
It's no big thing, so nothing to notice unless you're looking for it (its the future), unless you understand (as he did) just how big a thing it would be to make it work. This is an era when computers are running on valves and only on enormously important (or secret) projects, not running a telephone system.
Fahrenheit 451 is probably the most predictive bit of SF I've read (although the author claims the majority of his work to be fantasy).
Take for example how hard it was for Guy to get his wife to communicate with him while she had 3 screens on the go - a mixture of friends and family shouting at her (facebook et al), redundant semi-interactive tv shows (take your pick of x-factor or WoW farming I suppose) plus an inability to go more than a few feet without some sort of tech stimulus (in ear radio in the story, but concept of people having withdrawal symptoms when offline is definatley there).
Of course he misses flat out in other respects - such as the extremely elongated advertising boards to be easier to read while high speed driving and so on. I'll leave it up to you to decide if the idea of over simplified, dumbed down, governing a country by lies, half truthes and hoping the warm fluff will make the populace doscile ever came to be...
I re-read Fahrenheit 451 about six months back... like you, I was struck by the video walls. The other thing that struck me was that in the novel bored teenagers would go to theme parks to smash glass, or else harm pedestrians just for shits and giggles (akin to 'happy slapping'). At the time I had custody of disused factory in which were storing old junk and furniture... kids broke in and smashed every piece of glass and mirror they found.
MARTIN: We could use the money to build a library of sci-fi, Asimov, Baxter, Clarke
BART SIMPSON: Hey, what about Bradbury?
MARTIN: I am aware of his work.
Obviously Bart was an aficionado of fiction in which youth run amok... he has been known to dress as droog for Halloween.
My favourite is another of Ray Bradbury's, the circa-1950 short story "The Murderer". It's set in a near future where everyone is constantly in touch and constantly bothering each other with the minutiae of their lives. The story itself is an interview with a man in custody who's finally flipped and smashed up the talking technology so he can have some piece and quiet. Replace "wristwatch radio" with "phone" and it's a remarkably prescient piece of work.
Intel, back in the 1960s when the first microprocessors were being created, thought its future lay in memory chips.
This might point the way forward for an aging population beset with memory problems. While upping brain-power is a fantasy for many folks, treatment or mitigation of afflictions that are currently intractable would seem to be an obvious direction for more immediate research. Intel might have been right, just ahead of the curve.
It's a fascinating idea. Using hardware to store memories that can't be held by human memory due to conditions like Alzheimer's...the next step would be using processors to replaced damaged neural tissue but we're hundreds of years away from trying something that complicated.
Hundreds of years? I think decades is far more likely. Hundreds of years back from today, the fastest anyone had ever travelled was the top speed of a horse, and the most complicated device of the day was probably a clock - which was rarer and more expensive than a spacecraft is today.
When you think of the enormous strides in science and technology we've made in just the last 50 years, I think it's quite reasonable to say that in another fifty years we will be able to create and implant artificial nerves and memory.
My grandfather was born before powered flight had been demonstrated, and died after we walked on (er, not me *personally*, you understand) on the moon. The change of pace doesn't look like slowing down, unless it's swamped by spam...
p.s. I'd guess the cost of a clock was perhaps closer to that of a current car, not a spaceship - depending on complexity, of course - there will always be people who have to have the latest and best...
" The change of pace doesn't look like slowing down, unless it's swamped by spam..."
Sorry, but you can't go measuring progress as a linear, one horse race: In about half a century years we've gone from the Wright brothers to moon landings. But in the following half century, what new frontiers has aviation technology broken through? The race is more of an unknown number of horses, some appearing and disappearing at random, all heading at varying speed, (sometimes backwards) in various directions.
The challenge is not in predicting when we'll have our own flying cars, but if they'll just be bypassed by some other technology that removes the need altogether.
New Scientist had an article on rates of technological change about 2 years ago. Rate of technological change has slowed. Last great change in physics ? Einstein. Biggest transport change ? railways. Biggest communications change ? Telegraph. Everything else is improvement and convenience, not major change. ie going from a life travel radius of 20 miles, baring the occasional war and army travel to peasant affordable 300 miles in same time. Comms going from weeks to cross continents to minutes. Radio merely improved bandwidth. In summary, greatest technological change window was around 1845 to 1880.
As for MACH1, never seen it, but it is years later than first story I read. In a 1936 Boys Own Annual ITIRC. No school, just installation of an embedded computer into the characters brain so no need for voice. It read and fed directly into brain.
I was just wondering if perhaps people had reached faster speeds on boats rather than horses, but no.
Horses can reach approximately 50mph, and while a modern sailing vessel can reach higher speeds than this, nothing more than a hundred years old would have been able to reach the same speeds as a fast horse.
Interestingly this person has worked out the speeds of various ancient voyages (mainly around the med):
Can anyone think of a non-lethal way of moving faster than a horse using technology older than (say) a century?
"It's a fascinating idea. Using hardware to store memories that can't be held by human memory due to conditions like Alzheimer's...the next step would be using processors to replaced damaged neural tissue but we're hundreds of years away from trying something that complicated."
Human brains neither store nor transmit information the way computers, microphones, cameras or speakers do. Despite decades of work on implantable electronics (including the ethical issues. See Creighton's "Terminal Man" for some of the problems) it appears they only started trying to read the output of the optic nerve last year.
The idea has great potential for both good and bad outcomes. But it's a long way from here (although perhaps not as far away as people might think).
Remember you can train the brain ... so the interface may not be as tricky as you think, the wetware is much more flexible than the hardware. For instance, with the brainport, a 2D array of electrodes is placed in the mouth. It doesn't take too long before blind people wearing one of these devices report a phenomenon very much like seeing.
My favorite book when I was a child was Tom Corbett Stand by for Mars. I must of read that book 30 times before I was 10. Despite being wrong on many aspects(Canals and breathable atmosphere on mars) two things I do remember were at the start when he has to give up his mobile phone before entering the space academy and complaining he couldn't use a shave mask to shave with.
Of course we are still waiting for shave masks, but they sound like a great idea, but the concept of a mobile phone was a good one for 1952. This may of had something to do with the great science writer Willy Ley providing the technical direction. Of course mobile video phones were foreseen in Dick Tracey before then.
One more thing. Despite Star Trek's good science predictions they constantly failed to foresee the advances in computers. Even their best computers seem slightly old fashioned compared to what w have today. For example I can never work out why the photon torpedoes are so dumb, often missing the target the size of a starship
" For example I can never work out why the photon torpedoes are so dumb, often missing the target the size of a starship"
Well, first of all, space is big. Really, really big. Compared to space, a starship is the equivalent to that speck of sand in your shoe. Sure it feels like a boulder, but when you actually dig it out, it's tiny.
Second, starships are fast. Really, really fast. The clue's in the name: star - ship. Because of point 1, stars have a lot of space to move around in, and they're not really social to begin with, so they've drifted quite far apart. So a ship that travels between stars in any reasonable amount of time has to be able to really move.
'For example I can never work out why the photon torpedoes are so dumb, often missing the target the size of a starship'
Well they don't travel at c, so if your scanners can pick one up, it's not going to be too hard to avoid being hit by one unless the ships are right next to each other - which they always are, since ST combat as with most telly & movie SF is always done at human speed and at naval-style close quarters. Look at the BSG reboot - everything is up close & personal since even the use of unguided relativistic weapons over large distances renders them pretty easy to dodge.
For an idea of the issues involved in near-relativistic combat, there's a superb sequence in Alastair Reyolds 'Redemption Ark' where two ships travelling near-c engage in pursuit combat (i.e. one flinging out stuff from the back for the other to crash into).
If you want some properly mental space weaponry, read 'Debatable Space' Philip Palmer.
"For example I can never work out why the photon torpedoes are so dumb, often missing the target the size of a starship"
Because Gene Rodenberry was modeling Star Trek after naval warfare circa WW1, with battleships firing broadsides at each other, and maybe lobbing stupid and slow torpedoes at each other.
It took the first Battlestar Galactica series to "modernize" space warfare to WWII, with carriers harassing each other via their fighters. (I'd put Star Wars at early WWII, before the importance of the carrier was really known.)
Nobody that I'm aware of has done a movie with "modern" battle styles.
Never mind the photon torpedo guidance, the Star Trek computer's removable storage was crap too: According to an episode I watched not long ago, a perspex slab about the size of a pack of playing cards only held a couple of kb of info, judging by the way Spock was swapping them in and out of the computer, every time he wanted look up a different piece of info.
The one benefit of having the computer internally, as opposed to in a phone, is that it's a bit more difficult to lose or to otherwise take from the agent.
They'd also make ideal Black Box Recorders when the technology improves, potentially making a recording of sights, sounds and even thoughts, and even have the system respond to those.
There's a nasty thought, give it another couple of decades and you could have a breed of suicide bomber wired to explode on a signal they don't even know about with explosives inside them, not sure how they'd get around the explosion being muffled by the body itself, but if there is one thing mankind is good at, it's figuring out how to blow things up effectively.
An al-Qaeda suicide bomber was sent against the Saudi interior minister a few years ago. The bomb was ... rectally implanted. In the event the bulk of the body did muffle the blast, as you say, but if it was surgically implanted at the front with just a layer of skin it might work better. The Reg just recently had an article about a female cocaine smuggler with fresh wounds in her breasts holding packets of cocaine. I suppose we can only hope that the salafists' revulsion towards women prevents them from using the bomb breast implant.
It is in IT though, a simply complex mind game adjustment for capture of SMARTR Quantum Networking Rail Roaders for Refreshing New Moves with Enlightening Ideas Confirmed Up and Running in this Virtual Reality in Space.
Where do all words we want to follow with fabulous deeds, come from? What is their ravenous attraction targetting?:-)
Marvin Minsky (made the first head-mounted display, neural net and confocal microscope, is name-checked in 2001: Space Odyssey, lauded by Asimov as well) co-wrote a novel with Harry Harrison (The Stainless Steel Rat, nuff said) about integrating a neural net with someone's brain, called The Turing Option. It is framed as a thriller, but the AI integrates itself with the protagonist's brain because he has suffered a traumatic bullet-related head injury.
It's alright. Might seem a tad dated now. There again, who am I to judge?
You know, a black box recorder for humans would be an extremely useful thing, from both a medical and crime investigation point of view.
Yes, and these aspects have been explored in any number of SF stories. See for example Doctorow's Down and Out in the Magic Kingdom, where brain recording is part of the "cure for death" that his future society enjoys. Criminal forensics come into play as well.
It'd be really handy for a totalitarian government, too.
Piezo-electric mesh, incorporated into all the major muscles, with conductive plastic 'wires' feeding to an energy storage device. I read that years ago in a sci-fi story that I can't remember much else about. I think the 'hero' had a small but powerful laser device in his middle finger, intended for one-off emergency use, since he had to have surgical repair to the tip of his finger after he used it.
Whatever you can think of, it's probably in a sci-fi story somewhere.
Anyone who has read Greg Egan's "Quarantine" knows what's what. Here is the start, and I suppose one has to read it in a "hardboiled" voice
Only the most paranoid clients phone me in my sleep.
Of course, nobody wants a sensitive call electronically decoded and flashed up on the screen of an ordinary videophone; even if the room isn't bugged, radio-frequency spillage from the unscrambled signal can be picked up a block away. Most people, though, are content with the usual solution: a neural modification enabling the brain to perform the decoding itself, passing the results directly to the visual and auditory centres. The mod I use, CypherClerk (NeuroComm, $5,999), also provides a virtual larynx option, for complete two-way security.
However. Even the brain leaks faint electric and magnetic fields. A superconducting detector planted on the scalp, no bigger than a flake of dandruff, can eavesdrop on the neural data flow involved in an act of ersatz perception, and translate it almost instantaneously into the corresponding images and sounds.
Hence The Night Switchboard (Axon, $17,999). The nano-machines which carry out this modification can take up to six weeks to map the user's idiosyncratic schemata — the rules by which meanings are encoded in neural connections — but once that's done, the intermediary language of the senses can be bypassed completely. What the caller wants you to know, you know, without any need to hallucinate a talking head spelling it out, and the electromagnetic signature at skull level is, for all practical purposes, inscrutable. The only catch is, in the conscious state, most people find it disorienting — and at worst traumatic — to have information crystallizing in their heads without the conventional preliminaries. So, you have to be asleep to take the call.
No dreams; I simply wake, knowing:
Yet another job to spy on the someone's wife...
Quite a creepy short (in modern terms) novel of the old-school sci-fi.
Doesn't have much beyond what we have now except neuron-perfect digital copies of human brains, and some mucking around with philosophical concept of what data (and consciousness) actually IS.
Recommended. Creepy as all heck, but definitely recommended.
You're not reading it wrong but then again to compare it unfavourably to Rowling's sewer dredgings isn't reasonable I contend. JKR's writing is childishly poor.
But if you set a very high bar, prepare to be disappointed by almost anything.
Just a suggestion, try Vurt by Jeff Noon.
Like most good SF writers he uses current knowledge and possible future directions for the basis of his technology and he has covered a good few of the suggestions here including:
Crystal memory stores to aid the brain's natural memory storage
In build communications tech in the brain
Virtual reality interfaces activated by closing the eyes and seeing "ghost" images which provide functionality.
His books are OK but sometimes the ending of the stories leave a bit to be desired. But his approach to the science is pretty good and I do have an old memory that suggests I read an article about tests looking at how technology can be used to enhance/improve memory capture in the brain.
I love the ideas in his books, but his female characters give cardboard cutouts a bad name *cough* Ione Saldana *cough*, and his endings tend to be non sequiturs; massive lumps of deus ex machina or 'if-it-wasn't-for-you-meddling-earthmen-we-would-have-got-away-with-it'.
Wormhole railways are totally bonkers and the occasional highly detailed descriptions of locomotives make me wonder if the young Peter was a trainspotter, or still is.
I'm happy I discovered Iain M. Banks after Hamilton; the Culture series utterly nails the sheer power a post-singularity civilization could wield, and the utterly decadent lifestyle one could have within such a society.
Beer, because in the Culture you could wake up one morning and decide to spend 200 years becoming a master brewer, simply because you could.
My daughter was as good as deaf when she was born in 2000 . Being implanted when she was a toddler means that she functions as a completely hearing human. She has a skull-mounted interface under her scalp which connects to an externally-worn RISC-powered device. She can listen to an MP3 player via wireless, transmitting straight to her inner ear.
This is now.
If I had been born with the same disability, I would be deaf.
I work with a profoundly deaf colleague who had cochlear implants a year or so ago. She has improved, but hearing is still hard work for her - as far as she can explain, she is severely limited in both frequency and volume discrimination. Which means that sometimes small sounds can drive her mad (as a bonus, because she has only one implant, she has little or no direction finding ability) and leaves us in the rather odd position of having to be quiet because a colleague is deaf...
>...and leaves us in the rather odd position of having to be quiet because a colleague is deaf...
I've heard that this is not uncommon. A retired doctor with hearing aids drinks in our beer garden, and he is more put off by background noise than us who are lucky enough to still have reasonable hearing. He is probably one of those who complain to Radio 4 about placing background music behind spoken-word content (screw DAB: Roll on 'radio' over IP, and the BBC can easily output the raw spoken-word output without music for those who want it)
I did read in New Scientist that Charles Babbage started to loose his hearing, and became overly sensitive to street musicians and buskers.... he campaigned against them, so in return they made a point of parading up and down outside his house.
I'm dyspraxic, and with it comes sensitivity to noise... I do appear to have a lower tolerance to sodcasting than my more mellow peers. FFS kiddies, either buy some headphones or get a ghettoblaster so we can all hear it. The landlord who made our local legendary sadly passed away a few years back... he was the sort to dunk phones in peoples pints, and "tell all the lager drinkers to f^&k-off and we'll all have ourselves a lovely little lock-in".
Well, sort of. It's not the brain per se but the ability to learn language that's crucial. Every child will learn to speak between the ages of one and five. During that window they learn (by imitation and emulation) the sounds and words that make up their language. This is an involuntary process and it's on a timer - if the process has not been kicked off by five or six years old, then it becomes harder and harder. This is why many deaf people who have been taught to speak sound odd to us. It's also where accents come from, because the sounds that you learn to use to make language become unconscious. This is why many continentals have such trouble with the English "th". They have to work to learn it as older children and most of them can't or won't. (Disclaimer: I speak four languages, have a German wife and live in the Netherlands, so this is not Euro-bashing - it's experience).
Cochlear implants have a huge advantage over hearing aids because they are more sensitive; they have a huge disadvantage because the "sounds" that they generate in the wearer's head are not sounds that we would readily recognise. If they are the first sounds that children hear then yes, they can form the basis of learning to speak. My daughter grew up in an environment where German, Dutch and English were spoken interchangeably, and so she speaks all three without accent, as does her (hearing) younger sister.
With adults who have never heard, results of implantation are almost always disappointing. Hearing adults deafened by age or misfortune are better at adapting because they know what to expect and they can adapt to it.
The fact is, this is a game-changer for the human race. When we were still in the diagnostic phase a German doctor said to me "From now on, no German child will ever need to learn sign language". The doctor who implanted my daughter describes the CI as "die einzige Sinnesprothese"- the only sensory prosthesis. During the last ten years I have seen serious effort being put in by the deaf community to <a href="http://en.wikipedia.org/wiki/Sound_and_Fury_(film)">turn back the tide</a>, and I have seen with my own eyes children who could have learned to speak being crippled by withholding implantation until they were five years old. The difficulty the child then has learning to speak is held up as an example by defenders of sign language that "these things don't work, see? Sign language is better".
My daughter attends a normal secondary school, although she is a year younger than the rest of her class. That's not attributable to the CI, but the opportunity that she has to complete a regular education is. As is the fact that she has a clear speaking voice even when she's not "plugged in". A relatively small investment by my medical insurer (40k Euros) has made the difference between a future taxpayer and a charity case. It's a no-brainer.
BTW sorry about the trumpet-blowing but I'm proud of my daughter and I'm not ashamed of that.
hamsterjam said: "During the last ten years I have seen serious effort being put in by the deaf community to turn back the tide, and I have seen with my own eyes children who could have learned to speak being crippled by withholding implantation until they were five years old."
It is beyond belief that the deaf parents who do this should be allowed to do so. Some deaf parents even use Pre-Implantattion Genetic Diagnosis to ensure they have deaf children because it is their "culture". Whilst I have somewhat lower than average hearing which is getting worse, and think that it is probably a very useful characteristic in the modern world (just like not having a good sense of smell), I think that the parents that would deliberately handicap a child for their own selfish reasons should be prevented from doing so by whatever legal means exist.
The kudos of first secret agent with own in-body (electronic) computer goes to Colonel Steve Austin from the novel "Cyborg" circa 1972. If you cannot get an old copy then the "Cobra" trilogy by Timothy Zahn (1985-88) is an excellent Mil-Spec reboot of the concept.
Could we do something similar with today's technology? probably... The human brain is an adaptive and resilient organ with several volunteers already sporting basic implants into their audio & visual areas without serious side-effects. Micro-surgery has also come a long way with repairing peripheral nerve bundles so no great leap of faith or technology is required to augment the relatively slow chemical reflexes of the human nervous system with a layer of electronic communication.
The complete ubiquity of tablets.
They even called them PADDs. According to the STTNG tech manual, you could fly a starship with one, which sounds a whole lot like Remote Desktop. The description of the 'Tricorder' is also very similar to a modern tablet, as it was communications device, sensor, camera and frequently a small bomb if you shorted out it's battery.
The memory sticks were also just that - memory sticks, though I think their storage capacity which was considered large in the late 80's (Petabytes I think) will be normal for a USB5 thumbdrive of 2019.
What struck me as strange when I first saw it back in 1988 (yes, it was aired first time in 1985, but then I didn't have access to sattelite) was the total absence of 'personal wireless comms'.
No cell-phones of any sort, no walkie-talkies...
They used ship-wide PA to page people and had phone booths at different places around the ship.
Back then my father used a NMT-450 mobile(more like luggable), and I walked around with a Motorola walkie-talkie/cell-phone cross of some sort.
Biting the hand that feeds IT © 1998–2020