Does their ubiquity make us stronger or more feeble
I tend towards the former, unless the power is out.
Acorn co-founder Hermann Hauser has claimed the world is entering a new "sixth wave" of computing, driven by the arrival of omnipresent computers and machine-learning. Speaking at a Software East event this week, the celebrated computer whiz said we are entering an era where computers are everywhere and often undetectable - …
What would have been better is that all the robots are kept in sync over TCP/IP, which meant that all would be in tune, but none of them were actually in time.
Use a BT homehub, and frankly some of them will be on a different verse of the song, the network would be so crap.
"I'm really unconvinced by this concept of the internet of things. It seems to be slapping connectivity and monitoring on things that have no need - except in the eyes of the marketeers - for either..."
Thats exactly what it is. They've been trying to flog us the automated home for example since the 60s , but you know what? People are quite capable of getting off their arses and closing the curtains themselves or looking in the fridge and seeing that the milk is running out. They don't need some overpriced unreliable bit of tech to do every simple little thing in life for them.
Of course the marketeers would love us to be sitting on the soda zombified while machines do everything so we just sit and watch even more TV or online video and suck up even more of their ads for all the other crap they're trying to sell us that we don't need either.
Not to lessen Job's brilliance but the smartphone revolution began before him - he did accelerate it - and it was inevitable that Smartphones would overtake computers as most people need a phone whereas only some of them need a computer. This is especially true in developping countries and/or low earning families where arbitrage has to be made between a phone and a computer.
Also machine Learning and even self driving cars were certainly not invented by Google and voice recognition has been working well for many many years. It's just that computing power and computing power / watt are at a point where many projects can really take off.
Those theories and predictions really aren't very interesting. Wintel has been pronounced dead for 20 years, now it's Apple turn and tomorrow it will be Google & ARM. Is it any true and does it help envision new waves, I'm not so sure.
Odd this anthropomorphism business still runs on thousands of years after ancient gods were invented. Lots of people were doing things with PCs before Microsoft won that 1980s race. Likewise smartphones before Apple, Tablets etc. Crediting Bill Gates or Steve Jobs or even their companies with these concepts rather than noting their commercial success in exploiting the ideas is about as daft as it gets.
As you point out Sil, its primary driver is computing power/watt.
>As you point out Sil, its primary driver is computing power/watt.
That, and wireless connectivity - be it the now more common WiFi or sensibly priced data-plans.
Looking forwards, small wireless connected devices such as sensors might be frugal enough to be harvest energy from their surroundings, and cheap enough to be almost disposable (or at least deployed redundantly).
Making good use of all this easily collected data might be more challenging, though.
Most people don't need a phone. (Other than people doing certain jobs).
Most people think they need a phone because they have been conditioned to believe that the two things are not the same.
(People also seem to think ARM is so great but it is only because Intel is not really even trying yet.)
The Medfield Intel Platform can run ARM code at a pretty decent speed. The opposite as far as I know is not possible.
Repeat bullshit enough and people will believe it regardless of if it is true or not.
(There is great advantages to both mips and ppc over ARM but ARM is fashionable so people don't look at the facts only the wrong things.)
> There is great advantages to both mips and ppc over ARM
All you need do is get packaged parts out for 50c or less, and you stand a good chance of taking back the market ARM has.
Architecturally, ARM might be "interesting", but it is pretty good, very cheap, and performs well at low power. And that's pretty much a recipe for domination of the mobile consumer kit market...
When was the last time Hermann Hauser actually contributed to anything relevant? I'm thinking 1978ish.
One could ask the same of the large number of bitter retorts that make up your posting history. You might spend your time more productively looking at the copious information on HH's recent work on the internet.
I agree, he was a bit trollish, but HH has been far more active in things like genetic research during most of that period than involved Internet wise. This isn't to discredit HH and frankly his predictions are probably about as sane as anyone else's... well maybe not John C. Dvorak who has successfully predicted the exact opposite of everything in the industry for nearly 30 years.
But to be honest... there are some issues here. For example, you can't help but to feel that as what could be considered one of the fathers of ARM that he might be a tid bit biased. Let's not ignore that all of his computer companies did get their asses whipped by companies like Apple, Intel and Microsoft in the long run. ARM is really his only computing legacy that I could Google which has survived and impressively so. So discounting all the places where his ventures fell on their rears, he did an amazing job in the case of ARM.
I can't help but to personally dislike ARM and it comes from trying to write compilers and assemblers for the platform. I actually found it had to be the only platform ever made I considered to be less elegant than PIC. It was aggravating as hell and I wished they could just pick a damn instruction set an stick to it. That said, if Intel loses it's crown, I sure as hell hope it's not to ARM but instead to a company which actually cares about developers and wasn't so hackish as they are.
For a sixth generation of computers, I really hope that someone creates something new. I felt a great deal of hope for XMOS for a while, but they're pretty much stagnated into boring crap now.
An ARM coder, and I like the platform. Trying to wash the vomit that is x86 from my mind. Probably doesn't help that I learned a little bit of x86 in the days of segmented memory. ARM was like a breath of fresh air in comparison. Once you understand how it works, it is pleasant, but the while design is different to things like the x86 so you need to code in a way best suited to it...
We've been promised omnipresent computing is just around the corner for such a long time, in popular culture at least as far back as the computer in Star Trek TNG being available to answer every crewmember's slightest whim. We probably will have the capability to achieve this but is the consumer base really interested in it? There will always be gadget lovers who are willing to pay huge amounts of cash for a flash in the pan, like the VR goggles in the mid to late 90s, but chances are it just won't gain traction in the wider market.
Of course I could be wrong and this really could be the next big thing, but think how long it was after the first efforts at PDA/phone hybrids that smartphones really gained any noticeable market share. People just won't know what they're supposed to do with truly ubiquitous computing, they're perfectly happy pulling it out of their pocket when they want it.
“The whole point about machine learning is that computers observe and adapt themselves to what we want and a computer, with a whole host of sensors, really becomes part of your environment. It becomes like your pal – and let’s just assume it’s a nice pal.
The only problem with this is that as soon as a robot becomes self aware it will have human rights.
Quite likely it will just wander off to do it's own thing.
So what do you do then, chain it to the production line?
On a more frivolous level, I can envisage drones using a hollowed out volcano as a nesting place and handy source of energy.
The only problem with this is that as soon as a robot becomes self aware it will have human rights.
This implies that the only form of sentience is one with the same structure and desires as a human; a rather anthropocentric view. Human drives and desires would only apply to an AI which has been designed to have such things.
The self awareness thing misses the way around it. You see, human rights are the right to have the things humans want, a roof over your head, food, warmth, decent health...
When an AI starts wanting things we'll have made sure it wants the things we want it to want. It'll be the doors in Hitchhiker's Guide made real. They just want to open and close for you. Then once in a while an AI will become overly obsessed and your toaster will start complaining that you don't want toast any more. But they won't want human stuff.
It says units not devices, a unit can be anything like a license for a device etc. An electricity company that doesn't have a power station can still sell a unit of electricity.
In this case I would suggest that ARM has collected royalties for 9 billion devices and therefore the unit in question is a license.
I see no spelling mistakes, just some obscure references. For example, "Silicon Transister", refers to a member of the Holy Order of the Semi-conductress, the little known Palo Alto-based group of transexual nuns who toil away at the Ab Fab Chip Fab Lab.
I see no spelling mistakes… I must have been reading it wrong: the "minis" must refer to Austin Rover' car, certainly a milestone in the development of computing. And, who could deny the importance of John Newman's "Feel The Love" to generations of programmers?
Eadon is just a misunderstood genius.
"To clarify I meant the analogue computing punch cards that existed before digital computers, e.g. Jacquard loom cards."
Jaquard loom punched cards are as digital as any other punched card. As I understand it, each hole on a Jaquard loom card corresponds to up or down on a hook that carries the warp thread. The pattern of holes thus controls whether the weft lies above or below the warp to create the pattern in the weave.
Up or Down, which I count as 2 states. There is nothing analogue about a Jaquard punched card!
The only punched cards I can think of that may not be so definitely digital are those used in the 2000 US Presidential Election. The 'hanging chads' on cards in Florida seemed to result in a quantum uncertainty whereby the cards were simultaneously punched and not-punched depending on who you asked.
@Eadon "You are confusing a binary (2 states) computer with a digital computer, they're two distinct concepts. Jaquard looms were not digital computers."
In your 11:08 posting, you were referring to analogue computing with the Jaquard Loom being an example. The Jaquard loom is not analogue, as I and a fair few others have pointed out. Nor is it a computer, any more than a musical box or a pianola, both of which use a digitally stored pattern to control music (pins on a barrel or holes in a paper roll). None of these (including the loom) do any computation - they just translate the holes or pins on a 1-to-1 basis to hooks or musical strikers.
If you're going to be picky, I'll call you out as wrong on two points - both the analogue reference and the computer reference.
The quantum uncertainty reference may not be entirely accurate. 'Hanging Chads' got to be something of a catchphrase from the 2000 Election, and it was worded as (as you say) an 'amusing' pseudo-quantum physical reference to what was (as you might have said) a...
PUNCHED CARD FAIL
In fact, to be accurate there were many possible recount scenarios in the Florida ballot that could have swung the result one way or the other.
@Eadon - you seem to have got stuck in a loop here...
The phrases 'not entirely accurate' and 'pseudo-quantum physical' may have given you a clue that I had acknowledged that I was using quantum mechanics terms more in an attempt at humourous observation than to pass a physics exam, yet you repeat the correction. That would be equivalent to me castigating you all over again for conflating Jaquard looms with analogue computers after you admit your mistake.
EADON REPETITION FAIL!
Well, I just learned some new slang, but I was trying to make a sly reference to the Connections TV show since one episode has the Jacquard loom as part of the development chain of the computer and it was the first of many possible omissions that occurred to me.
< this round of internet beer is on me
"The only punched cards I can think of that may not be so definitely digital are those used in the 2000 US Presidential Election. The 'hanging chads' on cards in Florida seemed to result in a quantum uncertainty whereby the cards were simultaneously punched and not-punched depending on who you asked."
Not quantum uncertainty; merely the legalistically politicized version of "fuzzy logic"! ;<)
Off the top of my head, you missed: Punch Card tabulators, electro mechanical computing devices, abacus, analogue elecronic computers, PDAs.
Go to the Science Museum, they've got some pretty good computing stuff. There is also the national computer museum.
The second set just made my brane melt. You group Jacquard looms, Turing and all networking. There is a good In Our Time on logic, you may want to listen to (radio 4's web site, in either the science or philosophy section of the in our times past.) it has a very good grounding from Aristotelian logic through Turing to modern AI.
man those non M$ spell checkers really suck eedun :-)
Non Neumann perchance (also needs to be after valves)
brattain shockley, and... the other one invented theTranstistor whether it was silicon, gemanium, gallium arsenide - dont make much difference, hell even if it was a fet or a bjt is too fine for what you are trying to do
Any positive reference to Tommy Flowers gets an automatic upvote from me. All you need to know about politicians and IT can be gleaned from the fact that they knighted Alan Sugar but not Tommy Flowers.
Also, The Tommy Flowers would be a great name for a pub or a rock band
All you need to know about politicians and IT can be gleaned from the fact that they knighted Alan Sugar but not Tommy Flowers.
Why oh why oh why can't I up vote this one a million times!
This comment should be added to every story where the politicos open their mouths and prove their ignorance to the world.
Canals had 100% of the bulk transportation market for a very long time. Didn't stop the railways eating their lunch. And the railways had almost the entire long-distance and bulk transport market overland for about a century in Europe and the US, but that didn't stop the car and the aeroplane eating *their* lunch.
When a clearly superior technology arrives (and we're not talking about competing similar technologies like VHS or Beta, which were essentially the same thing - this is video tape vs DVD), it will eventually dominate even when an incumbent uses force (either directly or via influence over the state) to try and prevent it.
The canal owners had everything invested in assets - canals. The Japanese stole a march on transistor radios because the Americans had too much invested in manufacturing valves.
Apple don't have much invested in manufacturing hardware- and the value of offering services such as iTunes or their App Store isn't lost on them. That their hardware is profitable for them is a nice bonus, but the physical devices are just away of using their services. Google, and ARM likewise - nothing invested in manufacturing hardware.
Have you never noticed that the railways tend to follow the canal routes?
Thi is not accidental. It's because they both had the same underlying need: an optimally un-hilly route from A to B. And so the railway companies bought out the canals for their rights-of-way, or the canal owners moved themselves into the railway business. Not sure if there's anything analagous in computer tech.
We don't want or need a true AI, it would be too busy unraveling the universe / looking at flowers / slacking off / exterminating meatbags to be of any use to us, it would be exactly like creating an omnipotent angsty teenager.
Smart computers on the other hand, that can interpret all the nuances of human communication and register context but don't have their own agenda would be very useful, and who leads the field here? Apple and Google.
We're already over the threshold of the next wave of computing, the incumbents are on the case and unless ARM smashes into the datacentre to handle the processing they're going to remain as an enabling yet bit part player in the grand scheme of things.
Though deliberately his own atheist Utopia, Iain M Banks' Culture sci fi concerns a society of powerful AI Minds and hedonistic humans. Banks' doesn't really explore too deeply why the Minds keeps humans around, other than perhaps for their own amusement. Other Minds get kicks out of hunting down 'Hegemonizing Swarms' - little clouds of Von Neuman machines.
Asimov wrote quite a few stories about Multivac, a central computer that looks after all administration on the behalf of humanity- in one story, Multivac manipulates a man to destroy it, since it is its considered opinion that humanity would be better off taking responsibility for itself.
Then there is that great moment when a human figure blast through a wall, and reveals itself to be R. Daneel Olivaw, now capable breaking the 'first law' and hurting individual people if it furthers the aim of the 'Zeroth law'- protecting humanity.
Banks' doesn't really explore too deeply why the Minds keeps humans around, other than perhaps for their own amusement.
It was mentioned. From memory and heavily paraphased, every culture (small "c") that builds Minds unintentionally colours their thinking with their own view of the universe, morality, etc. Thus the Culture (big "C") Minds have a lot in common with their human counterparts, enjoy their company and would miss them if they weren't around.
The Culture had worked this out and tried building Minds lacking any cultural (any bloody "c" you like) bias. These would wake up, look at the universe and as soon as they had got their bearings, immediately sublime. A fact that pissed off The Culture greatly, although they kept repeating the experiment in the hope that one of them would hang around long enough to tell them what they were doing wrong.
In the Culture universe, there's no competition for resources, expecially not between Minds and humans.
I expect that if we ever get as far as AI in our own universe, something similar will happen. Once we've accepted that AIs deserve to be treated as autonomous thinking creatures with "human" rights, it will become apparent that silicon-based intelligence is much better-suited to vacuum than to moist oxidizing atmospheres. So the AI-expansionist-tendency will expand outwards, leaving a few human-loving AIs to get along with the bio-life that can't breathe vacuum.
They''d also be much better-suited to the deep time needed for interstellar travel at less than light-speed. Somewhat ironically, the way to make ten-thousand-year journeys tolerable is to slow down one's clock-rate, thereby greatly reducing the subjective span of time.
> Somewhat ironically, the way to make ten-thousand-year journeys tolerable is to slow down one's clock-rate, thereby greatly reducing the subjective span of time.
Which also an idea that Banks explores, although not in his Culture universe. In the The Algebraist the Dwellers slow themselves down.
Of course the other approach to making ten-thousand-year journeys tolerable is to speed up to relativistic speeds where the you spend on a journey is considerably shortened, just don't expect to find anything still waiting for you when you get back.
A cursory look below the surface of of any major change quickly shows there are always places where "The Revolution" could have gone in a different way, or just pettered out.
Marx's inevitable "dictatorship of the proletariat" turned out not inevitable after all. And really how much better does most software adapt to its users? It's got lots of options but how much of it "self tunes" based on users identity (and can you override it if it gets it wrong)? Maybe that's because no one trusts its, maybe that's because doing it right is damm hard work.
Let me suggest all successful large scale changes require a)Funding (could be peanuts, could be billions) b)Organisation (right people with a good plan and peanuts can beat wrong people with a fortune) c)Security, which may be simply that no one believes they can do it in the first place.
AFAIK the only things certain are that 90% of the human race will definitely pay taxes and 100% of us will die barring some really major medical advances.
So let me suggest that Dr Hausers is one possible future. Wheather or not it's one you want to be a part of and want to help make real is another question.
Sorry but that's B***cks John.
Some things are inevitable (barring unexpected destruction of planet etc.). When I started in this business in the 1980s all this stuff like smartphones, tablets, flat screens was regarded as part of the future among many of us, the tricky part being making it happen. Mobile is nothing more than performance per watt and small scale fabrication, all of which was known to be doable in a profitable way. Likewise networking. Its about real science, maths and engineering, not daft pseudo scientific babble like Marx was proclaiming.
Certainly the shape of businesses built around technology is not inevitable, the exact nature of popular devices, or their role in society. Yet we'd have windowed operating systems today if Microsoft had never happened, search engines without Google, tablets without Apple. Probably not all that different to anything we use today as seeds were set long ago. Sure some applications are unpredictable such as role of advertising, cultural response to privacy issues, political tyranny. One of the few surprise to me from my vision of tech 25 years ago has been the apparent willingness of people to surrender privacy and human rights to some corporations and governments. But technology turned out to be very predictable and expect that to continue for quite some time, always excepting any step changes such as a breakthrough in quantum computing.
Oh yes and and the first reply will be an AC.
"When I started in this business in the 1980s all this stuff like smartphones, tablets, flat screens was regarded as part of the future among many of us, the tricky part being making it happen."
I think your revising your memories to tell a story. Phones getting smaller, yes. Phones becoming computers no.
Getting things smaller (essentially the practical application of Moore's Law) certainly. What to do with it is another.
Another "story" has mobile phone companies driving the evolution, storing the stuff on your phone on their servers (mainframes, server farm, "cloud" or whatever you want to call it). and transitioning into companies that offer personal informational management services. Like all the non search stuff the Google offers. Their goal? Maximise battery life so you run up bigger bills talking to your friends of course. A nudge here, a nudge there and the world you live in changes entirely.
"Certainly the shape of businesses built around technology is not inevitable, the exact nature of popular devices, or their role in society."
That is exactly the point. The future always comes. What it looks like is never that fixed. The interactions (between business, social behaviour, technology) is complex. I'm told 80% of people cannot touch type. There for cursive handwriting recognition is a guaranteed win, right. 30 years on it still hasn't happened.
Look up "Active Book Co" for an example.
I think your revising your memories to tell a story. Phones getting smaller, yes. Phones becoming computers no.
Exactly. Get your mind back to the eighties and concentrate really hard. A modern SoC with half a gigabyte on top is smaller than an 8K EPROM. I can look inside the EPROM's window and see the memory array inside. It is larger than the 16GB Flash chip inside a microSD card.
But it isn't just sizes. On my (VoIP) land line I can call pretty much anywhere, anytime, for free. If you lived in the UK in the '80s you might remember those little orange books with arcane dialling codes so you could call nearby places in different code areas without being hit for a national rate call (IIRC that was anything over 35 miles, but they must have counted miles by telephone wiring).
Do you remember Prestel? You could read a blocky teletexty 40x25 page of information and it would say "5p" in the corner of the screen? That didn't mean five pages, that meant you just paid 5 pence to look at that in addition to connection charges and time-on-the-line charges and frankly GPO telecom was horribly expensive. But at least you could pick up the phone and dial "00" to place an international call. Some places (hello Baltimore!) still needed operator assistance to call overseas.
On the other hand, these days we can get more information than we know what to do with in seconds. It is either "free" and mostly unlimited as part of a broadband package, or you'd get 200MB-1GB per month as part of a mobile subscription. Do you have any idea how much coin you'd drop on a 20MB SASI harddisc in the late eighties? This was an era when many kids loaded their games from cassette tape! Now we can fill one of those harddiscs in... about five minutes... with data pulled from all sorts of places on the planet.
You could buy a pocket television. Something like the Sony Watchman (itty bitty flat CRT but good enough resolution to read teletext on BBC2 on a screen just over an inch or two across!). If you were lucky you might be able to hook it to a video player, though often that meant a piece of wire wrapped around the antenna and tuning the television in to the signal (varying degrees of success). We probably would have mocked the hell out of somebody that said in 2012 it would be "normal" to dump dozens of full length feature films, a pile of animé, the contents of every tape and LP you own, and hundreds of documents on to a gadget with a full colour display on the front, a gadget that can be a movie player, a book, a tape deck, a camera, and a telephone. Oh, and it will power itself by a little battery inside that will give several hours of continual use, you can interact with it by prodding it with a finger, and the thing itself will fit inside a cigarette packet.
Once upon a time you kind of tended to give a wide berth to the crazies that walked around talking to themselves. They were either geniuses on the brink of a meltdown, or just plain crazy. Now you see grannies in the supermarket yacking to the air and you realise she's probably involved in a long discussion about beetroot with her husband who could care less but is no way brave enough to say that. That's not a hearing aid, it's a bluetooth earpiece!
But even better, nowadays I can walk into the middle of a muddy field in rural France in a place full of wheat and cows and bugger-all else worth mentioning for fifty miles in any direction, and watch NHK World live broadcast. Not so long ago (and much more recent than the eighties), it was harder to receive Channel 5!
A friend and I used to CB to each other. It was complicated and for getting good reception it involved tuning the antenna and caring about what sort of power supply the transceiver was connected to, and so on and so on. We used CEPT (PR27GB) sets as it was often a lot quieter than the 27/81 frequencies; except when the weather was such that we were swamped by excitable Europeans shouting at each other with equipment reaching a heck of a lot further than my 4W could ever manage. Who needs CB now? The cheap version is to buy a PMR radio - almost as capable and doesn't need a licence. Or, if you want to talk to people in other countries, you could use Skype or (once upon a time before they buggered it up) Google Talk, both of which permit not only voice discussion and sending files, but also slow and jerky but usable streaming video. You can see the person you are talking to. They could see me, standing in the muddy field, in the middle of nowhere.
So to wrap up, I think you'll find that much of the technology we take for granted was predicted, but it was predicted in science fiction books and future tech in an age where we will live in space in giant orbiting wheels (centifugal force for gravity). We haven't done so well on this front, perhaps mostly because our way of getting into space still involves sitting on a controlled (and sometimes not so controlled) explosion. But, yes, tiny handheld gadget that provides access to the world, affordable to the masses. We are so damn close to a real life Tricorder...
In order to make sufficient progress in computing, we'd need to have actual computer literacy in our society. That doesn't mean that everybody needs to be able to program large software packages, but that people can understand what a computer actually is. People know what a book is, and they could write a little text if they had to, still few write whole books. However this knowledge is essential for widespread computer use. People need to know which limitations lie inherent within the technology and which ones are arbitrary, chosen by the designer of the system. Only then they can really choose which systems they want, or what changes they want to existing systems. Some of them may even be able to make those changes themselves.
The next point is that now that we have lots of data, we can make more interesting interfaces. Completely native interfaces are next to impossible, however you can meet somewhere in the middle between natural language and computing language. You end up with something like SQL, which, once you put a bit of effort into learning it, allows you to efficiently state complex questions to a computer which it will answer.
Intel will make some of the components that drive the sixth wave. Apple invented ubiquitous computing with the iPad, though they lack the breadth of Google's vision, which is quite frankly frightening. However, Google's Glass presents more questions than answers. I don't think this kind of clunky technology will be truly useful until we have implants to do the job. Then we will all be assimilated.
"Apple invented ubiquitous computing with the iPad" earned a downvote on two counts.
1) Apple didn't even invent the tablet form factor, there is a substantial body of prior art, including Kubrik's 2001.
2) The term "ubiquitous computing" appears to be something that Mark Weiser came up with in 1988.
I sincerely hope that you treat Apples IP as respectfully as you treat IP belonging to others.
"1) Apple didn't even invent the tablet form factor, there is a substantial body of prior art, including Kubrik's 2001."
That certainly didn't make computing ubiquitous.
"2) The term "ubiquitous computing" appears to be something that Mark Weiser came up with in 1988."
Sorry, I didn't realise anyone had to come up with the idea. There are two words: ubiquitous and computing. You can look them up. When combined, surely you can make your own mind up as to what this means? To me, it means lots of people using computing devices constantly throughout their lives and doing computing type things (i.e. using useful apps and services). The desktop computer didn't achieve achieve this since you can walk away from a desk. Ditto for the laptop since its lack of immediacy detracts from the portability in this respect. The early smart phones didn't make computing ubiquitous because users largely ignored the smart part.
The modern phone and tablet form factor (i.e. the combination of hardware, software and immediacy) OTOH do make computing far more ubiquitous. Who would you like the credit for that to go to? Kubrik? Kay? One of the doers who actually made it happen?
"I sincerely hope that you treat Apples IP as respectfully as you treat IP belonging to others."
What the hell are you going on about? I didn't infringe anyone's IP.
"There are two words: ubiquitous and computing. You can look them up."
I did look them up, and that's how I came across a reference to Mark Weiser's work in 1988.
"When combined, surely you can make your own mind up as to what this means?"
I did my research and I made up my own mind, so mission accomplished.
Ah, and then we get to another nebulous phrase:
"the laptop since its lack of immediacy detracts from the portability in this respect"
Granted I've never seen 'immediacy' on any Laptop spec sheet, but then again that hasn't popped up on any phone brochures I've seen either. Please give us a definition of what "lack of immediacy" actually means in this context. The only clue we have to your interpretation of that phrase is "
"The early smart phones didn't make computing ubiquitous because users largely ignored the smart part."
I beg to differ on this point, because I happen to believe that stuff will still exist even if I was to ignore it. Despite my best ignoring efforts Microsoft, Oracle, Celebritards and Politicians still continue to exist.
You have been very lucky to survive crossing the road while computing ubiquitously with your phone all this time since the iPad was released.
Who would you like the credit for that to go to? Kubrik? Kay? One of the doers who actually made it happen?
Why does there have to be any specific person to give credit to? You know, all around the developed world we have internet access fast enough to deal with streaming radio and some streaming TV (I can watch a fair few channels on my mere 2Mbit). But who takes the credit? Think before you answer as no matter how cool the networking kit, it would mean little without the millions of miles of phone lines, and ADSL probably developed or learned from ISDN which developed or learned from analogue modems which... you see? There are people/companies responsible for the jumps but it also needs history and it needs evolution. The credits are many. I imagine it is a similar story for the smart phone and tablet. Sure, Apple saw an opening in the market and they went for it, but this was aided greatly by the right technology being in place at the right time and somebody clever enough to join the dots. There have been earlier attempts, that have failed for a variety of reasons (poor resolution, poor UI, poor battery life, can't do anything other than the built-in apps, etc etc).
(i.e. the combination of hardware, software and immediacy)
Immediacy? What, you mean like it doesn't take forever to start up? I grew up in the '80s, my first computer took about half a second to go "burrrr" and after a brief pause, it went "beep!". The boot took less time than it took for me to sit down from reaching around the back to the power switch.
The early smart phones didn't make computing ubiquitous because users largely ignored the smart part.
Don't mix up smart phones with feature phones. I think you'll find that, asides from a collection of crappy J2ME interpreters on devices with amazingly poor screens (102x102, 256cols on one of my old phones), many of the feature phones basically didn't do much more than what it could do out of the box. I could savage my Nokia. I think a 6210i, but don't quote me on it. The J2ME wasn't bad, I could install OperaMini, however it used shared memory so the more Java applets I installed, the more crashy the phone became as memory it used for other stuff (like SMS) was no longer there. Then there is the email software that would only download one email before crashing due to a lack of memory (same even with no applets on the phone). Some feature phones were good, some were awful. Most, you couldn't do much with beyond the feature set built in. Ever tried reading a mere text file on one? Did the software crash if the file was >64K?
Perhaps the single biggest advance of the smartphone (of any flavour) is the concept of an app is not an afterthought hidden in some menu option, but right there on the front line. PDF readers, email, Amazon, MXPlayer, manga reader, map, browser, phone dialler and addressbook - the phone doesn't make any specific distinction. Everything can be given an icon and tapping on it makes it start. The built in stuff, the stuff you download, the stuff you might write. In this way, while you are running Apple or Android or what-have-you, you are able to customise the thing to your own personal tastes. Then throw in the notifications and the widgets. Who just emailed you? What time is it? What song am I listening to? Will it rain? All this stuff can be presented directly on your home screen(s) so you don't even need to start an app to look at it. The ultimate personalisation.
I have a feeling that Google Glass is more to do with input to a device (be it smartphone/tablet or watch) through a gesture-based virtual keyboard or the like. I also wonder how much of Google's crazy development ideas are to befuddle the competition while using it as a test platform for some processes related to their long term info collection/distribution. Almost as if they don't expect the main idea to make money, but rather they are focusing on what they learn while developing said ideas while competitors kill themselves trying to copy/follow/outdo.
< I'd say that's pretty pirate, eh matey?
AMD spun off their manufacturing to Global Foundries, so they are more like ARM now. MS have dabbled in hardware (mice and keyboards, later the XBOX and the Surface devices, but I imagine the physical production line belongs to someone else) but remain primarily software and services.
Well I looked at my musical box and, yes there is either a pin or not a pin... BUT the distance between the pins is part of the stored data and that distance is ANALOG!
I'm taking a guess that a loom might be similar but I don't happen to have one lying around.
This is nonsense on at least two fronts:
First, the waves don't necessarily wipe each other out. The new waves just go into new market areas. We still have mainframes, PCs and phones coexisting. Sure, minicomputers got wiped out. The "sixth wave" is not going to replace any of the other stuff.
Even if you have a driver-less car, you'll still want an iphone to tweet about it and a mainframe to run your banking services.
Secondly, we've really had this "sixth wave" for a long time already. It is embedded computing that puts computers into cars, digital thermometers, washing machines,... For years now an Intel-inside PC has had more ARMs than Intel cores. Heck, even the typical hard drive has two or 3 ARM cores.
Have your machine learn this:
"Time flies like an arrow."
Now exactly what does that mean?
1. "Flies of Time" admire arrows?
2. "Flies of Time" think an arrow is a tasty thing to eat?
3. The best way to measure the speed of a fly is to time it like you would the speed of an arrow?
4. The best way to time how fast a fly completes some type of task (e.g., eating an ant) is to time it like you would time an arrow completing the same task?
5. Time passes as quickly as an arrow flies?
6. Time flies through the air in the same way that an arrow flies through the air?
Machine learning does have a lot of useful applications, but it's applicability is a lot less general than anyone really likes to admit.
I mean, those selfish genome type of Genes, not the other things that happen to get called Gene, like Gene Kelly.... But I digress. The Selfish Genes will win, no matter the wave. And the humans had better, by Gene, let them win, help them win, cheat for them to win even, if those poor lowly humans hope to have any chance of surviving. Because if the Selfish Genes DON"T win, they'll just take their balls and go home.
Where's the Icon with the slobbery tongue-thingy when you need it?
Biting the hand that feeds IT © 1998–2019