Feeds

back to article Future of computing crystal-balled by top chip boffins

If you thought that the microprocessor's first 40 years were chock full of brain-boggling developments, just wait for the next 40 – that's the consensus of a quartet of Intel heavyweights, past and present, with whom we recently spoke. At the 4004's 40th birthday party in a San Francisco watering hole on November 15, The Reg got …

COMMENTS

This topic is closed for new posts.
Silver badge

Daft thing is ...

This laptop+docking station is roughly eight years old. It provides all the computing power I need to run the Ranch, and all the Internet connectivity that I require for voice, data & video communications world-wide.

Note that I am an adult, not a gamer.

2
5
Anonymous Coward

You speak like being a gamer is a bad thing... GPGPU wouldn't exist if there wasn't a financial reason for it, and gamers pay the bills for the research that really makes it all possible. If we just needed advanced 2D, we wouldn't have any of the excess power in graphics cards we often can re-purpose now... which is making it's way in to super computers.

If you're being short-sighted enough to suggest on the whole humans have hit a limit on the processing power we need, unless we play games... I suppose you didn't need more than 64k of memory either?

2
0
Ru
Meh

So what you're saying is,

if you have no need to run compute-intensive applications, you don't need a powerful computer? That's some piercing insight right there.

Leaving aside your mildly offensive assumption that adults could not possibly wish to play games, there's a whole world of applications out there which actually need powerful hardware (and haven't just bloated to fill the space). I for one am extremely glad that I can compile one particular project in 60 seconds these days, when it was more like 12 minutes ten years ago.

1
0
Thumb Up

That's all very well, but will they play Crysis 6?

Seriously though, excellent couple of articles Rik. Stuff like this is the reason I love the Reg, and will forgive all the (usually Apple) clickbait articles to come back and read.

2
0

It's a shame Faggin isn't a bit younger...

... because I rather fancy a wager on the success of neuro-computing.

Say $50 on the Human Brain Project ( http://www.humanbrainproject.eu/ ) turning out something useful if it gets ten years funding.

The snag with teaching yourself neuroscience is that it's really an entirely new subject; you'd be better off getting real experts involved. So HBP involves people like Henry Markram (Neuroscience), Steve Furber (ARM designer), and Seth Grant (Human Genome project, neuro-genomics expert).

Dave Lester (APT Manchester / HBP project)

0
0
Silver badge
Coat

>Intel's first strained silicon processor was the "Prescott" Pentium 4 of 2004.

That's the effect of poor code name choice for you!

0
0
Silver badge

Mimic the human brain!

I started computing when the 4004 came out - using machines the size of a large car- 20 years earlier thermionic LEO ran the whole of Lyons business - thousands of shops and warehouses and employees paid and stock controlled.

We now have machines a billion times more powerful than the 4004 on each desk and we can now just about e-mail accounts to pass the buck or spend all day arguing over which font to use on something that only a computer should ever see.

They can already mimic the human brain - just turn them off!

3
0
Angel

In my yoof...

..., I did one of the newfangled MSc giving an intense grounding in all things digital including design and fab (was working in opto so no real need to pursue this practically)

At that time, the discussion was 1um devices with the barrier perceived to be the retardation effect, in which device channels started acting as waveguides rather than mere connections.

Can anyone point me to a relatively low tech discussion of what happened to the physics challenges?

0
0
Anonymous Coward

"retardation effect"

Particularly prominent in technology with a certain fruit on the label.

(ducks to avoid flames)

0
0
Silver badge
Coat

Really?

"...The purpose of the computer, said the man who helped bring them into so many aspects of our lives, is to help us be "better human beings in terms of the things that matter to a human, which are creativity, loving, and living a life that is meaningful."..."

That may be what he thinks, but has he asked the white mice what they think we're for?

So long, and thanks for all the fish in my pocket...

0
0
Gold badge
Unhappy

21st century lithography, 21st century materials 1970s instruction set

BTW Atomic diameters are roughly 1/10 of nm. So *roughly* 10 years is foreseeable in terms of density.

The business about the fin shaped transistors having better *off* characteristics is interesting as in principle that should lower the off current by (potentially) quite a bit.

Handy when you've got an effect that's wasting power at *any* clock frequency even when the processor is doing nothing.

And on the subject of Mr (Dr?) Faggin's pessimism on cognitive computing I'd remind people of AC Clarkes observations about leaned men saying what can and cannot be done.

On the whole depressing.

1
0
JDX
Gold badge

Fascinating Article

Thanks El Reg, nice amongst the Apple rumours!

0
0

This is a piece of misinformation. The chip process speed progress is nearly over. There is no-way to go lower than 14nm, so only 2 "process shrinks" to go.

0
0
Bronze badge
Unhappy

Big hot throbbing x86 forever

Intel would rather battle the laws of physics than create a new instruction set? No wonder there's so much research using different CPUs, GPUs, and DSPs. The science behind low-level software must evolve too. One thing that's sorely lacking is executing parallel tasks with low latency. Researchers, compiler designers, and game developers could probably rattle off a dozen other areas where low-level software hasn't kept up.

2
0
Silver badge

Going non-extreme?

With tablets and phones and things like the Mac Air, there may be a push to scale tech down to meet other objectives, such as portability, providing "just enough" power rather than oodles.

0
0

Material costs

A lot of the cost factor comes down to the size of the silicon wafer. They keep etching smaller and smaller onto the same sized wafer. Increasing the size would increases costs. If they can find another material to etch onto that is cheaper to produce then perhaps we can overcome some of the cost factors and just increase the size somewhat (or if the material is better then keep the same sizes or smaller and cheaper).

I'm still waiting for a modular approach processors. I.E. If I want to make my system faster, I plug in an extra CPU/GPU chip onto/alongside the existing CPU/GPU. I know you can get boards with multi-processor slots but I am thinking more consumer grade here.

0
0

Mimicking the human brain isn't necessarily a good target

We can't do 10 million floating point operations per second; the brain simply isn't designed for that kind of stuff. Some people I know couldn't do one floating point operation per ten million seconds. The brain is good at what it's good at, but it's not good at what it's not good at, and the reason we have tools is to complement our natural abilities. A hammer is made of metal so that it can bash nails in. If a hammer "mimicked a human fist" then it would be useless - we can't punch nails into the wall (with the possible exception of Chuck Norris).

1
0

Re: Human FLOPS

Ah, we can't *consciously* do ten million flops per second. But maybe we are doing them *unconsciously* all the time while processing the world. Example: those people who can find the n-th root of huge numbers faster than a computer. Sure, to them it feels like manipulating colours or something, but how are they arriving at the correct answer unless they are tapping into some kind of serious processing ability that maybe exists at the cellular level in everybody, just more directly accessible by them?

0
0
JDX
Gold badge

re:There is no-way to go lower than 14nm

Well, now a random person said it on the internet, won't Intel be feeling silly!

0
0

This is deliberate misinformation, targeting the shareholders, for the purpose to keep share prices high.

0
0

Future of Computing

You may have missed another "Future of Computing" 3 part series in EDN.

http://www.edn.com/article/520059-The_future_of_computers_Part_1_Multicore_and_the_Memory_Wall.php

Allegedly INTEL is toast......or some such.

0
0
This topic is closed for new posts.