Intel and chip-tech house Numonyx unveiled a new technology on Wednesday that the companies claim will enable non-volatile memory to break through NAND's 20nm barrier and scale down to process sizes as tiny as 5nm - and do so cost-effectively. What's more, the resulting stacked memory arrays could potentially usurp current DRAM …
Reg, why don't you publish an article...
...about one of these new nonvolatile technologies actually hitting the mass market? THEN perhaps I'll pay attention. All these "still in the experimental stage" articles depress me.
I remember hearing that name a long time ago, where one Ovshinsky had found a way to make transistors with glass instead of silicon, and was looking to revolutionize solar cells and the like. It's been a long journey to acceptance for this technology since it first received publicity, although this isn't the first time it has had some real-world success.
Can I be the first to say...
This what I have been waiting for, all forms of storage unified into one!
No more separate RAM and HDDs etc. Then they can also get it to fit on the same chip as the CPU, then with the massive reduction in computer components it can all go massively parallel.
So ten years from now maybe we could have a computer with a hundred cores and each core has a 1TB of memory that you can use how you want. I hope they can do it at a good cost as well.
Imagine if you want to upgrade your computer you just add more unified chips.
Yeah, that my dream of how computers could be in the future. Let's just hope it doesn't get strangled with a bloated OS.
Allow me to dream here, dammit!
"... that would allow the munging together... "
A choice word!! Never heard that one before. More! Conjugated in irregular perfect and imperfect tenses, please.
I like this!
I love this!
The only downside is that if your memory goes bad then so does your storage.
I couldn't disagree with you more Charles, I find stuff like this fascinating to read - plus knowing about something before it becomes mainstream is never a bad thing =)
Don't hold your breath
These technologies often take a really long time to become mainstream and sometimes never do.
NAND flash was invented in the 1980s but only really got going in around 2003.
FRAM never got into mainstream.
It isn't just making the stuff more reliable. It has to be cost effective too.
NAND and DRAM flash will be with us a long time still.
Also found it fascinating ....
But surely if you're aiming at layering stuff onto chips then it's the second layer that's hardest rather than the first?
I did at first, too, and have been hearing forever this talk of the fabled "unified memory". But after hearing tons of talk about MRAM, RRAM, PCM, Racetrack memory, and so on...with absolutely nothing commercially viable to show for it, I've become rather jaded and want everyone to just cut to the chase. "Show me the money, sonny!" Tell me when I can have these things in my computer, iPod, or so on so that I can actually ENJOY the benefits for myself.
It must be XIP then?
i.e. I guess it must be properly addressable and support eXecute In Place operation if they're predicting replacing DRAM.
NAND can't be addressed the same way as DRAM or NOR flash, it's a block device conceptually similar to a hard drive but with a lot more quirks. This is one of the reasons it took so long to replace NOR flash in cellphones - to use NAND flash for code, you need to have DRAM to execute it from. Early all-NAND devices needed twice as much DRAM as the NOR devices they were replacing.
Still, this stuff sounds promising. A smaller process and highly granular writes should mean it uses less power than NAND too.
A half a century and...
Since the sixties, Stan's phase change amorphous memories have always been "just around the corner from full production."
The guy is good and ya' have to hand it to him: He's been right for half a century. Always just around the corner.
It is what I am waiting for
The unified memory - get rid of all this copying data back and forth, don't bother 'loading' an app into RAM, just run it where it is, modify a few bytes her and few bytes there, only duplicate when you need to.
Now all we need are OSes which would work with this architecture.
Indiana jones and the search for the holy grail of memory...
PCM is good but it isn't the holy grail of memory, but still its better than Flash in a number of ways, (although Flash wins in a few other ways) but its not going to replace RAM ever. PCM still wears out. In theory PCM can take a lot more writes than Flash but its not like RAM which allows unlimited writes. Still PCM does give far higher numbers of writes combined with the fact PCM can also address individual bits which greatly helps to increase its storage lifespan. Flash works ok but its got a lot of limitations. One advantage Flash has is that it can allow multi-bit (Multi-level) storage which increases its density (but partly at the expense of reducing the number of reliable writes).
But I don't see how Flash is going to be displaced by any other memory technology until it offers better storage densities than Flash. (Until that point any other memory technology will always remain very niche market memory and so some people will unfairly laugh at it for not being mass market). The point Flash memory looses the race to better storage densities won't happen until we are approaching much smaller scales as its hard to scale Flash down. (Intel aims to start 11nm by about 2015-2016). So we are looking at a few years at least where Flash is unchallenged in storage.
The only 2 technologies I've seen so far that could in theory over take Flash in memory density (in the next few years) are Memristors and maybe just maybe PCM, but both still wear out so none are RAM style replacements.
I am however unclear on the potential of FRAM. I'm not sure if they wear out eventually. FRAM has however got the potential to be very fast, in theory it could beat DRAM so if it doesn't wear out thats one to very much watch?
As for MRAM this one is even more unclear to me. It in theory doesn't wear out! and its as fast as RAM and it has similar densities to DRAM but unfortunately both DRAM and MRAM don't have the high density of Flash. Problem is that means MRAM doesn't yet have a market. It can't replace DRAM as it would require a redesign of computers which won't happen for the vast majority of computers (so MRAM stays a niche market) and MRAM can't replace Flash as it can't match its storage density (so again MRAM stays a niche market). MRAM seems to be currently trapped by market forces at the moment. I think one big hope for MRAM is how it can also save power. That could open up its potential into many markets as both a DRAM and smaller Flash sizes replacement in one chip. Like maybe the phone market. By replacing both DRAM and Flash and saving power while on and off its got a lot of potential. It could also do very well in embedded applications for the same reasons as it can act as both DRAM and some storage space.
If they could crack the density issue with MRAM then it would become the holy grail of memory. That would suddenly make it extremely big business.
From a marketing point of view, in the future I can't help thinking having instant on PCs would be a major selling point to most people who hate waiting for things to boot up. Its not hard to imagine a future where the idea of waiting for a computer to boot up would becoming as old fashioned as listening to a modem is to us now. (Imagine 2 people with computers both switching their machine on at the same time and one instantly goes ping and is ready to use, while the other computer they have to sit and watch for a few minutes slowly getting ready to use). I suspect most people would end up laughing at boot times as very old fashioned. But I think its going to take a big company to make their own high density MRAM for their own Sub-notebook to really shake up the market to really kick start MRAM sales, but if they do I think they could end up making a fortune from MRAM. My guess would be companies like Samsung and Sony could do it. (My money is on Samsung. :)
> Now all we need are OSes which would work with this architecture.
something along the lines of single-level store? as in Multics or OS/400?
heard this one before
Numonyx previously used BJTs to do PCM. They introduce OTS here for multilayer but actually only demonstrating single layer. Having studied OTS I can say they are going out on a limb. Just to convince industry to use PCM instead of flash is difficult . Now we also have to rely on OTS instead of transistor. Two immature technologies... bad move.
Look, you whiners out there that want the vendors to "just show me the money, sonny" obviously have never worked at the coal face. I have, for virtually my entire career (22 years and counting). It takes time and timing to shepherd a new technology from the lab all the way to the retail store. And big companies and governments share the common trait that they move like snails when it comes to change, so its a good thing the bleeding edge takes time to advance -- it gives folks like me time to prepare the culture and infra to accept the new technology. I rely on El Reg (among many others) to bring me the news of where all these advances are -- good job, Reg!
So quit yer yapping, kick back, soak in the future-looking buzz, and imagine what YOU would do with PCM memory. THEN, TELL INTEL!! How do you think they know which technologies to get behind, which to put on a back burner to cook a bit longer, and which to shelve? Yeah, partly the technology itself tells them...but in great part, its *US*, their potential customers, that tell them. [And for the record, I've been telling them we need PCM memory for a few years now.] Its up to us to look over the technology, understand how it works, see if we can find any potential flaws or drawbacks, imagine how we'd use it, and tell the vendors who are trying to bring these technologies to market.
If you don't -- if you just sit back and kvetch about the news -- then you really can't complain about the speed or capacity of your technology, now can you?
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- FOUR DAYS: That's how long it took to crack Galaxy S5 fingerscanner
- Did a date calculation bug just cost hard-up Co-op Bank £110m?
- Feast your PUNY eyes on highest resolution phone display EVER
- Wall St's DROOLING as Twitter GULPS DOWN analytics firm Gnip