Original news release was issued by Stanford News, written by Tom Abate.
Our need for fast and efficient computers isn’t going anywhere, and Moore’s law is starting to lose its breath, as people have said time and time again. But if law of dimishing returns and limits of silicon chips give you panic attacks, you may stop worrying. More than one lab is making steps to ensure that our current and near-future computers are left in the dust once we need to make the switch to the next generation. Most recently, researchers from Stanford University have made a considerable contribution to making silicon chips look like ancient history.
It is, for all intents and purposes, a paradigm shift from the two wildly different ways that we currently tackle computer memory. The first type – volatile memory – has superior speed, but loses data after power is turned off. It has found its footing in computer RAM because its speed is an invaluable asset when working with processors to store data during computations at speeds measured in nanoseconds. The second type – nonvolatile memory – is considerably slower, but manages to keep permanent memory, which makes it good for flash storage.
Now, Stanford-led research shows that an emerging memory technology, based on a new class of semiconductor materials, could deliver the best of both worlds, storing data permanently while allowing certain operations to occur up to a thousand times faster than today’s memory devices. The new approach may also be more energy efficient.
“A thousandfold increase in speed coupled with lower energy use suggests a path toward future memory technologies that could far outperform anything previously demonstrated” – Aaron Lindenberg, an associate professor of materials science and engineering at Stanford and of photon science at the SLAC National Accelerator Laboratory, and the research team leader.
Today, memory chips are commonly based on silicon technologies that efficiently switch electron flows on and off, representing the ones and zeroes that drive digital software. But researchers continue searching for new materials and processes that use less energy and require less space than silicon solutions.
Phase-change memory is one possible next-generation technology. Scientists have known for some time that certain materials have flexible atomic structures that offer interesting electronic possibilities. For instance, phase-change materials can exist in two different atomic structures, each of which has a different electronic state. A crystalline, or ordered, atomic structure, permits the flow of electrons, while an amorphous, or disordered, structure inhibits electron flows.
Researchers have developed ways to flip-flop the structural and electronic states of these materials – changing their phase from one to zero and back again – by applying short bursts of heat, supplied electrically or optically.
Phase-change materials are attractive as a memory technology because they retain whichever electronic state conforms to their structure. Once their atoms flip or flop to form a one or a zero, the material stores that data until another energy jolt causes it to change. This ability to retain stored data makes phase-change memory nonvolatile just like the silicon-based flash memory in smartphones.
But permanent storage is only one desired attribute. A next-generation memory technology also needs to perform certain operations faster than today’s chips. By using extremely precise measurements and instrumentation, the researchers sought to demonstrate the speed and energy potential of phase-change technology – and what they found was encouraging.
The new research focused on the unimaginably brief interval when an amorphous structure began to switch to crystalline, when a digital zero became a digital one. This intermediate phase – where the charge flows through the amorphous structure like in a crystal – is known as “amorphous on.”
In the presence of a sophisticated detection system, the Stanford researchers jolted a small sample of amorphous material with an electrical field comparable in strength to a lightning strike. Their instrumentation detected that the amorphous-on state – initiating the flip from zero to one – occurred less than a picosecond after they applied the jolt.
To comprehend the brevity of a picosecond, it’s roughly the time it would take for a beam of light, traveling at 186,000 miles per second, to pass through two pieces of paper.
Showing that phase-change materials can be transformed from zero to one by a picosecond excitation suggests that this emerging technology could store data many times faster than silicon RAM for tasks that require memory and processors to work together to perform computations.
Space is always a consideration in design, and previous experiments have shown that phase-change technology has the potential to pack more data in less space, giving it a favorable storage density. Taking energy into account, researchers say the electrical field that triggered the phase change was of such a brief duration that it points toward a storage process that could become more efficient than today’s silicon-based technologies.
Finally, although this experiment did not establish precisely how much time would be required to completely flip an atomic arrangement from amorphous to crystalline or back, these results suggest that phase-change materials could perform superfast memory chores and permanent storage – depending on how long the thermal excitation is engineered to stay inside the material.
Much work remains to turn this discovery into functioning memory systems. Nonetheless, attaining such speed using a low-energy switching technique on a material that can store more information in less space suggests that phase-change technology has the potential to revolutionize data storage.
“A new technology which demonstrate a thousandfold advantage over incumbent technologies is compelling,” Lindenberg said. “I think we’ve shown that phase change deserves further attention.”