Bridging light and electrons

A new ferroelectric memory device stores and retrieves data using both light and electricity, offering a compact and scalable solution to bridge electronic and photonic computing systems.
Did you know that nearly every microprocessor today — from those in smartphones to supercomputers — still runs on a design architecture conceived in the 1940s?
Issue 05 | May 2025 Forging New Frontiers
Known as the vonn Neumann architecture, it separates memory and processing units and connects them via a shared data channel. While the architecture has underpinned decades of progress, it’s showing its age in the face of increasingly data-intensive technologies such as artificial intelligence. Moving data between memory and processor takes time, wastes energy and hamstrings performance.
Associate Professor Gong Xiao has a solution: a compact, silicon-compatible memory cell that stores data using light or electricity, and reads it in both formats, simultaneously. At the Department of Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore, Assoc Prof Gong led a team to build this photonic-electronic memory using a thin film of ferroelectric material on a silicon ring resonator — a design that retains data without power and operates at telecom wavelengths.

Together with his team, Associate Professor Gong Xiao built a photonic-electronic memory using a thin film of ferroelectric material on a silicon ring resonator.
The stable, low-power and multi-modal memory device could help bridge the gap between photonic and electronic systems. It is also compatible with existing chip manufacturing processes, which makes it easier to scale for real-world applications, from high-speed data centres to processors that mimic the human brain.
The researchers’ findings were published in Light: Science & Applications on 23 August 2024.
Overcoming the memory bottleneck
Photonic microchips, which process data using light instead of electricity, offer faster speeds and lower power usage. However, many photonic devices lack a simple and efficient way to store information directly as light. Each time data must be saved, light signals are converted back into electrical ones, and then converted back to light for processing. This back-and-forth adds delays and chips away at the benefits of photonic computing.
To realise the potential of photonics, researchers have been searching for a memory device that can store data using both electrical and optical inputs, and outputs data in either form. It would also need to retain information even when the power is off — a feature known as non-volatility.
Issue 05 | May 2025
While various solutions such as the use of materials like phase-change films have shown some promise, they tend to introduce instability, complexity in programming or incompatibility with the standard silicon processes that underpin modern computing.
Assoc Prof Gong’s team took a fresh approach. They designed a memory cell around a silicon ring resonator, a tiny structure through which light circulates continuously. Onto this resonator, the team applied an ultra-thin layer of aluminium-doped hafnium oxide — a ferroelectric material that responds to small electrical changes by shifting the orientation of its internal electric dipoles.
“Ferroelectric materials let us control a kind of built-in polarity.”
“Ferroelectric materials let us control a kind of built-in polarity,” explains Assoc Prof Gong. “Once set by a voltage or even light, that state holds, and we don’t need to keep using power to maintain the data.”
This change in polarity affects two things at once: it alters the capacitance (how the device responds electrically) and the refractive index (how it interacts with light). As a result, the same memory cell can be written and erased using either electrical pulses or light, and read through both electrical sensing and optical signals — with no interference between the two.
By varying the strength of the input signals, the team was also able to store multiple levels of data in a single cell — not just simple ones and zeroes, but multiple states such as 00, 01, 10 and 11. This means more storage in a smaller footprint and less overall energy use per bit. Further tests showed that the memory was stable, energy-efficient and reliable across many repeated uses. It also retained data for long periods and required only low voltages to operate.
Towards unified computing systems
Acting as a bridge between photonic and electronic systems, the team’s invention could enable faster, more efficient platforms for data centres, optical interconnects and neuromorphic processors that mimic how the brain processes information.
In addition, because the design is CMOS-compatible, it can be integrated into existing chip manufacturing processes and scaled to dense memory arrays. The
05 | May 2025
team is currently focused on enhancing switching speed, reducing power consumption and integrating additional components directly on-chip to improve overall system efficiency. In parallel, they are pushing toward a large-scale demonstration of photonic computing tightly integrated with electronic systems. This hybrid approach aims to leverage the scalability and versatility of heterogeneous integration, paving the way for high-performance, energyefficient computing platforms suitable for next-generation data processing and AI workloads.
Capacitive memory gets a charge
While ferroelectric materials have opened new vistas in photonic-electronic memory, their potential stretches far beyond. In a recent review, Assoc Prof Gong takes a bird’s-eye view of how ferroelectric capacitive memories (FCMs) could shape the future of data storage and energyefficient computing.
Published in Nano Convergence on 22 January 2025, the review explores how these emerging memory devices store data not by switching resistive states, but by altering capacitance — a charge-based method that enables non-destructive readout, ultra-low power consumption and better scalability in chip arrays.
Unlike conventional resistive memories, FCMs offer unique advantages in largescale systems: they sidestep common pitfalls like voltage loss and “sneak path” errors, which often plague dense memory networks. The review also highlights how FCMs could accelerate computingin-memory for machine learning — performing computations within memory cells to reduce data transfer overhead.