A chip off the old block

A single transistor can mimic the neural and synaptic behaviours of the human brain, bringing biologically inspired computing closer to reality.
The human brain is a remarkable computing machine. The IBM Summit supercomputer, once the world’s fastest supercomputer, can perform 200 quadrillion (that’s 15 zeros) calculations per second, using around 15 megawatts, enough to power thousands of homes. By comparison, the human brain, weighing just over a kilogram, achieves more than five times the
Issue 05 | May 2025
computational capacity, all while consuming less energy than a household lightbulb.
Mind-blowing stuff. It’s little wonder, then, that scientists are so keen to replicate the performance of the human brain.
Researchers led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering, College of Design and Engineering (CDE), National University of Singapore (NUS), have demonstrated that a single, standard silicon transistor, the fundamental building block of microchips used in computers, smartphones and almost every electronic system, can function like a biological neuron and synapse when operated in a specific, unconventional way.

The research team’s work presents a highly scalable and energy-efficient solution for hardware-based artificial neural networks (ANNs). This brings neuromorphic computing — where chips could process information more efficiently, much like the human brain — closer to reality. Their study was published in the journal Nature on 26 March 2025.
Putting the brains in silicon
The world’s most sophisticated computers already exist inside our heads. Studies show that the human brain is, by and large, more energy-efficient than electronic processors, thanks to almost 90 billion neurons that form some 100 trillion connections with each other, and synapses that tune their strength over time — a process known as synaptic plasticity, which underpins learning and memory.
For decades, scientists have sought to replicate this efficiency using ANNs. ANNs have recently driven remarkable advances in artificial intelligence (AI), loosely inspired by how the brain processes information. But while they borrow biological terminology, the similarities run only skin deep — software-based ANNs, such as those powering large language models like ChatGPT, have a voracious appetite for computational resources and, hence, electricity. This makes them impractical for many applications.
Issue 05 | May 2025
Forging New Frontiers
Neuromorphic computing aims to mimic the computing power and energy efficiency of the brain. This requires not only re-designing system architecture to carry out memory and computation at the same place — called in-memory computing — but also the development of electronic devices that exploit physical and electronic phenomena capable of replicating how neurons and synapses work. However, current neuromorphic computing systems are stymied by the need for complicated multi-transistor circuits or emerging materials that are yet to be validated for large-scale manufacturing.
“To enable true neuromorphic computing, where microchips behave like biological neurons and synapses, we need hardware that is both scalable and energyefficient,” said Assoc Prof Lanza.
“To enable true neuromorphic computing, where microchips behave like biological neurons and synapses, we need hardware that is both scalable and energyefficient.”
The researchers have now demonstrated that a single, standard silicon transistor, when arranged and operated in a specific way, can replicate both neural firing and synaptic weight changes — the core mechanisms of biological neurons and synapses. This was achieved through adjusting the resistance at the transistor’s bulk terminal, which in turn allowed the team to control two key physical phenomena within the device: impact ionisation, which generates a current spike akin to the activation of an electronic neuron, and charge trapping, which causes the resistance state to persist over time, mimicking the long-term behaviour of a synapse. Building on this, the team designed a two-transistor cell, called Neuro-Synaptic Random Access Memory (NS-RAM), that can switch between neuron and synapse modes dynamically.
“Other approaches require complex transistor arrays or novel materials with uncertain manufacturability, but our method makes use of commercial CMOS (complementary metal-oxide-semiconductor) technology, the same platform found in modern computer processors and memory microchips,” explained Assoc Prof Lanza. “This means it’s scalable, reliable and compatible with existing semiconductor fabrication processes.”
Issue 05 | May 2025
Through experiments, the NS-RAM cell demonstrated low power consumption, maintained stable performance over many cycles of operation and exhibited consistent, predictable behaviour across different devices — all of which are desired attributes for building reliable ANN hardware suited for real-world applications. The team’s breakthrough marks a step change in the development of compact, power-efficient AI processors that could enable faster, more responsive computing.
“The transistors we used are not at the cutting-edge — but rather traditional 180-nanometre node transistors, which can be produced by Singapore-based companies,” adds Assoc Prof Lanza. “Now that we’ve understood the operating mechanism, it’s now more a matter of microelectronic design.”
Forging New Frontiers
Memristors for fast and energy-efficient AI
In another paper, also published in Nature on 16 April 2025, Assoc Prof Lanza and his team dissected how the memristor industry is advancing, and how it is going to affect our lives.
Memristive circuits offer a major advantage for AI hardware as they enable data to be stored and processed simultaneously. This reduces energy consumption and latency, making them more efficient than conventional architectures that separate memory and processing units. From the current state of transistor-based memory, to the rise of memristors, as well as their scalability challenges and the central role of collaboration between academia, startups and chip manufacturers, the researchers offered expert analysis and forward-looking insights into this emerging technology in their comprehensive review.
“The development of memristive materials and devices has been extensively studied for more than 15 years, and now it is time to start to focus on hybrid memristortransistor implementations that result in realistic products,” says Assoc Prof Lanza. “We are now working on a multidisciplinary project to bring 2D memristors into reality, joining efforts from materials science, physics, chemistry, electrical engineering and even AI.”