top of page

>

>

Scientists create AI chip that computes at the speed of light

Fastest AI chip

News

Scientists create AI chip that computes at the speed of light

Mar 11, 2026

13:30

Disruption snapshot


  • A new AI chip uses light to run neural networks. Photonic processors could deliver much faster speeds and far lower energy use than today’s electrical GPUs.


  • Winners: Data centers and AI companies benefiting from cheaper compute. Losers: Electricity-intensive GPU infrastructure if photonic chips reduce power needs dramatically.


  • Watch for hybrid chips combining electrical logic with photonic neural network layers. Early deployments in data centers would signal the technology moving from labs to industry.

AI chips using light may sound like science fiction.


But a new breakthrough suggests they could reshape the future of computing.

 

Researchers at the University of Sydney just unveiled a prototype AI chip that runs neural networks using light instead of electricity.


The chip was developed at the university’s Sydney Nano Hub and detailed in a new paper published in Nature Communications.

 

At first glance, it sounds like a niche academic experiment. It’s not. If this technology scales, it could change the economics of AI computing in a big way.

 

Today’s AI boom runs on electrical chips. GPUs and CPUs push electrons through billions of transistors to perform the math behind neural networks. That architecture built the modern AI industry.

 

This new chip takes a completely different path.


Instead of electrons moving through circuits, the processor sends photons through nanoscale photonic structures. As light moves through those structures, the physics of the system performs the calculations used in machine learning.

 

Each structure is only tens of micrometers wide, roughly the width of a human hair. Together they form what researchers call a photonic neural network. Because the system works with light, it processes information at picosecond speeds.

 

In testing, the chip analyzed more than 10,000 MRI scans and classified them with accuracy between 90% and 99%.

 

Those results are impressive. But the bigger story isn’t just speed or accuracy.

 

It’s energy.

 

AI training and inference consume enormous amounts of power. Data centers packed with GPUs already strain electricity grids around the world. As models grow larger, the cost of power becomes one of the biggest limits on how fast the industry can expand.

 

Photonic computing could change that math.

 

Light can carry information with far less energy loss than electrical signals moving through silicon. If photonic processors can run neural networks efficiently at scale, they could dramatically cut the power required for AI workloads.

 

That would reshape the economics of AI hardware. Lower energy costs mean cheaper AI compute. And cheaper compute means more companies can build and run advanced models.

 

For investors, this is the kind of early breakthrough that’s easy to ignore. It’s still a prototype coming out of an academic lab.

 

But many of the technologies that built today’s semiconductor industry started the same way.

 

The companies that figure out how to commercialize photonic AI chips could end up building the next generation of AI infrastructure. And that’s where the biggest stock opportunities often begin.

 

The disruption behind the news: Electricity has been the bottleneck of AI scaling.

 

Every new generation of AI models uses more power than the last.

 

Light changes that equation.

 

Today’s AI infrastructure runs on massive GPU clusters that consume huge amounts of electricity. A single high end AI training cluster can draw tens of megawatts. Data centers globally could consume more than 1,000 terawatt hours of electricity each year within a decade if AI demand keeps growing.

 

Photonic computing attacks that problem at the physics level.

 

Electrons create resistance when they move through circuits. Resistance creates heat. Heat requires cooling. Cooling requires even more power. The entire AI compute stack is working against thermodynamics.

 

Photons don’t follow those rules.

 

Light can move through materials with very little resistance and almost no heat. When neural network math is built directly into photonic structures, the computation happens as light moves through the chip. The physics performs the calculation. No heavy electrical switching needed.

 

That changes the cost curve.

 

If photonic chips deliver even a 10x improvement in energy efficiency, the economics of AI infrastructure change fast. A hyperscale data center that would need $100M in power infrastructure could need far less. Training runs that cost millions in electricity could become dramatically cheaper.

 

But the hidden shift is where AI hardware spending goes next.

 

Today, about 30% to 40% of the operating cost of large AI clusters is electricity and cooling. Cut compute energy by 10x and the bottleneck moves. The main cost becomes data movement and memory access. Those already account for an estimated 70% to 90% of energy use inside modern AI chips.


In other words, photonic compute could make the math almost free while the real expense becomes moving weights and activations around the system. If that happens, the next race in AI hardware won’t just be photonic processors. It will also include optical interconnects and new memory architectures designed to keep up with them.

 

Speed is the second disruption.

 

These photonic systems compute in picoseconds. That’s trillionths of a second. Even modest gains here could reshape real time AI workloads such as autonomous systems, robotics, medical imaging, and edge AI devices.

 

Faster and more energy-efficient chips could also enable entirely new consumer devices where powerful AI runs locally instead of in distant data centers. That’s one reason many technologists believe wearable interfaces like AI-powered glasses could eventually replace smartphones, as explored in this article asking can AI glasses replace our phones.

 

Right now the AI hardware market is dominated by companies like Nvidia and Advanced Micro Devices building larger electrical accelerators. Photonic chips represent a completely different architecture. If the technology scales, it doesn’t compete on small performance improvements. But rewrites the entire hardware stack.

 

At the same time, researchers are experimenting with multiple radically different computing architectures, from photonics to neuromorphic processors to quantum systems, such as the recent breakthrough where scientists built a quantum brain on a chip for drug discovery.


In another wild experiment, scientists tested lab-grown brains as the potential AI chip replacement.

 

What to watch next

 

Prototype photonic chips look promising but scaling them is the hard part.

 

Manufacturing and integration will decide whether this becomes real infrastructure.

 

The next 24 months will show whether photonic AI becomes science or industry.

 

Three signals matter.

 

First, integration with existing silicon manufacturing. If photonic neural networks can be built using standard semiconductor processes, adoption could move much faster. If not, they could remain research projects.

 

Second, system scale. This prototype handled 10,000 MRI images. Commercial AI systems run models with billions of parameters. The race is to build photonic networks large enough to support modern AI models.

 

Third, hybrid architectures. The most likely near term path isn’t pure photonic computing. We will see hybrid chips where electrical processors handle control logic and photonic layers perform the heavy neural network math.

 

If that hybrid approach works, photonic accelerators could start showing up in data centers before the decade is over.

 

AI demand is rising faster than traditional chips can scale efficiently. AI models keep getting better.


At some point, something will have to replace the electron-heavy architecture that runs today’s models.


AI processors using light could just be it.

Recommended Articles

loading-animation.gif

loading-animation.gif

loading-animation.gif

bottom of page