
News
Lab-grown brains learn to play video game Doom
Disruption snapshot
Scientists showed living human neurons can function as computing hardware. A biological computer played Doom in a lab dish. This introduces biology as a potential alternative computing architecture.
Winners: bio-computing startups, AI infrastructure builders, synthetic biology tools. Losers: traditional GPU-only compute models and data-center operators facing rising power costs.
Watch neuron scaling milestones. If lab systems grow from ~1 million to ~10 million neurons and improve task performance, biological processors could start competing with silicon AI hardware.
What if the next AI chip won't be made from silicon, but human brain cells.
Scientists in Australia just proved this is possible
Biotech researchers at Cortical Labs used their CL1 biological computer platform to grow about 800,000 to 1 million human neurons in a lab dish
Then had it play Doom.
Those neurons were connected to a computer interface that translated the Doom game environment into electrical signals the cells could understand. The neurons fired back their own signals, which the system converted into in-game actions like moving around or firing weapons.
This builds on Cortical Labs’ 2022 experiment where similar neuron clusters learned to play Pong.
The difference is that Pong is extremely basic. Doom is a big jump. It’s a fast 3D environment filled with enemies, navigation, and constant decision-making.
Right now the neurons still play like beginners. But the signals coming from the dish show something interesting. The network is adapting and getting better over time.
The setup blends living neurons with traditional hardware to create what the company calls a biological computer.
It's an early glimpse of how AI chips may look like in the future.
The disruption behind the news: Brains are already the most efficient AI chips.
These neurons aren’t simulated intelligence.
They are living cells functioning as computing elements.
And they’re already learning inside a machine.
For decades computing improved by shrinking transistors and packing more of them onto silicon chips. That approach is running into physical limits and rising energy costs. Training a large AI model today can consume millions of kilowatt-hours of electricity. A small cluster of neurons operates on roughly milliwatts.
That’s the economic shock inside this experiment.
A dish with fewer than 1 million neurons learning a complex task suggests an alternative computing path. The human brain runs on about 20 watts while delivering far greater efficiency than today’s AI systems. Even a basic biological computing unit could eventually provide dramatically better energy efficiency than GPUs.
Now imagine data centers filled with hybrid bio-silicon systems.
The adoption path is easy to see. Demand for AI is exploding while computing supply is limited by power availability, chip manufacturing capacity, and hardware costs. If a biological system can perform learning tasks at even 10% of the energy cost of GPUs, the economics of AI infrastructure change.
There’s another advantage. Neurons naturally learn through feedback. You don’t need trillion-parameter models or massive datasets to train them. Instead, they adapt continuously through signals and rewards. That could create systems that improve while running, rather than requiring months of training runs.
Right now Cortical Labs is working with around 200,000 neurons during gameplay experiments. That number sounds small until you remember the human brain contains about 86 billion neurons. Scaling even a small fraction of that inside controlled biological hardware would create computing systems very different from today’s machines.
And unlike traditional chips, biological systems can self-organize.
A less obvious economic implication appears earlier in the AI pipeline. Training data is one of the largest hidden costs in modern AI. Many frontier models rely on datasets containing hundreds of billions or even trillions of tokens. Those datasets must be collected, cleaned, labeled, and sometimes licensed.
If a biological system can learn mainly through reward signals, similar to how the Doom neurons improve through feedback loops, it could reduce the need for massive datasets. In practice that means simulated environments and reinforcement signals might replace large portions of curated training data.
Even cutting training data needs by 90% would remove one of the most expensive constraints in AI development. That would shift the competitive advantage away from companies that control the most data and toward those that build the best bio-compute interfaces.
What to watch next
Today, it’s Doom running on a dish of neurons.
Tomorrow they could be learning practical tasks.
That shift may happen faster than many people expect.
First, watch scaling. If neuron clusters grow from 1 million to 10 million cells, capabilities could increase rapidly. Biological networks become much more powerful as connections multiply between neurons.
Second, watch remote access. Cortical Labs already allows developers to interact with these neuron systems through cloud interfaces. That means biological computing could enter developer ecosystems sooner than most people assume.
Third, watch cost curves. A high-end GPU like Nvidia’s H100 sells for about $30,000 and consumes large amounts of electricity. If a biological unit eventually delivers comparable learning capacity at even one-tenth the energy cost, data center economics could shift quickly.
Regulation will likely struggle to keep up. These systems run on living human neurons, and if companies start scaling them commercially, you can expect ethical debates to ramp up fast.
But the direction is pretty clear. What we’re seeing is an early step toward machines that mix silicon with human biology.
Neurons playing Doom may look like a novelty at first glance, but it signals something larger. If biology becomes part of the computing stack, the way high-end computers are built and the economics behind them could change in a big way.
Recommended Articles



