Why Is OpenAI Spending $51 Million on a Chip Startup?
OpenAI is investing $51 million in a neuromorphic chip that can mimics the human brain to advance the capabilities of AI and could potentially change the way we use AI completely.

Neil Regole
Updated April 30, 2024

A green eye (symbolizing OpenAI) visualizing its own universe, generated with Midjourney
Reading Time: 5 minutes
Artificial intelligence capabilities are advancing at a breakneck pace thanks to innovations by leading AI labs like OpenAI. Now, OpenAI plans to catapult its AI developments even further into the future with a $51 million investment in Rain AI, a startup focused on groundbreaking neuromorphic chip technology.
With customized hardware tailored to mimic neural pathways in the human brain, OpenAI could soon smash through existing limitations on how intelligent and responsive AI systems can be. If the collaboration pans out as planned between these two trailblazing companies, new frontiers of artificial intelligence could rapidly unfold—from how seamlessly AI interfaces with online platforms to unlocking general intelligence exceeding human cognition.
Why Is OpenAI Spending $51 Million?
OpenAI signed an agreement in 2019 to potentially purchase $51 million in neuromorphic chips from Rain AI, and documents from Rain AI stated that Sam Altman had personally invested more than $1 million in the company. The documents obtained by Wired also mentioned OpenAI's letter of intent as a factor that helped Rain AI obtain funding. The fund had led a $25 million fundraise announced by Rain in early 2022, which was later forced to sell by the Joe Biden administration.
OpenAI is also said to be launching its own artificial intelligence chip company, the reason being because of the high prices and scarcity of GPU chips from Nvidia that are currently limiting AI development. There’s definitely a market for more AI chips given Nvidia makes billions of dollars selling AI GPUs, but they can't keep up with demand. Their high prices and long waitlists are hindering the industry.
To give you an idea, Nvidia made $14.5 billion in data center revenue last year (mostly from AI chips) in just three months! That's a 206% increase. Massive companies such as Microsoft (which uses OpenAI systems) and Meta purchased over 150,000 GPUs from them. But to clearly understand why OpenAI was willing to invest so much in Rain AI, we first need to look at what Rain AI is and what it's capable of doing.
What Is Rain AI?
Rain AI is a San Francisco-based startup located less than a mile from OpenAI's headquarters. They are working on neuromorphic processors (NPUs), which are computer chips designed to mimic aspects of the neural architecture of the human brain. When compared to traditional graphics processing units (GPUs), the goal of these NPUs is to provide more computing power and energy efficiency for AI applications.
Rain claims that their NPUs could produce 100 times more computing power while using 10,000 times less energy than GPUs during AI model training. Rain's neuromorphic processors aim to bring AI capabilities more in line with human cognition in terms of efficiency and performance by mimicking key characteristics of biological neural networks. Their brain-inspired approach is what separates them from traditional AI hardware in the race to build better AI.
How Do Neuromorphic Processors Work?
Neuromorphic processors (NPUs) are computer chips that mimic aspects of biological neural networks like those found in human brains. They work differently than traditional computer processors, to give you a better perspective here is how it works:
- Input data comes into the NPU as electrical spikes, similar to signals between neurons. This input reaches artificial synapses, basic computational units that connect the input and output layers.
- Each synapse has a weight that determines the strength of the connection from inputs to outputs. These synaptic weights are adjusted during training as the NPU learns to perform tasks.
- When enough input spikes reach a synapse, the next neuron in the network is activated enough to fire its own output spike, propagating the signal.
- Learning occurs through synaptic plasticity - the adaptation of synaptic weights across neuronal connections. This process allows the NPU to learn and maintain memory like a brain.
- Unlike traditional processors that execute instructions sequentially, NPUs perform parallel processing across their neural network architecture, enabling faster processing for AI applications.
The end result behaves similarly to a neural network, capable of learning tasks through examples and environmental interaction. But this brain-inspired design allows efficient computation while requiring less energy than traditional hardware.
How Rain Chips Could Change AI in The Future
Rain's first chips rely on the RISC-V architecture, which is essentially an open-source chip blueprint backed by big names like Google and Qualcomm. Rain is geared at edge devices such as smartphones, drones, and even cars. Instead of massive data centers, these are electronic devices distributed throughout the world.
The key point is that Rain wants to create chips capable of training AI models and running them after deployment. Most edge chips now only focus on inference, which is why this would be useful if OpenAI wanted to push AI intelligence to devices rather than the cloud.
However, it is still being determined how OpenAI will use Rain's technology. Still, having chips perform end-to-end AI from training to inference directly on devices would be significant. It has the potential to put advanced intelligence on things like robots and self-driving cars and potentially change the way we use AI completely.
Want to Learn Even More?
If you enjoyed this article, subscribe to our free newsletter where we share tips & tricks on how to use tech & AI to grow and optimize your business, career, and life.