The Future Is Analog: New Twists on an Old Technology Might Unlock the Future of AI
top of page
  • Writer's pictureAlec Sorensen

The Future Is Analog: New Twists on an Old Technology Might Unlock the Future of AI



Advances in AI have led to increasingly complex large language models (LLMs) capable of sophisticated tasks. However, their immense computational requirements demand enormous power, posing challenges for AI scalability and raising environmental concerns.


Analog computing, using continuous signals instead of discrete binary data for calculations, dates back to 200 BC (check out the Antikythera Mechanism). Modern analog computers can enable faster processing and reduced power consumption, making them attractive for energy-hungry LLMs. By minimizing data movement and leveraging analog circuit parallelism, analog computing could improve AI efficiency.


Why Now?

The rapid growth of LLMs like GPT has accelerated the need to address scalability and power consumption. Advances in analog computing open new AI applications.


Historically, analog computing faced reliability and accuracy issues due to variations in material quality, temperature, voltage, etc. However, advances in material science, fabrication, and error reduction may solve these problems. Moreover, many AI applications have higher error tolerance; for example, image classification tools can function at 95% accuracy, while traditional computing tasks require 99.9+ accuracy.


Why You Should Care

LLMs can disrupt almost every industry but require solving scalability and power use. If analog solutions are the answer, there would be significant consequences for the $280B data center market, not to mention the $500B semiconductor market.

Have a technology angle for is or just want to talk about commercialization? Let us know at info@tradespace.io

 

Technologies that are Shaping Analog Computing

Quantum Cryptography
Patent Data from Tradespace IP Platform
  1. MEMristors: Short for "memory resistor," MEMristors can retain resistance states even when the power is turned off. They are being used to mimic brain synapse behavior, enabling energy-efficient neuromorphic computing systems that learn and adapt over time.

  2. Optical Computing: Leverages light instead of electricity for data processing. Offers high-speed parallelism and low power consumption, making it an attractive option for implementing large-scale neural networks and other complex AI models.

  3. Biological Neurons: Researchers are exploring ways to create artificial neurons or harness living neurons to build biologically-inspired computing systems that can efficiently process information and learn in a manner similar to the human brain.

 

Analog Startups: Going Beyond AI Chips


Mythic AI, probably the most well-known analog chipmaker, raised over $170M from investors like Blackrock and HP Enterprise but ran out of funding in late 2022. What does Mythic's collapse imply about commercial viability?


Poor Timing: Mythic's main issue was timing. Between 2012 and 2022, traditional chip companies contined to benefit from Moore's law, which predicts that the number of transistors on an IC, (hence processing power), doubles every two years. Today, transistor size is nearing atomic scale, limiting chipmakers' ability to enhance processing power by reducing transistor size.


Mythic's Revival: Furthermore, the exponential growth in LLM complexity has forced chipmakers to address requirements that were once only theoretical. This nascent demand led to Mythic's emergence from the ashes with a $7M investment from Lux Capital last month.


New Entrants: Beyond Mythic, a vibrant startup ecosystem has emerged around analog computing with many companies eschewing the crowded AI chip market for different approaches that have the potential to completely reshape the computing market:

Quantum Computing startups
 

Large Chip-Makers Join the Party


Big Bets on Resistive RAM: Nearly 80% of the IP developed by large chip makers and fabs (e.g., Intel, IBM, TMSC, Samsung) is focused on ReRam, which uses MEMristers to store analog DNN synaptic weights.


Notably missing from the top IP generators is NVIDIA, the current leader in AI data center technologies (primarily GPUs). While NVIDIA does have some interesting IRAD in biological computing, it generated 90% fewer analog computing patents than its peers.


China leads the way in emerging approaches: While biological and optical computing approaches to AI are at lower TRL levels than MEMristers, Chinese companies and affiliated research centers have driven significant IP production. Notably, State Grid Corporation and Beijing Lynxi Tech lead biological compute IP generation, while Huawei leads the way in optical computing for AI

 

Analog Computing IP Commercialization Opportunities


We triaged thousands of analog computing patents to highlight three portfolios of available IP from leading US Research Institutions with unique implications for AI. Click on an opportunities to see more detail on our IP Marketplace.



 

Was this analysis helpful? Sign Up Here to get an Emerging IP Spotlight for a different technology every other week

bottom of page