🔮 E09: Analog Computing: The Once and Future King
Underrated: Mixed-signal analog-digital ICs capture 50% of the $60bn edge AI hardware market.
Photo: The Antikythera Mechanism, the oldest analog computer
It’s Sunday, yeah what of it? I’m out here hustling, what are you doing? E05 you got neuromorphic computing, and E08 brought you optical computing. I’m here today for analog computing. You will be getting the full gamut of exotic computing over the next few months from your quantums and your mechanicals to your moleculars, magnetics and acoustics. Oh yeah, computing with acoustic waves. Strap in.
But honestly, turning our world into zeros and ones was a mistake. Our world is continuous and messy right. The clarity of binary is all nice and convenient , but it’s cargo cult math isn’t it? The world can’t be turned into ones and zeros and then modelled effectively can it? What if we used a new type of computer. One that was designed to compute the world as it is?
You've just imagined 'Upstairs Downstairs, a new quiz show devised and hosted by David Brent.
Anyway thats enough fun for you. Now you learn.
🗿Analog Computing
TLDR:
Summary: Analog computing is a computing paradigm where information is represented and processed directly as continuous analog signals, rather than being encoded digitally. Analog computers utilize analog electronic circuits to perform operations on the amplitude, frequency, or phase of analog signals to accomplish useful computation.
Viability (4): Already mature $90bn market. R&D is focused on making analog ICs more precise, reliable, and programmable which would make them more useful for broader applications at the edge like AI, robotics and biosensing.
Drivers (4): Supply-side, analog ICs have got materially cheaper and more useful through better fabrication and interfacing technologies, especially mixed signal ICs. And ReRam has made analog storage viable. Demand-side, AI is basically matrix multiplication for which analog ICs can do very efficiently.
Novelty (5): Analog computing has similar efficiency, parallelism, and real-time processing benefits as photonics. But it is superior when implementing non-linear effects like like multiplication or frequency mixing. Likely only candidate for edge applications with non-linear effects where input signals are continuous.
Diffusion (4): Analog ICs are expensive ASICs, and have widespread use for specific applications across many industries. But broader adoption is limited due to lack of precision and programmability. Deep learning may very well be the next major application, but the issue of precision still needs to be solved.
Impact (4): The most likely impact is that mixed signal ICs combine the benefits of analog circuits with digital circuits on mixed signal ICs serving the growing edge AI and wearable markets.
Timing (Now: 2020-2025): Invest now because the edge AI opportunity went through a catalytic moment in 2022 with LLMs.
Underrated. Depends on if mixed-signal ICs or analog ICs sufficiently solve the precision challenge and serve AI inference applications. If it can, the market size would be on the mid-to-upper end of the $500 billion to $5 trillion market.
2030 Prediction: Mixed-signal analog-digital ICs capture 50% of the $60bn edge AI hardware market.
Summary
A useful starting point is at the top.
Electronic computing harnesses the movement of electrons through a diverse array of technologies to represent and process information. It used electronic components like transistors, capacitors, etc. to manipulate and process information. Encompasses digital and analog computing paradigms. Alternative computing approaches include optical, quantum, mechanical, molecular, magnetic, and acoustic.
The most common and general type of electronic computing is digital computing which encodes information into binary digits, leveraging the on/off states of transistors in logic gates to create flexible, reprogrammable, and noise-immune systems.
Analog computing is a less common and more specific type of computing that represents information as continuously variable voltages instead of 1s and 0s, relying on electronic analog circuits to manipulate signals directly for efficient but less precise computation.
Analog computing is maybe the oldest emerging technology. Unlike quantum, optical or molecular, analog computing already won. Then lost. The earliest analog computers were special-purpose machines, for example, the tide predictor developed in 1873 by Lord Kelvin. One hundred fifty years later, digital computing has eclipsed analog computing. It was replaced by electronic computers from around 1930, mainly driven by the need for high-speed calculations for ballistics and code-breaking during World World II. Digital computers offered greater precision, accuracy and programmability compared to analog systems.
Digital computers went on to take over the world. Although slightly less well-known (by me, at least) analog computers carved out a 90-billion-dollar niche for signal processing, communications, and power management applications. Analog ICs are one of three electronic semiconductor segments with logic (CPU, GPU, etc) and memory (NAND, DRAM, etc). Logic is the largest segment, with $180 billion in 2022 sales, memory with $130 billion, and analog ICs with $90 billion. Growing at roughly seven per cent annually, the market will be worth 100 billion-dollar by 2027.
So shall we all pack up and go home here? Analog computing is already a large and growing market. Take my money, right? Well yeah, but what if… and stay with me here. What if analog computers get more precise, accurate and programmable? As previously explored with neuromorphic and optical computing here and here, we should think about analog computing as one of many alternative computers in a new heterogeneous computing era.
TLDR: Analog computing is a computing paradigm where information is represented and processed directly as continuous analog signals, rather than being encoded digitally. Analog computers utilize analog electronic circuits to perform operations on the amplitude, frequency, or phase of analog signals to accomplish useful computation.
Viability (4)
Analog computing is a significant $90 billion mature market. There are two types of analog ICs: general-purpose ICs and application-specific analog ICs. General-purpose ICs have four segments: amplifiers and comparators, interface (mainly Analog-to-Digital Converters (ADCs) and Digital-to-Analog Converters (DACs), power management, and signal conversion. And application-specific analog ICs include customized chips for end-user applications in defence, automotive and other industries. Major suppliers include large multinational companies that we all know and love like Texas Instruments, Analog Devices, Infineon and STMicroelectronics.
R&D continues to improve power efficiency, advancing sensor interface capabilities and integrating analog and digital circuitry on a single chip. However, improvements in high-speed data conversion and precision and accuracy have the potential to materially grow the analog IC market. First, high-speed data conversion in ADCs and DACs would make hybrid analog/digital chips more compelling for applications beyond wireless communications, audio processing, and instrumentation, opening up new applications like mobile robotics and biosensing. ADC sampling rates now reach several giga samples per second (GSPS) with resolutions ranging from 12 to 24 bits and beyond, offering greater dynamic range and finer signal detail. Second, on the precision side, circuit and layout optimization, notice and distortion reduction, and new materials like gallium nitride (GaN).
TLDR: Already mature $90bn market. R&D is focused on making analog ICs more precise, reliable, and programmable which would make them more useful for broader applications at the edge like AI, robotics and biosensing.
Drivers (4)
On the supply side, there have been three main drivers of interest in analog computing. First, advances in analog and mixed-signal integrated circuits. Improved analog IC fabrication techniques allow higher precision and complexity in custom analog circuit designs. This enables more powerful analog computing modules. Second, the development of resistive RAM (ReRAM) which stores data by resistance rather than charge, allowing direct storage of analog values. Combined with analog computing ICs, ReRAM provides efficient computational memory. Third, improved analog-digital conversion has enabled faster and higher resolution analog-to-digital and digital-to-analog converters enable seamless interfaces between analog and digital systems.
On the demand-side, low-power edge devices continue to be a driver, but really, renewed interest is because of AI. Deep learning and recent transformer models are really just matrix multiplication machines. Analog integrated circuits are well-suited for efficiently implementing matrix multiplication operations. At the core, matrix multiplication relies on repeated steps of multiplying values and summing the results. Analog ICs can perform multiplication using simple, compact Four-Quadrant Multiplier circuits that take voltage inputs and output their product. Sums are computed with op-amp summation circuits cascaded to accumulate results. Keeping computations in the analog domain avoids costly digitization and leverages the parallelism and density possible on an IC substrate. Analog matrix multiplication performance scales nearly linearly by adding more parallel multiplier circuits.
By minimizing data transfers and optimized for matrix math, analog ICs achieve far greater efficiency for neural network workloads compared to flexible but power-hungry digital processors.
TLDR: Supply-side, analog ICs have got materially cheaper and more useful through better fabrication and interfacing technologies, especially mixed signal ICs. And ReRam has made analog storage viable. Demand-side, AI is basically matrix multiplication for which analog ICs can do very efficiently.
Novelty (5)
A table like this is misleading at best, and flat-out wrong at worst. But the process of comparing computing approaches like this does help put the computing environment in some context. It’s helpful because it is clear that there will not be one ring to rule them all. Analog computing is strong on energy efficiency, parallelism, real-time processing, and nonlinearity. These features make it suitable for high-performance, real-time continuous data processing. However, photonic systems are also compelling for these use cases, potentially at lower power consumption and higher throughput. The advantage of analog over photonic is that photonic systems struggle with non-linearity (where the output is not directly proportional to the input).
As with hybrid optoelectronic devices, hybrid analog-digital chips are an attempt to take some of the bandwidth and energy efficiency benefits of analog whilst benefiting from the speed, precision and cheaper manufacturing cost of electronics
TLDR: Analog computing has similar efficiency, parallelism, and real-time processing benefits as photonics. But it is superior when implementing non-linear effects like like multiplication or frequency mixing. Likely only candidate for edge applications with non-linear effects where input signals are continuous.
Diffusion (4)
Broadly
Analog ICs are already widely adopted across a broad range of industries. The primary restraint to broader use is the cost, especially regarding logic ICs. The simplest explanation is that we do not have a general-purpose analog processor like a CPU. There is some versatility within analog ICs like Analog-to-Digital Converters (ADCs) and Digital-to-Analog Converters (DACs), but generally, they are customised for specific applications. They are ASICs. The fact they are specific means the market size is limited, and as such, they lack the same economies of scale that benefit digital chips. This can make analog ICs less cost-effective, especially when digital solutions can meet the required specifications. Additionally, it costs more to design an analog IC too. Unlike their digital counterparts, analog circuits are susceptible to noise, temperature, and manufacturing process variations. Designing robust analog circuits requires expertise and meticulous consideration of these factors, making it a more specialised field. Finally, the primary barrier for optical computing, too, is the dominance of digital computing, and the availability of digital ICs has led to a preference for digital solutions in many applications.
For AI
A specific restraint for analog ICs applied to AI is the issue of precision. While analog hardware can efficiently perform matrix multiplications, implementing high-precision nonlinear activation functions poses a key challenge for analog neural network implementations. Analog circuits inherently have limited precision due to electronic noise and variability. But neural networks require high precision activations to model complex functions accurately. Quantizing down to low bit precision causes substantial accuracy loss. High-precision analog activations demand large, complex circuitry, erasing analog efficiency benefits. Potential solutions under exploration include leveraging higher resolution data converters only for activations, adaptive calibration techniques, hybrid analog-digital designs, and new training algorithms to make networks robust to low-precision analog circuits. Innovations that overcome the activation precision bottleneck while maintaining efficiency advantages could enable fully analog neural network implementations. But precision limitations remain a central challenge analog must address.
TLDR: Analog ICs are expensive ASICs, and have widespread use for specific applications across many industries. But broader adoption is limited due to lack of precision and programmability. Deep learning may very well be the next major application, but the issue of precision still needs to be solved.
Impact (4)
Scenario 1: Fail (<$100bn) Score: 2/3. Low Probability.
Analog computing sort of taps out at the current set of applications. It is never able to achieve the precision required to efficiently serve AI. And this lack of precision is a fundamental barrier to extending analog ICs into new applications.
Scenario 2: Unite ($500bn-5tr) Score: 4. High Probability.
This is the mixed signal IC story. We get analog-digital ICs that efficiently allocate some signals analog circuits and other signal to digital circuits depending on the precision, power consumption, latency, bandwidth requirements. Hybrid ICs turn out to be the right balance for transformer models in edge devices serving the large AI inference market. Also, hybrid ICs serve new biomedical and wearable uses cases like in-ear biosignals.
Scenario 3: Replace (>$5tr) Score: 5. Low Probability.
The big bull case is that analog computers go on to eat into a very large chunk of the digital computing market. For this to happen analog chips need to effectively solve the precision problem. And/or solve the programmability problem. Things like ultra-low noise circuits, improved fabrication consistency, and active trimming are a start. But we would need to reduce noise via breakthroughs in materials, components, fabrication, calibration, and architecture.
TLDR: The most likely impact is that mixed signal ICs combine the benefits of some analog circuits with digital circuits on the same chip to serve the edge AI and edge wearable markets.
Timing (2020-2025)
The market is already large up for risk capital. The supply-side innovations are happening fast enough and the market demand especially for edge AI is growing fast enough that risk capital should enter the market now. The timing risk relates to the timeline for fabrication and sufficient volumes which will be 5-7 years from a standing start depending on what we mean by “standing start” and by “sufficient”.
TLDR: Invest now because the edge AI opportunity went through a catalytic moment in 2022 with LLMs.
Overrated or Underrated?
Underrated. The rating depends on if mixed-signal ICs or ultimately analog ICs are able to sufficiently solve the precision challenge and serve AI inference use cases. If it can, the scenario 2 market size would be on the mid-to-upper end of the $500 billion to $5 trillion market.
2030 Prediction
Mixed-signal analog-digital ICs capture 50% of the $60bn edge AI hardware market.
Open Questions
Precision - Can analog computing achieve sufficient precision for complex tasks given inherent noise? Or will hybrid digital approaches always be needed? Especially relevant for applying analog ICs to LLMs.
Specifically, can we develop robust quantization-aware training whereby we can train neural nets with artificially lower numerical precision. This makes models tolerant to low-bitwidth analog hardware.
Programmability - How much programmability can analog systems have vs. fixed-function hardware? Can algorithmic training help?State of the Future Connections
Models - Can we develop accurate models of analog computational noise, errors, and faults to design more robust systems? SPICE (Simulation Program with Integrated Circuit Emphasis) is a widely used analog circuit simulator. However, SPICE remains a detailed circuit-level tool. New techniques are needed to abstract key noise behaviors to the system-level for analog computing architectures and software co-design.
Startups to Watch
It’s actually quite hard to parse which startups are doing analog neuromorphic, digital neuromorphic, or analog-digital, or pure analog. So this is certainly an incomplete list:
Mythic (United States): Analog AI inference chips for edge computing
Anabrid (Germany): Analog-Digital Hybrid Computing (Interview)
Aspinity (United States): Analog chips for edge machine learning
Encharge AI: Mixed-signal In-Memory Computing (IMC)
Also before you go, remember I keep banging on about deep geothermal energy? First there is this from Austin Vernon summarising the state of play in depth. He then put me right on my incorrect view that it the industry was sort of waiting for high-voltage electro-pulse (HVEP) rock-breaking technology. He says conventional drilling techniques will be sufficient. He concludes:
“drillers could adapt conventional techniques to reach 60,000' depths and that selling process heat would be more profitable than producing electricity.”
And:
It’s happening. Brain recording next…
Amazing as usual, there's a huge opportunity for a startup actually to work on the algorithms/software side for all 3 analog, photonic, neuromorphic. And who knows I believe the 3 together are maybe possible.