🔮E08: Optical Computing
Are electronic computers the result of path dependency, or are they the only computers we will ever reliably produce cheaply and at scale?
Reading time: 8 mins.
So here we are again, then. Every week apparently. Write every week they said. It’s good to hold yourself to account they said. Writing helps clarify thinking they said.
I dunno, man. It’s also hard work. I’ve got work to do over here. Inbox is pilling up. The car needs a service. The solicitor is on my case. And you’ve all got me hunched over the laptop furiously trying to explain the difference between photonic computing, optical computing, optoelectronic computing, photonic integrated chips.
It takes 8 minutes to read. And it took me like 8 hours to write. I'm trying give you a million dollars worth of game for $9.99 over here. Kinda.
If you are new here, the least you can do subscribe.
If you are a regular, thanks so much I appreciate you. But come on, give me a squeeze would you.
Onto the show. Here are some open questions:
Are electronic computers the result of path dependency, or are they the only computers we can reliably produce cheaply and at scale?
It’s 2050, do electronic computers still exist? Do quantum and photonic computers run the core, and neuromorphic/analog run the edge? What is the advantage, other than cost and incumbant advantage of electronic? Analog was replaced by electronic so we know paradigm shifts to occur.
A corollary, If optical computers are useful for high-performance computing, is the right mental model that they are best suited for datacentres? If so, is quantum computing a threat? Even noisy-intermediate stage quantum computers will likely be best suited to HPC applications. Are photonic and quantum computing competitive? Or will they eventually “merge” and the winning approach is a quantum photonic computer?
Photonics is theoretically more performant than electronics, but we have yet to manufacture them. Is this a function of the inherent non-linearity of photons or just a money/processing challenge? Basically can photonic computers every be general-purpose?
Much love, Lawrence
🔮 Optical Computing
Context
Once you get into the details of new computing approaches like photonics, quantum, and analogue, you end up in a debate between science and manufacturing. Scientifically, non-conventional computers are better across critical dimensions like power consumption, bandwidth, and latency. But then the actual hard questions around adoption and timing relate to manufacturing. All the barriers are around the scale and complexity of the electronic CMOS semiconductor supply chain and the difficulties in fabricating anything that doesn't fit almost perfectly into the existing processes.
So we end up with digital neuromorphic chips and optoelectronic chips. Designs that take some of the benefits of new architectures and substrates but connect neatly to other electronic components and processes. We should think about the computing process in these terms. We're on a path from a unipolar computing world with electronics as the hegemon to a multipolar world with dominant electronic power but with rising photonic, quantum, and analogue powers. Photonics, analogue, and quantum will try to fit into the existing world by incorporating many electronic features and processes to smooth the transition. Quantum and photonics might even merge. The big question is, how big will these new powers get? How much electronic land will they capture? And will any of the new approaches become a new hegemon?
Scorecard
Viability: how technically mature is the technology? (3/4)
Drivers: how powerful are adoption forces? (5)
Novelty: how much better relative to alternatives? (3)
Diffusion: how easily will it be adopted? (2/3)
Impact: how much value is created? (3/4)
Timing: when will the market be suitable for risk capital? (2020-2025/2030+)
Overrated/Underrated: Correctly Rated
2030 Prediction: Optoelectronic processing chips will have a 30% market share in high-performance computing.
Connections: Li-Fi, Optogenetics, Quantum Hardware, Holographic Memory, Photonic Memory, Optical RAM
Summary
Optical computing uses light particles, or photons, instead of electrons, to perform computations. Optoelectronic computing is a sub-field that involves the use of photons and electrons to perform computing tasks. It combines optical and electronic computing elements to leverage both advantages while mitigating their limitations. They are segments within the broader field of photonic computing encompassing optical computing and other technologies using light for information processing tasks. Photonics is the most expansive term involving the generation, manipulation, and detection of photons, which are light particles. Photonics applications range from fibre-optic communication to medical diagnostics and laser manufacturing.
Optical computing leverages the properties of light, including its speed and capacity to carry large amounts of data, to achieve higher processing speeds and energy efficiency compared to classical electronic computing. While classical computing relies on electric currents in circuits to perform calculations, optical computing uses light waves to transfer and process data. It involves components like optical transistors and optical fibres, which handle multiple light signals at the same time.
Viability: how technically mature is the technology? (3/4)
Opto-electronic computing is close to commercialisation within a 1-2 year period. An optoelectronic chip, also known as a photonic integrated circuit (PIC), is a device that integrates multiple optical (light-based) and electronic functions onto a single chip. Such a chip aims to utilise the best of both worlds (I bet you forgot Jay-Z and R Kelly did an album called best of both worlds. Try finding that on Spotify 👀): the high-speed and high-bandwidth of optics combined with the mature electronics technology and processes. PICs take data as an electronic signal, convert it to an optical signal using a modulator, process the data as light, convert the signal back into an electronic signal using a photodetector, and the electronic signal is then output from the chip. The primary challenges are manufacturing and performance, especially the efficient conversion between optical and electrical signals, consuming roughly 30% of their energy converting electronic energy into photons and back. However, much R&D work is going into reducing this figure.
All-optical computing with no electronic components is far less mature, still mainly in the lab demonstration phase. The primary goal is still to build a scalable optical transistor to replace the electronic transistor. This would avoid the need for modulators and photodetectors entirely, simplifying the design and reducing power consumption. Numerous approaches include reflective mirrors, semiconductor nanocrystals, laser switching, and non-linear optical materials. A few promising approaches are now in the chip design phase.
Drivers: how powerful are adoption forces? (5)
Supply: Driven by developments on the supply side: advancements in silicon photonics, photodetectors, and non-linear optical devices. Firstly, manufacturing advancements, particularly in silicon photonics, allow for cost-effective, high-volume production of optical components. This includes improved lithography techniques and materials science advancements. Lightmatter's 'Mars,' the world's first 4096 MZI (Mach-Zehnder interferometer) integrated chip, is a device used to split and recombine light, allowing for precise control of the phase and amplitude of the light, showcasing the large-scale integration capabilities. Secondly, the efficiency of converting between optical and electrical signals is critical to optical computing performance. Innovations like lightwave looping and more efficient photodetectors and modulators enhance this efficiency, increasing speed and reducing power consumption. Lastly, the ability to perform non-linear operations is crucial for computing. Advances in designing and manufacturing non-linear optical devices, such as optical logic gates and amplifiers, enable more complex and powerful optical computing systems. This includes developing materials with high non-linear optical coefficients and new device architectures.
Demand: Three major factors drive the optical computing market: AI growth, power consumption concerns, and quantum computing advancements. AI applications like machine learning require processing vast data at high speeds, a task where optical computing could excel due to its high-speed, parallel data processing capabilities. Optical computing can quickly perform linear algebra operations and matrix multiplications, which is essential for AI algorithms. Power consumption is a significant concern in computing, with estimates suggesting 10%-20% of the world's total power could go to AI inference by the decade's end. Optical computing, which can transmit data with less energy and generate less heat, could be a more sustainable option. Lastly, the quantum computing industry's development has influenced optical computing progression. Developments in controlling and detecting photons for quantum applications have advanced optical computing technologies. Well-funded companies like Xanadu Quantum Technologies, ORCA Computing, PsiQuantum, Pasqal, and Atom Computing are leading the way.
Novelty: how much better relative to alternatives? (3)
I say this when discussing any computing hardware, but the market is for performing calculations. Right now the vast majority of calculations are performed using electronic computers. Optical computers will compete with electronic computers to run computations. Computing with photons versus electrons theoretically offers superior speed, performance and lower power consumption. That is still a theoretical advantage, though, as in practice, prototypes don’t really acheive anywhere near the theoretical speed or bandwidth benefits because of error accumulation, scaling prolems, and signal interference. Although, we can expect better control and miniaturisation as manufacturing capabilities develop.
We shouldn’t think of optical computing as a drop-in replacement for electronic computing, rather it will win market share for high-performance computing. Electronic computers will remain much cheaper for the foreseeable future and with an extremely strong incumbant advantage, optical computing will find customers where some cost can be traded-off for extreme performance. Matrix multiplication tasks, common in many AI applications like LLMs, are an obvious candidate. So is climate modelling and other complex simulations.
Diffusion: how easily will it be adopted? (2/3)
Optical computing faces serious headwinds to adoption. Like any other non-electronic, non-CMOS computing architecture, it needs many components and manufacturing processes to change. Of the different configurations, optoelectronic computing faces the easiest path to adoption. A function of the ability to communicate directly with electronic memory and I/O, and existing software can be run. However, design and fabrication are complex and require precise control over light properties, making development more challenging and costly than traditional electronic chips. Integrating existing electronic systems is another hurdle, requiring efficient conversion methods between optical and electronic signals. While decreasing, the cost of optoelectronic components can still be higher than equivalent electronic components. Material selection is also challenging, as the materials used need specific properties for efficient light generation, manipulation, and detection. This is also related to the issue of reliability and lifespan of optoelectronic components which can be sensitive to environmental conditions.
For all-optical chips the barriers are worse. Add an inability to use electronic fabrication processes and the need for new software and algorithms to all the ones above. Not ideal…
Impact: how much value is created? (3/4)
The future computing landscape will be heterogeneous. One ring will not rule them all. The era of a single dominant computing approach is over. We already see specialisation in the electronic domain, with tasks offloaded to ASICs and FGPAs. Compute specialisation will continue from electronic computation to photonic, analogue and quantum in a 25-year timeframe. I wish I had a better mental model for how these tasks would break down. I am furiously scribbling crazy things on a 2x2 matrix, but I'm not there yet. The task is to map different calculations onto a matrix with compute complexity and maybe accuracy, latency, or cost. I haven’t figured it out yet. Like Pepe Silvia.
The right way to think about the impact of optical computing is to weigh the probability of manufacturing an all-optical chip at scale and reasonable cost. This involves solving the issues of logic-level restoration, cleaning up the signal to avoid error accumulation; scalability, connecting multiple logic gates in a sequence; higher fan-out, connecting multiple logic gates; and input-output isolation, again, to avoid signal interference and errors. If these are solvable problems, then optical computing has the potential to perform a vast array of high-performance computing tasks, including a lot of AI training, most likely in the data centre. Suppose you think that the inherent non-linearity of photons will always mean electronic processing is cheaper and easier to manufacture. In that case, optical computing will mean optoelectronic computing and have a relatively limited impact.
Timing: when will the market be suitable for risk capital? (2020-2025/2030+)
Optoelectronic computers (2025-2030) will come to market in the next few years and are likely to find beachhead markets for massively parallel tasks like matrix multiplication for machine learning and high-performance computing like climate modelling.
Optical computers (2030+): The timeline for a scalable all-optical computer is unknowable. Even a fabrication breakthrough in 2023 would still take 5+ years before significant volumes, and that ignored the need for optical memory and other SoC components.
2030 Prediction
Optoelectronic processing chips will have a 30% market share in high-performance computing.
Overrated or underrated?
Corrected Rated. Boring, I know, but I think by now, the industry knows the potential benefits and is fully aware of the challenges of manufacturing photonics chips.
Startups to watch
State of the Future Connections
Li-Fi: Li-Fi never really took off. Would an optical computer help?
Optogenetics. Controlling cells with light would surely be a good application area for an optical processor.
Quantum Hardware. One of the approaches to building a quantum computer is photonics. In a photonic quantum computer, quantum information is stored in the quantum states of photons, such as their polarization or path. Quantum gates are performed by manipulating photons using beam splitters, phase shifters, and wave plates. Measurements are performed using devices like photodetectors.
Think of all those connections.
Connections.