A Timeline of Hardware Delivering AI: from CPUs to Photonics

Artificial intelligence has advanced hand-in-hand with innovations in computing hardware. While algorithms drive the intelligence, hardware provides the raw capability to process, store, and transmit the vast amounts of data required. Without the right silicon, even the smartest model will struggle to succeed.

In this article, we have taken a look at a history of AI hardware, what it is, and how it has evolved to meet the demands of bigger and better AI models*.

1. 1970s - CPUs: The Universal Processor

The Central Processing Unit (CPU) has been the backbone of computing since the early 1970s, long before modern AI emerged. Originally handling all computational tasks, CPUs remain essential today as coordinators of complex systems.

  • What it is: A general-purpose processor with a small number of powerful cores, optimised for sequential tasks and branching logic.
  • Key function: Executes instructions in a fetch-decode-execute cycle, using caches, branch prediction, and pipelining.
  • Role in AI: Early neural networks, classical machine learning, and lightweight inference.
  • Major players: Intel, AMD, ARM.

2. Late 1990s-2000s - GPUs: Unlocking Parallelism

Graphics Processing Units emerged in the late 1990s, initially for rendering video game graphics. Starting with the introduction of NVIDIA’s CUDA in 2006, GPUs have become indispensable for training AI models owing to their massive parallelism.

  • What it is: Thousands of smaller cores combined to perform many tasks in parallel.
  • Key function: Performs large-scale matrix and vector operations simultaneously, ideal for neural network training.
  • Role in AI: Training large models and high-throughput inference.
  • Major players: NVIDIA, AMD, Intel.

3. 1980s-2000s - FPGAs: Reconfigurable Acceleration

Field-Programmable Gate Arrays have been used since the 1980s in embedded and industrial applications. Their ability to implement digital circuit which can be reprogrammed on-the-fly and perform high-speed, concurrent processing led to their adoption in AI applications in the late 2000s.

  • What it is: Integrated circuits with reconfigurable internal logic, allowing custom digital circuit designs to be implemented after manufacturing.
  • Key function: Create custom circuits tailored to specific AI tasks.
  • Role in AI: Fast inference, telecom AI workloads, and edge acceleration.
  • Major players: AMD (Xilinx), Intel (Altera), Lattice Semiconductor, Amazon.

4. Mid-2010s - Bespoke ASICs for AI: Purpose-Built Performance

  • What it is: Chips which are custom-designed and optimized for a specific application or task.
  • Key function: Maximizes performance per watt by eliminating unnecessary flexibility.
  • AI role: Hyperscale training and inference, autonomous driving chips, domain-specific tasks, and edge AI.
  • Major players: Google, Tesla, Cerebras, Graphcore, Mythic, Tenstorrent.

5. Mid-2010s - Compound Semiconductors: High-Speed Networking

Materials such as gallium nitride (GaN) and gallium arsenide (GaAs) began playing a larger role in AI infrastructure in the mid-2010s, especially for networking and high-frequency components.

  • What it is: Semiconductors made from compound materials, such as gallium arsenide (GaAs) or indium phosphide (InP), offering higher electron mobility and faster performance than silicon.
  • Key function: Enables faster signal transmission and higher efficiency at high voltages and frequencies.
  • Role in AI: High-speed signal processing, Power-Efficient AI Hardware, integration of ultrafast optical interconnections
  • Major players:  Coherent, Qorvo, Broadcom, ST Microelectronics, Infineon.

6. Late 2010s - TinyML Devices: AI at the Edge

Coinciding with the rapid evolution of the Internet of Things (IoT), TinyML hardware integrates AI acceleration into ultra-low-power microcontrollers for on-device intelligence. It brings many of the capabilities of the larger AI hardware discussed here, including CPUs, GPUs, and ASICs, into a compact, fast, and energy-efficient form factor.

  • What it is: Low-power chips running small, often quantized models entirely on-device.
  • Key function: Processes data locally without relying on cloud servers.
  • Role in AI: On-device sensing, IoT intelligence, and energy-efficient inference.
  • Major players: STMicroelectronics, NXP, Renesas, Espressif.

7. Late 2010s - Neuromorphic Computing: Brain-Inspired Design

Emerging in the late 2010s, neuromorphic chips mimic the brain’s structure, using spiking neural networks for event-driven, and energy-efficient AI computation.

  • What it is: Hardware designed to emulate neurons and synapses, enabling brain-inspired, event-driven computation.
  • Key function: Processes information only when events (“spikes”) occur, rather than on a fixed clock cycle, greatly reducing power consumption.
  • Role in AI: Robotics, real-time perception, adaptive learning, and other AI tasks requiring low-latency, energy-efficient processing.
  • Major players: Intel, IBM, BrainChip.

8. Late 2010s - Processing-in-Memory (PIM): Breaking the Memory Wall

Emerging in the late 2010s, PIM addresses the processor-memory bottleneck by performing computation directly inside or next to memory arrays, boosting efficiency for data-intensive AI workloads.

  • What it is: Chips which have built-in compute capabilities located inside memory.
  • Key function: Minimizes data movement, lowering latency and power consumption.
  • AI role: Efficient inference for large models and data-heavy workloads.
  • Major players: Samsung, SK Hynix, Mythic.

9. Early 2020s - Photonic AI: Computing with Light

In an exciting new approach to AI hardware, photonic processors are being developed which use light instead of electricity for computational tasks, promising high speed processing and lower energy use.

  • What it is: Hybrid chips which leverage photons to perform operations.
  • Key function: Executes matrix calculations through optical interference and waveguides.
  • AI role: Supports energy-efficient training and inference for large and small-scale AI models.
  • Major players: Lightmatter, Lightelligence, PsiQuantum.

10. Emerging - Quantum Computing for AI

Quantum processors are still largely experimental for AI, but they offer the potential to solve optimization, simulation, and combinatorial problems exponentially faster and more efficiently than classical hardware.

  • What it is: Hardware that uses quantum bits (qubits) to exploit superposition and entanglement to perform computations beyond the reach of classical systems.
  • Key function: Executes quantum gates and circuits for AI-related tasks, including optimization, probabilistic sampling, and hybrid classical-quantum training.
  • AI role: Expected to enable specialized AI workloads, such as complex optimization, large-scale simulations, and novel machine learning algorithms that are infeasible on classical hardware.
  • Major players: IBM, Google Quantum AI, Rigetti, D-Wave.

11. Looking Ahead

The progression of AI hardware shows a clear trend where each generation builds upon and complements the last. CPUs orchestrate workloads, GPUs drive training, ASICs and FPGAs provide specialized acceleration. Meanwhile, emerging technologies like photonic processors and quantum computing are poised to unlock entirely new levels of speed, efficiency, and capability.

The future is not about one dominant architecture but a coordinated ecosystem where each technology plays its part to drive the next wave of AI innovation.

 


 

* While we have put rough dates on each of the technologies discussed, innovation in each of these fields continues to be on-going and thriving as computing technology pushes toward greater speed, efficiency, and capability.


 

This blog has been co-authored by Rebecca Frith and Luke Jones.


 

Rebecca Frith circle frameRebecca Frith

Rebecca is a patent attorney working in our engineering team at Mewburn Ellis. She has a first-class MEng degree in General Engineering from Durham University where she specialised in electronics. After graduating, she worked for three years at a technology consulting firm as an electronics and firmware engineer. As a technology consultant Rebecca dealt with a variety of research and development projects for the defence and aerospace industries, including projects in computer vision, data security in machine learning, sensing devices, radar modelling, radio communications and safety assured electronics design.

Email: rebecca.frith@mewburn.com