Experimentally realized in situ backpropagation for deep learning in photonic neural networks

Sunil Pai, Zhanghao Sun, Tyler W. Hughes, Taewon Park, Ben Bartlett, Ian A.D. Williamson, Momchil Minkov, Maziyar Milanizadeh, Nathnael Abebe, Francesco Morichetti, Andrea Melloni, Shanhui Fan, Olav Solgaard, David A.B. Miller

Research output: Contribution to journalArticlepeer-review

Abstract

Integrated photonic neural networks provide a promising platform for energy-efficient, high-throughput machine learning with extensive scientific and commercial applications. Photonic neural networks efficiently transform optically encoded inputs using Mach-Zehnder interferometer mesh networks interleaved with nonlinearities. We experimentally trained a three-layer, four-port silicon photonic neural network with programmable phase shifters and optical power monitoring to solve classification tasks using "in situ backpropagation,"a photonic analog of the most popular method to train conventional neural networks. We measured backpropagated gradients for phase-shifter voltages by interfering forward- and backwardpropagating light and simulated in situ backpropagation for 64-port photonic neural networks trained on MNIST image recognition given errors. All experiments performed comparably to digital simulations (>94% test accuracy), and energy scaling analysis indicated a route to scalable machine learning.

Original languageEnglish
Pages (from-to)398-404
Number of pages7
JournalScience
Volume380
Issue number6643
DOIs
Publication statusPublished - 28 Apr 2023
Externally publishedYes

Fingerprint

Dive into the research topics of 'Experimentally realized in situ backpropagation for deep learning in photonic neural networks'. Together they form a unique fingerprint.

Cite this