TY - JOUR
T1 - Experimentally realized in situ backpropagation for deep learning in photonic neural networks
AU - Pai, Sunil
AU - Sun, Zhanghao
AU - Hughes, Tyler W.
AU - Park, Taewon
AU - Bartlett, Ben
AU - Williamson, Ian A.D.
AU - Minkov, Momchil
AU - Milanizadeh, Maziyar
AU - Abebe, Nathnael
AU - Morichetti, Francesco
AU - Melloni, Andrea
AU - Fan, Shanhui
AU - Solgaard, Olav
AU - Miller, David A.B.
N1 - Publisher Copyright:
© 2023 The Authors.
PY - 2023/4/28
Y1 - 2023/4/28
N2 - Integrated photonic neural networks provide a promising platform for energy-efficient, high-throughput machine learning with extensive scientific and commercial applications. Photonic neural networks efficiently transform optically encoded inputs using Mach-Zehnder interferometer mesh networks interleaved with nonlinearities. We experimentally trained a three-layer, four-port silicon photonic neural network with programmable phase shifters and optical power monitoring to solve classification tasks using "in situ backpropagation,"a photonic analog of the most popular method to train conventional neural networks. We measured backpropagated gradients for phase-shifter voltages by interfering forward- and backwardpropagating light and simulated in situ backpropagation for 64-port photonic neural networks trained on MNIST image recognition given errors. All experiments performed comparably to digital simulations (>94% test accuracy), and energy scaling analysis indicated a route to scalable machine learning.
AB - Integrated photonic neural networks provide a promising platform for energy-efficient, high-throughput machine learning with extensive scientific and commercial applications. Photonic neural networks efficiently transform optically encoded inputs using Mach-Zehnder interferometer mesh networks interleaved with nonlinearities. We experimentally trained a three-layer, four-port silicon photonic neural network with programmable phase shifters and optical power monitoring to solve classification tasks using "in situ backpropagation,"a photonic analog of the most popular method to train conventional neural networks. We measured backpropagated gradients for phase-shifter voltages by interfering forward- and backwardpropagating light and simulated in situ backpropagation for 64-port photonic neural networks trained on MNIST image recognition given errors. All experiments performed comparably to digital simulations (>94% test accuracy), and energy scaling analysis indicated a route to scalable machine learning.
UR - http://www.scopus.com/inward/record.url?scp=85159244384&partnerID=8YFLogxK
U2 - 10.1126/science.ade8450
DO - 10.1126/science.ade8450
M3 - Article
C2 - 37104594
AN - SCOPUS:85159244384
SN - 0036-8075
VL - 380
SP - 398
EP - 404
JO - Science
JF - Science
IS - 6643
ER -