TY - JOUR
T1 - KirchhoffNet
T2 - End-to-End Analog Circuit Acceleration for ODE-Based Neural Networks
AU - Zheng, Su
AU - Gao, Zhengqi
AU - Sun, Fan Keng
AU - Boning, Duane S.
AU - Rohrer, Ron
AU - Yu, Bei
AU - Wong, Martin D.F.
N1 - Publisher Copyright:
© 1982-2012 IEEE.
PY - 2025/12/23
Y1 - 2025/12/23
N2 - This paper introduces KirchhoffNet, a novel class of neural network models inspired by the principles of analog electronic circuitry, specifically Kirchhoff’s laws. KirchhoffNet operates as an analog circuit, where the network input is represented by initial node voltages, and the output corresponds to the node voltages at a specific time. The dynamics of the node voltages are governed by learnable parameters on the edges, and the evolution of these voltages follows a system of ordinary differential equations (ODEs). Despite the absence of traditional neural network components such as convolutional layers, KirchhoffNet achieves outstanding performance across a wide range of machine learning tasks. We further demonstrate that KirchhoffNet is capable of computing diffusion models, making it a promising candidate for accelerating modern generative AI applications. Most notably, KirchhoffNet can be implemented as a high-speed & low-power analog integrated circuit, which introduces a compelling advantage: irrespective of the number of parameters in the network, its on-chip forward calculation can always be completed within a short time. This property makes KirchhoffNet a highly attractive and scalable paradigm for implementing large-scale neural networks, opening new avenues in the realm of analog neural networks for artificial intelligence.
AB - This paper introduces KirchhoffNet, a novel class of neural network models inspired by the principles of analog electronic circuitry, specifically Kirchhoff’s laws. KirchhoffNet operates as an analog circuit, where the network input is represented by initial node voltages, and the output corresponds to the node voltages at a specific time. The dynamics of the node voltages are governed by learnable parameters on the edges, and the evolution of these voltages follows a system of ordinary differential equations (ODEs). Despite the absence of traditional neural network components such as convolutional layers, KirchhoffNet achieves outstanding performance across a wide range of machine learning tasks. We further demonstrate that KirchhoffNet is capable of computing diffusion models, making it a promising candidate for accelerating modern generative AI applications. Most notably, KirchhoffNet can be implemented as a high-speed & low-power analog integrated circuit, which introduces a compelling advantage: irrespective of the number of parameters in the network, its on-chip forward calculation can always be completed within a short time. This property makes KirchhoffNet a highly attractive and scalable paradigm for implementing large-scale neural networks, opening new avenues in the realm of analog neural networks for artificial intelligence.
UR - https://www.scopus.com/pages/publications/105026086390
UR - https://ieeexplore.ieee.org/document/11313598/authors#authors
U2 - 10.1109/TCAD.2025.3647615
DO - 10.1109/TCAD.2025.3647615
M3 - Journal article
AN - SCOPUS:105026086390
SN - 0278-0070
JO - IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
JF - IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
ER -