Another day, another review. This time I come back to the short review series and with a paper close to my scientific and career background. I already posted here about photonics and its applications in information processing, so I would not bother to be worried about the readers of The Information Age being ignorant of this.
The paper I shortly review here appeared also in MIT Technology Review section of Arxiv papers for this week. It continues our journey through the neural networks current efforts, and it is also another breakthrough in the area. This time we are presented with a work by one other numerous group of researchers, and on Optical Computing. More specifically the deeply interesting possibilities of neuromorphic computing using photonic chips. As is mentioned in the MIT review, all-optical computing and devices has been a dream for Computer Science in general. This paper deals with what the authors refer to as a ‘neural compiler’, that is, it is at the heart and matter of scientific computing, at the very mechanics of it.
The MIT review also cautions against undue enthusiasm, and this is well done. Indeed, we are a bit in the very beginning of what might be a wave of breakthroughs in computing with neural networks, but given the complexity and sensitivity of the issues involved, misguided decisions may in the end compromise achievements or delay real possibilities for significant applications.
Neural networks are taking the world of computing by storm. Researchers have used them to create machines that are learning a huge range of skills that had previously been the unique preserve of humans—object recognition, face recognition, natural language processing, machine translation. All these skills, and more, are now becoming routine for machines.
So there is great interest in creating more capable neural networks that can push the boundaries of artificial intelligence even further. The focus of this work is in creating circuits that operate more like neurons, so-called neuromorphic chips. But how to make these circuits significantly faster?
Optical computing has long been the great dream of computer science. Photons have significantly more bandwidth than electrons and so can process more data more quickly. But the advantages of optical data processing systems have never outweighed the additional cost of making them, and so they have never been widely adopted.
That has started to change in some areas of computing, such as analog signal processing, which requires the kind of ultrafast data processing that only photonic chips can provide.
At the heart of the challenge is to produce an optical device in which each node has the same response characteristics as a neuron. The nodes take the form of tiny circular waveguides carved into a silicon substrate in which light can circulate. When released this light then modulates the output of a laser working at threshold, a regime in which small changes in the incoming light have a dramatic impact on the laser’s output.
Crucially, each node in the system works with a specific wavelength of light—a technique known as wave division multiplexing. The light from all the nodes can be summed by total power detection before being fed into the laser. And the laser output is fed back into the nodes to create a feedback circuit with a non-linear character.
An important question is just how closely this non-linearity mimics neural behavior. Tait and co measure the output and show that it is mathematically equivalent to a device known as a continuous-time recurrent neural network. “This result suggests that programming tools for CTRNNs could be applied to larger silicon photonic neural networks,” they say.
The results show just how fast photonic neural nets can be. “The effective hardware acceleration factor of the photonic neural network is estimated to be 1,960 × in this task,” say Tait and co. That’s a speed up of three orders of magnitude.
That opens the doors to an entirely new industry that could bring optical computing into the mainstream. “Silicon photonic neural networks could represent first forays into a broader class of silicon photonic systems for scalable information processing,” say Taif and co.
Of course much depends on how well the first generation of electronic neuromorphic chips perform. Photonic neural nets will have to offer significant advantages to be widely adopted and will therefore require much more detailed characterization. Clearly, there are interesting times ahead for photonics.
We report first observations of an integrated analog photonic network, in which connections are configured by microring weight banks, as well as the first use of electro-optic modulators as photonic neurons. A mathematical isomorphism between the silicon photonic circuit and a continuous neural model is demonstrated through dynamical bifurcation analysis. Exploiting this isomorphism, existing neural engineering tools can be adapted to silicon photonic information processing systems. A 49-node silicon photonic neural network programmed using a “neural compiler” is simulated and predicted to outperform a conventional approach 1,960-fold in a toy differential system emulation task. Photonic neural networks leveraging silicon photonic platforms could access new regimes of ultrafast information processing for radio, control, and scientific computing.
Further reading is always recommended. Specially concerning the crucial topics for this paper, which are Hopf Bifurcation and Wavelength-Division Multiplexing. The authors conclude with this paragraph:
We have demonstrated a reconfigurable analog neural network in a silicon photonic integrated circuit using modulators as neuron elements. Network-mediated cusp and Hopf bifurcations were observed as a first proof-of-concept of an integrated broadcast-and-weight system . Neural network abstractions are powerful tools for bridging the gap between physical dynamics and useful application, and silicon photonic manufacturing introduces opportunities for large-scale photonic systems. Simulations of a 49 modulator neuron network performing an emulation task estimated a 1,960× speedup over a verified CPU benchmark. At increased scale, silicon photonic neural networks could be applied to unaddressed computational areas requiring ultrafast, reconfigurable, and efficient hardware processors. Furthermore, silicon photonic neural networks could represent first forays into a broader class of silicon photonic systems for scalable information processing.
Text images: MIT Technology Review and the paper reviewed
Featured Image: Photonic Wirebonding for Optical Multi-Chip Systems (PHOIBOS)