Researchers at the University of Utah have developed an optical neural engine (ONE) that tackles partial differential equations (PDEs) with remarkable speed and energy efficiency. If you’ve ever struggled with sluggish, resource-heavy simulations, this breakthrough might just change the game.
Instead of relying on traditional digital computations, the team employs diffractive optical neural networks and optical matrix multipliers to encode these equations into light. This means that variables are represented by properties like intensity and phase, and as light navigates the engine’s components, it directly transforms into the solution.
The researchers demonstrated the engine’s prowess on several complex PDEs—including the Darcy flow, which models fluid movement through porous media, magnetostatic Poisson’s equation, and even the Navier-Stokes equation that governs fluid dynamics. Led by Assistant Professor Weilu Gao and Ph.D. candidate Ruiyang Chen, their work, published in Nature Communications, opens up new avenues for efficient, large-scale scientific computations.
Former team member Yingheng Tang, now a research scientist at Lawrence Berkeley National Laboratory, explained that the optical approach not only speeds up the simulation process but also cuts down on energy use. This shift from conventional methods could significantly impact fields like geology and chip design, where fast, effective modelling is essential.