[ad_1]
A Swinburne University of Technology led team has demonstrated the world’s fastest and most powerful optical neuromorphic processor for artificial intelligence (AI) that can operate faster than 10 trillion operations per second and is capable of processing ultra-large-scale data.
The research published in the journal Nature represents an enormous leap forward for neural networks and neuromorphic processing in general.
Artificial neural networks, a key form of AI, can learn and perform complex operations with wide applications to computer vision, natural language processing, facial recognition, speech translation, playing strategy games, medical diagnosis, and many other areas. Inspired by the biological structure of the brain’s visual cortex system, artificial neural networks extract key features of raw data to predict properties and behaviour with unprecedented accuracy and simplicity.
Led by Swinburne’s Professor David Moss, Dr. Xingyuan (Mike) Xu (Swinburne, Monash University), and Distinguished Professor Arnan Mitchell from RMIT University, the team achieved an exceptional feat in optical neural networks: dramatically accelerating their computing speed and processing power.
The team demonstrated an optical neuromorphic processor operating more than 1000 times faster than any previous processor, with the system also processing record-sized ultra-large-scale images enough to achieve full facial image recognition, something that other optical processors have been unable to accomplish.
“This breakthrough was achieved with ‘optical micro-combs’, as was our world-record internet data speed reported in May 2020,” told Professor Moss, Director of Swinburne’s Optical Sciences Centre and recently named one of Australia’s top research leaders in physics and mathematics in the field of optics and photonics by The Australian.
While state-of-the-art electronic processors such as the Google TPU can operate beyond 100 TeraOPs/s, this is done with tens of thousands of parallel processors. In contrast, the optical system demonstrated by the team uses a single processor and was achieved using a new technique of simultaneously interleaving the data in time, wavelength, and spatial dimensions through an integrated micro-comb source.
Micro-combs are relatively new devices that act like a rainbow made up of hundreds of high-quality infrared lasers on a single chip. They are much faster, smaller, lighter and cheaper than any other optical source, according to the study.
Co-lead author of the study, Dr. Xu, Swinburne alum and postdoctoral fellow with the Electrical and Computer Systems Engineering Department at Monash University said, “This processor can serve as a universal ultrahigh bandwidth front end for any neuromorphic hardware optical or electronic-based bringing massive-data machine learning for real-time ultrahigh bandwidth data within reach.”
“We’re currently getting a sneak-peak of how the processors of the future will look. It’s really showing us how dramatically we can scale the power of our processors through the innovative use of micro combs,” Dr. Xu explained.
RMIT’s Professor Mitchell also added’ “This technology is applicable to all forms of processing and communications — it will have a huge impact. Long term we hope to realise fully integrated systems on a chip, greatly reducing cost and energy consumption.”
“Convolutional neural networks have been central to the artificial intelligence revolution, but existing silicon technology increasingly presents a bottleneck in processing speed and energy efficiency,” said a key supporter of the research team, Professor Damien Hicks, from Swinburne and the Walter and Elizabeth Hall Institute.
This breakthrough shows how new optical technology makes such networks faster and more efficient and is a profound demonstration of the benefits of cross-disciplinary thinking, in having the inspiration and courage to take an idea from one field and using it to solve a fundamental problem in another, according to the scientists.
(This story has been published from a wire agency feed without modifications to the text.)
Follow more stories on Facebook and Twitter
[ad_2]
Source link