27 July 2022
New AI uses associative learning techniques rather than AI’s traditional neural networks to challenge the conventional wisdom that artificial neurons and synapses are the sole building blocks of AI.
Researchers at Oxford University’s Department of Materials, working in collaboration with colleagues from Exeter and Munster have developed an on-chip optical processor capable of detecting similarities in datasets up to 1,000 times faster than conventional machine learning algorithms running on electronic processors.
The new research published in Optica took its inspiration from Nobel Prize laureate Ivan Pavlov’s discovery of classical conditioning. In his experiments, Pavlov found that by providing another stimulus during feeding, such as the sound of a bell or metronome, his dogs began to link the two experiences and would salivate at the sound alone. The repeated associations of two unrelated events paired together could produce a learned response – a conditional reflex.
Co-first author Dr James Tan You Sian, who did this work as part of his DPhil in the Department of Materials, University of Oxford said: ‘Pavlovian associative learning is regarded as a basic form of learning that shapes the behaviour of humans and animals – but adoption in AI systems is largely unheard of. Our research on Pavlovian learning in tandem with optical parallel processing demonstrates the exciting potential for a variety of AI tasks.’
The neural networks used in most AI systems often require a substantial number of data examples during a learning process – training a model to reliably recognise a cat could use up to 10,000 cat/non-cat images – at a computational and processing cost.
Rather than relying on backpropagation favoured by neural networks to ‘fine-tune’ results, the Associative Monadic Learning Element (AMLE) uses a memory material that learns patterns to associate together similar features in datasets – mimicking the conditional reflex observed by Pavlov in the case of a ‘match’.
The AMLE inputs are paired with the correct outputs to supervise the learning process, and the memory material can be reset using light signals. In testing, the AMLE could correctly identify cat/non-cat images after being trained with just five pairs of images.
The considerable performance capabilities of the new optical chip over a conventional electronic chip are down to two key differences in design:
- a unique network architecture incorporating associative learning as a building block rather than using neurons and a neural network
- the use of ‘wavelength-division multiplexing’ to send multiple optical signals on different wavelengths on a single channel to increase computational speed.
The chip hardware uses light to send and retrieve data to maximise information density – several signals on different wavelengths are sent simultaneously for parallel processing which increases the detection speed of recognition tasks. Each wavelength increases the computational speed.
Professor Wolfram Pernice, co-author from Münster University explained: ‘The device naturally captures similarities in datasets while doing so in parallel using light to increase the overall computation speed – which can far exceed the capabilities of conventional electronic chips.’
An associative learning approach could complement neural networks rather than replace them clarified co-first author Professor Zengguang Cheng, now at Fudan University.
‘It is more efficient for problems that don’t need substantial analysis of highly complex features in the datasets’ said Professor Cheng. ‘Many learning tasks are volume based and don’t have that level of complexity – in these cases, associative learning can complete the tasks more quickly and at a lower computational cost.’
‘It is increasingly evident that AI will be at the centre of many innovations we will witness in the coming phase of human history. This work paves the way towards realising fast optical processors that capture data associations for particular types of AI computations, although there are still many exciting challenges ahead.’ said Professor Harish Bhaskaran, who led the study.
The full paper, ‘Monadic Pavlovian associative learning in a backpropagation-free photonic network,’ is available in the journal Optica.
Notes to editors
For further information or to arrange an interview, please contact the University of Oxford press office at email@example.com or on +44 (0)1865 280528
The paper ‘Monadic Pavlovian associative learning in a backpropagation-free photonic network’, published in Optica is available here: https://doi.org/10.1364/OPTICA.455864
Images and video
Images and a short explainer are available to download: https://www.dropbox.com/sh/q6tl6htu9z4urds/AABl0GS01uW25TZLObS6iC1Wa?dl=0
About the University of Oxford
Oxford University has been placed number 1 in the Times Higher Education World University Rankings for the sixth year running, and 2 in the QS World Rankings 2022. At the heart of this success is our ground-breaking research and innovation.
Oxford is world-famous for research excellence and home to some of the most talented people from across the globe. Our work helps the lives of millions, solving real-world problems through a huge network of partnerships and collaborations. The breadth and interdisciplinary nature of our research sparks imaginative and inventive insights and solutions.
Through its research commercialisation arm, Oxford University Innovation, Oxford is the highest university patent filer in the UK and is ranked first in the UK for university spinouts, having created more than 200 new companies since 1988. Over a third of these companies have been created in the past three years. The university is a catalyst for prosperity in Oxfordshire and the United Kingdom, contributing £15.7 billion to the UK economy in 2018/19, and supports more than 28,000 full time jobs.