Introduction
While proven incredibly valuable for numerous applications, ranging from
robotics and medicine to economy and computational cognition, artificial
intelligence (AI), in many ways, is nullified when compared with
biological intelligence. For example, the Cockatiel Parrot can navigate
and learn unknown environments at 35 km/hr, manipulate objects, and use
human language, with a brain consuming merely 50 mW of power.
Comparably, an autonomous drone with comparable mass and size consumes
5,000 mW of power while being limited to pretrained flying in a known
environment with limited capacity for real-time learning. Deep learning
with artificial neural networks (ANNs) is a predominant method in AI.
ANNs, however, are limited to slow generalization with massive data,
offline training, and batched optimization . In contrast, biological
learning is characterized by fast generalization and online incremental
learning . Spiking neural networks (SNNs) closely follow the
computational characteristics of biological learning and stand as a new
frontier of AI . SNNs comprise densely connected, physically implemented
“silicon neurons,” which communicate with spikes . SNNs were realized
in various hardware designs, including IBM’s TrueNorth , Intel’s Loihi ,
the NeuroGrid , the SpiNNaker , and the BrainDrop .
Programming a neuromorphic system is a challenging endeavor, as it
requires the ability to represent data, manipulate and retrieve it with
spike-based computing. One theoretical framework designed to address
these challenges is the neural engineering framework (NEF) . NEF brings
forth a theoretical framework for representing high-dimensional
mathematical constructs with spiking neurons for the implementation of
functional large-scale neural networks. It was used to design a broad
spectrum of neuromorphic systems ranging from vision processing to
robotic control . NEF was compiled to work on each of the neuromorphic
hardware designs listed above via Nengo, a Python-based ”neural
compiler,” which translates high-level descriptions to low-level neural
models .
One of the most promising directions for neuromorphic systems is
real-time continuous learning . A neuromorphic continuous learning
framework was recently shown to handle temporal dependencies spanning
100,000 time steps, converge rapidly, and use few internal state
variables to learn complex functions spanning long time windows,
outperforming state-of-the-art ANNs . Neuromorphic systems, however, can
realize their full potentials only when deployed on neuromorphic
hardware. While NEF was previously adopted for both digital and hybrid
(analog/digital) neuromorphic circuitry , we propose a detailed, fully
analog design for NEF-based online learning. Our circuit design utilizes
OZ, an analog implementation of the NEF-inspired spiking neuron we
recently proposed . OZ is a programmable spiking neuron that can support
arbitrary response dynamics. We used online learning to represent
high-dimensional mathematical constructs (encoding and decoding with
spiking neurons), transform one neuromorphic representation to another,
and implement complex dynamical behaviors. We further designed a circuit
emulator, allowing the evaluation of our electrical designs on a large
scale. We used the emulator to demonstrate adaptive learning-based
control of a six-degree-of-freedom robotic arm. Our design supports the
basic three fundamental principles of NEF (representation,
transformation, and dynamics) and can therefore be of potential use for
various neuromorphic systems.