RFR: Extending autoassociative memories

Proposed by Cerenaut and the Whole Brain Architecture Initiative
(What is a Request for Research/RFR?)

Background

There is potential to extend Machine Learning to more human-like capabilities with the introduction of episodic memory that enables one-shot learning and memory replay (to consolidate memories for long term storage without catastrophic interference).

Algorithms that are inspired by the hippocampus provide a promising approach, a recent example is AHA (Kowadlo 2019). AHA uses an autoassociative memory based on the well known Hopfield network. However, autoassociative memories are generally limited by the following two constraints:

  • Batch vs incremental: limited to learning a batch at a time, and resetting between batches
  • Static vs sequences: do not learn sequences

Autoassociative memories can have many other applications, and removing these constraints would have a big impact.

Not a lot of research has addressed these constraints, a good summary is given by Shen 2010. These projects use very different approaches such as SOMs (Yamada 1999), growing networks (Sudo 2009), multi layer networks fully connected ann’s (Shen 2010) or multiple Hopfield networks (Maurer 2005).

A promising model of the autoassociative component of the Hippocampus gives inspiration for improving a Hopfield network (Jensen 1996a, 1996b).

Aim and Outline

The aim of this project is to extend a Hopfield network:

  • to learn incrementally, so that it continues to learn new samples and forgets old ones, and
  • be capable of learning sequences as well as static samples (i.e. it becomes hetero-associative)

The proposed approaches are:

  1. Incremental Learning:
    Devise a hebbian-like rule with a forgetting factor and configuration for incremental learning.
  2. Sequential Learning:
    Use the ideas of the model of Jensen 1996, a decaying period of plasticity following a sample, to enable learning connections between time-consecutive samples and hence enable sequential learning

Successful execution of this project would likely result in publishable work.

Status

Open

URLs and References

Jensen, O., & Lisman, J. E. (1996). Theta/gamma networks with slow NMDA channels learn sequences and encode episodic memory: Role of NMDA channels in recall. Learning Memory, 3(2–3), 264–278. https://doi.org/10.1101/lm.3.2-3.264

Jensen, O., & Lisman, J. E. (1996). Hippocampal CA3 region predicts memory sequences: Accounting for the phase precession of place cells. Learning Memory, 3(2–3), 279–287. https://doi.org/10.1101/lm.3.2-3.279

Kowadlo, G., Ahmed, A., & Rawlinson, D. (2019). AHA! an “Artificial Hippocampal Algorithm” for Episodic Machine Learning. Retrieved from http://arxiv.org/abs/1909.10340

Maurer, A., Hersch, M., & Billard, A. G. (2005). Extended Hopfield network for sequence learning: Application to gesture recognition. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3696 LNCS, 493–498. https://doi.org/10.1007/11550822_77

Shen, F., Yu, H., Kasai, W., & Hasegawa, O. (2010). An associative memory system for incremental learning and temporal sequence. Proceedings of the International Joint Conference on Neural Networks, 1–8. https://doi.org/10.1109/IJCNN.2010.5596780

Akihito Sudo, Akihiro Sato, Osamu Hasegawa (2009), Associative Memory for Online Learning in Noisy Environments Using Self-Organizing Incremental Neural Network, Published in IEEE Transactions on Neural

Takeo Yamada, Motonobu Hattori, Hiroshi Ito (1999), Sequential learning for associative memory using Kohonen feature map, IJCNN’99