Sen Cheng is head of the working group Computational Neuroscience. © RUB, Marquard

Memory New model unites two rivaling theories

Computer models help to understand memory.

There is an ongoing debate between neuroscientists about the question whether certain parts of the human memory consist of two separate systems or not. A new computer model – created by computational neuroscientists from Bochum – shows how the theories can be united. The research team around Prof Dr Sen Cheng and Dr Jing Fang published their findings in the journal “Neural Computation”.

Neuroscientists differentiate between the episodic memory, which is the part of the memory system that stores experiences, and the semantic memory which stores facts. If you recall drinking your first cup of coffee this morning at home, you use your episodic memory. The knowledge of what a cup is is stored in your semantic memory.

Valid arguments on both sides

If two separate systems in the brain are responsible for the two forms of memory or just one is currently under debate. There is experimental evidence for both theories.

The research group from Bochum therefore proposes a model in which two separate systems exist, but the systems are deeply intertwined. They translated their theory into a computer model.

In this model the semantic system is active first and processes the raw data, which is the sensory input that we get from our surroundings. After that – in a second step – the episodic memory is activated and creates an abstract memory out of the information from the semantic system. This memory is stored as sequences of memory – snippets that are called “snapshots” by Sen Cheng.

Snapshots in the brain

So far it has been assumed that episodic memories are stored as a single snapshot – just like a polaroid-photo of an experience. The RUB-scientists believe that there is rather a series of snapshots that are stored and combined as one episodic memory.
To support their theory, the scientists tested the predictions of their computer model in an experiment with human participants.

Predictions are confirmed

The participants had to remember how and in which direction eight abstract objects on the computer screen moved – a test for the episodic memory. They were allowed to interact with four of the eight objects before the test in order to store them – according to the theory – in their semantic memory.

The tests showed that the participants could remember the mode and direction of the movement of the objects more easily, when they had seen them before, which means that the objects were represented in their semantic memory. This finding matches the prediction of the computer model and shows that semantic and episodic memory are as intertwined as expected by the researchers. Further research is needed to show if and how both systems influence each other in both directions: “In the next step we are going to investigate how the episodic memory influences the semantic memory,” says Cheng.

Funding

The study has been conducted within the Collaborative Research Centre 874 at the Ruhr-Universität Bochum. The interdisciplinary research group has been funded by the Deutsche Forschungsgemeinschaft (DFG) since 2010 and investigates how the brain processes sensory input into complex behavior and memory.

Original publication

Jing Fang, Naima Rüther, Christian Bellebaum, Laurenz Wiskott, Sen Cheng: The Interaction between Semantic Representation And Episodic Memory, in: Neural Computation, 2017, DOI: 10.1162/Neco_A_01044

Published

Wednesday
21 February 2018
2:26 pm

By

Judith Merkelt-Jedamzik

Translated by

Judith Merkelt-Jedamzik, Julia Crispin

Share