Pigeons classify what they see into categories and thus help research. © SFB874/Susanne Troll

Biopsychology Exploring the strategies of categorization

Using a novel research method, Bochum-based research team identifies universal principles in categorisation learning.

Our mental ability to divide the complex world into categories makes our daily life much easier. But how do we categorise? What kind of stimulus properties do we assess? Researchers at Ruhr-Universität Bochum (RUB) have come a step closer to answering these questions with the help of pigeons. They discovered that birds use different strategies to successfully learn categories. To gather data, the researchers used a novel research method. To this end, they combined so-called virtual phylogenesis, in which artificial stimuli are generated by computers, with a machine learning approach, namely an automated evaluation of the birds’ pecking behaviour. They have published the findings of their research in the January issue of the journal Animal Cognition.

Categorisation harnesses knowledge for new experiences

Colloquially referred to as pigeonholing, categorisation learning often has a rather negative connotation in the public eye. Yet the basic cognitive ability to categorise offers a significant advantage: it condenses the flood of objects and events in our environment on the basis of commonalities and makes the knowledge that we have accumulated usable for new experiences.

In the field of science, the aspects stimuli that determine the classification into a category have been the subject of controversial debate for a long time. The study conducted by the Bochum-based research team now offers insights into this question – through a research approach using computer-generated stimuli in combination with a machine learning analysis of the pecking behaviour of pigeons. “We specialise in working with these animals,” points out Dr. Roland Pusch, lead author of the study. “Pigeons have a highly developed visual system and show excellent performance in behavioural tests. This makes them an excellent model system to tackle this question.”

Pigeons’ specific pecking behaviour facilitates detailed analysis

The biopsychologists trained the pigeons to distinguish between digitally produced images on the screen and divide them into categories by pecking at the monitor. “We precisely defined the properties of the image stimuli,” as Pusch outlines the process. “Through so-called virtual phylogenesis, we created two object families with 20 members each on the computer. Based on its properties, each object clearly belonged to family X or family Y and could thus be categorised accordingly by the animals.” “The trump card in our research series was the specific pecking behaviour of pigeons,” adds project leader Professor Onur Güntürkün. “After training, pigeons use pecking to indicate whether an object belongs to a category or not. At the same time, they also mark exactly the spot on the object that was decisive for their categorisation choice.”

Based on the automated recording, the researchers pinpointed the locations on the objects that the pigeons touched when they made their choices on the monitor. “The pecking behaviour of individual animals was very consistent. This leads us to the conclusion that the animals attach importance to very specific characteristics of the stimuli,” says Pusch. “Interestingly enough, despite identical behaviour, these preferences are different in each individual; in other words, each pigeon has its very own specific characteristics that it considers important in the two families of objects. This suggests that categorisation learning is not limited to a single learning strategy.”

According to Pusch and Güntürkün, the combination of virtual phylogenesis and the machine learning approach offers a lot of potential for subsequent research in the field of categorisation learning. For example, the method opens up the possibility of studying species-specific behavioural strategies in comparative experiments in addition to its sensory basis. Beyond the behavioural analysis, neuronal processes that trigger categorisation learning in the brain could also be explored in detail.

Funding

The study was funded by the Collaborative Research Centre 874 (SFB 874) of the German Research Foundation. The SFB 874 “Integration and Representation of Sensory Processes” has existed at Ruhr-Universität Bochum since 2010. The researchers examined how sensory signals generate neuronal maps, and how this leads to complex behaviour and memory formation.

Original publication

Roland Pusch, Julian Packheiser, Charlotte Koenen, Fabrizio Iovine, Onur Güntürkün: Digital embryos: a novel technical approach to investigate perceptual categorization in pigeons (Columba livia) using machine learning, in Animal Cognition, 2022, DOI: 10.1007/s10071-021-01594-1

Press contact

Dr. Roland Pusch
Biopsychology
Faculty of Psychology
Ruhr-Universität Bochum
Germany
Phone: +49 234 32 24037
Email: roland.pusch@rub.de

Prof. Dr. Dr. h.c. Onur Güntürkün
Biopsychology
Faculty of Psychology
Ruhr-Universität Bochum
Germany
Phone: +49 234 32 26213
Email: onur.guentuerkuen@rub.de

Download high-resolution images
Der Download der gewählten Bilder erfolgt als ZIP-Datei. Bildzeilen und Bildnachweise finden Sie nach dem Entpacken in der enthaltenen HTML-Datei.
Nutzungsbedingungen
Die Verwendung der Bilder ist unter Angabe des entsprechenden Copyrights für die Presse honorarfrei. Die Bilder dürfen ausschließlich für eine Berichterstattung mit Bezug zur Ruhr-Universität Bochum verwendet werden, die sich ausschließlich auf die Inhalte des Artikels bezieht, der den Link zum Bilderdownload enthält. Mit dem Download erhalten Sie ein einfaches Nutzungsrecht zur einmaligen Berichterstattung. Eine weitergehende Bearbeitung, die über das Anpassen an das jeweilige Layout hinausgeht, oder eine Speicherung der Bilder für weitere Zwecke, erfordert eine Erweiterung des Nutzungsrechts. Sollten Sie die Fotos daher auf andere Weise verwenden wollen, kontaktieren Sie bitte redaktion@ruhr-uni-bochum.de

Published

Thursday
03 February 2022
9:17 am

By

Anke Maes

Translated by

Anke Maes

Share