Economics This is what executives inadvertently reveal

Their voice provides an indication of the profit expectations of enterprises. A team from Bochum is using artificial intelligence to investigate this phenomenon.

Company executives know more than they say – and this is exactly what analysts and investors want to know. An interdisciplinary team from the economics and engineering departments at Ruhr University Bochum, Germany, has shown that all you have to do is listen very closely: the voice unintentionally reveals information that can be used to predict the expectation of profit or loss for the following year more effectively than conventional methods based on published figures. To this end, the team led by Dr. Doron Reichmann, head of the FAACT research group (FAACT stands for Finance/Accounting/Auditing/Controlling/Taxation) at the Faculty of Management and Economics, deployed the method of machine learning. With the help of their models, the researchers managed to beat the usual capital market returns by nine percentage points. They published their study online on the SSRN platform on 28 Decembre 2022.

No-one is in complete control of their voice

When executives talk about their company’s performance and profit expectations, they are not one hundred per cent transparent: they want to present things in a favourable light, especially to potential investors and analysts. The latter, on the other hand, want as realistic a picture as possible and read between the lines. “Savvy analysts seek personal contact with executives in order to pick out moods from the conversation,” says Doron Reichmann. “They know that the voice reveals information. The formation of language is such a complex process that no-one is in complete control of their voice.”

However, it’s not easy to describe exactly what they are looking for. “Machine learning is particularly useful when we, as humans, are unable to define precise rules for certain processes,” says Charlotte Knickrehm from the Chair for Industrial Sales and Service Engineering. “For example, we can infer emotions from voices. It’s quite an easy thing to do – but we don’t know how, exactly, we do it.” She and her colleagues use machine learning techniques to predict from statements made by executives whether their company will make a profit or a loss in the following year.

More than 8,000 recordings

To this end, they used over 8,000 recordings made at earnings conferences calls, where executives discuss the financial performance of their companies with investors and analysts. These audio files were converted into spectrograms that visualise the acoustic information. These spectrograms serve as input into a neural network that is able to recognise patterns in images. They trained it with data from 2015 to 2020, feeding it with both the spectrograms and the information on whether the respective company had made a profit or loss in the year following the recording.

And it worked: AI was six to nine percentage points more accurate in predicting whether a company would make a profit or loss in the following year than conventional models based on the company’s published figures.

Combination of analyst estimates and AI is most effective

In the second step, the researchers combined their predictions with analysts’ estimates of profit respectively loss. “This improved the predictions considerably once again,” points out Doron Reichmann. “The combination was 40 per cent more accurate than the analysts’ estimate alone. This means that if an analyst is ten per cent better at estimating whether a company will make a profit or loss than a coin toss, the combination with our method is 14 per cent better. That doesn’t sound like much, but it makes a lot of difference in financial markets.”

In order to assess whether they would also have earned more money on the stock market by applying their method, the researchers simulated a stock purchase with a good forecast through their models. The result: “In the years under review, we would have beaten the capital market by about nine percentage points,” says Doron Reichmann.

Original publication

Jonas Ewertz, Charlotte Knickrehm, Martin Nienhaus, Doron Reichmann: Listen Closely: Using vocal cues to predict future earnings, in: SSRN, 2022, DOI: 10.2139/ssrn.4307178

Press contact

Dr. Doron Reichmann
Head of the FAACT research group
Faculty of Management and Economics
Ruhr-University Bochum
Germany
Phone: +49 234 32 27683
Email: doron.reichmann@rub.de

Download high-resolution images
The selected images are downloaded as a ZIP file. The captions and image credits are available in the HTML file after unzipping.
Conditions of use
The images are free to use for members of the press, provided the relevant copyright notice is included. The images may be used solely for press coverage of Ruhr-Universität Bochum that relates solely to the contents of the article that includes the link for the image download. By downloading the images, you receive a simple right of use for one-time reporting. Saving the images for other purposes or further processing of the images that goes beyond adapting them to the respective layout requires an extended right of use. Should you therefore wish to use the photos in any other way, please contact redaktion@ruhr-uni-bochum.de

Published

Thursday
12 January 2023
9:14 am

By

Meike Drießen (md)

Translated by

Donata Zuber

Share