Algorithms for counting people in a crowd

Study: People trust algorithms more than themselves or others

Algorithms - i.e. programs that are supposed to solve a problem based on certain instructions - have become an indispensable part of our everyday lives: We trust the navigation system to show us the shortest route to our destination, Spotify to suggest suitable songs and online dating platforms that they provide us with the perfect partner for life. In fact, some people seem so confident in the programs that they fall asleep while driving a self-driving car.

Some of this trust is justified. After all, algorithms are actually usually better at solving analytical tasks than people. However, we trust the programs even when we shouldn't. This is the result of a current study from the USA, which carried out three experiments with a total of 1,500 participants.

The more complex, the more trustworthy

In it, the study authors showed the participants a series of pictures showing a certain number of people. Participants then had to estimate how many people they saw and could match their answer with either that of thousands of other people or that of an algorithm.

The more people there were in the pictures, the more likely the participants were to rely on the algorithm. The problem: Algorithms are not always particularly good at recognizing many objects and people in images - especially not when people are close together and difficult to separate from one another. According to the study authors, it can be problematic if people trust decisions solely because they come from an algorithm.

Algorithms that discriminate

Examples in application processes have shown how much algorithms can sometimes have a negative impact on decisions. Time and again, algorithms that should help to find the most suitable applicant for a specific position have discriminated against women or other groups in the past - mostly due to the fact that the data and instructions contained in the Program had been fed showed an uneven picture.

The authors of the study therefore warn against trusting algorithms too much and too quickly, without taking into account the possible bias and the level of development of the programs. According to the study, politicians and decision-makers in particular should be more aware of what algorithms can and cannot do. (Jakob Pallinger, April 14, 2021)