Socio-Technical Analysis of the so-called AMS-Algorithm
TU Wien Informatics was part of the research team investigating possible bias and structural inequalities of the AMS assistance system.
The new Labour Market Opportunities Assistance System (AMAS) is to be set up in the Austrian Public Employment Service (AMS) from 2021 onwards to calculate the future opportunities of job seekers on the labor market, based on past-years’ statistics. The so-called “AMS algorithm” is discussed controversially—critical voices speak of an algorithmic hardening of discrimination on the labor market. The Institute for Technology Assessment (ITA) of the Austrian Academy of Sciences together with a team from our research unit Multidisciplinary Design and User Research analyzed the technical functioning and social effects on behalf of the Upper Austrian Chamber of Labour.
Technology as a value-neutral tool is a myth: socio-scientific technology research has shown that the design and use of technology always anchor specific values, norms, and interests in society. Even big-data analyses are surrounded by an aura of truth, objectivity, and accuracy that is not empirically proven. These two misconceptions have also accompanied the gradual introduction of the AMS algorithm—which is currently put on hold following a ruling by the data protection authority.
How the Algorithm Works
The future chances of job seekers in the labor market are calculated based on statistics from past years. The job seekers are divided into three groups based on their “integration chances,” to which different resources for continuing education are allocated. The algorithmic system looks for connections between job seeker characteristics and successful employment. The features include age, group of countries, gender, education, care obligations, health impairments, past employment, contacts with the AMS, and the labor market situation in residence. The aim is to invest primarily in those job seekers for whom the support measures are most likely to reintegrate into the labor market.
The project investigated the AMS algorithm’s socio-technical dimensions based on current research, specifically from the Critical Data Studies and the area of fairness, accountability, and transparency in socio-technical systems. The study looked into AMAS’s objectives at the organizational and operational level and checked which guiding principles inform the system’s technical design. Further, the researchers asked how AMAS was technically implemented, which data on past occupational histories were used, and what components the AMS algorithm consists of. Which forms of bias, discrimination, and error rates play a role, how these were explicitly taken into account, and possible effects by the system’s use on job seekers were also addressed.
Based on the algorithm’s findings, people with a low classification and low chances on the labor market are denied helpful AMS measures. The algorithmic classification discriminates against groups “far from the labor market.” It affects people who have been looking for a job for a more extended period—mostly older people, people with health problems, and low-skilled workers. In the development of the system, hardly any procedures were used to avoid bias in the system. The system does not provide any evidence to prevent possible structural inequalities in treatment in the application.
“The research project on the AMS algorithm emphasizes the need for interdisciplinary research. Only by a holistic analysis of algorithmic systems, which includes technical and social aspects, we can realistically assess these systems’ effects. As computer scientists—in the sense of a digital humanism—our task is increasingly to move away from the technological determinism that primarily proposes a technical solution for every social problem. The Centre for Informatics and Society (C!S) takes up this challenge and researches these problems by combining technology and society,” Florian Cech, co-author of the study, is convinced.
In the development of algorithmic systems for (semi-)governmental institutions such as the AMS, anti-discrimination measures and system and data transparency are required to enable a comprehensible evaluation from a technical, fundamental rights, democratic, and rule-of-law perspective. Besides, the study recommends rights of access and objection for those affected, public consultations, and the teaching of new skills for AMS advisors and customers if algorithmic systems are used in the public sector.
The project was conducted in interdisciplinary cooperation with Florian Cech and Fabian Fischer from TU Wien Informatics’ research unit Multidisciplinary Design and User Research and the Centre for Informatics & Society, and Gabriel Grill (PhD student at the School of Information, University of Michigan).