top
"The Individual Voting Machine" by David von Netzer

All under control?

What would it be like if our society were controlled exclusively by algorithms? Under the direction of Prof. Fabian Hemmert, students from the Industrial Design department at the University of Wuppertal have developed dystopian products that are intended to sharpen our view of undesirable futures. He explains the project in this interview.
5/15/2023

“Controlled by algorithms. Finally. A new era has begun. Governments and courts have been entirely replaced by AI-based systems. Previous problems with human bias and corruption are thus consigned to the past. Every citizen has the right to algorithmic monitoring during their waking hours for their own safety and the safety of all. The system is designed according to human values of transparency, happiness, productivity, fairness, and individuality. Humans can be selfish. Our system cannot. Welcome to the future,” reads the introduction to the project website "aicracy" of the Industrial Design course at the University of Wuppertal.

Anna Moldenhauer: Prof. Hemmert, how did this project come about?

Fabian Hemmert: In our seminar, we chose the topic of what our society might look like in the future if the technical developments we are currently seeing continue in a dystopian direction, which is also reflected in a critical kind of design. For this, we analyzed different areas, such as insurance and our legal system. What might happen if we are one day ruled by the algorithm? To investigate, our students – David Hrlic, David von Netzer, Alexander Görts, Christopher Weld, and Piet Becker – researched during the semester, identified potentials, compiled concepts and sketches, and then created physical prototypes in our workshop. I created a time-lapse of this process, and this video also serves to convey our approach: The bulk of a design process is basically conceptualization, and that involves extensive research. We incorporated the designs into a fictional everyday story in a possible dystopian future.

The basic idea of Aicracy is rooted in an algorithm that controls our behavior to a significant extent. Algorithms learn from humans – so in this vision, would designers then be a kind of god-like authority that defines the values for our society by means of product design?

Fabian Hemmert: The question of who programs the algorithm is of course a crucial one – we also discussed this issue in the seminar and came to the conclusion that citizens could have the opportunity to submit votes on socially relevant issues to the algorithm every Sunday via “The Individual Voting Machine”. The algorithm then adjusts the rules of society accordingly. This could be a vote regarding a speed limit, for example: If the majority votes in favor, the rule will apply. Little balls are used for voting – the better adapted a citizen is, the more balls they have available for voting – and of course they can use more than one ball to answer a specific question if they want to add particular weight to their vote there. So more voting balls mean a greater say.

"The Transparency Bracelet" by Piet Becker

Those who behave according to the rules get more voting balls for their choices, or even better prices in the supermarket, as I saw in your film.

Fabian Hemmert: Exactly! This already shows that a system guided by algorithms does not act fairly. What was also interesting is how all the processes we discussed in the framework are already present in our world – at least in a weakened form. We are quite used to insurance companies, for example, giving us discounted prices if we behave according to the rules.

Would there be a chance for the designers to intervene if, for example, there were technical problems or the system had too destructive an impact on people?

Fabian Hemmert: Our vision did not provide for this; the highest authority here is the technology.

During the seminar, did you talk about the responsibility designers have when they design products whose technology can be directed against people or could be used by people to cause harm to another person?

Fabian Hemmert: Not explicitly, but I think our example shows very well what happens when designers absolve themselves of responsibility and allow things to be created on autopilot. Then it becomes inhuman, and this poses a dilemma: In our technologized world, we can’t usurp all responsibility, but nor can we absolve ourselves of it completely either. It’s about finding the right balance. Communication and participation play a big role – the system appears participatory because citizens are allowed to vote every week on how to further program the algorithm. In reality, however, this voting is itself already inhuman. It is also conceivable that the system could run decentrally via a blockchain and would have to be available on a mandatory basis on every cell phone, for example. This would mean that there would not even be a server that could simply be switched off in an emergency. If you take this idea a step further, you quickly come to the question of why you actually need people to make human decisions.

"The Happiness Patch" by Alexander Görts

The products have names like “The Transparency Bracelet,” “The Happiness Patch,” “The Fairness Basket,” and “The Productivity Chair,” and the respective descriptions also use catchphrases like “free of charge” to convey the impression that their use can only be beneficial. I find this way of positively framing a dystopia for the marketing of products very interesting. The project is also a good example of how an algorithm can change human moral concepts or values.

Fabian Hemmert: That’s right, reading these texts can really make you anxious and afraid, mainly because demand for such a system is not unlikely. I can certainly imagine countries where efficient working is paramount and so these “measures” might be used, including the “positive” communication.

There are already numerous sci-fi films that focus on technology that controls humanity. Why is this topic nevertheless interesting for designers?

Fabian Hemmert: The generation that is just beginning their studies at our university is not concerned with getting rich, but with finding meaning in their design. Their work needs to be holistically “right”. This was the starting point from which we looked at the kind of dystopia the optimization process might sometimes lead to.

"The Fairness Basket" by David Hrlic

Do you get the impression that the students’ approach to design changed during the project?

Fabian Hemmert: Our project was fictional, so the students naturally had a lot of creative freedom. I always ask my students at the beginning of their studies if they see anything in their environment that is not designed, because everything around us is designed – it was once an idea and then it became real. We perceive every object in our everyday life as “normal,” but in each case it was conceived by a human being. And this in itself shows how much responsibility designers have. They certainly became more aware of this during the seminar, and it was also frightening for the students to see that in some parts of the world many dystopian design ideas are already common practice.

Where does the project go from here?

Fabian Hemmert: There is no intention to launch the products on the market; rather, “Aicracy” is intended to stimulate a discussion about how we would like the future to look. People have the option to participate in the discussion via the hashtag #aicracy. What if we had parliaments that were controlled by algorithms? What might be better than today, what might be worse? For example, would I allow constant monitoring of my vital signs in exchange for a better rate on my health insurance? If we manage to raise awareness in this regard through the project, then it will already have been successful.

"The Productivity Chair" by Christopher J. Weld
aicracy