Regulation of technology calls for a different view of humans

When it comes to regulating digital technology, such as generative AI and algorithms, the focus is often on the technology itself rather than the human who interacts with it. At least, not on humans as they truly are. We must acknowledge that humans are never entirely predictable; otherwise, regulation will never be effective, argues Prof. Esther Keymolen in her inaugural address. On Friday, April 19th, she will assume the chair of Regulation of Digital Technology at Tilburg University. In her work, she focuses on the ethical and philosophical questions surrounding technology regulation.

Digital technology increasingly permeates our lives: social media, navigation systems, ChatGPT, and many other applications have become commonplace. Our lives are even partly shaped by them. With new laws and regulations, our data in these systems are protected as much as possible, and the risks associated with these new technologies are minimized: think of the GDPR (General Data Protection Regulation), the AI Act, and the Trustworthy AI Guidelines in the EU.

The human aspect is indeed taken into account in all this regulation, but it is often not effective. Esther Keymolen illustrates this through four categories of people involved in digital technology: the citizen, the professional user or developer, the civil servant, and the scientist.

As citizens, for example, we accept cookies without knowing exactly what we are agreeing to; we don’t read the fine print. Sometimes, we need different terms of use than those provided, such as managing privacy on a group level rather than individually, or we may not want to know certain available data - such as the risk of an inherited disease. Regulation assumes that humans themselves control their data, but this human is a fiction, according to Keymolen.

Technology professionals and tech companies are expected to design and use technology according to ethical and legal principles, but this is often not straightforward. What is fair technology, for example? How do you uphold human dignity in technological practices? Such complex legal and ethical issues are increasingly being formulated as technical questions with a programmable answer. However, not all’aspects of human life can be captured in data and systems. Therefore, sometimes a broader public discussion or political decision-making is more appropriate than a technical solution.

Civil servants working with digital technology make great efforts to do it right, for example, by checking whether algorithms discriminate. But often, this is not enough - what happens to people who don’t fit into a box? Technology does not always offer an easy solution here.

When advice is needed, scientists are often consulted, but they also do not always have a clear answer when it comes to the use of technology. Science can outline various scenarios, but there are always questions that must be answered in the political arena of democracy.

Positive view of humans

According to Keymolen, the solution is to see humans not as lacking when it comes to digital technology, but - following the philosopher Helmuth Plessner - as creative, dynamic individuals connected to their environment. According to this positive view of humanity, humans are constantly shaping life, often with the help of technology, and they are never done with it. It is this variability that characterizes humans and that regulation must protect.

Therefore, we must not only invest in aligning technology with humans, which is still essential but never perfect. We must also devise solutions for when things go wrong, arrange alternatives, provide checks and balances, and devise exit strategies. A pluralistic approach is necessary. Only when we cherish the unpredictability of humans will technology truly serve us.

Inaugural speech

Prof. Esther Keymolen will deliver her inaugural speech on Friday, April 19, 2024, at 4:15 p.m. in the auditorium of Tilburg University. Title: Technological Times: Looking out for the Human. The speech can also be followed via livestream .