Science Talk with Julia Neidhardt
Earlier in spring, Julia Neidhardt joined a panel discussion in the “Science Talk” series to explore the intersection of technology, society, and ethics.
Earlier in spring, Julia Neidhardt joined a a panel discussion to explore the intersection of technology, society, and ethics. Together, the experts in the panel examined why safeguarding our digital future—from data storage to AI—requires a robust framework, concluding that regulation does not have to be a barrier to innovation.
The discussion was part of the “Science Talk” series, an event format hosted by the Austrian Federal Ministry of Women, Science and Research (BMFWF), where researchers discuss current, socially relevant topics on a public panel. Audience members are actively invited to contribute questions and network with the experts.
Regulation Does Not Have to Be an Obstacle to Innovation
Regulation does not have a good reputation in a technical context. Yet it is not a barrier to innovation, but rather often a necessary step, concluded experts at a “Science Talk” earlier this spring. “Geopolitical developments clearly show, for example, why data storage in Europe is so important,” said Ben Wagner from the Interdisciplinary Transformation University Austria (IT:U) in Linz. The innovation potential of this measure is usually overlooked. In the past, this discussion was only held as a human rights debate, whereas today it is mainly negotiated in the context of defense and security. “However, ethical and human rights obligations can also be linked with economic requirements,” said Wagner. Such cooperation could offer great opportunities within a European regulatory framework.
Examples of this can be found in various infrastructure sectors that are trying to become independent, from the user domain to the mobile communications industry. “Of course, this is associated with costs, and the market is only European. But this European market is so imperative because it is currently becoming apparent that dependencies are more expensive in the long run,” explained Wagner, who is a professor of democracy and human rights at IT:U.
Considering Social Aspects of Non-Neutral Technology
Ethical questions now belong to the core area of engineering. This is also connected to the idea of socio-technical systems: according to this concept, technology is not neutral, but embedded in a social context. “With a building, this might be less obvious than with ChatGPT, but both interact with humans in different ways,” said Wagner. “Because it is so direct and clear with ChatGPT and the like, we are currently realizing how strongly technology has always shaped our lives.” The discussion about trust in technical solutions will become “incredibly important” in our society over the next few years, added Anna-Vera Deinhammer, an endowed professor at the FH Wien der WKW and integral engineering scientist. For example, to reduce the CO2 emissions of an existing building, consumption data—ideally from the past ten to fifteen years—is very useful. “But this won’t work without an accompanying societal discussion. Because if I have this data for certain addresses, I can draw further conclusions,” Deinhammer pointed out.
New Threat: Agentic Systems
The impacts of technical innovations have now become very tangible for all of us, as noted by Julia Neidhardt, a researcher at TU Wien and UNESCO Co-Chair on Digital Humanism. At least since the release of ChatGPT, these have been apparent to everyone through discussions about the consequences of AI, for instance, for the labor market or the school system. With new agentic tools like OpenClaw, new threats emerge: the AI agent can access numerous apps and take over various tasks, which leads to security and privacy concerns because it acts autonomously on behalf of the user and also requires passwords to do so. “This makes people more aware of how powerful these things are. In my opinion, this will also lead to a greater need for protection—including on a political level,” Neidhardt continued.
And what options are there for dealing with this in the future, apart from regulation? “Ideally, we will manage to integrate all these questions much more strongly into the training of engineers,” said Wagner. Accordingly, he demanded that this education be made much more interdisciplinary, with a focus on social responsibility.
Curious about the panel discussion?
You can watch the panel discussion (German only) on the BMFWF’s YouTube channel
Curious about our other news? Subscribe to our news feed, calendar, or newsletter, or follow us on social media.