Technology has transformative power – and this is generally a power for good. To rein in new technologies’ potential, we must think if and how to regulate them: through industry-wide codes of conduct, other soft or hard law mechanisms, co-regulation or perhaps through code itself. We should think hard so as not to overregulate – lest we stifle innovation; but we should think harder not to underregulate – lest we lose our personal freedoms.
We invite different speakers, all from renowned international universities, to discuss with us relevant case law, the institutional legitimacy of the judiciary, administrative agencies and other supervisory (non)elected bodies within law and technology.
The talks are open to the public, but you need to register via innovationsrecht(at)uni-graz.at in order to receive the link to the meetings.
Speakers and Topics
May 11 – Professor Valsamis Mitsilegas
The Privatisation of Pre-emption in the Digital Age: A Critical Appraisal of EU Proposals on the Prevention of the Dissemination of Terrorist Content Online
Private actors are increasingly called upon to co-operate with law enforcement authorities in the fight against crime and terrorism. This is particularly the case in the digital age, where access to and regulation of persona data and digital behaviour of citizens is a priority for the state in developing a pre-emptive paradigm of security governance. The seminar will address this privatisation of pre-emption by evaluating critically EU proposals for a legal framework on the prevention of the dissemination of terrorist content online. By focusing on the changing nature of obligations imposed upon the private sector in this context, the seminar will explore the consequences of the privatisation of pre-emption on fundamental rights and the rule of law.
May 12 – Dr. Alessandro Corda
The Privatization of Punishment version 2.0: Criminal Records, Digital Technologies, and the New Punitive City
The talk will discuss a new facet of the privatization of punishment and its effects. While privatization represents a well-established phenomenon in modern criminal justice operations, less understood are the technological, market, and governmental forces that in recent years have dramatically reshaped the production, dissemination, and use of criminal record data. The focus will be on a reconceptualization of theories of penal entrepreneurialism that more directly addresses the role of technology and corporate interests in the field of criminal record management. A new paradigm (‘penal entrepreneurialism version 2.0’) will be utilized to describe and critically assess the new, multifaceted, and often problematic interactions between private actors autonomously collecting, commodifying, and variously using criminal record data, technological developments, and the criminal justice system.
May 18 – Professor Frank Pasquale
New Laws of Robotics: Defending Human Expertise in the Age of AI
How far should AI be entrusted to assume tasks once performed by humans? What is gained and lost when it does? What is the optimal mix of robotic and human interaction? New Laws of Robotics makes the case that policymakers must not allow corporations or engineers to answer these questions alone. The kind of automation we get—and who it benefits—will depend on myriad small decisions about how to develop AI. Pasquale proposes ways to democratize that decision making, rather than centralize it in unaccountable firms. Sober yet optimistic, New Laws of Robotics offers an inspiring vision of technological progress, in which human capacities and expertise are the irreplaceable center of an inclusive economy.
June 1 – Professor Elisabeth Hödl & Professor Lucas MacClure
Professor Elisabeth Hödl: AI journalism and the role of the media in shaping public opinion
The Internet has profoundly changed the way people access and engage with news. In the early days of computer culture, the hope was for participation, democratization of access to knowledge, liberation from discrimination, and decentralization of power. Three decades later, the networked society presents a more differentiated picture: Content is not only generated by humans, but also by machines. Today, easy-to-use and affordable technologies are available that allow text, image and audiovisual content to be generated but also manipulated in an automated way. AI journalism and automated journalism are shaking the media industry and with it the question of responsibility for content. What is the role of a functioning press in the state and what are the implications of automated content for law and society?
Professor Lucas MacClure: Online Speech Regulation in Latin America
In addition to teaching in the International Studies at Boston College, Professor MacClure is a visiting adjunct professor of Law at Adolfo Ibáñez University. He holds a J.S.D. and an LL.M. from Yale Law School, where he specialized in comparative constitutional law and political science, and an LL.B. from University of Chile School of Law. He is a member of the Chilean bar association (Colegio de Abogados de Chile). His research and teaching interests include US and Latin American constitutional law, Internet law, and jurisprudence.
June 2 – Dr. Matthias C. Kettemann
Here Come the Global Content Judges: Democratizing Online Speech Rules through Oversight Boards?
Algorithms and norms online platform use to conduct content governance substantially impact how we communicate online. But how are these private normative orders set and legitimated? Can they be contested? Recent European legislative efforts, like the DSA, point to more transparency and accountability duties. Platforms have to inform users why content is deleted. This is good, but it doesn’t help with the underlying legitimacy problem of setting and ruling on speech rules.
June 9 – Professor Sylvie Delacroix
Bottom-up Data Trusts: Disturbing the ‘One Size Fits All’ Approach to Data Governance
From the friends we make to the foods we like, via our shopping and sleeping habits, most aspects of our quotidian lives can now be turned into machine-readable data points. For those able to turn these data points into models predicting what we will do next, this data can be a source of wealth. For those keen to replace biased, fickle human decisions, this data—sometimes misleadingly—offers the promise of automated, increased accuracy. For those intent on modifying our behaviour, this data can help build a puppeteer’s strings. As we move from one way of framing data governance challenges to another, salient answers change accordingly. Just like the wealth redistribution way of framing those challenges tends to be met with a property-based, ‘it’s our data’ answer, when one frames the problem in terms of manipulation potential, dignity-based, human rights answers rightly prevail (via fairness and transparency-based answers to contestability concerns). Positive data-sharing aspirations tend to be raised within altogether different conversations from those aimed at addressing the above concerns. Professor Sylvia Delacroix’s data trusts proposal challenges these boundaries.