Johan Rochel, a philosopher who holds a PhD in law, is an EPFL scientist working to develop teaching resources on philosophy and the ethics and law of technology. In 2023, he received a DRIL Grant to make a series of videos for the EPFL community, which came out in March 2024, and began teaching a new course on The Ethics and Law of Artificial Intelligence.
Tell us about the “Ethics of AI” videos you have made.
This project involves crafting four short videos of around eight minutes each on digital ethics for EPFL students that are entertaining and a bit more informal and colloquial than a MOOC. We decided on the concept of visual podcast with Adrien Miqueu, an SHS assistant teacher and talented illustrator. The support of the CEDE team has also been impressive.
Where did the idea for this project come from?
The Social and Human Sciences (SHS) Program at EPFL is crazy interesting; 150 courses on almost everything you can think of in terms of humanities. But teachers come and go, and as soon as they go, everything is lost. In order to keep this knowledge, we are testing a process where SHS instructors can record short videos without a lot of investment of time and energy. Then if they leave EPFL, we still have their short video and keep that knowledge. The final vision is something like a Netflix of humanities: an easily accessible digital trace of what’s going on at the SHS program. These short videos about the ethics of AI are an attempt to concretize this vision.
Who is your target audience for these videos?
These videos have been prepared for EPFL students. I’ve tried to address these future engineers who, whatever their discipline, will be confronted with questions linked to the ethics of digital technologies, and more specifically, the ethics of AI. Beyond the EPFL student community, the videos will be of interest to anyone who develops AI solutions and anyone who buys and integrates them into their business processes. The idea was that these videos shouldn’t look at all like a traditional course. We were inspired by scientific YouTubers who use creative ways to make their knowledge accessible. I hope that the gently quirky and humorous side of the videos will help to raise awareness of this subject.
What kinds of feedback have you received?
So far, the feedback has been very positive: the offbeat approach and useful nature of the content means that viewers can understand the subject quickly. After watching the four videos, people should have a good overview of the issues involved, and then they can delve deeper into what interests them.
You also started teaching a new course on AI ethics in autumn 2023. What was the inspiration behind this course?
Some people believe that being a good engineer means just mastering the technical aspects. From that standpoint, ethics is at best a “nice-to-have.” But a more realistic view is that engineers are individuals who work in a specific setting – a company or public-sector organization, for example – where they interact with colleagues, are bound by legal requirements, and must consider society’s values. All this relates to ethics, and therefore ethics should be a key part of a good engineer’s education. Our graduates must be able to identify ethical questions and challenges, and then know where to look for the resources to address them.
The students recognize that a good engineer is one with both technical and ethical skills. For me and my colleagues, it’s a paradigm shift in terms of what we teach and how we should train tomorrow’s engineers. It’s a deep-seated movement with a lot of momentum, also because it’s part of a broader push towards greater sustainability.
The new class is offered at the master’s program in Digital Humanites and open to master’s students from different sections. It covers general topics for anyone dealing with digital technology and specifically AI. An important part of the curriculum will focus on the ethical challenges associated with the design of AI systems, which is one of the core skills taught at a university like EPFL.
How was the autumn 2023 course?
It was an excellent experience. More than 80 students enrolled and the interim feedback they gave was great. They’d been waiting for a class like this! We covered a lot of issues related to the ethical and legal challenges of AI. In the exercise session, which was 45 minutes per week, we used the ethics analysis frameworks discussed in class to examine real-world case studies and engage in social and political debates. We had master’s students from at least five different programs, which made our discussions very interesting and informative. But I had to put a time limit on the discussions – when you’re talking about the measures that should be introduced to control autonomous weapons systems, for example, people can keep debating for hours! Of course, the students who chose to enroll in this class are naturally highly motivated and already know a lot about the issues we discuss. The big challenge is to reach those who are less motivated.
What are your hopes for the class?
Anyone outside EPFL will tell you that today, it’s unthinkable that someone could get a top engineering degree without having a strong foundation in ethics. We need to do the important work of tailoring conventional humanities classes to the needs of engineering students. And that’s where CDH has real expertise. We’re all about being at the interface, and we need to take this interface seriously. It’s also important to bear in mind that humanities teachers can learn a lot from engineering students.
One of my key messages in the class is that we don’t bring ethics into AI systems – ethics is already there, and we need to make it explicit. When you design software – a chatbot for example – ethics is already part of the process, you just don’t see it. It’s there when you make choices, tap into your values and goals, and assess trade-offs. What I want is to give students a toolbox for identifying these steps as ethical issues. It’s a good way to start the conversation. I tell them: “I’m not going to teach you something new; I’m just going to help you do what you do better.”
Author: Stephanie Parker