ETHICS, COMPUTING, AND AI
Coda | Computing for the People: Ethics and AI
A post-panel conversation
"We need to ensure that the efficiency or sheer strength of the technical process isn't all that goes into the thinking about AI tools — that consequences for humans are a major part of the discussion. At the end of the day, the products or algorithms produced are for people."
— Melissa Nobles, Kenan Sahin Dean, MIT School of Humanities, Arts, and Social Sciences
Related story: The path to ethical, socially beneficial AI
At an MIT panel on Ethics and AI, leaders from government, philanthropy, academia, and industry say collaboration is the key to creating AI tools that serve the public interest well.
28 February 2019
Celebration of the MIT Schwarzman College of Computing
In a coda to the panel, Computing for the People: Ethics and AI, Melissa Nobles, Kenan Sahin Dean of the MIT School of Humanities, Arts, and Social Sciences and Jennifer Chayes, Technical Fellow and Managing Director of Microsoft Research, spoke to reporters about MIT's interdisciplinary vision for the new college. The following is a lightly edited transcript of their conversation.
• • •
Melissa Nobles: The main pedagogical concerns are figuring out how to organize the curriculum so that it's mindful of specific issues. We need to ensure that the efficiency or sheer strength of the technical process isn't all that goes into the technical thinking, that consequences for humans are a major part of the discussion. At the end of the day, the products or algorithms produced are for people.
Toward this end, the MIT Provost, Marty Schmidt, has established five faculty working groups intended to think about important issues related to the new college, to get a collective sense of what the faculty is thinking about in terms of the impacts and responsibilities of computing. I'm co-chair (with Julie Shah, assistant professor of aeronautics and astronautics) of a group looking at curriculum. We're looking at one model of a class on computer science where, in every section, a philosophy PhD looks at a particular ethical dimension of machine learning, investigating algorithmic bias, for instance.
Our group is not just looking at computation classes. We're also looking at the larger ecosystem in which students live, aiming to ensure that all students develop awareness of the larger social, economic, and political context in which we function. So imagine students taking a combined economics and data science class, or a political science class about congress, which also looks at technology regulation. There's a way all of our disciplines can take on the larger issues of technology that are quite separate from AI as such.
Chayes: It's also really important to have the ethics and philosophy embedded in the technical classes, and the technical embedded in social sciences. If you come out of college with a masters in computer science, and have an ethics question just slapped on at the end, that's too late. If you have ethics integrated into the education, the first thing you do when you get a dataset is to ask, what are the biases in this dataset? You say, if my dataset contains biases, would my algorithm amplify it? And now, how do I come up with another algorithm which will actually break those biases?
The same thing applies on the other side: Don't slap technology on the end of a degree in political science. You want political scientists to come out feeling confident and thinking critically, confidently about data. You don't want to be intimidated by data.
Panelists for "Computing for the People: Ethics and AI," L to R: Thomas Friedman (Moderator), Ursula Burns, Jennifer Chayes, Ash Carter, Darren Walker, and Megan Smith
In a conversation after the panel, Dean Nobles emphasized that the goal of the new college is to advance computation and to give all students a greater “awareness of the larger political, social context in which we’re all living.”
This is the MIT vision for developing “bilinguals” — engineers, scholars, professionals, civic leaders, and policymakers who have both technical expertise and an understanding of complex societal issues.
Nobles: There is a fundamental role here for the liberal arts fields. They provide an understanding of the role of ideology and democracy, and the relationship between citizen and state. Technology is changing the relationship, but those are primary political questions. And that's what a liberal arts education can offer. As far as I'm concerned, those of us in academia have to take more seriously what we've always known to be true — and we've got more homework, which is to better understand technology.
Chayes: Technology companies are being evaluated on how they deal with privacy, and are now being regulated on this issue in Europe. The next big issue is bias, and fairness scholars from my labs are consulting with the EU on how to audit and mitigate bias. Just as the EU pushed the world on privacy, they will push the world on bias. Every company will be looking to MIT or any university able to graduate people who don't have ethics slapped onto their education at the end because that is going to be the imperative.
Nobles: Some companies want that right now. They're concerned that we're not producing students capable of this kind of thinking. I don't believe, once we get this kind of education right, that any MIT student will disadvantaged by having gone through it. Students will be even more sought after, because they will be well trained technically and able to think much more broadly about society and social issues.
Ethics and AI: Perspectives from MIT
Making the path to ethical, socially beneficial AI
Panelists for the Computing for the People: Ethics and AI
Thomas L. Friedman (Moderator), New York Times columnist; Ursula Burns, CEO, VEON, Ltd.; Ash Carter, Director, Belfer Center for Science and International Affairs, Harvard Kennedy School; and former US Secretary of Defense; Jennifer Chayes, Research Fellow and Managing Director of Microsoft Research; and Darren Walker, President of the Ford Foundation
Prepared by MIT SHASS Communications
Editorial team: Emily Hiestand and Leda Zimmerman
Photographs: Rose Lincoln