Skip to main content

Ethical Foundations Class Introduces Ethical Thinking to Computer Science Students

Posted by Julianne Hodges on Sunday, April 1, 2018

Computers touch every part of our daily lives, from work to shopping to social media, and behind computer programs are human computer scientists making decisions. How do we make sure that these decisions don’t harm others?

A new computer science class co-taught by lecturers Alison Norman and Sarah Abraham aims to explore this question and train young computer scientists to think about ethical issues from the start, as reported by KXAN last month.

Ethical Foundations of Computer Science is a one-hour course consisting of class discussion and student presentations that first launched this spring. Students in the class think about and discuss a wide range of issues, from privacy to harassment, and how technology influences the ethical decisions surrounding those issues. Norman had wanted to create a class like this for a while, and last year, when the undergraduate office offered funding for a course with an ethics and leadership flag, she and Abraham submitted a proposal for Ethical Foundations.

“I feel like our students are young for the most part, so they don’t yet have the perspective on how much software is affecting the world,” Norman said. “I'm concerned that it's possible to make ethical decisions in implementing programs and not even realize that that’s what you’re doing.”

Although Abraham’s upper-division elective Contemporary Issues in Computer Science course also covers ethical issues, this class is unique because it offers these discussions to freshmen. This is by design: Norman said the class is supposed to set up a foundation for ethical thinking, hence the name, that could be built upon in later classes that the students take.

“The goal is that we have this conversation in the first year, they get practice thinking about these issues, and then all of the faculty can in build off of it in their classes,” Norman said. “What we’re trying to do is set it up so that it’s an ongoing conversation the whole time they’re here.”

So that all computer science freshmen can start with this ethical foundation, Norman and Abraham have thought about making Ethical Foundations mandatory for a computer science degree. However, this raises some challenges; a required course would likely need to accommodate more students per class in order to offer the course to every student. Currently, the class only has about 60 students, and while it would be easy to scale up the course into a giant lecture class, it would be hard to maintain the active, thoughtful component currently required of students, Norman said.

“The question is, how do you scale up something that is heavily group-based and intended to be very discussion-heavy,” Abraham said. “That’s what we’re currently mulling over, what would need to be changed to make that work while still giving them a comprehensive education, rather than just talking at them for an hour.”

In contrast with Contemporary Issues, which goes more in-depth with technical issues like cyber warfare or fake news, Abraham said Ethical Foundations deals with more personal issues, such as cheating and treating others in a respectful and welcoming manner, before expanding to larger issues like privacy or hacking.

“It’s more targeted towards helping people adjust to college life, making sure they can behave respectfully towards each other and build towards ethical behavior, hence why it’s ‘ethical foundations,’” she said.

One topic that the class has already covered, Abraham said, is the hidden biases in data and technology. For example, an ongoing series by investigative journalism organization ProPublica explores the machine bias found in Facebook’s restriction of hate speech, computer algorithms that predict future crime and more.

“We assume that computers will be unbiased, but it turns out that when you’re looking at statistics and data, all of that is very influenced by a variety of things,” Abraham said. “Becoming unbiased is a lot more challenging than people might think. Just because it’s a computer doesn’t mean it’s entirely fair.”

A real-world example of these biases is smart watches that have had trouble measuring the pulse of people with dark or tattooed skin, according to Norman.

“Because there’s no diversity in tech, no one had stopped to think that it was going to marginalize large numbers of people,” Norman said. “We’re going to talk about how it’s not only unethical to make them feel like they don’t belong as part of the society, but also, it’s bad business.”

Abraham hopes students leave the class thinking about these issues as they find jobs and continue with their careers.

“I hope that they get an awareness of the decisions they’re making and how their decisions affect others, and through the magic of technology, lots and lots of others,” Norman said.