Skip Navigation

Teaching Ethics in Computer Science, Part II

Art by Klara Auerbach


In the second episode of a two-part series, hosts Morgan Awner ’21 and Rachel Lim ’21 examine how Brown’s Computer Science department has implemented an ethics curriculum to help students comprehend the ethical implications of the code they produce.


Our interviewees, Andy van Dam, Tim Edgar, and Deborah Hurley, who are professors at Brown University, as well as Signe Golash and Hal Triedman, who are current students at Brown.


MORGAN: Welcome back to BPRadio. I’m Morgan.

RACHEL: and I’m Rachel. 

MORGAN: This is the second episode in our two-part series on ethics in computer science – and more specifically, how ethics can be implemented into Brown’s computer science program. 

RACHEL: In our last episode we talked about Brown’s new ethics TA program for computer science and learned more about what this program looks like in its first year of implementation. As a refresher, we learned that ethics TAs design additional materials for technical computer science courses that are intended to encourage critical thought about the ethical consequences of code. 

MORGAN: But we also heard that the introduction of ethics TAs is only the beginning of a series of transformations for Brown’s computer science program… We know that we’re trying to move in the direction of making ethics a central consideration in computer science. But what can we actually hope to achieve with this kind of program? How can we use education to steer technology in a better direction? And how can we ensure that Brown’s CS graduates code cautiously and ethically as they face a wave of new and powerful technologies that are only gaining momentum?

RACHEL: On this episode, we’ll be talking to professors at Brown who teach courses in the computer science department. We’ll be trying to figure out what we can and cannot accomplish by re-conceptualizing the way we teach. If we’re going to make changes to the way we teach computer science, we need to first understand what our goals are and determine what we’re actually striving for. 

MORGAN: These questions are fundamentally concerned with the nature of teaching computer science in universities. There are few more qualified to speak on the teaching of computer science than Andy van Dam. Andy helped to found the computer science department at Brown and served as its first chair. This fall, he’s teaching two courses: one on computer graphics, and a popular introductory course, colloquially known as “CS15.” Every year, hundreds of students go through CS15. Andy’s one of the most important people in the computer science community at Brown. But he’s worried about the speed of progress in the technology industry today.

ANDY VAN DAM: We’ve had a kind of libertarian, Wild, Wild West approach To have in tech be successful in society, growth at all costs, you know, move fast break things that kind of says we don’t give a shit about what we break. That’s, that’s the price we all pay for progress.

MORGAN: Andy takes a cynical but realistic view. He recognizes that the forces motivating life-changing technology advancements aren’t the same as those recommending that we step on the brakes and think about what we’re doing. He doesn’t have faith in corporations to prioritize ethical concerns over profits. As long as companies have their bottom line, Andy says, they’ll do what they think will make them money.

ANDY: I think what we’re learning is that surprise, surprise, growth oriented corporations, which is all of them, value growth and profit. Above pretty much everything else. It’s become part of our culture since the business schools promulgated this model of shareholder value and quarterly profits and the market and so on and so forth and being a force for good in society faded from what the value system was, wasn’t so much part of shareholder value, but even building and during instance institutions was part of shareholder value. Profits now in that kind of atmosphere, self regulation, forget about it. 

MORGAN: When brilliant ideas are only evaluated by how profitable they are, they’re unlikely to be considered for their potential societal effects. The drive to prioritize ethics won’t come from corporations. And as Andy explains, it will need to include a broader understanding of the non-technological forces at play.

ANDY: But what I’m trying to say is you can’t separate out the responsibility of people who work in big tech from what’s happening with our society with our government, up to including the Supreme Court, Citizens United, the influx of money as a corrupting influence and everything.

MORGAN: Andy sees the technology industry as inseparable from the corrupting influence of money. Corporations are never going to place ethical concerns at the forefront, and we can’t expect the people on the business end of tech companies to decide what’s best for the world.

RACHEL: Like Andy, Professor Tim Edgar is concerned about the pace and direction of progress today. Professor Edgar is teaching a non-technical course in the computer science department this semester called “Computers, Freedom, and Privacy.” He stressed the fact that students going into the technology industry need to be equipped with the skills to consider issues holistically.

TIM EDGAR: move fast and break things and and the question is, Okay, that sounds inspiring, but what things are we actually breaking is something that that people should be asking the question, just to be clear, I, I don’t actually think that having an ethics TA program or even thinking about ethics in a sophisticated way, when you’re engaged in any technical field. It was certainly in computer science is going to answer all those questions. You know, that suddenly, because we’ve armed our graduates with this knowledge and these skills That we’re going to, you know, transform the world and no one will ever create computer programs that can be misused. But I do think it’s helpful when people who have those technical skills, recognize the potential for misuse, and then at least are able to kind of alert society, hey, you know, I’m building this thing. There’s a positive value I see to it. But there’s also these downsides. We have to have a conversation as a broader society. You know, do we want to build this thing? Do we want it to be regulated? Are there ways we can mitigate the downsides to have those conversations earlier, rather than later is I think the goal of a program like this ethics ta program, because it helps the people that are actually doing the work to recognize those issues, instead of you know, kind of relying on the rest of society to grapple with the consequences of technology like that once it’s already been developed and is deployed out there in society.

RACHEL: Like the ethics TAs we spoke to in the last episode, Professor Edgar stressed that no computer science program can ever hope to teach students what the “right” answer is to these questions. He brought up an example from the 2016 US presidential election:

TIM: I remember after the issue around fake news and the 2016 election, I read some fabulous stories about coders in Silicon Valley, wondering whether you could create code that would spot fake news. And on the one hand, I kind of felt like good, you know, Good, I’m glad they’re thinking about the issue. And maybe there are technical tools that can be helpful, and figuring out you know, certain kinds of trolling and how that works. But then there was another part of me that scratch my head and thought, you know, good luck creating an algorithm that’s going to decide that What’s true and false? You know, for all of recorded history, you know, human beings have been grappling with questions of truth and falsity. And and and so it seems that one thing that I think is important about a program like this is that it gets students who may not be used to thinking in those terms that, you know, humanities and social science people are very comfortable with just getting them to spot those issues and to analyze them in a way that’s more sophisticated than, you know, how do I put in the line of code that says that if this does something wrong, it self destructs?

RACHEL: These problems are too complex to be solved with simple solutions. There isn’t an “ethical algorithm” for every problem, and we won’t make progress dealing with these issues if we only seek out technical solutions. As Professor Edgar was saying, it’s more important to teach students how to recognize these kinds of issues and give them a framework for engaging with them. Like Professor Edgar, Deborah Hurley is a professor who’s teaching a non-technical course in the computer science department at Brown this semester. The topic of this course is Cybersecurity Ethics, and in teaching it Professor Hurley has focused on an interdisciplinary approach to exploring the interplay of computers and ethics.

DEBORAH HURLEY: The cyber security ethics course is an ethics course that that’s the front, that’s the tip of the spear. And then we look at many technological issues as part of that it’s inherently multidisciplinary. So we look at legal issues, economic issues, social technological, was looking at in a very holistic way. So that’s one way to do it. And students have been very responsive to that. 

MORGAN: Professor Hurley believes that students naturally want to understand the effects of what they’re creating. She’s optimistic about the future of computer science education because she thinks that computer scientists are eager to learn holistically, and will develop the skills to consider what they’re creating in a more nuanced way if given the opportunity.

DEBORAH: So that is not only in computer science, we see that in other science and technology areas that are really strong zeal to understand. And I really use the word zeal on purpose to understand the implications of the tech they’re creating, or will be creating. So there’s a very strong student demand for more training and ethics and more ability to understand that.

RACHEL: These proposed changes to computer science education are not insignificant. It’s asking a lot more of students to learn how to understand both the technical and social implications of what they’re studying. But in a sense, these kinds of changes simply demand that people understand the nuances and complexities of what they’re doing. Professor Hurley mentioned that the zeal for holistic understanding is strong across different sciences. Furthermore, Professor Edgar pointed out that there’s nothing inherent in the technical aspects of computer science as a discipline that demands this kind of holistic consideration.

TIM: One reason that we’re focused on this with computer science is just the nature of the times that we live in in 2019. I imagine that during the Cold War, you know, you could imagine having an ethics you know, component to physics, for example, with a lot of physics, you know, physicist going off to work on, you know, nuclear weapons or nuclear power. that’s still true today. I think any science or technology engineering is it’s a wonderful discipline. It allows you to think rigorously build things, but sometimes people I think, have the mistaken idea that, well, I like this building things. And you know, my dad was an engineer, my brother’s an engineer. I’m sort of unusual and coming to a different career path. And it’s like, I like to tinker with things. I like to build things I like to because it’s right and wrong answers, right. You know, one plus one equals two that attracts people sometimes to science and technology fields. And that’s, that’s a beautiful and elegant part of Science and Technology. But you can’t you’re still living in society.

MORGAN: The comparison to nuclear physics is striking. It shows us an example of how scientific research doesn’t exist in a vacuum. The physicists who produced the revolutionary scientific work that was needed to create nuclear weapons were not the same people who were making strategic military decisions. Whenever a scientific advancement has the power to profoundly transform our lives, there rises a possibility for the kind of misalignment of interests we see today. Just like during the Cold War, a problem can arise when the technical is not joined with the ethical. Professor Hurley took the comparison a step further. She mentioned that certain professions actually require individuals to agree to abide by an ethical standard:

DEBORAH: So just as we have identified historically and even thousands of years ago, various professions where it’s very important that the person doing it, the work they’re doing is so critical to individuals and the society at large, that we haven’t developed an explicit ethical code of conduct that everyone is trained in and does their work in accordance with so obviously, the Hippocratic oath, thousands of years old is a classic example. Similarly, attorneys have ethics training and ongoing ethical, you know, training requirements and a very strong ethical code that they must abide by. And there are penalties if they don’t, they’re disbarred and so forth, penalized in other ways. So, in my own view, I think it’s entirely appropriate that people in computer science we’re going to be working in there and producing systems that you know, are part of our infrastructure health care, the food supply or Everything you really think of that they wouldn’t be having some kind of ethical training. So that to me is a natural evolution of the growth and pervasiveness of the field on the other. That is accompanied by a very strong surge and current students who really extremely strongly want to understand the social and economic implications of the technologies that are developing.

RACHEL: While the ethical oaths in medicine and law don’t translate directly to computer science, they still offer an important insight. People whose professions give them significant power to affect the lives of other people should have to agree not to use that power in a harmful or unethical way. Thinking about ethics this way, it makes sense to revisit the question of ethics TAs at Brown. We’ve seen that ethical considerations are important to consider at the forefront of a number of disciplines. Are the changes we’ve talked about making to the computer science department at Brown applicable in other areas? Can we imagine ethics TAs for other subjects, like biology, physics, economics, and so on? We asked Signe and Hal, the computer science ethics TAs we spoke to last episode. Here’s Signe:

SIGNE GOLASH: I think this could this sort of structure program could absolutely work and biology can History, economics, any sort of field in which it’s going to have an impact on people. I mean, there’s already huge ethical considerations in medicine. So this sort of program within sort of very long term goal could potentially implement, like ethical thinking into whenever you are designing something, which is going to have an impact on people, whether it’s a product, or it’s some sort of research, or anything that is going to impact the outside world where you’re thinking of what are the ethical considerations of this, and that doesn’t have to be like, super deep philosophical questions of normative ethics. And you know, which system of ethics am I doing? It’s, it can really just be like, is this accessible? Is this inclusive? What are the potential harms of this product? What are potential misuses of this product? Any sort of those even though small considerations and some sort of small tweak of your product can really have a large impact on the way users interact with it or the way that it impacts people. Just sort of integrating that into the way people think about the things they’re designing and the things that they’re creating, and I think really, really have a great impact.

MORGAN: It’s crucial to think about what education should strive to accomplish. We shouldn’t expect that our universities simply provide people with the skills they’ll need to make a living. Education should also seek to ensure that innovation is beneficial to the world at large. A college education shouldn’t just give you a narrow and specialized set of skills. It should give you the tools you need to be an ethical citizen. Here’s what Hal had to say:

HAL TRIEDMAN: Making sure that people are seeing the people who are going out into the world of computer science the students who are graduating from the department are seeing these concepts and not focusing on computer science as a discipline that has a strict bottom line, like financially speaking. Because I think that, you know, the conglomeration of so many services and companies has happened and these gigantic corporations serve as a really powerful draw for students through the CS department. And for a certain amount of time, like that was fine, but at the same time, like it’s not, it shouldn’t be a pre professional program. Not in every sense. I mean, it should prepare you to be a computer scientist in the real world, but it also should I equip you with a set of skills and a frame of mind that lets you critically engage with the world.

MORGAN: As we’ve learned from investigating Brown’s CS program, knowledge is power. And, in the famous words of Voltaire, with great power comes great responsibility. 

RACHEL:. Making significant changes to the way we structure our education will not be an easy task. And we know that teaching students how to build the next generation of technology or build the next big business doesn’t get us anywhere on its own. The ethics TA program at Brown does not represent the perfect and immediate solution to teaching ethical computer science. It is rather a first step, a way for us to start to learn how to effectively encourage and nurture critical thinking.

MORGAN: Thanks for tuning in to our series on ethics in computer science learning. We hope to see you next time on BPRadio.