How ethics is becoming a key part of research in tech

June 7, 2023, 5:54 p.m.

Stanford aims to conduct research to “benefit society,” but how are researchers ensuring their work lives up to this mission? Select centers across campus are making concentrated efforts to incorporate ethics into science and technology research with programming that includes fellowships, research reviews and coursework for interested academics.

Stanford ethical dilemmas in science, engineering and technology have made news in recent months, whether through the Doerr School’s decision to accept funding from fossil fuel companies, supposed academic dishonesty among students using ChatGPT or accusations against the University president for falsified data in research papers. Although ethical programming on campus is largely voluntary, programs and coursework are available to prepare researchers for the ethical dilemmas they will face.

“The way that we frame problems for research, the way that we pursue them, the methodologies we choose — those are all ethically laden,” said Anne Newman, Research Director & Honors Program Advisor at the McCoy Family Center for Ethics in Society and Staff Director of the Embedded EthiCS Program.

In the medical sciences, safeguards are integrated into the institution to monitor the interaction between human and animal research, since there are direct impacts on a subject. Within the Research Compliance Office at Stanford, the Human Subjects Institutional Review Board (IRB) reviews research proposals to confirm that projects with human subjects are safe and ethical before they begin.

However, some research domains — of which many are rapidly growing fields with large-scale societal impacts like artificial intelligence (AI) — have been largely left out of IRB reviews as they don’t work directly with animal or human test subjects. Some ethics centers on campus are seeking to change this.

Building off the IRB model, in 2020 the Ethics in Society Review (ESR) board was created under the McCoy Family Center, the Center for Advanced Study in Behavioral Sciences (CASBS) and Human-Centered AI (HAI) to make ethics a core part of research in computer science. The ESR acts similarly to the IRB by examining ethical concerns to minimize potential harm of the research before a project is approved for funding.

This process is integrated into grant proposal applications in HAI. After HAI reviews the technical merits of a proposal, it is handed off to the ESR, which assigns an interdisciplinary panel of faculty to review each of them. This panel acts as advisors on ethical issues to identify challenges and provide additional guidance on the ethical component of the research. Once completed, the panel will either recommend the funding agency release research funds, or suggest more iterations of the review process.

The ESR is not meant to determine whether the proposal should be funded, but rather to analyze the unintended consequences of the research prior to the start of the project. In discussing what ESR does, Betsy Rajala, Program Director at CASBS said, “Everytime you touch research these questions come up and [it’s better to] think about them sooner rather than later.”

There has been support from those who have gone through the ESR process. According to a study published in Proceedings of the National Academy of Sciences (PNAS), of the 16 grants asked to iterate with ESR, all said they would do it voluntarily in the future and 67% of that group said it influenced the design of their research problem.

In addition to this formal grant process, the ESR board has received ad hoc requests to review the ethical implications of proposed research projects, thus beginning to fill the ethical review gap in the field of CS and AI. Quinn Waeiss, postdoctoral fellow in Stanford Center for Biomedical Ethics and former CASBS Program Director said, “I think AI is a good archetype for the types of consequences that can come to be in all types of research.” They also expressed interest in and began expanding the program beyond HAI and into other areas of engineering and technology.

For graduate students that are not submitting proposals to HAI, there are other programs offered by the McCoy Family Center. One such program is their Graduate Fellowships. These fellowships are open to all graduate students with the goal of creating an interdisciplinary space for students to discuss ethical dilemmas and implications of their research and the institution of academia.

Explaining these offerings, Newman said, “It’s important to emphasize, as we think about ethics outside of philosophy, that it’s not about teaching people what to think but how to think [through ethical issues], and acquiring the language for engaging in conversations about ethics.”

The program only offers up to 12 fellowships, so to reach more students, the McCoy Family Center also offers small grants for students and postdocs who would like to create programming related to ethics and society, ranging from lecture series to workshops.

Other fellowships offered at the University surrounding the topics of engineering, technology and ethics in society include the HAI Graduate Fellowship program, the Technology & Racial Equity Graduate Fellowship and the Tech Ethics & Policy Summer Fellowships also offered by HAI.

Numerous undergraduate- and graduate-level classes cover the intersection of ethics with engineering and technology, but one model is shown in the Embedded EthiCS program. This program embeds ethical issues into the CS coursework for undergraduates by placing questions about ethics alongside traditional coding problems in assignments and lectures. Integration into technical work strives to bring ethical conversations to a broader range of students, according to Newman – not just the students that are interested in the topic.

However, Mav Levin ’25, undergraduate in computer science, highlighted that thoughtful design of such coursework is paramount in order to keep students engaged. Levin suggests there needs to be a better balance between deep understanding of the use cases of technical content and carefully interpreting what real, ethical questions might be worth asking.

“I think learning ethics is important, especially in CS,” Levin said. “I actually enjoy the ethiCS content that is technical or technical-adjacent. However, I strongly dislike the parts that are vague and repetitive.”

Expanding to graduate students is something those at the Engineering, Society, and Technology (EST) hub are interested in pursuing, according to Rajala and Waeiss. This involves working with professors on integrating ethical case studies and content into the coursework. For example, over the last 2 years Waeiss has worked a mock ESR into the coursework for CS 247A: Design for AI, exposing students to the ethical review process and providing a space to think about societal implications of their work.

But those involved in these programs have expressed there is still a large need to shift the community surrounding ethics in society and how we approach and value research at the university level. They also emphasize the importance of interdisciplinary work — a difficulty within academia since researchers are often siloed in their field and value is placed on publishing in high-impact journals specific to a field.

“At the intersection of the disciplines are the answers to a lot of our big societal issues related to the impacts of technology,” said Ashlyn Jaeger, Associate Director of the Ethics & Technology Initiatives. “If we were talking more across the disciplines it would be easier to identify potential negative downstream complications but also to brainstorm solutions.”

This article has been updated to clarify the ESR panel’s role in releasing research funds.

Login or create an account