Algorithmic Bias in Online Test-Monitoring Programs

by Vins

Social distancing, necessary due to the COVID-19 pandemic, has transformed education. With the pivot to remote learning, one of these transformations, Nora Caplan-Bricker reports for the New Yorker, is educational institutions’ increased reliance on software programs designed to monitor students taking exams for possible signs of cheating. Caplan-Bricker’s report draws on  interviews and studies to give a picture of how these test-monitoring systems introduce new, implicit racial biases into higher education. Despite these issues, remote-proctoring companies such as Proctorio, ProctorU, Examity, and ExamSoft have profited from the lack of trust between higher education institutions, professors, and their students by building mistrust and racist ideology into the technology itself.

Proctorio operates as a browser plug-in that can detect whether a student’s gaze is directed at the camera, how often they look away from the screen, how much they type, as well as how often the student moves their mouse in comparison to the rate of activity to a class average. On the basis of these measures, the software calculates a “suspicion score” for each student, flagging students who deviate too much from the norms for possible acts of cheating.

Students of color are more likely to be flagged by the technology because the software uses datasets in which white faces are overrepresented, leading to problems in how accurately such software can identify people of varied sex, age, and racial background. A 2019 study, produced by the National Institute of Standards and Technology (NIST), found higher rates of false positives for Asian and African American faces relative to images of Caucasians, and higher rates of false positives for African American females.

“The surge in online-proctoring services has launched a wave of complaints,” Caplan-Bricker reported. For instance, students with dark skin describe how proctoring software fails to discern their faces; low-income students are flagged for unsteady Wi-Fi or taking exams in rooms shared with family members; and Proctorio’s “ID Verification” procedure, which requires students to pose for a photograph with an ID that may bear a previous name, has outed transgender students.

In effect, Caplan-Bricker’s report describes the commercialization of ‘academic fraud,’ which places students of color at risk of either unconscious or direct biases. “When they can’t get to know you as a good student, it furthers the weird distrust everyone is feeling,” Grace Massamillo, a student at University of Texas at Austin, who was ejected from an exam after she resized the text on her computer screen,  told the New Yorker. “I felt like I was fighting to prove my academic integrity more than my knowledge.”

While we continue to grapple with the COVID-19 pandemic, the use of test-monitoring and surveillance software in higher education is popular in establishment media headlines. For example, in March 2021, USA Today published a first-person opinion article about the invasive role of remote learning technology. While this piece addressed biases in facial recognition technology, it did not address the topic of higher misidentification and labeling rates for students of color. CNBC also published an article on the topic in March 2021, but its coverage focused on how students have “learned new ways to cheat” during the pandemic, with only passing mention of software that aims to identify cheating. The article did not address concerns about racial bias in test-monitoring software. Overall, establishment news coverage has buried the topic of racial bias in the test-monitoring software in use at colleges and universities across the United States.

Source: Nora Caplan-Bricker, “Is Online Test-Monitoring Here to Stay?” The New Yorker, May 27, 2021.

Student Researchers: Emily Inman, Aidan Burke, Grace Sherwood, and Kathleen Boulton (University of Massachusetts Amherst)

Faculty Evaluator: Allison Butler (University of Massachusetts Amherst)