Dive Brief:
-
ProctorU, a company that offers remote test proctoring services, will no longer offer exam monitoring that only uses artificial intelligence to flag potential rules violations and send them to instructors for review, it announced Monday.
-
The company will now only offer proctoring services in which a trained proctor either monitors students remotely while they take their exams or reviews the recordings after their tests.
-
ProctorU says it is making the change because faculty members are reviewing few of the exams the software flagged and AI isn't capable of assessing human intent. Remote proctoring companies have also been accused of offering services with racially biased AI and invasive software.
Dive Insight:
Online proctoring companies have come under scrutiny for using AI that flags possible cheating too frequently. Their software often has access to students' computer video camera, microphone and web browser and looks for behavior it deems suspicious, such as looking away from the monitor too often.
But those opposed to remote proctoring say it increases students' anxiety during tests and penalizes them for innocent behaviors, such as reading questions aloud. Some reports also criticize online proctoring companies for using facial recognition systems, which have less accuracy in identifying women and dark-skinned people compared to other groups.
ProctorU said it is moving away from AI-monitored exams for three reasons: test providers weren't consistently reviewing the sessions the software flagged, the technology created more opportunities to unfairly implicate students in misconduct and it increased the workload for instructors.
Schools and testing authorities were only reviewing about 11% of test sessions that ProctorU's AI flagged for suspicious activity, the company said in its announcement. Moreover, it was frequently tagging innocuous behavior, such as a student rubbing their eyes.
Now, the company will have remote proctors either monitor live exams or review the tests after they have already taken place. In those instances, AI will still flag potential issues, but one of the company's trained proctors plus one other person will need to verify suspicious behavior before passing on a report to the college instructor to make a final decision.
"A human can discern the difference between you turning around and asking someone a question about an exam or you turning around and talking to your four-year-old who's asking for a drink of water," said ProctorU founder and chief strategy officer Jarrod Morgan. The company is working with colleges that use its AI-only services to transition them to those that use the company's human proctors.
However, some privacy advocates believe remote proctoring is harmful to students, even if it relies less heavily on AI. Albert Cahn, founder and executive director of the Surveillance Technology Oversight Project, said remote proctoring can cause undue stress for students.
"I'm glad to see the company pulling back one of its most disruptive product offerings," Cahn said. "It's still unclear to me how they can defend this model at all." Project-based learning and other types of assessments are an alternative to such "high-stakes tests," he added.
Some colleges have been embracing that approach. Contra Costa College, in California, issued guidance in October encouraging instructors to use alternative assessments — such as projects, presentations and essays — instead of remotely proctored exams.
The college created the guidance after students told officials that online proctoring was "anxiety provoking," said Maritez Apigo, distance education coordinator and an English professor at the college. "They're saying, you know, they're not able to concentrate on what they're being tested on because they're just so worried about looking like they're cheating, even if they're not trying to cheat," Apigo said.
San Francisco State University took a stronger stance last year, when the faculty senate passed a resolution restricting or banning some third-party remote proctoring.