By Derek Newton
If you follow me, or my writing on education, you know I explore the topic of cheating regularly and higher education frequently.
As part of that process, I run across items that are interesting but that probably won’t be explored or built into full articles. Rather than shelve them, I will make a better effort to share them – for whatever they’re worth.
Here are two.
That Report on “Snooping”
A few weeks back, an organization called Surveillance Technology Oversight Project (STOP) released a report called, “Snooping Where We Sleep.” It was supposed to be about remote test proctoring, the service that colleges and others use to limit cheating in online testing.
The third line of that report says, “The need for these tools is not clear: evidence suggests that students cheat less on online exams than in traditional classroom settings.”
I found that interesting because that’s not what the evidence says. The evidence, in fact, shows just the opposite – that students cheat far more frequently online. Here are a few examples.
I’d previously written about recent research from scholars at the University of South Carolina on this point exactly – whether students are more likely to cheat in online settings, especially during Covid-19:
A different academic research study, also from this year, found, “With the advent of online learning, that ability for students to engage unseen with faculty has grown, as has the ability for students to cheat and rarely get caught.”
Another from 2014 said, “Students in online courses have the highest tendency to cheat, with more than 70% admitting to cheating.”
And I wrote a lengthy story in the Washington Post on cheating during Covid. It’s really not a close call. It’s far more common online and growing rapidly.
So, the idea that “evidence suggests that students cheat less on online exams,” is not credible. Neither is much of the rest of the report, which features little but recycled news stories. But were there any evidence on the point of online cheating, as they offered, that would be news.
The footnote the report uses on that “evidence” links to a list of closed schools in New Jersey. The report lists the title of the footnoted article as “NJ Schools to Shut Down Wednesday; See Tri-State Closures Here, NBC N.Y.” So, it probably isn’t the case of a bad link. That’s the article that’s listed, it matches the link. But it says nothing about the incidence of cheating, either online or elsewhere.
Assuming it is just a mistake, and being quite interested in any research that contradicts that established literature, I e-mailed STOP. Twice. Then I called. No response.
That got me curious about the group itself – STOP. So, I looked up their 990, the form all non-profits are required to file with the IRS. It shows eight donations of $5,000 or more including one of $18,250 and another of $50,000. The 990 form requires organizations to list who gave those donations, including name and address. But the STOP form is bank here – listing the amounts, but not the names of their donors.
It seems that the organization that’s ostensibly about privacy, is pretty private about who is funding them. Maybe that makes sense. But it could help explain why they’re interested in making it easier to cheat in college.
And, of course, the report was blindly reported by CODA without, it seems, any skepticism or verifying.
When No One is Watching, Even When They Should Be
That CODA piece was awful reporting.
An example of good reporting was this piece out of Iowa. It was on a rather routine audit report on online education at the University of Iowa.
Like many colleges, Iowa uses a remote proctoring service to monitor online exams – Proctorio, which only records the exam session and uses AI systems to “flag” behavior that may be cheating. Teachers get notices about the flagged exams and can review the recording to make a call as to whether cheating happened and what to do about it.
The report, as the article shows, found that that only about 13% of the flagged sessions – tests that the proctoring company suspected may including cheating – were reviewed by faculty or school personnel.
That’s just as bad as it sounds.
And it means that one of two things is happening in the 87% of flagged but unviewed test sessions. One, that cheating is going on and no one is noticing. More accurately, that no one is concerned enough to even check – even when there’s direct reason to suspect it’s happening. Which means that, even when they’re caught, students are getting away with cheating nearly nine in ten times.
Option two is perhaps worse — that teachers are simply trusting the AI system and acting accordingly, failing or disciplining students without seeing the evidence. Since no AI system can be prefect, that would mean that students who are not cheating are being punished for it anyway.
It’s probably the first option because you’d think that an honest student, wrongly accused, would raise hell. They should.
But honestly, that no one is watching recorded test sessions is not news. People who follow cheating know that’s the case just about everywhere. It’s a dirty secret that remote proctoring that only records test sessions is useless. Very few, shockingly few, are ever watched by a human.
You could argue that simply recording a session is a deterrent to cheating, that you could be caught may keep someone from trying. But students aren’t dumb. As word gets around that no one is watching, even the deterrence aspect won’t work.
In fact, students don’t even need to be clever – they just need to able to do math. It’s an easy calculation that no teacher is going to watch dozens or even hundreds of test sessions. If an exam is 90 minutes and the class has 45 students and 15% are flagged for suspicious activity, that’s 11 hours of watching test video for just one class. That’s never going to happen.
In fact, the math of reviewing those videos, the hours it requires, exonerates teachers for not doing it. What professor has that kind of time? Even if they wanted to chase cheating — and they don’t — they don’t have the time. So, video goes unviewed and even obvious cheating goes unaddressed.
The problem is the system. Record and flag doesn’t work. Sure, it’s cheaper. And it lets Deans say how seriously they’re taking academic integrity, even though they have to know that no one is really watching.