This feature first appeared in the Summer 2020 issue of Certification Magazine. Click here to get your own print or digital copy.
With the COVID-19 pandemic impacting all segments of society, including the closing down of IT certification testing centers around the globe, remote online proctored testing has stepped to the forefront in the IT certification world. I'm curious to look at it from the standpoint of program owners as well as address the thoughts and concerns of individual test takers.
For programs that were already using some degree of online proctored testing (OLPT), it has been an easy transition to go completely OLPT. In general, any exams scheduled for on-site testing were pushed back a few months, and they will likely be pushed back even further. The timing is also in place for programs to go completely OLPT, so test takers should get set up to take near-future exams remotely.
This usually means getting an external webcam (i.e., not an integrated laptop cam) as well as setting up any required software. Waiting until just before the scheduled exam time to get set up could result in missing the start time and forfeiting the exam fee. Most organizations also recommend testing the external camera at least a week ahead of time.
Testing providers who use live proctors — a person is assigned to observe the test as you take it and can pause or end the test if they see any unusual behavior — will be sticklers for scheduled test appointments. You should not expect to switch your exam time after it's been set. There is at least one vendor that has live proctors working from home, but there is still a schedule to maintain.
In looking at testing providers, please be aware there is tremendous difference between a live proctor, who watches the exam as you take it, and proctors who tape an exam session and review it later. We'll circle back to this more passive proctoring in a moment.
What is online proctored testing?
During an online proctored exam, the exam candidate takes the test at home, or from some other convenient location where they are isolated from anyone else. Special software is needed to ensure the test taker's computer is locked during the test — no non-essential functions can be accessed. The testing provider and exam candidate schedule a fixed time and date for the test.
Again, there is set-up involved to not only get the computer ready but also to validate the test taker with facial recognition and ID validation. Once the exam is underway, the test taker is observed, or proctored, by the test delivery vendor.
There are parameters put in place by the certification program owners for the test taker to follow. If any of those parameters are violated during the test, then the exam can be stopped. These parameters, along with the test delivery vendor's expertise in monitoring test taking, help keep the exam's integrity in place.
Online test taking parameters
Certain conditions must be maintained during any online exam. Test takers are not permitted to speak aloud, which sometimes throws off individuals who like to read test questions to themselves before working on the solution.
It can jolt the test taker when the proctor interrupts the exam and tells them to stop talking. One key reason for this is that a test taker who reads each question aloud could be recording the questions to sell them later on an exam black market.
One restriction that may seem obvious, is that the exam candidate may not getting up and go to the restroom during a test. Programs should warn the test taker to use the restroom before starting the exam. Similarly, ducking out of the view of the camera may be allowed once, but twice will get a test stopped.
If any other person enters the room, or it becomes evident that someone else was already in the room, then the test will be stopped. Any unusual activity, such as repeatedly glancing to the side, is not allowed. Under the practiced eye of a live proctor, such behavior will be flagged and the exam stopped.
External cameras
As mentioned earlier, an external camera is generally needed for a live online exam. This refers to a webcam that must be plugged into your computer and mounted, generally along the top edge of your monitor, so as to provide a full and unobscured view of the exam candidate. Some programs have a policy that only allow the use of an integrated, or internal webcam, but that practice is fading away.
Cheating
The process of catching cheaters during an online proctored exam is both simple and straightforward. The exam candidate is monitored at all times for unusual activity, some of which we've described already. It may sound difficult to detect, but you don't necessarily even have to be a skilled exam proctor to tell that something fishy is happening.
In fact, watching a test cheater in action will leave many people amazed that they would even try to cheat. The external camera plays a key role in confirming the cheating, often including a walkthrough� of the premises that essentially allows the proctor to scan the room where the test will be taken. Between knowing the lay of the land and watching the test taker closely, suspicious behavior, which generally involves looking or listening, is easy to spot.
What is even more surprising is the level of protest from cheaters when their exams are stopped. It is remarkable what cheaters will say and do, up to and including threatening a lawsuit (though this is quite rare). Proctors generally have to be firm, expressing that the program owner is simply protecting the integrity of the exam. Other test candidates will appreciate what is being done.
Advantages of online proctored testing
With online proctored testing, exams can be taken beyond normal office hours, and a test taker does not have to travel. It also allows test providers to better meet the expectations of exam candidates. In the 21st century, most individuals in the IT industry expect to be able to take an online proctored test. Not to offer one shows a company being behind the times.
Certification executives who have no experience with proctored online exams will need to be educated about their security and validity, but many will have more open minds than they might have even just six months ago. Living through a pandemic is making everyone look at what is possible with a more open mind, so expect more and more programs to go this route.
Bearing in mind our earlier mention of angry cheaters, some certification program managers may question whether remote proctored tests are legally defensible. The short answer is an emphatic Yes. Testing providers have a legal right to require compliance with online proctoring protocols.
Were a case ever to come before a judge, which to date has been a very rare occurrence, the testing provider would need to demonstrate simply that a clear process had been followed to create and deliver the exam. That is essentially what legally defensible means. It does not mean a judge would attempt to determine whether a question and answer are correct.
From a program owner standpoint, understanding what services and capabilities are offered by a given testing provider helps determine whether or not to align with that that provider. Bear in mind that if exams are not live proctored, but only taped for review later, the very threat of a test being stopped is lost — removing a major deterrent to cheaters. Passive proctoring also means more effort for the program owner, as taped testing sessions have to be reviewed. (Who has time for that?)
Scenario-based questions can reduce attempts at cheating
Some testing providers offer state-of-the-art facilities, but many testing centers are old and getting older, making them more easily exploitable by cheaters. This is another reason why moving to online proctored testing makes sense. Bandwidth may be an issue in countries like India, which means careful scheduling will be required to offer exam slots during windows where less internet traffic occurring.
Concerns about cheating shouldn't be brushed aside, but there are mitigating factors. If an exam is completely scenario-based, for example, then the potential for cheating is dramatically lessened. Even if someone got a copy of the exam ahead of time, they would still need to figure out the answers ... not a likely occurrence if they cannot decipher the scenario.
Creating scenario-based questions directly addresses some of the systemic causes of cheating by making it impossible to simply memorize a range of potential answers. The key, in my experience, is to create entire exams with scenario-based questions and multiple-choice answering. In this way, the program owner is proactive rather than reactive. It is far more difficult to create such exams, but the payoff is exams which are extremely difficult to memorize and cheat on.
By using scenarios, the test taker has to have real-world experience to parse and answer the questions correctly. Scenario-based questions validate real-world experience. Imagine 60 different scenarios a test taker needs to both recognize and understand. This, in my experience, is more efficient than using performance-based tests, which are even more costly and time-consuming to create.
Colleges and universities
A recent news report shone a spotlight on cheating at a Boston-area university, where the students were all sent home during the pandemic. Not having a properly secured remote testing environment led to students actively trading information on other devices while taking the exam.
A level and fair playing field is needed to keep this from occurring in the future. Colleges and universities will have to address this issue quickly and with a greater degree of due diligence. Please see the sidebar to this article (included below) from a Ukraine college instructor who discusses his personal approach to testing.
From skeptic to believer
In 2005, I was approached by testing provider Kryterion with an offer to engage in online proctored testing, and I was very skeptical. Spending time and energy chasing exam cheats should not be the focus of any certification program. Once the shift was made to create exams that are entirely scenario-based, however, my concerns about using online proctored exams vanished.
Now my most important question is whether to shift entirely to online proctored testing, with on-site testing strictly limited to special events and conferences. Given that the short-term future of major events and conferences is suddenly in doubt, online testing may be the only way to go, at least for a while.
In the overall scope of things, bad actors, cheaters, and test cheat sites are a small minority of the IT certification community. There are still some post-testing analytics to flag suspected cheats, but for the most part, the focus on these bad actors is no longer needed.
There are also other methods that certification programs can use to keep these bad actors in check. The vast majority of test takers are honest, and those are the people we want to spend our attention on. Online proctored testing solves a lot of present logistical problems, and could represent a certification testing future from which we never turn back. Lean into the shift, and please stay safe, everyone!
Sidebar: Observations from the classroom about testing security
When it comes to testing, I allow my students to use certain study materials. I do not allow them to share information with each other during a test, which can be very difficult to track. I adopted this approach from my scientific advisor, who practiced it for years.
What I have learned from letting students take this open book approach is somewhat astonishing. The influence of using study materials on the end result appears to be very moderate, if it exists at all. Good students perform well, and bad students perform poorly — whether or not they have they access to study materials during the exam.
There are, of course, some parameters in place. First, I only allow study materials that have been prepared in either printed or semi-oral media (voice-activated, such as cell phones). Second, I define cheating as copying somebody's else work or communicating directly with another individual. The latter is harder to track, because I allow the use of smartphones, the internet, and apps.
In order to address copying, I use a simple method that is explained to my students in advance. If I find two similar solutions, with similar mistakes and similar grammar, then I ban both. I make no attempt to determine who actually did the work and who only copied it. Similar text (approach, symbols, notation, graphs), similar mistakes (mathematical inaccuracies), and similar grammar gets both tests flagged.
I've found that, to some extent, every person has an individual style that is as accurate as a fingerprint. The correct solution could be the same, but mistakes are always unique! This is even more true about grammatical errors. Every person has their own vocabulary and education, and it is highly unlikely that two people will write a few sentences in the same exact words, using commas in the same wrong places, and making the same misspellings or other mistakes.
This simple rule works quite well. Every time when we have a written control, where students have to solve some problems, I ban a few study materials. It motivates students to work alone in the future.
Regarding exchanging texts with another person nearby, or asking a question on the forum, these problems are more difficult to address. In general, the best approach is to have a semi-oral type of control, where you can ask the student how they solved a problem, or even what have they written in their work.
In case I cannot speak to a student, there are a few things I can do. During our lectures, we use certain notation, reference approaches, and so on. Very often, other instructors use different notation, and sometimes the notation itself speaks about where the text comes from.
For example, I have already learned about some articles in Wikipedia from my students' written work on exams and can recognize them. If I see suspicious signs, like an approach to solving the problem we never used, an unusual notation, or phrases from Wikipedia, it raises a flag and I may have a conversation with that student.
This rarely ever happens in practice, because we discuss the potential problem only if the solution provided by the student in the exam is right. In practice, such questionably derived solutions are almost always wrong � and, in the end, it does not matter what the exact source was if the question was answered incorrectly.
I have also found, to my surprise, that I can give students learning tasks very similar to ones on the test with detailed solutions upfront, then leave everything the same in the test but only change numbers. All they need is to take my solution and put another number into it.
This does improve marginal results a little bit, but the pattern still tends to be the same. Good students perform well and bad students perform poorly, no matter the form and content of the test. Now, with classes meeting remotely, we face another challenge: how to test students online remotely with a multiple-choice test.
My college does not have the option of using a commercial tool, so I plan to use the following strategy: First of all, no questions or tasks should be such that the answer can be Googled. I'm planning to generate questions in .png format, so students cannot copy and paste them. Also, each question should involve critical thinking, not simple yes/no answers, or a basic definition.
My second control, which is critical, is time constraints. If you have only 20 minutes to answer 20 questions, then you simply do not have time to search for an answer.
A problem that remains to be solved is the possibility of a student using a proxy to take the test in their place. Unless there is some validation or testing environment to address this, post-analysis will be required. But this brings me back to one of my key observations. For whatever reason, it remains very rare that a really bad student will submit very good work.
Vitaly Golomoziy teaches college courses in Ukraine and is a colleague of Peter Manijak at Magento, an Adobe Company.
Important Update: We have updated our Privacy Policy to comply with the California Consumer Privacy Act (CCPA)