This article was originally published on BizEd.
EHL learns how to fast-track remote exams and rethink student assessment.
When the COVID-19 pandemic prompted the Swiss government to close all of its higher education institutions in mid-March 2020, Ecole hôtelière de Lausanne (EHL) shut down its campus in less than three hours. Our institution’s primary campus is located in Lausanne, Switzerland, and we had to send more than 2,000 students to their homes in 90 countries.
At the time, we were planning to hold spring exams in June. EHL did not yet have the ability to test students remotely, and it was daunting to consider how much work we would have to do to make such tests possible. However, we realized we had a unique opportunity to make a major shift in a short period. Within a month of EHL’s closure, we had cleared bureaucratic hoops, prepared alternative exam formats, and vetted new proctoring software. By July 10, 2020, we had used our learning management system (LMS) to administer more than 12,000 online exams to 1,545 undergraduate students sheltering in place around the world.
The successful rollout of remote exams now makes our former process of on-campus testing look antiquated and impractical. In fact, we have no plans to return to our old approach to exam delivery. The whole experience has made us reassess how we think about exams, assessment, and remote learning in general. At the same time, it validated our approach to problem solving and to applying a service-oriented approach to our processes.
When change happens suddenly, it is important to think strategically, rely on common values, and have a dedicated team working on solutions. That attitude has helped us tremendously through some very tough days.
Replacing an outdated process
Before COVID-19, EHL students took exams on campus over the course of two weeks, with two exam sessions a day. About 50 percent of the exams were delivered over campus-based PCs set up by the IT department. The other 50 percent consisted of paper exams, which required 100,000 sheets of paper per session. Exams were closed book. To deter cheating, we randomly assigned seat numbers to students.
The logistics of running exams on campus proved laborious on many levels. For the IT department, it meant setting up 500 laptops and troubleshooting technical issues. For the administration, it required printing reams of paper before each test, and then electronically scanning each completed exam to allow students to review their answers online later. We also had to hire, train, and schedule 50 external exam invigilators and 20 academic assistants to watch over the students during test time.
In retrospect, it is clear that our previous system was imperfect. We used many resources to plan our schedule and deploy our personnel. We also encountered technical difficulties when PCs crashed, which made the process hard on the students and the IT department.
Even before COVID-19 occurred, our institution had begun investigating how to use our LMS to conduct exams. That meant we had a bit of a head start once the pandemic hit, but we still faced a number of challenges in the brief time we had before delivering the first tests online in June.
Taking the initial steps
The decision to move forward with remote exams was not easy. After all, we would have to find a way to maintain our mission for quality education, test students located in 90 different countries, and discourage cheating. In fact, we considered asking students to return to campus in August to take exams in person once the virus subsided. But we abandoned that option because it would have been costly for the students—and there was no guarantee that the pandemic would be over by then.
With remote exams the only viable solution, we had to ask for approval from the Swiss regulatory organization that accredits higher education in our region. This body must sign off on all fundamental changes to curriculum.
The first response was a resounding no. What followed were weeks of wrangling between EHL and the regulatory body, a process made more frustrating because we were all working remotely and had to communicate through phone and video calls. Finally, the organization made an exception for us, deciding that EHL could pioneer remote exams for higher ed in the region. That made perfect sense to us, since it is part of the EHL mission statement to pioneer hospitality education.
Our next step was to create a steering committee that comprised representatives from IT, academia, and business operations. This team not only helped during negotiations with the Swiss authorities, but it also defined the overall plan, designated resources, monitored progress, and acted as a liaison with important stakeholders. As executive dean and managing director of the school, I worked with the team to meet our goals.
Addressing cheating concerns
One of the team’s first tasks was to determine how to effectively monitor remote exams—which started with reviewing the way our institution assesses the risk of dishonesty. In general, educators hold one of two common views about cheating. Some take an extreme position: They believe most students will find a way to cheat, so schools must devise solutions that minimize the risk. Others take a more benign view: They believe most students will not cheat, so institutions don’t need to institute harsh measures.
At EHL, we take a middle ground. Our philosophy is that most of our students will follow our honor code, but we still must put safeguards in place. We also let students know they will be sanctioned if they’re caught cheating.
This perspective influenced our next steps in several ways. First, we decided to make remote exams open book, thereby mitigating the temptation for students to cheat and alleviating the stress they might experience when taking exams online. We believe that, at the bachelor’s level, it’s more important to use tests to allow students to display their competencies and prove that they can do the work rather than to verify that they have memorized the material. This is especially true at the end of the curriculum, when students are more seasoned.
We also tried to minimize potential cheating by holding each exam at the same time. Multiple time zones made this challenging, but after much discussion, the committee chose noon Central European Time. This meant that 5 a.m. was the earliest possible start time for any of our students, and 10 p.m. was the latest.
Finally, we invested in anti-fraud software to ensure the integrity of exams that would take place on students’ computers. After the steering committee looked at two proctoring vendors, we selected software that accommodated global users and integrated well with Moodle, our LMS. The software did have a particularity: It is a Google Chrome extension. Not until later did we realize that Google Chrome is unstable in China and could potentially prohibit 97 EHL students in that country from successfully completing their exams. We decided that if any of those students failed, we would give them the option of retaking exams in August 2020.
The software that we selected deters cheating because, once it’s enabled, it can record a student’s activities via the computer’s audio and webcam. It also collects data that shows if a student attempts to right-click, print or copy and paste material, navigate to other pages outside the exam, or take the test on the same network as another student. Suspicious behavior is flagged in a report after the exam is completed, and we can decide after further review whether we need to take disciplinary action against the student.
Preparing all participants
Changing such a familiar, well-worn process can cause people to worry about the unknown. To assuage the fears of students and faculty, the steering committee doubled down on preparing everyone for the reality of remote exams.
On the EHL intranet, the team posted FAQs, instructions, and user guides. It created a step-by-step video to show users what to expect when installing the Chrome extension and how to access the exam on the LMS. The team also held mock tests on the LMS so students and faculty could get comfortable with the system. In addition, it organized a hotline to assist students who might have questions during the exam. Through it all, the committee sent out a regular cadence of communications to keep everyone up-to-date on the latest news.
As we worked furiously toward a mid-June deadline, two things took us by surprise. We thought that new technology would pose a problem for faculty in their 50s. Instead, they embraced the challenge, telling us they saw remote exams as an opportunity to learn new skills.
Conversely, we had expected that our millennial and Generation Z students would find it easy to make the transition to digital tests. But we discovered that many had trouble adapting to remote learning and were getting frustrated. While they were comfortable with the digital environment, they weren’t used to being autonomous and they needed a human connection. Their helplessness and sense of frustration might have arisen in part because, as students of a hospitality-focused institution, they are the type of people who crave human contact and excel at customer service.
Still, we realized they weren’t as self-directed as we thought. They already were stressed from having to study from home during the pandemic, and now they were worried about online exams. As a result, they needed more guidance, more support, and more structure than we had initially anticipated. In response, the faculty and academic assistants offered extra tutorials to students, while a coaching center guided them on more effective learning and study methods. More important, in early June, we began heavily focusing on providing support during the exam session.
Administering the first tests
Remote exams took place the last week of June and the first week of July. The June exams served as a pilot; they were retake exams for students who had failed their exams the previous semester. The July exams were a much larger session, as students had tests for each class. In general, the exams went well.
The hotline proved highly effective. Students used it to ask questions about exams or get assistance with technical issues. We had three people managing the hotline and two backups on call. Those providing support included members of the steering committee, the administration, academia, and IT. Faculty and their assistants were on call for questions pertaining to their specific exams. If no one was available to talk, students were forwarded to a message box and their calls were promptly returned.
During the first three days of remote exams, the hotline addressed more than 1,000 calls a day. But it became clear to us that students felt the most stress during their first online exams. By the time they took their second and third tests, students became more comfortable with the process, and calls dropped significantly. We were pleased with the way the hotline, the FAQs, the extra tutorials, and the coaching center smoothed the exam process, and we will continue to use them in the future.
We were also pleased with the proctoring software. Once our students completed their exams, we received a report that highlighted suspicious activity, but we did not investigate every single incident. Instead, we established a threshold. If a student surpassed that threshold by engaging in a particular number of suspicious actions, we then investigated further by watching sections of the video of the student taking the test. To respect the privacy of the student data, we authorized only a limited number of persons to watch those videos. We also established strict rules for data storage.
Ultimately, we had to discipline 1.5 percent of the students who took exams in July and 4 percent of the students who failed their July exams and retook them in August.
The proctoring software significantly reduced the number of people we needed to administer the exams—just eight to ten academic assistants, as well as two staff members from the administration. And we only needed a handful of people to review the reports and follow up on suspicious behavior.
Overall, we consider the rollout of remote exams a success. Several factors contributed to our ability to carry out the initiative in a compressed timeline:
The EHL philosophy. EHL faculty, students, and staff are dedicated to creativity and innovation. They’re hard-wired to be very customer-centric, just like the hospitality industry itself. Everyone on my team asked, “What do you need? Because this is my problem, too.” All of them were invested in making this happen.
A dedicated steering committee with project management experience. Team members worked very closely together. They knew how to coordinate operational steps, how to implement projects efficiently, how to assess risks, and how to communicate with stakeholders. Because we were dealing with such tight deadlines, this experience was particularly important—no one wasted time asking how to do something. They already knew how to do it. This knowledge enabled the committee to move very quickly.
The faculty members of our academic and assessment boards. The academic board oversees the quality of EHL’s academic programs and, with the dean’s input, has authority to develop and execute initiatives. The assessment board reports to the academic board and helps the dean review the institution’s evaluation and assessment philosophy. The members of both boards are faculty, which proved invaluable in getting other professors to support remote exams. For example, the assessment board researched and defined open-book exams for the institution and promoted it to other faculty members.
I had created these groups in the fall of 2019, and I highly recommend that other institutions consider instituting similar boards on their campuses. Deans can select people who are excellent teachers, have expertise, and are committed to moving education forward. These boards also improve the relationship between administrators and professors, because they encourage faculty to come up with new ideas rather than waiting for the administration to push its own agenda.
The future of exams
Now that EHL has successfully administered remote exams to students in the bachelor’s program, we see the writing on the wall. Even if we could, we wouldn’t go back to the old system.
It’s easy to see why. First, remote exams require fewer people to organize and manage the process—as I mentioned, we were able to drastically reduce the numbers of our invigilators. We once needed about 70 people to keep an eye on students for each exam session. Now we use proctoring software and have just a few staff on hand to review suspicious activity.
Second, remote exams are simpler to execute than on-site tests. We no longer have to reserve rooms on campus, set up 500 PCs, or scan every completed paper. Third, online exams are environmentally friendly because of the reduction in paper use. Finally, administering online exams has fostered among faculty a growing acceptance of open-book tests.
As we have created exams on our existing LMS platform, we also have activated new features. For instance, we have instituted word counts as a way to limit the length of text answers. We use another tool to detect whether students are plagiarizing.
The success of our first round of remote exams has caused us to rethink assessment in higher education. At the end of the day, what matters is not how we deal with a crisis, integrate technology, or handle an external disruption like COVID-19. What matters is pedagogy. What matters is how we challenge ourselves to find the best ways to support individual learning and create meaningful, powerful, actionable educational experiences for our learners.