Detection of Malpractice in E-exams by Head Pose and Gaze Estimation

Document Type

Article

Publication Title

International Journal of Emerging Technologies in Learning

Abstract

Examination malpractice is deliberate wrongdoing contrary to official examination rules designed to place a candidate at an unfair advantage or disadvantage. The proposed system depicts a new use of technology to identify malpractice in e-exams, which is essential due to online education growth. The current solutions for such a problem either require complete manual labor or have various vulnerabilities exploited by an examinee. The proposed application encompasses an end-to-end system that assists an examiner/evaluator in deciding whether a student passes an online exam without any probable attempts of malpractice or cheating in e-exams with the help of visual aids. The system works by categorizing the student’s VFOA (visual focus of attention) data by capturing the head pose estimates and eye gaze estimates using state-of-the-art machine learning (ML) techniques. The system only requires the student (test-taker) to have a functioning internet connection and a webcam to transmit the feed. The examiner is alerted when the student wavers in his VFOA from the screen greater than X, a predefined threshold of times. If this threshold X is crossed, the application will save the person’s data when his VFOA is off the screen and send it to the examiner to be manually checked and marked whether the student’s action was attempted malpractice or just a momentary lapse in concentration. The system uses a hybrid classifier approach where two different classifiers are used. One when gaze values are being read successfully. On failing this due to various reasons like transmission quality or glare from his spectacles, the model falls back to the default classifier, which only reads the head pose values to classify the attention metric. It is later used to map the student’s VFOA to check the likelihood of malpractice. The model has achieved an accuracy of 96.04 percent in classifying the attention metric.

First Page

47

Last Page

60

DOI

10.3991/ijet.v16i08.15995

Publication Date

1-1-2021

This document is currently not available here.

Share

COinS