AI Camera and Mic Tracking. Face and Noise Detection
Ensure candidates are always looking at the test. And not getting any audio prompts

Trusted by 40,000 customers to conduct 20 Million+ Tests








How AutoProctor's AI ensures no impersonation and audio cues

No Face or Multiple Faces Detected
Candidates may step away from the computer to look at their phone. Or ask their friend to look at the questions. We detect both events.

Audio Prompts Detected
If someone next to the candidate is speaking the answers, that gets recorded. You can play the recording to determine if it was cues.

Random Photos During Exam
We take photos at random intervals throughout the test. You can look at the photos to determine if it was the right candidate attempting the test.
Start conducting exams with AutoProctor today
Experience seamless online proctoring and elevate your assessment process.
Frequently Asked Questions
No! The process is completely automated. Our AI analyzes the camera feed as the candidate attempts it and then detects the violations. There is no human anywhere in the process.
Do read our Privacy Policy to understand this better. But, unlike most other proctoring tools, we don't store the full exam session. We only store evidence of violations (and a few Random photos, if enabled). So if a candidate doesn't do anything suspicious, no data will be stored.
Secondly, we are compliant with a whole host of Student Privacy laws in North America, and GDPR in Europe. Do read our Privacy Policy for more details.
Lastly, we do not sell any of this data. As outlined in the Privacy Policy, the data are automatically erased within a few months of them being generated
If camera or mic access has not been granted or revoked, we don't let the candidate continue with the test. So, they will have to grant access to see the questions. If they block the camera with their hand, etc then you will see it as a violation and the Trust Score will be lower.
Like with most AI applications, AutoProctor too can make mistakes. Even though you may see a face on the photo, it may categorize it as No Face Detected. In almost all cases, it is because (i) the candidate isn't in a well-lit environment, (ii) is not looking directly at the camera, (iii) doesn't have a plain background, (iv) isn't being fully covered by the camera feed. By changing these things about the test-taking environment, the candidate can get the AI to recognise their face.
Noise detection is fairly accurate. You may see a few false positives, because AutoProctor cannot distinguish between human and environmental noise. So, you'll have to play the audio and then decide if the candidate was cheating.