What happens when a software tool violates someone’s privacy? Far from an edge case, this is an issue we see quite frequently in online proctoring.
Organizations want to be sure that people taking their training are who they say they are, and that they’re participating in good faith. But this often requires participants to give up personal data and submit to things like ID verification, room scans, audio recording, etc.
Who’s responsible for ensuring this is all done legally and ethically? Recent legislative shifts have placed the onus firmly on the organizations doing the testing, rather than the software providers themselves.
In this post, we’ll look at a few key laws and cases from Canada, the UK and the United States to shed some light on this evolving landscape and its implications for anyone that runs online testing and training.
The Canadian Anti-Spam Legislation (CASL) was created in 2014 “to reinforce best practices in email marketing and combat spam and related cyber threats.” Among other things, CASL requires a request for consent and written acknowledgement when using ‘invasive computer programs.’ CASL also requires organizations that use invasive software to provide users with assistance in removing it afterwards.
So, what is an invasive computer program? How do you know if your online proctoring tool is one? As per CASL, invasive software:
By blocking restricted sites, monitoring web use and requiring camera access, online proctoring services clearly fall under CASL. Additionally, few, if any, online proctoring plug-ins provide uninstall assistance. Most stay active by design, in an effort to save the proctoring provider from having to deal with additional support requests.
So what’s the risk of CASL noncompliance? If you’re an online training provider and your proctoring system leaves you liable to CASL violations, you could face regulatory penalties of up to $10 million per violation.
The General Data Protection Regulation (GDPR) implemented by the European Union has set a universal standard for data privacy and security. Under GDPR, fines for inappropriate collection, storage, transfer or deletion of personal data fall on the data owner, rather than the data processor — a key distinction that, like CASL, shifts responsibility away from third-party software vendors.
For online proctoring, this means that the organization running the testing is considered the data owner, while the proctoring tool is the data processor. Testing and training organizations are ultimately liable for how they deploy and maintain software that handles personal data.
As Ireland’s Data Protection Commission put it in a judgment against Slane Credit Union Limited, “processors cannot be used by controllers as a legislative safety net, and it is essential that due diligence is carried out to ensure[...] the protection of personal data.”
GDPR legislation allows for fines up to 20 million EUR or 4% of a company’s total worldwide revenue. Critically, any company that handles EU residents’ personal data is subject to GDPR — regardless of where they’re based.
Think liability is only an issue in Canada and the EU? Think again. American testing and training organizations are increasingly coming under regulatory scrutiny for their proctoring policies.
Illinois' Biometric Information Privacy Act (BIPA) is one potential example. BIPA places the liability on organizations when biometric data is collected or used in a way that violates user privacy (this includes failing to secure written consent from participants, and failing to publish a retention schedule outlining when the information will be permanently destroyed).
In one case, White Castle Systems, a railway company, was initially fined $228 million for scanning the fingerprints of more than 45,000 truck drivers without consent. (The case was eventually settled for $75 million.) Critically, the third-party vendor that manufactured the scanning software was not implicated in the charges — once again demonstrating that liability rests with the organization collecting data, rather than the vendors they use to collect it.
Another case with clear implications for online training and testing providers is a 2022 lawsuit against Cleveland State University. In its judgment, a federal court found that the University’s remote testing requirement of a room scan constituted unreasonable search and seizure in violation of students’ Fourth Amendment rights.
Cleveland State used two separate proctoring providers to facilitate these scans; both were named in the decision, but neither was fined. This instance not only underscores the need for careful selection and management of third-party proctoring services, but also illustrates how easily organizations can find themselves in legal jeopardy due to the actions of their software providers.
The examples above aren’t just particularly punitive outliers; they underscore an important trend in online privacy legislation. Increasingly, courts and regulators around the world are requiring organizations to be responsible for the actions of any third-party software, plug-ins or services they use.
To navigate this liability landscape effectively, online training and testing providers must adopt proactive compliance measures, including thorough due diligence, robust compliance monitoring systems, employee education and legal consultation when necessary.