Most of the recently enacted state-level consumer data privacy statutes have added a focus on automated decisions that “produce legal or similarly significant effects” concerning an individual. Most of the statutes simply allow consumers to opt out of such automated decision-making processes. But some of the data protection statutes go further.
For example, the Minnesota Consumer Data Privacy Act (“MCDPA”) has provisions that address the issue of automated decision-making based on the profiling of a consumer’s personal data. The MCDPA has a standard set of rights granted to consumers, such as the right to know what personal data is collected, with whom the data is shared, the right to a copy of the data, the right to opt out of certain data processing, etc. But there are additional rights related to automated data decision-making. In terms of a real-world example, consider the use of automated and computer-based data to make a decision about issuing credit. Under the MCDPA, consumers can opt out of having their personal data profiled in this manner, but there are other options.
In particular, controllers and processors must establish procedures to inform consumers:
- About the automated decision-making process — that it exists, how it works, what factors are considered, etc.
- Information about why the particular decision was made and/or what factors led to the decision
- What the consumer could change to obtain a different result, and
- Information about and the right to obtain a reevaluation based on changed/updated data
For consumer advocates, the animating principles here are potential bias in the use of automated decision-making and the lack of any human oversight.
These are the same animating principles for those who are concerned about the use of automated employment decision tools (“AEDT”). AEDTs are computer and AI modules and programs used for hiring and employment promotion decisions. Examples of AEDTs include programs or AI modules that evaluate resumes, rate and evaluate video job interviews based on responses, body language ,and behaviors, software that screens and filters applications based on keywords, etc. For those who oppose these new technologies, bias is a large concern, along with the absence of human involvement.
Interestingly enough, opponents of the use of AEDT may find some value in reviewing how the data privacy statutes are being drafted. Most of the data privacy statutes do not apply to employment decisions. So those statutes will not be directly useful in limiting any bias or discrimination generated by the use of AEDTs. However, at least three States have enacted statutes specific to the use of AEDTs. Those states are New York, Colorado, and Illinois. In each case, the statutes are aimed at notifying job applicants that AEDTs are being used and, except for Illinois, requiring some sort of bias impact study. Consumer advocates will probably find such statutory regimes to be underwhelming. Maybe something more in line with the rights granted by the MCDPA, as discussed above, would be more appropriate.
Contact The Consumer Data Privacy and Compliance Attorneys At Revision Legal
For more information, contact the experienced Consumer Data Privacy and Compliance Lawyers at Revision Legal. You can contact us through the form on this page or call (855) 473-8474.