On May 18, 2023, the U.S. Equal Employment Opportunity Commission (EEOC) issued a new technical assistance document titled “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964.”
This technical assistance is part of the EEOC’s 2021 agencywide initiative to ensure that the use of software such as artificial intelligence (AI), machine learning and other emerging technologies in hiring and other employment decisions complies with the federal civil rights laws enforced by the agency. The new guidance builds on the Uniform Guidelines in the Employee Selection Procedures (UGESP) adopted by the EEOC in 1978, as well as guidance issued last year addressing issues of using artificial intelligence in hiring within the context of the Americans with Disabilities Act.
The technical assistance addresses the potential discriminatory impact of using algorithmic decision-making tools, defined as the computer analysis of data that an employer relies on, either partly or in whole, when making decisions about employment. The guidance highlights the following examples of such software available to employers:
How can employers tell if their algorithmic decision-making tools are in danger of violating federal employment discrimination laws? According to the EEOC, any selection tools that create an adverse selection rate toward individuals of one or more protected characteristics can be indicative of discrimination. The technical guidance reminds employers that although AI systems have the appearance of objectivity, they are developed by humans and therefore are subject to the societal and personal biases that can create disparate outcomes in hiring.
The EEOC provides direction on how to evaluate the extent to which bias may permeate an employer’s automated process. The technical assistance directly states that the “four-fifths rule” can be applied to AI tools to help identify disparate impact. This test, described in detail in the UGESP, defines a selection rate for one group as “substantially” different than the selection rate of another group if their ratio is less than four-fifths (or 80%). For example, an employer’s hiring tool creates a selection rate of black applicants of 30%, while its selection rate of white applicants is 60%. Because the ratio of those two rates (30/60 or 50%) is lower than four-fifths, this selection rate for black applicants is substantially different than the selection rate for white applicants and could evidence discrimination against black applications.
The EEOC reiterates that the four-fifths rule is a good rule of thumb, but quickly dashes employers’ hopes of calculating their way into compliance with a simple formula. In some situations, the four-fifths rule will not be a reasonable substitute for a test of statistical significance — for example, where many selections are made, causing any ratio to be irreflective of the actual impact on different protected groups. As with traditional selection processes, employers should subject AI tools to holistic review; compliance with any one test cannot disprove discriminatory outcomes. The EEOC recommends that employers conduct self-analyses and audits on an ongoing basis. However, the EEOC makes it clear that employers need not discard their existing AI tools, but should make amendments to remedy discriminatory selection rates. Because algorithms can be adjusted, not doing so may open an employer up to liability.
Many employers may hope to circumvent these concerns by outsourcing AI hiring tools to third-party vendors. The technical assistance, however, states that employers may still be liable for their agents’ violations of federal employment discrimination laws. Employers therefore should take steps to determine if vendors or developers are building and auditing their AI tools for any discriminatory impact. The EEOC recommends asking vendors specifically if they relied on the four-fifths rule, or other court-approved standards like statistical significance, when auditing their product.
Tips and Takeaways
The technical assistance urges employers to take a hands-on approach to auditing AI usage in their hiring processes. The following tips may aid employers in that task:
For questions about how artificial intelligence presents both risks and opportunities for employers, contact the authors of this article.