AI Update: EEOC Issues New Technical Assistance Under Title VII

The EEOC has just issued new Technical Assistance to help employers comply with anti-discrimination laws when implementing employment selection procedures that use AI, software, or algorithmic processes (what the EEOC refers to as “algorithmic decision-making tools”). While recognizing that such tools may be used in a range of ways in the employment context, the EEOC is clear that the Technical Assistance document applies specifically to the use of such tools for “employment decisions such as hiring, promotion, and firing.” 

Overall, the EEOC’s Technical Assistance underscores that employers using algorithmic decision-making tools in relation to these key employment-related decisions must pay close attention to the requirements of Title VII and the Uniform Guidelines on Employee Selection Procedures (“UGESP”). As the EEOC itself described in releasing the Technical Assistance: “The document explains the application of key established aspects of Title VII of the Civil Rights Act to an employer’s use of automated systems, including those that incorporate artificial intelligence.”

Based on our expertise on assessing and auditing AI-enabled employment selection tools, we have identified the following key takeaways for employers from the new Technical Assistance document:

      • The EEOC is focusing on whether the algorithmic decision-making tool has an “adverse impact” on the basis of race, color, religion, sex, or national origin.

      • Meeting the so-called “four-fifths rule” is not sufficient to demonstrate that an algorithmic decision-making tool does not have an adverse impact.
            • The “four-fifths rule,” which is set forth in UGESP, states that the selection rate for one group will be considered “substantially different” than the selection rate of another group if the ratio of the two selection rates is less than four-fifths (80%). Some developers of AI- and algorithm-enabled employment selection tools have started marketing their tools as “valid” because they meet this standard.

            • The EEOC Technical Assistance makes it clear that employers using algorithmic decision-making tools cannot rely solely on the four-fifths rule to determine if the tool complies with Title VII requirements but rather need to do deeper statistical assessment:

              “The EEOC might not consider compliance with the rule sufficient to show that a particular selection procedure is lawful under Title VII when the procedure is challenged in a charge of discrimination…. [E]mployers that are deciding whether to rely on a vendor to develop or administer an algorithmic decision-making tool may want to ask the vendor specifically whether it relied on the four-fifths rule of thumb when determining whether use of the tool might have an adverse impact on the basis of a characteristic protected by Title VII, or whether it relied on a standard such as statistical significance that is often used by courts.”

          • Proactive steps can help an employer avoid problems down the line:

            “The EEOC encourages employers to conduct self-analyses on an ongoing basis to determine whether their employment practices have a disproportionately large negative effect on a basis prohibited under Title VII or treat protected groups differently. Generally, employers can proactively change the practice going forward…”

        To discuss how Resolution Economics can help your company conduct an audit to assess whether any of the algorithmic decision-making tools you use for hiring, promotion, or firing have a potentially adverse impact, contact a member of the Resolution Economics AI Bias Audit Team.

         

        Written by: 

        Margo Pave
        Director
        Washington D.C.
        mpave@resecon.com
        202.524.1658

        Facebook
        Twitter
        LinkedIn
        WhatsApp
        You are here:

        Leave a Reply

        Your email address will not be published. Required fields are marked *

        Post comment