Print Friendly, PDF & Email

By Samantha Egge, Legal Intern and Rachel H. Khedouri, Esq

In May, both the Department of Justice (“DOJ”) and the Equal Employment Opportunity Commission (“EEOC”) released new guidance concerning employers’ use of artificial intelligence (AI) – that is, the use of computer software or applications to complete a task commonly done by humans. Both agencies cautioned employers who utilize AI programs or similar software algorithms of their obligations not to use such technologies in a way that may be discriminatory, including against individuals with disabilities in violation of the Americans with Disabilities Act (“ADA”). Although the DOJ enforces disability discrimination laws with respect to state and local government employers and the EEOC enforces these laws as to private and federal government employers, the guidance from both agencies is instructive for all employers using AI in hiring and other employment decisions. The full text of the guidance is available HERE (DOJ) and HERE (EEOC).

Employers who use AI in their employment processes should consider taking the following steps in accordance with the new guidance:

STEP ONE: Identify any AI or similar software processes being used in employment decision making.

While AI most commonly is used in recruitment and hiring processes, using such technology to aid decision making in any aspect of employment is subject to ADA regulation. For example, some employers may use AI programs or algorithmic measures to evaluate performance, determine pay or promotions, and establish other terms and conditions of employment. When identifying AI processes, employers should also consider AI used by external vendors authorized to act on behalf of the company – liability for ADA violations can be applied to the company regardless of whether an employer creates or houses these programs internally. The guidance warns employers not to rely on vendor claims that their products are “bias free,” as these advertisements likely refer to racial or gender bias rather than discrimination on the basis of disability.

STEP TWO: Evaluate whether current use of AI and similar software programs may violate the ADA.

The guidance highlights potential AI uses that may violate the ADA, such as lack of accessibility for users with disabilities, limited accommodation options, and the use of tools that either “screen out” or identify individuals with disabilities rather than evaluating job skills. Employers should consider whether AI user interfaces and materials are accessible for individuals with various types of disabilities and whether there are any processes that would result in “disability-related inquiries” or seek information that qualifies as a “medical examination” prior to a conditional offer of employment in violation of the ADA. For employers who utilize vendors, the EEOC’s guidance provides helpful questions to evaluate vendor compliance with ADA regulations HERE.

Additional examples of potentially non-compliant uses of AI from the guidance include:

  • Resume scanners that prioritize applications using certain keywords;
  • Computer or typing tests that rate individuals based on keystroke counts;
  • Chatbots that screen out job candidates based on pre-defined requirements;
  • Testing software that rate job or cultural “fit” by comparing applicants to current successful employees;
  • Facial and vocal recognition software that score an individual’s expressions and responses during video interviews; and
  • Automating rejections based on screening questions that may identify a disability.

STEP THREE: Rectify any potential violations and adopt preventive measures.

After identifying technologies with potential to discriminate against individuals with disabilities, employers must take steps to make any necessary adjustments to comply with the ADA, including offering reasonable accommodations unless doing so would cause an undue hardship.

The EEOC’s guidance offers some “promising practices” for employers, including:

  • Proactively posting reasonable accommodation notices and request procedures on job listings;
  • Training staff to recognize requests for reasonable accommodation;
  • Creating efficient and timely processes for evaluating reasonable accommodations requests;
  • Ensuring that AI tools only measure abilities that are truly necessary for the job;
  • Using tools that have been designed to be accessible to individuals with a wide variety of disabilities; and
  • Working with any vendors that use AI programs to ensure that tools are compliant and reasonable accommodation requests are managed in accordance with ADA obligations.

Employers who use AI also should ensure such usage is in compliance with local and state law, including New York City’s ban of such tools for screening candidates for hire or promotion unless a bias audit has been conducted (scheduled to go into effect January 1, 2023).

If you have any questions relating to this new guidance or would like assistance in reviewing your company’s policies and practices relating to AI, please reach out to the NFC Attorney with whom you typically work or call us at 973.665.9100.


SIGN UP NOW to receive time sensitive employment law alerts and invitations to complimentary informational webinars and seminars.

"*" indicates required fields

By clicking this button and submitting information to us, you will be submitting certain personally identifiable information, or information which used together with other information, can be used to identify you and/or identify information about you, to Nukk-Freeman & Cerra, PC (“NFC”). Such information may be used by NFC to contact or identify you. Personally identifiable information may include, but is not limited to, your [name, phone number, address and/or] email address. We collect this information for the purpose of providing services, identifying and communicating with you, responding to your requests/inquiries, and improving our services. We may use your personally identifiable Information to contact you with time sensitive employment law e-alerts, marketing or promotional offers, invitations to complimentary and informational webinars and seminars, and other information that may be of interest to you. However, by providing any of the foregoing information to you, we are not creating an attorney-client relationship between you and NFC: nor are we providing legal advice to you. You may opt out of receiving any, or all, of these communications from us by following the unsubscribe link in any email we send. However, this will not unsubscribe you from receiving future communications from us which are based upon an independent request, relationship or act by you.