AI Hiring Meets 1970s Law

Expert analysis from

Fisher Phillips
March 4, 2026

A new lawsuit claims AI screening tools may be creating “consumer reports” without FCRA compliance. Here is what employers need to know.

Context

Artificial intelligence has transformed recruiting. Now it may be colliding with a 55 year old federal statute.

On January 20, two job applicants filed a class action in California state court against Eightfold AI, alleging the company’s hiring platform created “hidden credit reports” on candidates without complying with the federal Fair Credit Reporting Act.

Most AI hiring lawsuits so far have centered on discrimination. This one takes a different path: consumer protection law.

Why It Matters

If a court agrees that certain AI screening outputs qualify as “consumer reports” under the FCRA, the implications extend far beyond one vendor.

Employers could face:

  • New disclosure and authorization obligations
  • Pre adverse and adverse action notice requirements
  • Contractual certification duties
  • Class action exposure, even absent bias claims

This theory does not hinge on whether an algorithm is discriminatory. It hinges on process.

For business leaders, that means AI governance is no longer just about fairness and bias. It is about compliance architecture.

Core Idea

AI is artificial intelligence, not regulatory immunity.

If an AI vendor assembles third party data about candidates, generates predictive assessments, and delivers them to employers for hiring decisions, a court could decide that activity looks a lot like a consumer reporting agency under the FCRA.

And if that is true, decades old compliance rules apply.

What the Lawsuit Alleges

The plaintiffs claim that when they applied to employers using Eightfold’s platform, the system:

  • Pulled information from third party sources such as LinkedIn and other public databases
  • Analyzed massive volumes of global workforce data
  • Generated inferences about characteristics, abilities, and predicted success
  • Ranked candidates on a 0 to 5 scale
  • Filtered candidates before any human review

The legal theory is that these assessments constitute “consumer reports” under the federal Fair Credit Reporting Act and California’s ICRAA, triggering strict procedural requirements.

Eightfold has publicly stated that it does not scrape social media and similar sources. The case is in its early stages, and the facts will be tested in court.

The Legal Framework: FCRA in an AI World

Enacted in 1970, the Fair Credit Reporting Act regulates consumer reporting agencies that provide information used for employment decisions.

The statute defines a “consumer report” broadly. It covers communications about a person’s character, general reputation, personal characteristics, or mode of living used for employment purposes.

If applicable, the FCRA requires:

  • Standalone written disclosure that a consumer report may be obtained
  • Written authorization from the applicant
  • Pre adverse action notice with a copy of the report and a summary of rights
  • Adverse action notice after a final decision
  • Certifications between the employer and the reporting agency

Many employers rigorously follow these steps for traditional background checks. The open question is whether certain AI screening tools fall into the same category.

Five Practical Takeaways for Employers

1. Understand What Your AI Vendor Actually Does

Do not assume your platform only parses resumes.

Ask direct questions:

  • Does it pull data beyond the application?
  • Does it generate predictive scores or rankings?
  • Does it compare candidates against large external datasets?
  • Does it create inferences about traits or future performance?

You cannot assess regulatory risk if you do not understand the data flow.

2. Audit Vendor Contracts and Certifications

If a vendor could be considered a consumer reporting agency, it should be obtaining FCRA certifications from you.

Review your agreements and onboarding processes. Confirm whether:

  • Proper disclosures are being delivered
  • Written authorizations are collected
  • Pre adverse procedures are operationalized

Do not rely solely on a vendor’s position that the FCRA does not apply. Conduct your own risk assessment.

3. Break Down Internal Silos

Background check compliance often sits with HR or legal.

AI recruiting tools may sit with talent acquisition or IT.

Map your full hiring tech stack. Ensure FCRA compliance protocols, if applicable, extend to all third party screening tools, not just traditional background vendors.

4. Prepare for Scrutiny

Regulators and plaintiffs’ attorneys are actively testing AI employment systems.

Document your diligence. Maintain records of vendor vetting. Periodically review AI hiring practices with counsel before you are responding to a demand letter or subpoena.

Proactive governance is cheaper than reactive defense.

5. Consider Risk Beyond the Statute

Even if you conclude the FCRA does not apply, reputational risk remains.

Opaque, black box hiring tools can erode candidate trust. In a competitive talent market, perceived fairness and transparency affect employer brand and offer acceptance rates.

Legal defensibility is the floor. Trust is the ceiling.

Closing Thought

This case may take years to resolve. But the signal is already clear.

AI in hiring is no longer just a technology decision. It is a compliance decision, a governance decision, and a brand decision.

The companies that win will not be the ones with the flashiest algorithms. They will be the ones that treat AI as infrastructure, subject to the same rigor as finance, privacy, and risk.

In 2026, the question is not whether you use AI in recruiting.

It is whether you are using it with eyes wide open.

About

Fisher Phillips

Fisher Phillips, founded in 1943, is a leading law firm dedicated to representing employers in labor and employment matters. With nearly 600 attorneys across 38 U.S. and 3 Mexico offices, it combines deep expertise with innovative solutions to help businesses navigate workplace challenges.

Read more

Recommended

Related articles
Logo The AI Report
Join the Newsletter
Inchide fereastra