400,000+ business leaders (and teams at IBM, AWS & Zapier) start their day with The AI Report. 5 minutes. Plain English. No hype.
What the Sirius XM case means for employers—and 10 steps to stay out of court
Context
AI is speeding up hiring, but it’s also speeding up lawsuits. On August 4, Sirius XM was hit with a federal discrimination suit alleging its AI-powered applicant tracking system downgraded a candidate based on proxies for race. The case, Harper v. Sirius XM Radio, is the latest in a wave of legal challenges targeting algorithmic bias in employment decisions.
Why It Matters
For business leaders, the stakes are clear: a single biased algorithm can expose an organization to class-action litigation, reputational damage, and multimillion-dollar liability. Employers don’t get a free pass just because the bias comes from a vendor’s software. Courts and regulators are increasingly holding companies accountable for how AI is deployed in hiring.
Core Idea
AI is not a shield from discrimination law. If your hiring tools replicate bias—whether intentional or not—you could be the next test case.
What You Need to Know
10 Action Steps for Employers Using AI in Hiring
Closing Thought
AI is not a legal blind spot. It’s a magnifying glass. Employers who fail to build fairness and accountability into their systems are inviting scrutiny, lawsuits, and reputational fallout. The playbook is clear—govern, audit, and document now, before a plaintiff does it for you.
Fisher Phillips, founded in 1943, is a leading law firm dedicated to representing employers in labor and employment matters. With nearly 600 attorneys across 38 U.S. and 3 Mexico offices, it combines deep expertise with innovative solutions to help businesses navigate workplace challenges.

.png)