top of page
Search
  • Writer's pictureJodie Zerega

The Perils of AI Resume Screening: Law Firms Are Missing Out on Great Candidates

Updated: Feb 21, 2023

Resume screening can be tedious, especially during on-campus interview season. I spent countless hours on initial reviews while working as the Director of Recruiting at a large national law firm. We posted minimum hiring criteria, but we allowed anyone to drop their resume for consideration. Together with the Hiring Committee, we reviewed each and every one, even though we were well aware that most of the applicants wouldn’t be a fit. So I can understand why law firms would be keen to expedite the screening process.


Unfortunately, the shortcut many firms have landed on—using artificial intelligence (AI) to screen resumes, with humans reviewing only the resumes that the AI tool approves—is deeply problematic. These tools work well enough for cookie-cutter qualified candidates, but they frequently exclude lawyers with tremendous potential who have taken a less traditional path. The efficiency gain comes at a steep cost: missing out on valuable talent—in particular, diverse talent.

Rigid screening rules are no substitute for human judgment

Automated screening tools take a fundamentally mechanical approach. It’s far easier for a computer to weed out resumes based on class rank than to judge a candidate’s intangible leadership abilities or personal factors that may have affected academic performance. Consider for example a candidate who worked full time during law school and finished in the top 11% of the class. A human reviewer will immediately recognize the challenge that this candidate faced and will assess the academic record in light of the candidate’s unusual circumstances. Conversely, an automated screening tool is likely to perform a more simplistic analysis: the firm is looking for people who finished in the top 10%, therefore this candidate is excluded.


You might question how frequently cases like that actually arise. In my experience as a recruiter, the answer is all the time. I worked with a diverse female candidate—now thriving as a law firm associate—who was having no luck getting interviews because she had attended a Tier 3 law school. She graduated magna cum laude from undergrad, summa cum laude from law school, and went on to receive a LLM and performed well in that program. When an experienced recruiter comes across a record like that, we know that there must be more to the story. Sure enough, this candidate had been accepted at many T1 law schools, but she chose the T3 because her mom had been diagnosed with cancer, and that school was close to her mom’s home.


Fortunately, I was able to draw on my relationships with firms to get this candidate the comprehensive evaluation it deserved. Firms quickly realized her exceptional potential, and the story ended happily. Firms that rely on AI technology to screen lateral candidate resumes will miss out on a candidate like this, every single time. Had she not had my assistance at the beginning of her job search, she would not have the amazing job that she has now.


I’m hardly the only person to come to this conclusion. About six months ago, an AmLaw 50 firm asked Zerega Consulting for help with a lateral associate search. We found them an exceptional candidate who was the perfect culture fit, with the personality to be both a dynamic attorney and a great colleague. He also brought very good academic credentials (top 15% of the class, T1 law school). The firm’s Director of Recruiting was delighted when we presented this candidate, and the partners hired him on the spot. But the director had to acknowledge an uncomfortable fact: the firm had missed out on this candidate when they conducted interviews at his law school because his grades didn’t meet the threshold imposed by their automated screening process. She confided that this was exactly why the firm is no longer relying on AI to screen resumes for on-campus interview slots.


Keyword searches will not identify the full pool of qualified lateral candidates

AI recruiting tools’ poor performance isn’t limited to their failure to highlight promising non-traditional candidates. They also struggle to classify lawyers’ prior work experience appropriately. Imagine a firm seeks to identify lateral candidates with products liability experience. Some lawyers know to include in their LinkedIn profiles and resumes multiple keywords corresponding to their experience, but others haven’t optimized their profiles in this way. I can guarantee you there will be highly qualified candidates who describe their expertise as civil litigation instead of products liability. A savvy recruiter will be aware of that likelihood and will ask the right questions to figure out the precise nature of the candidate’s experience. The AI tool most likely will not.



Firms willing to invest the time will gain a competitive advantage

To be clear, I am not categorically averse to the use of AI in the recruiting process. Several of my client law firms use AI products at a later stage (typically after an initial interview) to help predict whether a candidate will succeed in the role. The candidate is asked questions about how they would handle hypothetical client situations, technical questions related to the subject matter of the role, and personality questions to predict culture fit and likely length of tenure. The basic idea is to match personality traits and other performance indicators to the traits displayed by top-performing current associates. My clients find it valuable, and I have no objection to this use case.


But initial screening is a different matter. Even if AI tools get it right in the majority of cases, that doesn’t excuse allowing exceptional candidates to slip through the cracks before the firm has an opportunity to assess them fully. In a tight market for talent, with so many firms taking a similar AI-based screening approach, there is a material competitive advantage to be gained by firms willing to invest more time at the screening stage.


As with any technology, AI can add value when used wisely. Initial screening is best left to humans.

bottom of page