.jpg)
There’s been something bubbling in the world of AI and recruitment lately…on one side, recruiters are staring at dashboards filled with hundreds (sometimes thousands!) of applicants per role. On the other side, candidates are trying to keep up, using AI to move faster, sound sharper, and stay competitive (it’s still a bit of an employer’s market after all).
Both sides are using AI, which we thought would make life easier for all of us, yet somehow, both sides are feeling more friction than before. What I’ve noticed since AI entered the chat is that the volume has exploded, and the signals that helped us before have gotten harder to find.
Today, AI is everywhere in a recruiter’s workflow. It helps scan resumes, match candidates to job descriptions, and filter out non-negotiables like work eligibility or location constraints. It helps them keep up with response timelines that didn’t exist a few years ago (hi, anti-ghosting legislation!).
And truthfully, with the sheer volumes we’re seeing today, the system can’t function without. But here’s what’s often misunderstood: AI isn’t making hiring decisions. Every serious hiring process still has a human in the loop reviewing candidates, especially for anything that isn’t an obvious “no.” Recruiters use AI to narrow the field, not to choose the winner.
The decision still comes down to pattern recognition built from experience. And that’s where things start to break down. Because what recruiters are seeing now, more than ever, is sameness.
A recruiter opens ten resumes, then twenty, then fifty, and at some point, they all start to blur together…the formatting is nearly identical, the phrasing feels familiar, the accomplishments sound impressive, but vague in the exact same way. It’s not that the candidates aren’t qualified. It’s that nothing stands out.
This is the unintended side effect of AI on the candidate side. People are using the same tools and prompts, and in trying to optimize, they’ve stripped away the very thing that would make them memorable: their individuality.
Recruiters notice this immediately, even if they can’t always articulate it. They’ll say things like, “This looks good, but something feels off,” or “I don’t really know who this person is.”
That’s the signal problem.
The resume is just the first filter. The real clarity comes in conversation. And lately, something new has been showing up…candidates using AI in real time during interviews.
First, you’ll see someone glance repeatedly at another screen, then their answers stretch longer than they need to, and they say all the right things, but it feels so fluffy. There’s no friction, pauses, or moments of thinking out loud. It’s almost too clean. Then I ask a follow-up, and that’s where it breaks.
When someone is speaking from experience, they can go deeper and give specifics. They can tell you what went wrong, what they changed, and what they’d do differently today. When someone is leaning on AI, the answers stay high-level. They sound correct, but they don’t hold up under pressure.
Recruiters pick up on this quickly. Not because they’re trying to catch people in a “gotcha” moment, but because they’re trained to look for evidence of real work.
And this trend is continuing post-interview as well. These days, I rarely receive follow-up emails (which is a shame and a lost art!), and when I do, they often feel like they’ve been run through AI. They’re polished, but stripped of any real personality or reflection. It’s a missed opportunity, because a simple, genuine note is one of the easiest ways to show enthusiasm and help me understand who you actually are.
I had a recent experience that captures this perfectly.
I was interviewing candidates for a role that required working with ERP and accounting-grade data. Things like general ledger, sub-ledger, AR, AP. Not surface-level familiarity, but actual experience dealing with messy, complex datasets.
I was recently speaking with a candidate for a role that required working with fairly complex, structured data within enterprise systems. Not just surface-level familiarity, but real experience navigating messy datasets, reconciling inconsistencies, and understanding how things tie together behind the scenes.
During the interview, the candidate was honest. They said they didn’t have direct experience in that area...and to me it wasn't a total dealbreaker as they've had some adjacent exposure. The rest of the conversation went fairly well, but it stayed pretty high-level, so I was still on the fence about them. I was considering moving them to the next step and having someone else validate my thoughts and feelings.
Then the follow-up email came. It included a highly detailed plan addressing the core problem we’d discussed, including areas they had earlier shared they didn’t have direct experience with. Now, it’s completely fair to follow up and expand on something you didn’t get the chance to cover in the interview. But this level of detail didn’t match the conversation at all, and that disconnect is what raised concern.
The issue wasn’t that the candidate used AI. It’s that the output replaced their actual experience. It introduced doubt instead of strengthening credibility. The candidate went from a maybe to a no, because I felt the credibility was no longer there.
AI isn’t the problem. In fact, the candidates who use it well tend to perform better.
They use it before the interview. They break down job descriptions, map their experience, and think through how they’d answer questions that will likely get asked. They’re also using it to research the company, understand context, and show up prepared.
But when the conversation starts, they put it away. They rely on their own words, examples, and thinking…and it shows. They’re not perfect, they take a moment to pause, they adjust in real-time, and they sometimes say, “I haven’t done that exactly, but here’s something similar.” That’s what builds trust.
There’s a subtle shift underway worth paying attention to. AI has made it easy to sound polished…, to the point that it’s no longer memorable. What stands out now is:
These are things AI can’t fully replicate, especially in real time. And because recruiters are seeing more AI-generated content than ever, those human signals are becoming more valuable, not less.
If you’re navigating the job market right now, the instinct might be to lean harder into relying on AI to optimize more and refine every sentence and bullet point until it sounds perfect. But that’s not what will set you apart.
Use AI to get ready and prepare more effectively, but don’t let it replace you. Let your resume sound like you, let your answers reflect what you’ve actually done, and let your follow-ups be simple and authentic (even if they’re not worded perfectly).
Because the truth is, recruiters aren’t looking for the most polished candidate. They’re looking for someone they can believe. And in a sea of sameness, belief comes from what feels real.