Here are two statistics that tell the whole story.
87% of companies now use AI somewhere in their hiring process. 40-80% of candidates now use AI to write their CVs and cover letters.
Both sides have adopted the same technology. Both sides think it gives them an edge. And both sides are making the process worse.
Recruiters are using AI to screen out candidates faster. Candidates are using AI to generate applications faster. The result? More volume, more noise, less signal, and a hiring process that feels increasingly impersonal for everyone involved.
Harvard Business Review put it plainly at the start of this year: "AI has made hiring worse." Not because AI is bad — but because both sides are using it to optimise for speed and volume instead of quality and connection.
We think there's a better way. Not less AI — but different AI. AI that amplifies what humans do well, rather than automating what they should be doing themselves.
The recruiter's AI problem
Screening that screens out
The most common use of AI in recruitment is candidate screening: parsing CVs, matching keywords, scoring applicants against job descriptions, and filtering the pipeline down to a shortlist.
On paper, this makes sense. A recruiter managing 20 open roles can't read every CV that comes in. AI filtering reduces a pile of 300 applications to 30 in seconds.
In practice, it's creating two problems.
First, keyword-matching AI is notoriously blunt. It rewards candidates who mirror the job description word-for-word and penalises those who describe the same experience differently. A senior engineer who writes "led platform migration" might be screened out because the job spec says "cloud infrastructure transformation." Same skill, different language. Rejected by a parser that doesn't understand context.
Second — and this is the one nobody talks about — when recruiters rely on AI screening, they stop developing the instinct that makes good recruiters great. The ability to read between the lines of a CV. To spot a career trajectory that suggests someone is about to make a leap. To recognise that a candidate from an adjacent industry might be the best hire precisely because they bring a different perspective. AI doesn't do this. And when recruiters outsource their judgement to AI screening, they lose the skill over time.
Automation that dehumanises
The other major AI use case in recruitment is communication automation: templated outreach, AI-drafted rejection emails, chatbot-driven scheduling, and automated status updates.
Again, the intent is sound. Recruiters are drowning in admin, and automating repetitive communications frees time for higher-value work.
But the execution is often terrible. Candidates can tell when they're receiving an AI-generated message. The language is slightly wrong — too polished, too generic, too obviously not written by a person who knows their name. When every touchpoint in a recruitment process feels automated, candidates disengage. They stop responding. They ghost — not because they're rude, but because they don't believe a human is on the other end.
The irony is stark: recruiters adopted AI to save time on communication, and it's causing the very ghosting problem that costs them placements.
Where AI actually helps recruiters
None of this means recruiters should avoid AI. It means they should use it differently.
AI is exceptional at operational discipline — the unglamorous but critical work that keeps a pipeline healthy. Task prioritisation: which candidates need a follow-up today, which clients are waiting for feedback, which placements are at risk. Pipeline monitoring: flagging when a candidate has gone quiet, when a contract has stalled, when a competitor might be circling.
This is AI as safety net, not decision maker. It doesn't tell you which candidate to hire — it makes sure you don't lose a placement because you forgot to send a follow-up email on Tuesday.
It's the difference between AI that replaces your judgement and AI that protects your revenue. The first makes you dispensable. The second makes you better.
The candidate's AI problem
The application arms race
On the candidate side, the AI adoption curve has been even steeper. Tools like ChatGPT, Teal, and Huntr have made it trivially easy to generate tailored CVs, cover letters, and even practice interview answers for any role in seconds.
The result is an explosion of application volume. If it takes 30 seconds to tailor a CV instead of 30 minutes, why not apply to 50 roles instead of five? The logic feels irresistible.
But here's what the data shows: 80% of hiring managers say they can spot an AI-written CV, and many report that AI-generated applications are actively hurting candidates' chances. The submissions are technically competent but interchangeable. They lack the specificity, the personality, and the evidence of genuine thought that makes a hiring manager pause and say "this person is different."
When everyone uses the same tools to produce the same polished output, nobody stands out. The AI advantage cancels itself out, and candidates are back to square one — except now they've applied to 50 roles they can barely remember instead of five they genuinely cared about.
The preparation gap
Where AI genuinely helps candidates is in preparation — not generation.
Using AI to research a company, understand their market position, identify likely interview questions, and stress-test your answers is enormously valuable. It's like having a well-informed friend who can help you think through your approach.
Using AI to generate your CV from scratch, write your cover letter, and rehearse scripted answers is like sending your well-informed friend to the interview instead of going yourself. It might get you through the door, but it won't survive the conversation.
The candidates who are winning in 2026 aren't the ones who generate the most applications. They're the ones who use AI to prepare more deeply for fewer, more targeted applications — and then bring their real selves to the conversation.
The trust deficit
There's a deeper problem emerging. Only 26% of candidates trust AI to evaluate them fairly. When candidates know that an algorithm might be screening their CV, they start writing for the machine instead of the human. They stuff keywords. They mirror job descriptions. They optimise for parsability rather than clarity.
And on the other side, when recruiters know that candidates are using AI to generate applications, they start discounting everything they read. "Is this really their experience, or did ChatGPT write it?" Trust erodes in both directions, and the recruitment process becomes a game of mutual scepticism rather than genuine evaluation.
This is the AI arms race: both sides using technology to gain an advantage, and both sides ending up worse off.
What human-first AI actually looks like
We think the entire framing is wrong. The question isn't "how can AI make hiring faster?" It's "how can AI make the humans in the process better at being human?"
For recruiters, that means AI that handles operational discipline — the follow-ups, the pipeline monitoring, the task prioritisation — so they can spend their time on what actually closes placements: building relationships, understanding candidates, and providing the consultative value that no algorithm can replicate.
For candidates, that means AI that provides structure and insight — tracking applications, surfacing preparation materials, flagging deadlines — so they can focus on what actually lands offers: showing up as a real person with genuine expertise and a clear story.
This is the principle behind everything we build at M2TalentsTech.
TalentSyncHub uses AI to automate the operational work that causes recruiters to drop the ball — smart task management, automated follow-ups, placement protection alerts — not to replace the judgement and relationships that make great recruiters irreplaceable.
ApplicantGrid uses AI to give candidates structure and preparation tools — CV review, interview prep, application tracking in five languages — not to generate generic applications that hiring managers immediately discard.
AI that amplifies humans. Not AI that replaces them.
The principle, not just the pitch
This isn't just our product positioning. It's a genuine conviction about where recruitment technology needs to go.
The companies racing to "automate recruitment" are solving the wrong problem. The problem was never that humans are too slow. It's that the tools humans use are fragmented, unintelligent, and built without understanding what recruiters and candidates actually need.
A recruiter doesn't need AI to read CVs for them. They need AI to make sure they never forget to follow up, never lose track of a candidate, and never miss a placement because something fell through the cracks.
A candidate doesn't need AI to write their CV. They need AI to help them stay organised across 30 applications in three languages, prepare deeply for the interviews that matter, and maintain the mental clarity that a chaotic job search destroys.
The companies that get this right — that build AI in service of human capability rather than as a replacement for it — will define the next era of recruitment technology.
We intend to be one of them.
For Recruiters
Are you a recruiter?
TalentSyncHub protects your placements and reclaims your time with AI-powered operations.
Discover TalentSyncHubFor Job Seekers
Looking for job search tools?
ApplicantGrid is your personal job search command centre. One inbox. Every language. Every application.
Discover ApplicantGrid