Artificial Intelligence (AI) is no longer a futuristic concept in recruitment—it’s here, quietly reshaping how organisations attract, assess and hire talent across education, public and private sectors. From AI-powered CV screening and chatbots to predictive analytics and bias reduction tools, the technology promises efficiency and consistency. Yet, these advances also bring challenges—particularly around empathy, fairness and the human connection that can underpin good recruitment practice.
In my role as a career’s education professional in higher education, I see this shift from both sides. On one hand, I support students preparing to navigate AI-driven recruitment systems as they transition from education into industry. On the other, I work with employers adapting their hiring strategies to balance innovation with inclusivity. It’s a front-row seat to both the excitement and the unease—the promise of efficiency and the risk of losing human connection.
Lived Experience Shapes Perspective
Some students approach me, CV in hand, wide-eyed and uncertain. They’ve spent months perfecting their applications, only to discover that an algorithm might decide whether a human ever sees their profile. It’s not that AI is inherently unfair—it’s that the experience can feel impersonal, even intimidating.
For candidates from marginalised backgrounds or neurodiverse learners, these systems can exacerbate anxiety. Employers, meanwhile, often assume AI equals objectivity, without fully appreciating its limitations. Observing both sides has taught me that technology alone cannot ensure fairness—human guidance remains essential.
AI in Education: Automation Meets Accessibility
Universities have been quietly experimenting with AI for years. Turnitin flags plagiarism and now detects AI-generated content (Turnitin, 2024). Platforms like Texthelp and Ginger provide personalised support for neurodiverse learners, giving educators insights into individual learning needs (Texthelp, 2023).
Applied to recruitment, these tools demonstrate potential—but they also serve as a cautionary tale. An algorithm might flag content as “suspicious” without understanding context, culture, or creativity. Tools designed to support neurodiverse candidates can unintentionally reinforce stereotypes if not carefully overseen. Technology can point the way, but it cannot replace judgment, empathy, or nuance.
AI in Public and Private Sector Recruitment
Across sectors, AI promises to streamline recruitment. Applicant Tracking Systems (ATS) can sift through hundreds of CVs in seconds. AI-powered screening identifies candidates whose skills match the role and chatbots provide instant engagement and guidance. Predictive analytics forecast performance, while bias reduction tools aim to make hiring fairer (IBM, 2025).
The benefits are clear: speed, consistency and scale. Employers can manage high volumes of applications, reduce administrative burden and make more data-informed decisions. Candidates may receive faster responses and, in theory, a fairer process.
But AI is only as unbiased as the data it learns from. Historical recruitment patterns can be replicated—and even amplified (Bogen and Rieke, 2018). Even the most sophisticated algorithm cannot assess motivation, potential, or cultural fit—qualities central to meaningful recruitment. This is where human oversight isn’t optional; it’s critical.
Real-World Case Studies
Unilever integrated AI-driven video interviews to assess cognitive and emotional attributes alongside CVs. Hiring time dropped from six months to four weeks, over 50,000 hours of interviews were saved, and diversity hires increased by 16% (Unilever, 2025).
BuzzFeed, using IBM’s Watson Candidate Assistant, engaged candidates in personalised career conversations and matched them to roles. Sixty-four per cent more applicants progressed to in-person interviews, improving both efficiency and inclusivity (IBM, 2025).
These examples are inspiring—but also a reminder: AI amplifies what humans set in motion. Without oversight, the same technology can unintentionally reinforce bias.
Balancing Innovation with Empathy
AI cannot replace intuition, empathy, or human understanding. Students need guidance to navigate automated systems: optimising CVs for ATS, building authentic digital profiles and engaging meaningfully with recruiters. Employers must retain human checkpoints, critically evaluate AI outputs and ensure technology complements—not replaces—human judgment.
Without empathy, recruitment risks reducing applicants to data points. With oversight, however, AI can help level the playing field, making opportunities more accessible for diverse talent.
Practical Reflections
- Audit AI tools regularly to identify bias and ensure fairness.
- Keep human checkpoints central to recruitment decisions.
- Be transparent with candidates about AI usage to build trust.
- Equip students to understand and navigate AI-mediated recruitment confidently.
- Collaborate with employers to embed inclusivity in recruitment practices.
Final Thoughts
AI is here to stay. Used thoughtfully, it can improve efficiency, transparency and inclusivity. Misapplied, it risks eroding trust and perpetuating inequities. Recruitment that combines AI with human insight, empathy, and ethical oversight benefits candidates, organisations and the wider labour market.
AI can do a lot—but it cannot understand lived experience. That is where humans remain indispensable.
References
Bogen, M. and Rieke, A. (2018) Help wanted: An examination of hiring algorithms, equity and bias. Upturn. Available at: https://www.upturn.org/reports/2018/hiring-algorithms/ (Accessed: 10 October 2025).
IBM (2025) BuzzFeed case study. Available at: https://www.ibm.com/case-studies/buzzfeed (Accessed: 10 October 2025).
Texthelp (2023) Supporting neurodiverse students with AI tools. Available at: https://www.texthelp.com/en-gb/ (Accessed: 17 October 2025).
Turnitin (2024) AI detection in academic submissions. Available at: https://www.turnitin.com/ (Accessed: 17 October 2025).
Unilever (2025) Unilever AI recruitment case study. Available at: https://www.gobeyond.ai/ai-resources/case-studies/unilever-ai-video-interview-recruitment-diversity (Accessed: 19 October 2025).
Share your comments and feedback