Will algorithms choose your next lab colleague?

Wait 5 sec.

When researchers apply for a job through the office of the pro-rector for research at Feevale University in Novo Hamburgo, Brazil, data from their CVs and publication records are read not only by hiring managers, but also by artificial intelligence (AI) tools. These help to screen candidates by evaluating their productivity and suitability for the role. “Obviously, these analyses are complemented by other steps, including manual data checking and interviews,” says pro-rector Fernando Spilki. Although AI is not yet used for recruitment across the university, it’s a growing approach, he says.At the National University of Malaysia on the outskirts of Kuala Lumpur, computer scientist Kalaivani Chellappan has similarly embraced AI for recruitment. Her research centre, which trains unemployed graduates for jobs in technology, used to face considerable setbacks over the years owing to hiring mismatches. Now, using generative AI, Chellappan has developed a screening tool trained on more than a decade of CV data. It integrates a personality-assessment instrument developed with a psychology specialist to help her to make better hiring choices. “The goal is to improve hiring accuracy and reduce turnover,” she says.According to a 2024 Nature survey of hiring managers, Feevale and Chellappan are outliers. In our poll of more than 1,100 hiring managers across universities, research institutes and industry in 77 countries, just 8% of hirers in academia said they used AI in the recruitment process. Meanwhile, 36% said they weren’t using AI to recruit, but would consider doing so in future. A larger share — 54% — said they were not using it, and would not do so. By contrast, AI has made greater inroads in industry: 49% of industry respondents reported using AI in their hiring processes (see Nature 634, 737–740; 2024).AI and science: what 1,600 researchers thinkThe global recruitment industry was valued at an estimated US$870 billion in 2024. As it looks to AI to deliver faster, more targeted recruitment services, and as science-related industries embrace AI-assisted recruitment, it might just be a matter of time before those technologies enter academic hiring more widely, too. If so, scientists who don’t keep up with the demands of AI-driven recruitment could lose out, and candidates who don’t understand what the AI tools look for might risk their applications being rejected without ever being seen by a human.Conversely, although jobseekers can use AI to make sure their CVs and cover letters closely match the requirements of a vacancy, this could produce applications that are so similar that it becomes difficult to tell candidates apart. “There are certainly productivity gains to expect in the short term, but it might paradoxically make the whole matching process less effective,” says Pierre Lison, a researcher studying natural language processing and machine learning at the Norwegian Computing Center in Oslo.Eyeballs on CVsAI is finding its way into the recruitment industry outside academia. Earlier this year, in a LinkedIn survey of 1,271 recruiters across 23 countries, 37% of respondents reported experimenting with generative AI or integrating it into their workflow — a significant rise from 27% the year before (see go.nature.com/44ntvfj). Just scrolling through the professional-networking site, meanwhile, throws up a multitude of talent-acquisition businesses advertising their AI-powered services. Many boast that their technology will save time, take over mundane tasks and eliminate unconscious bias.Such companies regularly approach Jim Harrington, a talent-acquisition director at the International AIDS Vaccine Initiative, a non-profit organization in New York City. “Eighty per cent of them say ‘We have some proprietary AI tool that helps us search to find the right candidate’,” he says. “It’s part of every solicitation.” Although he sometimes uses generative AI tools for routine tasks, Harrington remains dubious about these companies’ offerings. AI can be useful in weeding out candidates who are clearly not eligible for a role, he says, but he does not think it’s good at picking the best candidates. Moreover, handing over sensitive data to an AI system could constitute a data risk. “My approach has always been that we need to get human eyeballs on CVs,” he says.What ChatGPT and generative AI mean for scienceRecruiters who use AI to match and assess candidates are just one side of the coin. People are also using AI to tailor their applications and CVs to specific job openings — a practice that one-quarter of recruiters in Nature’s hiring survey said they found worrisome. Specifically, some were concerned that it would misrepresent candidates’ true abilities. Furthermore, it puts employers off — getting an application that looks like it was written by AI “immediately creates some scepticism around the candidate”, says Harrington.A June episode of the Nature Careers Podcast discussed other potential applications of AI in hiring (see Nature https://doi.org/g9p622; 2025). These included using the chatbot as a “thinking partner” to brainstorm what to include in an application; as a tool to tweak the language for flow and style; and as a way to help with job hunting, for instance by suggesting alternative career paths that suit the jobseeker’s academic background.Colin Fisher, PhD programme director for the management school at University College London, says he has noticed a decrease in spelling and grammar mistakes in applications. He thinks that could be down to what he calls “relatively benign” AI usage: passing text through AI-based grammar assistants such as Grammarly. There are probably applicants who use AI to write their personal statements, too, Fisher admits. But those who rely too heavily on AI are likely to be identified at the interview stage, he says: “In theory, we’d catch someone who didn’t really understand their own essay in that process.”Colin Fisher attributes a drop in spelling and grammar mistakes in university applications to ‘relatively benign’ AI use.Credit: Sam BushA more systemic concern is ‘signal corruption’ — when overuse of generative AI tools results in job applications that are so homogenized that it becomes difficult to identify uniquely talented or qualified candidates. This process, outlined in a 2024 publication by scientists at the Massachusetts Institute of Technology in Cambridge, could paradoxically make the recruitment process less effective overall (J. Kaashoek et al. in An MIT Exploration of Generative AI https://doi.org/ps4s; 2024). The researchers note that the problem is amplified because AI tools are also making it easier for candidates to apply for more jobs.Douglas Anderson, a chemist at Thermo Fisher Scientific in Eugene, Oregon, says this already happens when candidates use online recruitment portals to submit one-click applications for vacancies with pre-filled information from their profiles and CVs. “These auto-applications don’t look to be very sophisticated and are relatively easy to spot,” Anderson says. They rarely make it past his company’s first screening stage, which also uses AI, he adds. “In my experience, not one of our hires has been through an automated, one-button-style application.”Barriers in academiaNature reached out to half a dozen academic institutions to ask them about their AI use in recruitment. Most declined to respond, but the University of California, Berkeley, and the Swiss Federal Institute of Technology in Lausanne (EPFL) both said they did not use AI tools for centralized recruiting activities. However, they noted that they had limited overview of the issue, because final candidate review and hiring decisions tend to be made by the relevant departments or lab principal investigators.Jukka Luoma, who studies workplace AI use at Aalto University in Espoo, Finland, says that this decentralized hiring structure could partly explain why AI is not used much in academic hiring. “It might be challenging to develop a one-size-fits-all AI tool for hiring in the university context,” he says.Could AI help you to write your next paper?Other factors might play a part, too, Luoma says, including the long-term nature of academic employment, in which each hire has the potential to affect a research group or department for years to come. “Hiring faculty is a very high-stakes game, as you are hiring people to be your colleagues for decades,” he says.But perhaps the biggest barrier is the risk posed by feeding data from applicants into online AI systems. Commercial tools such as ChatGPT do not keep uploaded data private, and using the tools with personal data can violate privacy laws. The European Union introduced legislation in August 2024 that classifies recruitment as a high-risk use case for AI, and has imposed substantial regulatory requirements for organizations that want to use it for that purpose. This has slowed down the adoption of AI hiring practices in Europe regardless of the sector, Luoma says. “My guess is that universities are more cautious doing this sort of high-risk stuff.”Platform pioneersUniversities’ recruitment challenges — including decentralized decision-making and high-stakes hires — could explain why academic job platforms that post vacancies from lots of institutions have become trailblazers in developing AI hiring tools.One such platform is AcademicTransfer, a government-funded, not-for-profit recruitment system based in Utrecht, the Netherlands. The site, which lists nearly every Dutch academic job opening, has been developing AI-based tools for the past four years. This includes a tool that uses large language models (the technology behind chatbots such as ChatGPT) to rank applications for job postings. The tool, which is currently being piloted by the platform, was developed after universities using AcademicTransfer became overwhelmed by hundreds of applications, especially for jobs in which the principal investigator was popular, says managing director Jeroen Sparla. “That’s why we started this pilot with the recruiters, to see if algorithms could assist in which candidates would be the best fit, and which would be the worst.”