Of course, one would expect that many students who graduate with a doctorate in the field of AI shall join an AI company, whether it is a startup or Big Tech giant.
According to the 2021 Stanford Artificial Intelligence Index Report, North America AI PhD graduates who joined the AI industry after graduation increased from 44.4% in 2010 to approximately 48% by 2019. The share of new AI PhDs entering academia decreased from 42.1% in 2010 to 23.7% in 2019.
It probably does not help that private industry is willing to pay top dollar for AI talent.
According to figures from salary negotiation service Rora, jobs from the biggest AI startups, like OpenAI and Anthropic, post hair-raisingly high salaries around $700,000 to $900,000 for new researchers. And Google has apparently gone as far as providing large grants of restricted stock to attract top data scientists.
Whereas AI graduates are no doubt welcoming the trend — who wouldn't kill for a starting salary that high? — it's having an alarming impact on academia.
A 2019 survey co-authored by researchers at the Hebrew University of Jerusalem and Cheung Kong Graduate School of Business in Beijing found that nearly 100 AI faculty members left North American universities for industry jobs between 2018 and 2019-an outsized cohort in the context of a highly specialized computer science field. The study reports that during the period between 2004 and 2019, Carnegie Mellon lost 16 AI faculty members, and its peers, the Georgia Institute of Technology and University of Washington, reportedly lost around a dozen each.
The mass exodus of the faculties has opened up a Pandora's box, and to an extent, the Hebrew University and Cheung Kong survey concluded it had an especially stark impact on AI companies founded by students graduating from universities where those professors used to work. According to this survey, the chilling effect on entrepreneurship is strong in the years following faculty departures at a college and is powerful when replaced by faculty coming from lower-ranked schools or untenured professors.
Maybe that's why AI companies and labs are increasingly looking for talent in industry - not universities.
A new report from VC firm SignalFire suggests the percent of AI hires coming from top schools such as Caltech, Harvard, Princeton, Yale and Stanford-or those with doctorates-has dropped significantly from a peak of around 35% in 2015. In 2023, the percentage was closer to 18%, as AI companies began to look for and hire more non-graduate candidates.
"We found a high concentration of the best AI talent among a handful of startups when historically we saw this clustering at public giants like Google," said Ilya Kirnos, cofounder and CTO of SignalFire. "That led us to look at where the best AI talent was moving across the industry, and whether talent was more correlated with top universities or top startups."
To reach those conclusions, SignalFire homed in on a slice of the best AI talent through two different channels: academic papers and open-source project contributions. (Kirnos concedes that not every AI researcher publishes or contributes to open-source, but says the report is intended to offer a "representative slice" of the AI talent ecosystem, not the entire pie).
SignalFire cross-referenced authors at major AI conferences like NeurIPS and ICML with university employment listings to identify AI faculty. It then matched contributors to popular AI software projects on GitHub with public employment feeds, like LinkedIn, to identify top overall contributors.
Kirnos says SignalFire's data demonstrates an upward trend of AI companies (e.g. Hugging Face, Stability AI, Midjourney) to circumvent the esteem graduate hiring pools and research communities emerging from emerging new AI tradecraft (see: prompt engineering). And that, Kirnos claims, is a good thing for its potential to lower industry barriers to entry for non-PhDs.
This will create demand for new ways to assess recruiting candidates for real-world software engineering experience, Kirnos said. Instead of filtering by university brand names, we may see employers seek out new ways to screen applicants for expertise in building functional products out of the stack the company actually uses."
Diversity is in the eye of the beholder, of course.
Stanford reported that the AI PhD programs had been decidedly homogeneous in 2019, with white students making up 45% of AI PhD graduates. So were the AI teams on industry. In its State of AI 2022 report, McKinsey found the average share of employees identifying as racial or ethnic minorities developing AI solutions was a paltry 25%, and 29% of organizations had no minority employees working on AI whatsoever.
Kirnos reports: "He says he believes universities might do more to give their students a chance to deal with research and production in closer-to-sector work-life settings. "Engineering is moving, more than ever, away from building whole products from scratch in a vacuum, toward cobbling together stacks of AI models, APIs, enterprise tools and open source software," he said.
But this writer hopes that universities wake up to the trend and then act. Highly selective AI doctoral programs deserve condemnation for their exclusivity, at the very least, and for the way it centralizes power and speeds inequality. For my part, however, I'm reluctant to herald a future in which industry, through hiring and other means of influence, exercises growing control over the direction of the AI field.