Last week, prominent Google artificial intelligence researcher Timnit Gebru said she was fired by the company after managers asked her to retract or withdraw her name from a research paper, and she objected. Google maintains that she resigned, and Alphabet CEO Sundar Pichai said in a company memo on Wednesday that he would investigate what happened.

The episode is a pointed reminder of tech companies’ influence and power over their field. AI underpins lucrative products like Google’s search engine and Amazon’s virtual assistant Alexa. Big companies pump out influential research papers, fund academic conferences, compete to hire top researchers, and own the data centers required for large-scale AI experiments. A recent study found that the majority of tenure-track faculty at four prominent universities that disclose funding sources had received backing from Big Tech.

Ben Recht, an associate professor at University of California, Berkeley, who has spent time at Google as visiting faculty, says his fellow researchers sometimes forget that companies’ interest doesn’t stem only from a love of science. “Corporate research is amazing, and there have been amazing things that came out of the Bell Labs and PARC and Google,” he says. “But it’s weird to pretend that academic research and corporate research are the same.”

Ali Alkhatib, a research fellow at University of San Francisco’s Center for Applied Data Ethics, says the questions raised by Google’s treatment of Gebru risk undermining all of the company’s research. “It feels precarious to cite because there may be things behind the scenes, which they weren’t able to talk about, that we learn about later,” he says.

Alkhatib, who previously worked in Microsoft’s research division, says he understands that corporate research comes with constraints. But he would like to see Google make visible changes to win back trust from researchers inside and outside the company, perhaps by insulating its research group from other parts of Google.

The paper that led to Gebru’s exit from Google highlighted ethical questions raised by AI technology that works with language. Google’s head of research, Jeff Dean, said in a statement last week that it “didn’t meet our bar for publication.”

Gebru has said managers may have seen the work as threatening to Google’s business interests, or an excuse to remove her for criticizing the lack of diversity in the company’s AI group. Other Google researchers have said publicly that Google appears to have used its internal research review process to punish her. More than 2,300 Google employees, including many AI researchers, have signed an open letter demanding the company establish clear guidelines on how research will be handled.

Meredith Whittaker, faculty director at New York University’s AI Now institute, says what happened to Gebru is a reminder that, although companies like Google encourage researchers to consider themselves independent scholars, corporations prioritize the bottom line above academic norms. “It’s easy to forget, but at any moment a company can spike your work or shape it so it functions more as PR than as knowledge production in the public interest,” she says.

Whittaker worked at Google for 13 years but left in 2019, saying the company had retaliated against her for organizing a walkout over sexual harassment and to undermine her work raising ethical concerns about AI. She helped organize employee protests against an AI contract with the Pentagon that the company ultimately abandoned, although it has taken up other defense contracts.

article image

The WIRED Guide to Artificial Intelligence

Supersmart algorithms won’t take all the jobs, But they are learning faster than ever, doing everything from medical diagnostics to serving up ads.

Machine learning was an obscure dimension of academia until around 2012, when Google and other tech companies became intensely interested in breakthroughs that made computers much better at recognizing speech and images.

The search and ads company, quickly followed by rivals such as Facebook, hired and acquired leading academics, and urged them to keep publishing papers in between work on company systems. Even traditionally tight-lipped Apple pledged to become more open with its research, in a bid to lure AI talent. Papers with corporate authors and attendees with corporate badges flooded the conferences that are the field’s main publication venues.