Safiya Noble is one of the most respected voices on the impact of technology companies in society. She is an Associate Professor at the University of California, where she serves as Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of Algorithms of Oppression, a book that challenges the idea that search engines offer an equal playing field for all forms of ideas, identities, and activities. You can read an excerpt here.
Professor Noble’s first encounter with racism in search happened in 2009, when she was talking to a friend who mentioned: “You should see what happens when you google ‘black girls.’” She was stunned to discover that most of the results were related with porn or sex even if those words were not included in the search box.
Google blocked explicit content from AdWords in 2014. And yet, even today it’s possible to find hypersexualised results when searching for “Latino girls” or “Asian girls.” “What we know about Google’s responses to racial stereotyping in its products is that it typically denies responsibility or intent to harm, but then it is able to “tweak” or “fix” these aberrations or “glitches” in its systems,” Professor Noble wrote in her book, published in 2018 and reviewed here by The New York Times.
It’s important for journalists to understand what data bias is and how to report on it. It’s also important for people who work at the tech companies to have an education on the histories of marginalized people so they don’t make the same mistakes.