Over the weekend of Aug. 8, social media users came across what seemed to be a hack in the Google image search algorithm. It turns out that if you Google image search the phrase “white American doctor,” images of Black doctors show up as the result instead.
This led many social media users to ask whether the search giant had been hacked by activists, or whether it was a conspiracy manufactured by Google itself. Anyway, you don’t have to be an SEO-pro to figure out just why these particular images show up in the image search.
Keep scrolling to understand this phenomenon and the importance of phrasing and keywords when tagging your images.
Why does Googling “white American doctor” result in images of Black doctors?
When Twitter user @_TheProphecy tweeted, “Lmaooo idk who hacked the google algorithm but I am crying…google ‘white American doctor’ & press images,” people were wondering whether there was a glitch in the search algorithm or if the search engine had been hacked. Responders to the tweet also pointed out similar results for practically any profession with the words “white American” in front of it, including the search for “portraits of European people.”
But while many were wondering whether the results were an indication of a skewed algorithm or even a hack, it turns out the results are only a reflection of the way users themselves input information online.
Last year, a similar conversation was sparked off by Twitter user @JenniferMascia who tweeted about a similar odd search result. Jennifer asked her followers, "So I do a @Google search for 'Boston's black neighborhoods' and the 1st result that pops up is a list of Boston's 'worst' neighborhoods. How does that happen?”
Google public liaison Danny Sullivan explained to Jennifer that "the [results] page was strong on the aspect of 'Boston neighborhoods' generally and likely seemed more list-like than other pages in the results, so got automatically picked as a featured snippet."
He also said that while the featured snippet system usually works as it should, it isn't perfect and reporting through Google's Feedback tool helps clear up these mistakes when they happen.
He continued to explain that “when people post images of white couples, they tend to say only ‘couples’ & not provide a race. But when there are mixed couples, then ‘white’ gets mentioned. Our image search depends [heavily on] words — so when we don’t get the words, this can happen.”
So, is Google showing a bias that favors images of African-Americans?
Basically, what Danny was saying is that Google is only a mirror for how users search for content and how web pages are written to respond to those search queries. It’s not Google or its algorithms that are biased, it’s the bias of people who don’t use the right words to describe images that are used in an article. That’s why it’s important to match words near images, including alt tags and captions, with the keywords that users are going to search for.
Search Engine Journal conducted an investigation by analyzing the code of the images that were appearing at the top of the search for “white American doctors.” They found that the code of the web pages on which the images of African-American doctors appeared had the keywords “white American doctor” in their headings, in close proximity to the image.
So, when you search the term on Google, the search engine mines the code of various web pages and if it finds the keywords within code that are physically close to the code for the image, the results would show those images.
All this ties into how search engines interpret and understand a user’s intent when writing SEO-friendly content. Search Engine Journal asks the pertinent question, “If people don’t generally refer to Caucasian doctors as white doctors, how would publishers refer to Caucasian doctors on web pages?”
Google can only reflect back how we use search engines and how we upload content and write web pages. So the next time you’re creating a web page, remember that text around your images will have a huge influence in the way the image is ranked in a search result.
Tag your images correctly, kids!