Tess Torelli March 13, 2018

YouTube is still struggling to figure out how to keep from recommending divisive content like conspiracy theories.

On Monday afternoon, YouTube users on Twitter highlighted problems with the site’s autocomplete feature, which automatically suggests search queries after a user starts typing a few letters or a word, as well as its recommendation engine:

The thread was spawned by the New York Times’ recent editorial exploring YouTube’s role as “one of the most powerful radicalizing instruments of the 21st century.”

The video site, owned by Alphabet’s Google, has been the subject of several recent investigations showing how it highlights extreme content, like conspiracy theories or hyper-partisan points of view, over more measured videos.

For example, YouTube and Google’s autocomplete boxes deliver starkly different answers, with the video site recommending controversial or fake points of view on YouTube.

YouTube results are on the left while Google results are on the right. Both searches were done in Google’s “incognito mode” to prevent the results from being affected by prior search history.

Leave a comment.

Your email address will not be published. Required fields are marked*