Google wants you to believe it is doing its utmost to counter violent extremist content on its YouTube platform. Yet the tech company has been criticized for their record of turning a blind eye to the proliferation of such propaganda, and we should not expect a great deal of progress from its latest initiative to steer prospective jihadists away from the path to radicalisation and violence.
To be fair, YouTube has recently made several long-overdue changes in an effort to combat terrorist propaganda, but these come more than three years after ISIS began proliferating material such as beheading videos and calls for violence on the platform. Corporations have since left YouTube in droves after seeing their ads posted next to extremist content, and European governments have threatened regulation and stiff fines. It’s curious why a $600 billion company took so long to recognize a problem seen so clearly by the public, other businesses, and policymakers.
This past July, Google introduced the Redirect Method, an initiative that uses keywords to intersperse counter-narrative videos among YouTube’s search results for extremist content. The idea is that individuals searching for ISIS videos will instead find videos that counter ISIS’s extremist narratives by depicting the true brutality of ISIS’s rule and how the terrorist organisation fails to uphold true Islamic principles, for example. The hope is that users searching for terrorist and extremist material will ultimately be deterred from supporting groups like ISIS. There are, unfortunately, four problems with this approach.
The first is that ignoring unwanted content is easy for every user—including jihadists. You probably ignored a popup or ad minutes before reading this article.
When people deal with large quantities of information, they sort it based on its immediate usefulness. When searching the internet for a specific video, any hint that a piece of content is extraneous is grounds for disqualification. This isn’t to say that counter-narratives can’t be compelling or deeply affecting, but they are easy to ignore.
Second, the amount of extremist content on YouTube still far exceeds counter-narrative videos. Searches conducted by CEP in August of keywords outlined in Google’s Redirect Method pilot program showed that extremist content outnumbered counter-narrative material almost two to one. It’s troubling that YouTube would institute a system of narrative diversion but still allow so much violent propaganda to be uploaded and easily accessible.
Third, the problem of extremist content on YouTube isn’t only about direct searches and messaging. ISIS supporters on the encrypted messaging platform, Telegram, regularly post links to videos on YouTube and other video sites. Often the videos are private, meaning they cannot be searched for using keywords and can only be found on YouTube via the correct link. No matter how much counter-narrative content YouTube adds, the online armies of ISIS supporters will continue to purposely post and link to content there.
The fourth problem is that Google’s measures come woefully late, given they were announced first in February 2016. In the roughly 17 months since Google previewed what later became the Redirect Method, a number of high-profile terrorist attacks have been linked to content found on YouTube.
According to press reports and official statements, Pulse nightclub shooter Omar Mateen used YouTube to watch video lectures from al-Qaeda operative and radical preacher Anwar al-Awlaki. Similarly, Manchester bomber Salman Abedi relied, in part, on ISIS bomb-making instructional videos on YouTube to build his explosive device. That same video was still on YouTube almost two months after the May 22 suicide bomb attack.
Additionally, the members of the Barcelona terrorist cell who accidentally blew themselves up prior to the August 17 vehicular attack were making a TATP-based explosive device, the same chemical composition as the DIY bomb in the ISIS video.
Google’s counter-narrative plan is simply too little, too late. While it has made improvements in taking down violent extremist content—after years of media attention, threats of regulation, and loss of ad revenue—much still remains to be done. Google and other tech companies are still not acting consistently, systematically and transparently to remove and block terrorist content. In fact, they are largely attempting to disown the ethical responsibility they have to be more vigilant. The Redirect Method makes that abundantly clear.