When ‘Sorry’ is No Longer Enough

When ‘Sorry’ is No Longer Enough


Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+

The recent hearings at the US Senate and House of Representatives have not quashed the troubling issues surrounding Facebook. CEO Mark Zuckerberg and his teams of lawyers and strategists projected concern and said, once again, that  Facebook would make changes. The  question remaining is how effective it will be this time. In the wake of reports that private information of up to 87 million unsuspecting Facebook users was harvested by a data mining  company well-known for being involved in the Trump election and the Brexit Referendum (Cambridge Analytica) , Zuckerberg turned on the charm offensive. This was not the first time, and will likely not be the last time, that Facebook stands to attention when faced with criticism. The question for regulators across the world is, why should we wait for Facebook to own up to the next privacy or public safety scandal and come with apologies in hand? Why don’t we put regulations in place to protect users? 

The Counter Extremism Project (CEP) has documented numerous instances where Facebook has made express policy changes following public accusations, a scandal, or pressure from lawmakers. While one would hope that Facebook is continuously working to improve security on its platform, there is no excuse for having to make so many policy changes after wrongdoing has been exposed, rather than taking preventative action. It points to the suspicion that there are more scandals in the making due to yet undiscovered lapses in Facebook’s current policies and practices. This uncertainty calls into question whether or not Facebook is doing all it can to take down terror content posted on its platform or remove hate speech.

Cambridge Analytica was not the first time Facebook received backlash for the mishandling of user data: Facebook first issued an apology on the issue in 2007, when a feature called Beacon tracked and shared users’ online activity without expressly asking them for permission. Five years later, after the company admitted to a year-long data breach that exposed the personal information of six million users, Facebook promised to “work doubly hard to make sure nothing like this happens again.” Nonetheless, this year Facebook again rushed to update its data policy immediately following the latest scandal, publishing at least five press releases explaining  new measures and adjustments. Aside from the enormous  breach of user privacy that the Cambridge Analytica scandal represents,  it begs the question – if Facebook can track people’s movements and preferences so well, why can’t Facebook not identify terror content on its platform and remove it before it is viewed.

For more than a decade, Facebook has faced criticism for the misuse of its platform on issues ranging from the publication of inappropriate content to user privacy and safety. However rather than taking preventative measures, Facebook chooses to wait and then  jumps to make policy changes after the damage has already been done.

In an age where terror attacks are committed by people who are radicalized by watching content hosted on online platforms, the time has long passed for regulators to make Facebook stop apologizing after the horse has bolted.

Recently, many European countries have raised the question of the legal status of Facebook and other online platforms amid concerns over the proliferation of online extremist content. These platforms provide information and thus circumvent the responsibility beholden to publishers. Researchers at CEP continue to find extremist content published on Facebook, so claims made by Mr. Zuckerberg during the recent hearings in Washington were incorrect and left a harmful impression of security that simply does not exist.  We need regulators and policymakers on both sides of the Atlantic who will not just hold these tech companies accountable to standards for identifying and permanently removing terrorist content that are effective, transparent, and enforced throughout the value chain.

Facebook-apology_r2

Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+