How can social media companies support civil society?

EPA-EFE/JULIAN STRATENSCHULTE

How can social media companies support civil society?


Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+

In August, the UN issued a report that accused Myanmar’s military of “genocidal intent.” Minutes after the release of the report, Facebook announced that it had discovered a covert campaign, orchestrated by Myanmar’s military, which sought to stoke fears about the Rohingya minority and undermine Myanmar’s civilian government. Facebook banned 20 accounts associated with the military and removed another 12 accounts and 46 pages for “coordinated inauthentic behaviour.” It was an unprecedented move by a social media company, and it prompted praise from human rights activists and proponents of transparency.

But buried in the UN’s full report were other damning indictments of Facebook. The company failed to provide the UN Fact Finding Mission with “country-specific data” about hate speech on the platform and, with embarrassing irony, took weeks to respond to the UN’s request to remove Facebook posts that targeted a human rights activist for allegedly cooperating with investigators.

From the United States to the Philippines and Myanmar, bad actors have used social media to misinform, harass, and silence people around the world. Faced with scathing criticism and looming regulation, social media companies have only just started to act. After promoting conspiracy theories during breaking news, YouTube recently made changes to make authoritative sources more accessible to viewers.

In testimony before Congress, Twitter’s CEO lauded their efforts to encourage healthier public debate. Facebook has committed to increase local content moderators, and hired an independent group to audit the human rights impact of its platform in Myanmar.

I have worked on some of these problems with civil society groups in Myanmar, Pakistan, and Sri Lanka, where the dominance of social media is shaping public discourse, and decisions in Silicon Valley have had far-reaching consequences. Viewed in this context, what else can social media companies do?

First, companies must work with and support local communities when developing their policies. Citizen groups understand the political and social challenges of their countries best, to say nothing of local languages, where it is not always easy to automatically monitor hate speech without causing collateral damage. In many countries, civil society has also been critical to uncovering state-sponsored harassment and hate speech online – and even averting violence, as we learned from groups in Myanmar.

Second, companies need to give researchers, journalists, and human rights defenders access to the data on their platforms. Facebook and other companies have said that data restrictions are necessary to protect user privacy, but they also setback those who are trying to counter radicalisation, protect electoral integrity, and stop the spread of misinformation.

This year alone, researchers have used public data from social media to understand how hate speech boosted anti-refugee attacks in Germany and the persistence of fake accounts peddling misinformation well after the US elections. Using Facebook’s Groups API, my own research showed how hate speech exploded at the beginning of the Rohingya crisis. This research allows us to hold companies accountable, and to develop interventions – community dialogues, digital literacy, and more – that can help tackle some of these problems.

Finally, companies should adopt a human rights based approach in their work.

The UN Guiding Principles on Business and Human Rights suggest, at a minimum, policies that set out a company’s human rights responsibilities and expectations. Adopting more concrete actions, such as “assessing actual and potential human rights impacts” on the ground could help to prevent situations like Myanmar in the future, and UN investigators have called on Facebook to adopt such a framework.

As repressive regimes adopt more sophisticated strategies – infiltrating online groups, cyber bullying the opposition, and automating bots – the challenges faced by civil society will undoubtedly grow.

The solutions to these tactics will require different responses and academics, journalists, and citizen groups will need to work together. Earlier this year, Mark Zuckerberg insisted that Facebook’s aim is to “empower and build” groups like fact-checkers and others that “can help figure out these new issues on the Internet.” Genuine empowerment means giving these groups the information and opportunity to participate in policymaking, as well as the space to criticize them.

Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+