Facebook: 99 percent attitude, 1 percent show

EPA-EFE/ETIENNE LAURENT

Facebook CEO Mark Zuckerberg arrives on stage during the VivaTech fair in Paris, France, 24 May 2018. The annual commercial convention runs from the 24 to 26 May.

But what about accountability?


Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+

Facebook has grown too fast for its own good. Unable to keep up with its own explosive pace, the company keeps finding itself either getting in trouble, or not doing enough to stay out of trouble. And the waters keep running deeper and deeper.

Last week, it was made public that a bug in Facebook’s code changed the user-set default privacy settings for new posts, of up to 14 million users. The bug made affected users’ post settings ‘public regardless of their previously set preferred settings, the company has confirmed.  As many as 14 million Facebook users may have been posting content publicly, while believing they were posting items using their default settings, to just friends, or smaller user-designed subgroups like family-only.

Users can typically decide whether their posts are visible by their friends, or publicly on the platform, and set their preferred settings for regular use. The bug caused the platform to make the default setting “public”, ignoring or overriding the users’ set preferences. “We’d like to apologize for this mistake,” Erin Egan, Facebook’s chief privacy officer, said in a statement. Egan confirmed that Facebook fixed the issue and is notifying every user who was affected.

Facebook Founder and CEO Mark Zuckerberg’s appearance in front of the European Parliament made an impact – either because of what he said or due to the awful mismanagement of the whole hearing. The fallout is still rumbling on over whether it was a coup that he attended or an embarrassment that he was allowed to get off so easy. Aside from the questions around data, fake news and marketing, he pulled out his “joker card” when answering probes around illegal content on the platform. The card in question is the statistic that Facebook’s AI can now “flag 99 percent of the ISIS and al-Qaeda related content that we end up taking down before any person in our community flags that for us.” Listeners are evidently supposed to be wowed by the anecdote. 99 percent is a lot. It’s almost everything. Except, it isn’t.

p24-h_53297079

Content relating to terrorism

First of all, does this number only refer to “ISIS and al-Qaeda content” – what about other illegal content? Secondly, 99% of what Facebook “ends up taking down” could mean anything. How much terrorist content flows freely on the platform, but isn’t flagged? Zuckerberg doesn’t say. Thirdly, “before any person in our community flags that for us” doesn’t mean no-one sees it. The question remains: how long does this content remain online before it is removed?

This week research, provided by the Counter Extremism Project (CEP) and seen by New Europe, showed that it stays online for too long. The organization’s researchers found over a handful of illegal content that was available on Facebook and had been available for over a day up to a number of years. Much of this content are photos, news and videos promoting ISIS messages, and at times it includes footage of beheadings, amputations and shootings. So, either Facebook’s AI is not as effective as its CEO asserted or we have been deceived!

Look at how he frames the anecdote. That it started in his college dorm room. In its early stages, Zuckerberg says, Facebook relied on its ‘community’ to flag content, which Facebook would then “look at” – a term he uses twice. Zuckerberg then mentions the sophisticated AI tools the company now has that “flags more of the content up front,” which the company employs tens thousands of people to “review more of”. This wording is important. Zuckerberg plants the seed that ‘flagging’ and ‘removal’ are not two distinct things. He wants the listener to hear “flagging” and to assume that ‘removal’ follows, as a matter of course. Observers seem to have fallen for that hook, line and sinker.

But we’re not buying it and we want action. Facebook cannot continue to brandish this 99% around like it is a solution to everything. Regardless of the fact that 1% in terms of the reach of Facebook is still too much, we cannot accept being fooled by dubious wording and Facebook needs to ensure that illegal content is not anywhere on its platform.

Child pornography

Another sensitive issue is the use of the platform for the exchange of child pornography. A large investigation by the BBC in March of last year saw its journalists reported to police by Facebook and plans for an interview cancelled. The BBC, during its investigation, flagged up 100 images which appeared to break Facebook’s guidelines. According to the BBC, they included:

pages explicitly for men with a sexual interest in children

images of under-16s in highly sexualised poses, with obscene comments posted beside them

groups with names such as “hot xxxx schoolgirls” containing stolen images of real children

an image that appeared to be a still from a video of child abuse, with a request below it to share “child pornography”

Of the 100 images only 18 were removed.  According to Facebook’s automated replies, the other 82 did not breach “community standards”.

Another step the BBC team took, was to report five convicted paedophiles who had Facebook profiles to the platform. Facebook’s own rules forbid convicted sex offenders from having accounts. None of them were taken down.

The story becomes absolutely bizarre from there. The BBC showed its findings to Anne Longfield, the Children’s Commissioner for England. With quotes in the media slamming the company’s observed practices, Facebook finally agreed to an interview with the BBC, “on condition the BBC provided examples of the material that it had reported, but had not been removed by moderators.”

When the BBC did so, they were reported by Facebook turned the journalists in to the UK’s National Crime Agency. The chairman of the Commons media committee, Damian Collins, said this move was “extraordinary – because you’re trying to help them clean up their network, from material that shouldn’t be there”.

Facebook later provided a statement. “We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards,” it said. “This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures.” … “It is against the law for anyone to distribute images of child exploitation.” … “When the BBC sent us such images we followed our industry’s standard practice and reported them to Ceop [Child Exploitation & Online Protection Centre].” … “We also reported the child exploitation images that had been shared on our own platform. This matter is now in the hands of the authorities.”

This story, which resulted in the journalists and BBC investigated by the authorities for trying to uncover a culture of turning a blind eye to child pornography, doesn’t suggest the company wants to propagate illegal content. It does however suggest that their attitude when it comes to becoming more accountable is not just reactive, but that action is only taken when it becomes absolutely necessary to do so; after the fact. And perhaps, one can deduce, that the company can also be vindictive.

Wildlife

And if you think that Facebook has more work to do than the removal of the kind of content that may inspire a terrorist attack, or relates to child pornography, you’re right.

An in-depth investigation by Wired Magazine published on June 5 entitled, ‘How Facebook Groups Became a Bizarre Bazaar for Elephant Tusks’, shows how Facebook is turning a blind eye to wildlife traffickers who use such groups to market and trade their wares around the world like TRAFFIC and the Wildlife Justice Commission, have been fighting in the past to uncover the size and scope of online trafficking.

“While the research generated some publicity, there have been no significant repercussions for Facebook,” according to Wired.

The whistle-blower at the centre of the Wired exposé, has filed a 94-page complaint last August which alleges that by facilitating illegal acts via its platform, profiting from it in the form of ads, and failing to disclose the risk of this type of abuse to its shareholders, “Facebook is violating Securities and Exchange Commission regulations governing publicly traded companies.”

Wired explains that “attached to the complaint are dozens of pages of exhibits with screenshots showing photos of what appears to be lion fangs sitting below Facebook posts for Google and Pizza Hut, big cat claws for sale appearing just above an ad for Uber, a smooth grey rhino horn resting on a scale.”

Just by scrolling down the pictures attached to the 6000-word Wired article, you begin to get a real sense of the magnitude of the problem.

Back in April, in the USA, Four hours into Zuckerberg’s before the House Energy and Commerce Committee, Georgia representative Buddy Carter asked Zuckerberg if he knew that “there are some conservation groups that assert that there’s so much ivory being sold on Facebook that it’s literally contributing to the extinction of the elephant species.”

“Congressman, I have not heard that,” Zuckerberg replied.

That’s just not good enough.

Zuckerberg’s performance in the European Parliament in May was an peek into the way the company has handled these problems: 99% attitude, and 1% show. Because occasionally, Zuckerberg will show up for that photo-op.

Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+