Facebook may be 'pivoting' to something worse
Recent stories have demonstrated that concern was perhaps justified.
Over the past few months, Mark Zuckerberg has spoken at length about his grand plan for fixing Facebook, reports BBC.
In short, it involves “pivoting” - as they say - to a more private social network. One which focuses on closed spaces, like groups or messaging, rather than the public News Feed.
He unveiled this plan in March, a year after the Cambridge Analytica scandal hit.
At the time, I noted that critics were concerned that the shift would mean Facebook was abdicating some of its responsibilities. Making Facebook more private would arguably not remove the problems of abuse - though it would make it harder for outsiders to find instances of Facebook’s failures.
Recent stories have demonstrated that concern was perhaps justified.
On Monday, ProPublica revealed the existence of a private Facebook group which contained disturbing jokes allegedly posted by US Border Patrol agents.
The investigative site said comments included mockery of migrants that had died in custody, as well as aggressive, sexist remarks about prominent female politicians. The group has existed for more three years and has almost 10,000 members.
A Facebook spokesperson told the BBC: "We want everyone using Facebook to feel safe. Our community standards apply across Facebook, including in secret groups. We're co-operating with federal authorities in their investigation."
Separately, a report last month from California-based investigative group Reveal exposed groups where police officers, from more than 50 different departments across the country, shared racist memes, islamophobia and conspiracy theories.
And the Washington Post detailed a flurry of groups offering bogus cancer treatment “advice”, such as to "use baking soda or frankincense” instead of chemotherapy. These groups are able and allowed to flourish - the Post reported at least two with more than 100,000 members.
Facebook said it provides related news stories to posts that might contain misinformation, but we don’t have any statistics on how effective this measure is.
(Facebook has, however, banned some women who had shared mastectomy scars as an act of solidarity and encouragement with others facing their own battle with cancer.)
Hidden from view
What makes these examples of abuse more significant than what we’ve seen in the past? They show how Facebook’s strategy has the ability to push its problems into the shadows.
ProPublica was only able to observe the Border Patrol group thanks to someone sending them screenshots - otherwise it was entirely hidden from view.
Reveal had to use specially-written software code that cross-referenced members of hate groups against users who were signed up to legitimate pages about police work.
The Washington Post reporter was able to access some groups, but was swiftly banned and blocked when it became clear who she was.
Even Facebook finds it more difficult to find itself accountable when it comes to groups.
The site has said its ability to use algorithms and AI to detect hate speech and misinformation still falls short, and therefore it still relies heavily on users reporting inappropriate content.
In groups, this of course becomes far less likely: the inappropriate content is the reason people joined the group in the first place. And Facebook has shown limited willingness to proactively look for these kind of abuses itself.
Groups have, of course, been a feature on Facebook since the early days. But never before have they had such prominence.
Facebook, as directed by its leader, is aggressively pushing users to use groups more often. There’s an advertising campaign - which includes hand-painted murals - and a new button placed front and centre in its mobile app. Private is the new public.
“This vision could backfire terribly,” warned French journalism professor, Frederick Pilloux, in 2018. “An increase in the weight of 'groups' means reinforcement of Facebook’s worst features - cognitive bubbles - where users are kept in silos fueled by a torrent of fake news and extremism.”
Make no mistake: few, if any, of the problems Facebook is “working hard” on at the moment would have come to light were it not for external pressure from journalists, lawmakers, academics and civil rights groups.
The examples I’ve raised here pose a question: is Facebook fixing itself, or merely making it harder for us to see it's broken?