It’s 1 July. Only 31 days to go. Thirty-one days of the snowballing Facebook boycott in solidarity with Stop Hate for Profit or, simply and coincidentally, a pause or suspension because enough is enough (again).
Facebook says in its defence (without pausing for breath): "We try really, really hard and will try harder still and somehow we will make this brand and social safety thing go away. By the way, if anyone’s listening, we already make loads of it go away and aren’t Twitter and YouTube every bit as much of a problem? The truth is that there are bad people who say bad things and some of them say it on Facebook."
They are right! The utility of Facebook, enjoyed by a couple of billion people and a few million advertisers, comes with a price. No-one, literally no-one in history, has come close to creating a platform for the frictionless dissemination of content that works as well as Facebook.
The trouble is that friction is sometimes essential. The obvious analogy is pain. Without pain, most humans would not survive for a month. Pain adds necessary friction to our daily lives and prevents us from touching hot stoves, drinking boiling water and walking into traffic.
Perhaps, then, Facebook needs to add friction to the system. The obvious areas are to pre-screen posts, comments, shares or any content at all before it is recommended by an algorithm, as I argued in a previous article for Campaign last week.
Let’s assume, however, that doing all of these is a bridge too far, while doing none of it depends too much on insufficiently learned machine learning.
Here’s a halfway house:
- No group can be started without the group moderator providing authenticated identity. All "in group" content would be subject to external moderation and would be "open" to those moderators even if closed to uninvited parties.
- Algorithmic content recommendation will be category-specific and allowed one category at a time starting with (say) sports, arts, food, travel or lifestyle, and subject to constant review. News organisations would require specific and undoubtedly contentious vetting – more friction. Facebook will make those decisions with its oversight board and have to defend them.
- Suspend live streaming as a public utility and restrict it to content providers in the group above.
The imposition of these three "rules" would mean that groups are no longer a place to hide, that sharing is far harder to game and that the invitation to infamy is withdrawn. At the same time, closed communities can exist and prosper, and legitimate safe content is promoted.
The promise of the internet was to shine light into dark places; it was never to create a libertarian utopia. Facebook may have earned its power, but its behaviour over the years has cast reasonable doubt on its motives. With power comes responsibility. In this case, that responsibility is a requirement for friction in the system.
Facebook can’t get to "zero" in the incident count, nor can Twitter or YouTube; there are indeed bad people who do bad things. For some advertisers, other third parties and some media, that may be a "forever" problem, but for many the controls above will represent real change.
With that comes the opportunity to switch the lights back on in a safer environment – safer for business and safer for the public.
Rob Norman is former chief digital officer at Group M