Against against Facebook
A highly speculative attempt at redefining our collective hatred and its lightning rod
There are few things that people agree on today. Everyone's extremely polarised. Whole countries worth of people are worried about civil wars. Rhetoric is divisive. Norms are torn to shreds. But one thing seems clear in this confusing universe. Facebook is evil.
In fact it's hard to find a positive take on Facebook that isn't authored by a certain Zuckerberg. It's hated so much that when he personally donated $70 million to a hospital, in exchange for putting his name on it, they voted to somehow condemn the decision (took the money though, moral high grounds don't pay enough). And that's not because they hate his family, or philanthropy at large, but because of Facebook.
So this is my attempt at a contrarian take. But a few concessions upfront - I'm not arguing they're saints. I'm sure they've bent the law as much as all corporations, and that they've not gone excessively overboard in investing the billions they earn into things-people-say-they-should-do.
And a secondary bias - I don't use Facebook much and I don't have an Instagram account. The two Facebook properties I do use though are WhatsApp (because I need to talk on something) and Portal (try making a toddler talk on the phone, it's like chasing chickens).
To start with, it's always amused me that Facebook and Google are considered as powerful as nation states sometimes due to their giant username and billions in cash. Yet they're looked at as so fragile they'd be scared of threats from tiny startups they'd overpay to squash.
Secondly Facebook has somehow ended up in a position that's almost by definition unwinnable. But it's also not a fair position. For instance, the asks of Facebook are:
They seamlessly connect the billions around the globe
They've created spaces for people to meet each other and connect (a digital town square) and have private and public conversations
The ability to send messages so targeted you feel like you're being violated, vs not overcharging for ads (i.e., driven the advertiser's revenues to its lowest common denominator)
They should drive all inflammatory, pornographic, violent, <insert other horrible characteristic> type of content out and make it impossible for anyone to post those
They should have steadfast user security, especially on their private data, and make it impossible for anyone else to manipulate them
There needs to be a social law that's the equivalent of Arrows Impossibility Theorem to show that it's impossible to satisfy this group of demands.
For instance 5 runs foul of 2 and 3. If you can't talk to others or be able to get messages from everyone, then that's not Facebook. 4 runs contrary to 1. If you have 2.5 billion people, 33% of the world's population, using your platform regularly, with all the permutations of their posts and comments and sharing, how the hell can you police that?
Also the ability to do 2 is the whole reason the damn thing grew. If you had to go through friction this would've gotten stopped in its tracks.
And if you're argument is that they did things to grow the company, well, that's kind of their job. They're not meant to also shoulder the moral responsibility of the unintended consequences of their platform a decade out. That's asking way too much of any company! Sure we can and should provide guidelines on what's appropriate and what's not. No strip mining the Central Park, for example. But those guidelines have to be clear and relatively unambiguous. No-doing-bad-things seems ambiguous. No-making-things-have-negative-externalities is worse.
Nick Bostrom explores this in a different way. Let's assume that machines can surpass humans in intelligence. What will happen to our conflicts then? For example today we all live in a semi-stable equilibrium where adversarial selection works in our advantage. We set up whole countries to run that way. Our legal system runs that way.
In the future, when AI is superintelligent, will we be able to specify what we want well enough that it won't be able to find a loophole? Considering Goldman Sachs is able to find loopholes practically as a business model, with everyone looking at them and praising them, doesn't seem particularly likely. If they're able to outwit our collective societal intelligence on a regular basis, what chance do we have against a superintelligent AI.
In a different fashion this shows the problem with the situation as is. There’s a complex mess of laws, rules, norms and expectations that underlay our interactions. If you want to do anything at scale, as you have to do when you have 2.5 billion people using your product, then you have to put those laws, rules, norms and expectations into code. And that’s just impossibly hard to do.
So let's look at the larger societal accusations.
Facebook has broken democracy
The argument for, far as I can tell, is that misinformation trolls fed misinformation into the Facebook system where citizens took that misinformation and ran with it, thinking it's information.
Quoting Prof Sir John Curtice, "You've got to go back and think of the big story. Put it like this, if you are arguing 400,000 leaflets could have made a difference to the UK's membership of the European Union, that strikes me as a very good way of demonstrating the fragility of that membership."
I'm not sure where the problem lies. I agree that it's bad for people to believe wrong things, I just don't have an easy way to fix it.
Maybe we can have "better journalism" to solve it? Didn't work in the past so I'm not sure why it'll work now. For a book length treatment of the problems with that thought see The Truth.
Perhaps it's the fact that Facebook allowed people to buy ads targeted at certain people in certain jurisdictions that subverted election laws in those countries? Very fair. But the solution would be for Facebook to now be in charge of ensuring accuracy of its advertisements according to the separate policies in 200+ countries.
If that doesn't destroy the ability of every silicon valley startup to acquire users easily I'm not sure what will.
Or maybe it's that Facebook allowed known false speech from elected officials? But again, it's not Facebook's fault that some politicians lie. Are they now shouldering the editorial responsibility for the whole public discourse? Surely if there's ire to be spent, it should be on those that are lying in the first place.
Facebook sells your data
For one thing, no they don't. That's their whole business model in fact. I have your data and nobody else does. Otherwise why wouldn't the advertisers just buy the data they're selling?
Also, there are actual companies out there that do sell your data. They're called data brokers. There are a fair few of them out there. One of the larger ones, Acxiom, has “more than 62 countries, 2.5 billion addressable consumers and more than 10,000 attributes—for a comprehensive representation of 68 percent of the world’s online population.” LiveRamp sold Acxiom to Interpublic, one of the largest ad agencies, for $2.3 billion.
It’s worth mentioning again that the reason Facebook is valuable is not because they sell data. It’s because they make something that makes 2.5 billion people visit their sites a month! Their evil is made real because people across race, creed, religion, gender and politics opt in.
Facebook makes you sad and depressed
2.5 billion MAUs. Which is 1/3rd of the whole world. And in the world we see:
Incidence of suicide - 1.4% of deaths worldwide in 2017, c.1.5m people a year in 2020
Incidence of depression - 264m people worldwide
None of this is new to Facebook. Unfortunately they are part of the human tapestry. And they're an unregulated output of the current means to talk to each other.
Because on the internet there are no physical barriers like there is in the real world to stop intruders, to filter messages, or to
Maybe the trouble is not that it happens, but that Facebook can make it happen at the much larger scale. Sure, but is the problem of bullying happening because you are being harassed by a random person in Bolivia? Or is it your same high school assholes repurposing your newsfeed?
Once again it's absolutely a problem. But it's a bigger problem than Zuck not wanting to fix it.
And even if Facebook does make you depressed, surely that wasn't part of their Series B fundraise deck. It didn't show up in the S1. It's not their stated purpose. Should we regulate them? Sure. Should we tell our kids to use it less? Absolutely. But there's a difference between stopping kids from using things they don't know how to use responsibly yet (as if anyone does!) and calling them evil.
There used to be a happy idyllic time (pre-2005) when the world was divided into nice little zones with its own rules and dominion within. Everyone was ecstatic then. I'm sure honey tasted sweeter and milk was made by happy cows.
Sure there were companies that sold things across borders, but they were things for the most part. And things have to move in physical space. So the borders remained sacrosanct.
Information was freer, and Google made out okay, but because most information is searched for it made the issue slightly less relevant.
But now, we have large groups of the world, spanning countries and continents, sitting in one place. And what's more, pay a couple thousands bucks you can show them your message!
In a dystopian way, this fulfils the promise of the internet, that we can actually reach everyone.
Facebook have been trying desperately to make people like them throughout this. They've said mea culpa, repeatedly. They've changed their API. They've deleted millions of accounts. They've suspended tens of thousands of apps.
So, would it be better for everyone if social media didn't exist? It would make starting companies much harder, it would change career trajectories for a fair number of people (and mostly negatively), it would substantially increase our advertisement spend, and there will be less cat videos on the internet.
There will also be less cyberbullying (but not all bullying), one less vector for suicide, one less vector for depression, and one less place to get your poorly informed political ideas from.
But Daily Mail will still exit. And Fox News will still be broadcast. That won't change. I don't see Rupert Murdoch getting nearly the same attention, though he's arguably used an existing medium where the norms and practices are far better understood.
Genies, once let out, don't go back inside the bottle easily.
In the late 19th century there used to be between 6 and 12 mail deliveries a day, the equivalent of a rather slow email system. The postal service was on fire!
You could get multiple news deliveries, the newspapers were despatched through pneumatic tubes, with the first trial happening in 1863.
We've always been suckers for faster information and not particularly great at either filtering or assessing said information.
This isn't a new problem, or a Facebook problem, it's a human problem.
And if we choose to look at it as a specific instance of a bad actor acting badly, then we're not going to be able to actually do anything meaningful because we're solving the wrong problem. And this is true even if the bad actor was actually behaving badly. That’s why the essay isn’t titled ‘For Facebook’.
There's a group of people who, when told that Bill Gates is donating billions to solve malaria, respond by how that won't clean his red red hands from the sins of Microsoft. But maybe instead of taking the easy way out by hating Zuckerberg and calling Facebook Satan, it's worth thinking about how we should live as a society in an age where borders don't apply to information.
We've been asking them to please solve all of human communication problems please, and soon if you can, with no leakage. Bear in mind human communication, and its varied formats, has been a constant source of debate for as long as there's been people. We amend our laws regularly. There's a reason we haven't stopped with the ten commandments. Every corner case that emerges requires a fix and that's how we slowly build up our norms and institutions.
It's not going to get fixed with an algorithm and a wish.