Greedy Algorithms And The Need For Illegibility
On Sauron's Eye and the need for wider opacity
The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. ... If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.
To avoid losing a piece, many a person has lost the game
Edit: A better metaphor could’ve been the Sauron’s Eye of public attention and the myriad ways to escape it through secrecy, including the One Ring of having lesser process transparency. Highlighted by George.
One of the few things that almost everyone I know agrees on is that reading the news has become a chore. Those who have stopped like that they've stopped. Those who've reduced wish they could reduce more. And those that haven't yet wish they could. It's about as common a consensus as exists today.
And the reason to a large extent seems to be that they effectively trade in sensationalism over any kind of journalistic integrity to the "truth". Whether it's true or not that this is somehow atypical, historically speaking, it does make me think that letting information free-flow around the world at light speed could have some downsides.
Let's start with Matt Yglesias' idea of Secret Congress. He says:
progress in these areas, if it happens, will tend to come from a handful of members on different sides of the aisle getting interested in the topic and working quietly with other members to make deals to make it happen
He exemplifies this as an instance of "pulling the rope sideways", as Robin Hanson suggested to make the highest impact move.
The belaboured metaphor aside, this makes hell of a lot of sense. While everyone else is fighting over crazy complicated theories that are hugely inflammatory around immigration or urban crime or green new deal or facemasks, here a few folks get together quietly and get real work done.
His argument is that coding problems as highly partisan is what gets them into trouble. Republicans won't want to pass anything that's seen as too Democratic, and vice versa. So ACA gets called Obamacare it gets shot down even though it's copied from Mitt Romney's plan.
This pattern shows a broader heuristic to how certain actions get classified as highly partisan. It's that we're acting as greedy algorithms, who try to optimise not just the goal but every single step towards achieving it.
Greedy algorithms are not great to work with. They excel in local stepwise optimisation, to the detriment of achieving the global goal. It's something to watch out for. Its the inability to understand that occasionally you need to do non-optimal steps now to get to an optimal outcome later.
That's also what the application of Musashi to other parts of life gets wrong. Not every action should be about cutting. Sometimes it should be about setting up the move several steps down. Not every move can or should be about not losing anything.
And yet this is also exactly what we seem to be doing in most public settings.
Take masking for instance. It should've been innocuous, plenty of bipartisan support even at one point in time. The Secret Congress theory suggests that since it's not something that could've been immediately cast in highly partisan lights, it should've passed through under the radar. Under the Secret Congress theory this should've happened, but for the fact that it suddenly had a spotlight put on it.
And when a spotlight is put on it we act as downside minimising Musashis, trying to win with every cut, and become greedy algorithms.
But this is in no way purely steeped just in politics. The same thing applies to Apple when they decided to scan your phone for child abuse pics, or Facebook when they thought Instagram for kids was a good idea, or Amazon after they went through the trouble of trying to figure out where to have their second HQ
Whatever gets public attention gets pilloried. Whatever issue gets the limelight put on it becomes the next nexus.
Part of the reason is that this is just a change in scale, not a change in scope. We can see politicians and businessmen get pilloried, and completely unjust public witch hunts in pretty much every era we care to look at. I'm sure if we had enough parchments to pore over or hieroglyphics to decipher we'd see "this pharaoh is a moron, dams kill crocodiles" arguments there too.
But it seems to happen more today, because we've all become greedy algorithms. The difference today is the ability to be instantaneously aware of every single utterance of every single conversation, in real time. And as much of a blessing as this is, it's also a curse.
If you're a CEO, being constantly bothered during the decision making process seems a great way to lose your thought and get frustrated.
For instance, I think hyperloop is really cool and completely and utterly impractical. His idea of boring tunnels under cities to have single cars go single file in them seems lunacy.
But I also know that asking Elon to defend his reasons, his ethics and his business plan at the same time is unworkable. There are just too many unknowns. The best thing to do is to just check if it'll have some drastic downsides, and if not to just try it. Sometimes you just need to let someone try something, and then you can jump in and criticise. We don't need to have an opinion every single step of the way.
If the steps along the way are all seen to be verboten to tread on, where every single sword swing has to be a killer swing, we end up in this dumb game of jousts.
We supplement a lack of objective clarity with an overabundance of detail.
Politics became worse even as we started getting too much information about the actual process of governance. And until recently information had meant input, something you could do about it.
But here information comes without the possibility of action, which meant the only action one could take is vociferously shouting into the void. As soon as that void took 140 character shape it got embedded into the media ecosystem, which dutifully amplifies all voices it deems powerful and interesting.
If we want competent bureaucrats or technocrats or businessmen or economists or laypeople to actually solve any of our problems, it would truly be helpful for them to be given the space to try rather than be consistently be heckled from the cheap seats.
If we ask that every plan be highly legible not just in its goals or their broad outline, but in its every execution step, we are setting ourselves up for failure. Legibility is hugely helpful. Byrne makes a strong case that it's how the tech world helps us see like a state, and the increased legibility it brings helps us achieve much higher scalability.
But premature questioning for legibility, to know and debate the purpose of each step as a group, is a great way to kill most ideas and most plans. Just like "explain to me exactly how you're going to grow in the next year" taken to the nth degree is a great way for a VC to kill the startup. At best it can only work as a forcing function to think, not as a way to actually get a plan.
We hate it when this happens in real life. If someone was doing it to you you'd call it micromanaging or backseat driving. And yet we're extremely happy to do this for any and all public plans. One could call it monday morning quarterbacking, but its worse than that. It's quarterbacking the quarterback while he's trying to throw the ball.
Until recently we didn't have the levels of transparency to see someone else's action plan at every step. So every misstep govt makes is further evidence that they suck. Every hack of Facebook is evidence that Zuck is a profit-seeking space lizard. And every increase in bitcoin is indication that USD is gonna collapse. Every negotiation by a politician is a sellout to corporate interests, and every conversation with a journalist is a way to make the pithiest case possible in an instant.
We're all day traders in information these days. We probably should acknowledge it and be wary of it.