I.
A lot of people seem to hate Silicon Valley with a passion. Recently I commented on a post that Tyler, a writer at Atlantic, made, suggesting that tech isn’t monomaniacally bad, and got hit with a barrage of comments about how I’m wrong. Technologists caused addiction to social media and smartphones intentionally, they said. They forced advertising on us, to the point of having to use ad blockers to even get on the internet1. They never asked us for permission before building driverless cars.
This is a common theme in arguments against tech. The anger seems to stem from the fact that we live in a world of tech, immersed in it, and you don’t have a say in the matter. It’s the water we swim in. And in that there are precious few choices but to use Google, to use Meta, to use AI that’s provided by OpenAI or Google. Want to book a flight? Research something for work? Talk to a friend? Call your parents? Check in on a sleeping baby? All needs tech in the Silicon Valley sense.
The immediate counter argument is that you could just not use it. You could “exit”, in the Hirschmann-ian sense. But that’s not a realistic possibility in 2025. Even the remote tribes who haven’t seen other human beings in centuries aren’t safe. So you’re left with “voice”. To complain, make your complaints heard.
And why wouldn’t you, the anger is that people had no choice or say in the matter. If you want to do any of those things, many of which didn’t exist in the pre-2000 era by the way, then you have no choice but to use one of the mega corporations that rose up in the last couple decades. And hate the technologists who build the thing.
Silicon Valley’s real sin here isn’t addiction or monopoly per se2; it is draining long-existing frictions from daily life so hard and fast that hidden costs pop up faster than society can patch them.
If you make something easy to use, people will use it more. If you will make something easier to use, other constraints will emerge, including the constraint that it becomes much harder for users to cognitively deal with it.
The other choice was to not use the tech at any point in its exorable rise. But that was a coordination problem and people suck at solving coordination problems3. These alternatives are nice to imagine, but lest we forget we collectively threw up on paid versions of browsers, social media, search engines, forums, blogs, literally anything that had a free + ads alternative. Because nobody loves a paywall. As Steward Brand said so well:
Information wants to be free
Much as it’s ridiculed, including by me atimes, Silicon Valley does try to build things that people want, and people want their lives to be easy. That’s why every annoyance that your parents had to deal with has been cut down so you can swipe to solve it on your mobile phones today. Yes, from booking tickets to researching topics to coding to talking to friends to checking in on your sleeping baby.
Silicon Valley finds every way imaginable to remove frictions from our lives. Every individual actor in tech works independently to find the next part of life that has any demonstrable friction and remove it, from finding love with a swipe to outbound sales enablement. Startups try to build on it, large companies try to capitalise on it, and VCs try to fund it. YC has it as their literal motto - ‘Make Something People Want’.
That’s how Google and Facebook built an advertising empire, because that could help them give what we wanted to us for free. We demanded it. And the network effects embedded and compounding investments meant they could grow bigger without anyone else able to compete with them, because who can compete with free.
The logic is as follows. Tech tries to make things easier to use. But the easier they are to use, we use them more. When we use them more, there's either a supply glut and often centralization, because to give things to us for free requires enormous scale. It outcompetes everything else. Which means they become as utilities. Which means there is no competition. Which means they will not compete on things that you might consider important. Which means when they make decisions, you feel like you do not have a say. Which means you feel alienated, and lash out.
II.
In any complex system when we remove bottlenecks the constraints moves somewhere else. This is true in operations, like when you want to set up a factory. It’s also true in software engineering, when you want to optimise a codebase. It's part of what makes system wide optimisation really difficult. It's Amdahls Law: “the overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used”.
To optimise, you have to automate. And the increase in supply that reducing friction brings is the defining feature of automation; it always creates new externalities. Re AI assisted coding, Karpathy had a tweet that talked about the problem with LLMs not providing enough support for reviewing code, only for writing it. In other words, it takes away too much friction from one part of the job and adds it to another. You should read the full tweet, but the key part is here:
You could say that in coding LLMs have collapsed (1) (generation) to ~instant, but have done very little to address (2) (discrimination). A person still has to stare at the results and discriminate if they are good. This is my major criticism of LLM coding in that they casually spit out *way* too much code per query at arbitrary complexity, pretending there is no stage 2. Getting that much code is bad and scary. Instead, the LLM has to actively work with you to break down problems into little incremental steps, each more easily verifiable. It has to anticipate the computational work of (2) and reduce it as much as possible. It has to really care.
As it's easier to create content it becomes harder to discover it and even harder to discern it.
wrote a wonderful essay on this topic. She discusses how friction is effectively relocated from the digital into the physical world while we move into a simulated economy where friction, like gravity, doesn’t apply. This is akin to thinking about a ‘Conservation of Friction’, where its moved from the digital realm to the physical realm. Or at least our obsession with reducing friction reduces it in one place but doesn’t eliminate it elsewhere.She wrote:
But friction isn’t the enemy!!!! It’s information. It tells us where things are straining and where care is needed and where attention should go.
And it's not all bad news. Because friction is also where new systems can emerge. Every broken interface, every overloaded professor, every delayed flight is pointing to something that could be rebuilt with actual intention.
But it’s not just that, the question is why the removal of friction caused such widespread dismay this time around.
III.
Now, the story of most of the technological and economic revolution is also the story of reducing friction. Consumers demand it. The history of humanity is one of a massive increase in capability, innovation and growth!
Reduction in friction means we increase the volume of what’s being supplied. The increase in volume can even result in a winner-take-all market when there are network effects, like there are with things relating to human preferences. Which changes the nature of the market, since if supply is much easier then demand shifts.
Now, led by AI, we’re at a historic height of friction reduction4. Look at education. Clay Shirky writes about the incredible change in education brought about by ChatGPT. Students write papers instantaneously, “I saved 10 hours”, and learn far less as they don’t need to go through the tedium of research, discovery or knowledge production.
While students could use it also to speed up the process of learning new things, and many do, they’re caught up in a red queen race. “You’re asking me to go from point A to point B, why wouldn’t I use a car to get there?” as a student said.
"I've become lazier. AI makes reading easier, but slowly causes my brain to lose the ability to think critically."
This is the problem in a nutshell. We can’t stop ourselves from using these tools because they help us a lot. We get hurt by using the tools because they steal something of value in being used. You have to figure out how much to use the tool along with using the tool, and build the price signal internally. Eating a thousand cookies would have external manifestations you can use to guide your behaviour, what about a thousand instagram reels?
And you can’t opt out of it.
Our obsession with reducing friction reduces it in one place but doesn’t eliminate it elsewhere. We agree to these externalities through collective inaction. Everyone adopting is the Nash equilibrium; individually rational, collectively perhaps costly.
IV.
So what can one do, but complain about the existence of these technologies, and bemoan their very existence, dreaming of a simpler time?
Every time someone complains about how they’re addicted to Twitter and wish they could lock their phones away for a bit, they’re hoping for a world where the extreme ease and afforandces that modernity has brought us is pulled back just a little bit. To go to a simpler time.
But we can’t. We’re in this collectively. There is no market of one. The choices we’re not given is the choice to not be able to make certain choices. Where's the setting on Instagram for “turn this off for 2 hours unless I finish my work first"? In the relentless competition that reducing friction brings there is no place for a tool that adds intentional friction.
These stories are everywhere. It used to be that the way you applied for a job was to know a guy or maybe even to get a guy to send a letter to another guy on your behalf. We even used to get PhDs that way.
And then it got easier to apply for jobs. You had online portals. You have middlemen. You had resumes that would get sent in, seen by screeners, seen by HR, vetted against a set of criteria, and then you got an interview. Many rounds of interviews. Much more efficient.
Except for when everyone found out how to do it and started sending in resumes en masse and causing incredible chaos in the system. It’s Jevon’s paradox with a vengeance! The internet supercharged this. And as a result we’re in a situation where people routinely apply for 100s of jobs and don’t get a callback, and the only way to get a job is to know someone.
The increase in supply brings with it new costs - more cognitive load, and more search costs.
That’s why we started telling jobseekers you need a personalised resume and personalised cover letter, trying to find a way to get the candidates to put effort in. Same as for college applications.
Until AI entered the picture.
A “barrage” of AI-powered applications had led to more than double the number of candidates per job while the “barrier to entry is lower”, said Khyati Sundaram, chief executive of Applied, a recruitment platform.
“We’re definitely seeing higher volume and lower quality, which means it is harder to sift through,” she added. “A candidate can copy and paste any application question into ChatGPT, and then can copy and paste that back into that application form.”
And why is this a particular problem? Because search costs are too high!
Cinder is part of a growing list of US-based tech companies that encounter engineering applicants who are actually suspected North Korean nationals. These North Koreans almost certainly work on behalf of the North Korean government to funnel money back to their government while working remotely via third countries like China. Since at least early 2023, many have applied to US-based remote-first tech companies like Cinder.
The part that it’s North Koreans social engineering into the jobs is not the most pertinent part here, though it’s hilarious, it’s that we created a frictionless experience and as a result are dealing with a supply glut, which we have no easy way to solve. We now have to find a way to automate dealing with the supply glut, which will create new loopholes, which we then will have to work to automate, which will …
Every process you find that works is a secret that you get to exploit. It works until it is no longer a secret. When it’s no longer a secret and everyone is happy to do the same thing to succeed you can no longer rely on that strategy. Helping solve frictions that we had before creates new ones that we have to learn to contend with.
There are so many examples of this but one more that I like. Steve Jobs once talked about the importance of good storytelling, considering the limits of animated movies.
In animation, it is so expensive that you can't afford to animate more than a few % more than it's going to end up on screen. You could never afford to animate 10x more. Walt Disney solved the problem decades ago and the way he solved it was to edit films before making them: you get your story team together and you do storyboards.
Animation, even CGI, used to be much harder to do. Which meant that directors and storytellers had to work very very hard to figure out where to use it. They had to be careful. The story came first.
This is no longer a constraint. CGI is easier and cheaper, so more widely used. Now a typical Hollywood studio spends more time in post-production than in pre-production and shooting combined. Cheap flexibility led to a supply glut. The constraints changed.
V.
Which brings us back to the tech and the obsession with reducing friction. Part of that is market forces, but that is because us as consumers and users hate friction. We say we like the alternative, dream of Thoreau, though we’d rather spend time talking about dreaming of being Thoreau on Instagram rather than actually waldenponding.
But we can’t exit the digital world, not easily, so we feel stuck. And unlike with conveyor belts or software engineering, when we reduce the friction of our own demands, the new bottlenecks or introduces aren't easily visible nor easily fixable. How do you deal with the fact that we get hundreds of messages now from multiple apps and are deluged in incoming information that swamps our ability to process it?
Removing friction changes human behaviour. And it’s hard to deal with the consequences of that change in behaviour overnight. But we do learn, we learn to counter those ones, and build the next generation. Whether that’s soot from the industrial revolution polluting our roads or advertisements polluting our information.
The reason we feel the urge to lock our phones away is because we’re not used to having a constantly-on always-aware portal into the entire world just be readily available. The reason why vibe-coding leads to review-fatigue, getting 30 or 300 PRs a day instead of 3 and being buried under, is because we’re not used to doing that yet. The reason why using AI to write leads to high profile hallucinations is because we still haven’t learnt how to use it better.

Friction is the way we know where to focus next. That’s why we dislike it as users even if some of it might be good for us. That’s why Silicon Valley tries to eliminate it. Which is what spurs the criticisms like what Tyler made. Its existence is an indication of new ecological niches being created within our demandscape. It’s not conserved, but neither is it eradicated. It moves. It hides. It finds new places to make itself known. And that’s how you learn where to focus next.
And until people learn to catch up with a frictionless existence, we try to add friction back into our lives to make it possible for us to live. To deal with it seeping away from easy communication to harder cognitive load. We already do, for some things. Touching grass. No-meeting-Wednesdays. Putting ‘Away’ on Slack. Saying you’ll only check your email at 4 PM.
Every one of these can feel like an individual imposition because of a world that tech built, a small piece of personal rebellion, even if it was built only to appeal to everyone. A small escape from engineered dependence to our tools, which are no longer our own.
Since every advance is couched in terms of “you are now liberated from [X]”, and we can so easily think of a way that [X] was important to being human, it is easy to fight back5. This is hardly new. Socrates wouldn’t write anything down for fear of hurting his memory. But the thing is, Socrates was so clearly wrong in this! Writing things down kickstarted civilisation, so memorably by his student Plato. And his student Aristotle. It just needed to dissimulate through society so people figured out how to deal with the drawbacks.
Human scale isn’t machine scale, nor is it economy scale. Our neurons only fire so fast, our societies only adapt so fast, and until they do we might be prisoners of our own drive to make life better.
Some of these will be built by new startups and new technologies, some by new laws or guidelines or processes, and some, like Plank said, by the older folks just aging out. And until then we will see a lot more Voice from people, dissatisfied with the world they live in, annoyed at the choices they didn’t individually influence, because they are unable to exit.
They also argued tech and silicon valley enabled wars and are to blame for Palestinian children getting bombed, but I took that to be an everything-bagel criticism of “the way the world is”
It’s the rare monopoly that charges less, often free, vs the Standard Oil type of monopoly
It usually is solved through regulation or mass public protests, and neither seemed appropriate when you’re being given the world for free. Old monopoly arguments didn’t even apply, since again it was being given to you for free.
One of the most magical realisations I’ve had was when I grokked that the digital world which seems free is not. That there is a thermodynamic material physical cost to information. When the equations that I learnt about this fact actually became real. If you truly understood it you could’ve even made a fair bit of money, as you realised that electricity and cooling are real physical manifestations of your AI usage and someone will need to actually build them.
This is why people get really angry at “you don’t need to write your own emails anymore” not but at “you don’t need to fill your own Salesforce sheets anymore”.
Rohit, your friction analysis connects to patterns I've been exploring around system persistence and constraint satisfaction. What strikes me is how each technological "solution" displaces constraints rather than eliminating them.
Your AI coding example perfectly illustrates this - and I see it in my own work. AI has removed the friction of generating text, but now I spend enormous time reviewing LLM output for accuracy, coherence, and alignment with my actual thinking. The constraint moved from "time to write" to "capacity to validate and refine AI-generated content."
This pattern repeats across history. When the printing press removed friction from copying texts in the 1500s, scholars complained about "the confusing abundance of books" - suddenly they could access more information than they could meaningfully evaluate. Same displacement: easier creation, harder curation.
What's remarkable is how quickly this is changing. I retired 25 years ago and had basically forgotten how to type. But only in the last month have AI models become sophisticated enough to understand my cognitive load limitations and adapt accordingly. When I push back, these newest systems don't just generate faster - they actually recognize when I'm getting overwhelmed and adjust their communication style.
This suggests we might be moving beyond the historical pattern where new technologies create problems humans must solve through adaptation. Instead, we're developing systems that can adapt to individual human needs rather than forcing uniform interaction styles.
The critical questions remain: Are we confusing efficiency with effectiveness?
I’m worried about removing friction to super intelligent and hyper persuasive AI. Didn’t see your post before I wrote the conclusion to my piece today!