Zac Acker
4 minute read
Jack Dorsey just laid off nearly half of Block. 4,000 people. Gone. And his explanation was essentially: AI is here, we don't need you anymore.
"A significantly smaller team, using the tools we're building, can do more and do it better," he wrote to shareholders. Then Block's stock surged 24% in after-hours trading.
That's the world we're in right now.
The AI Layoff Playbook Is Getting Crowded
Block isn't alone. Salesforce CEO Marc Benioff cut his customer support headcount from 9,000 to 5,000. His quote about it was almost admirably blunt: "I need less heads." Microsoft has shed 15,000 jobs this year while posting record profits. Amazon, Meta, Google. The list keeps growing, and AI is the justification on almost every press release.
There's a pattern here that's worth naming: every one of these companies exploded in size during 2020-2022, hired like they were building cities, and now needs to correct for it. Block went from 3,800 employees in 2019 to over 10,000 before the cuts. The company forgot what it was building and over-hired through the fog.
AI didn't create this problem. It's being used to solve it while giving executives a cleaner story to tell investors.
Bloomberg called it "AI-washing." A way to dress up old-fashioned cost-cutting as technological futurism. That's probably too cynical as a complete take, because AI capability is genuinely advancing fast. The pandemic bloat and the AI revolution arrived at a convenient moment for each other, and not everyone losing a job right now is losing it to a model.
What Actually Happens When You Cut Too Deep
AI couldn't handle nuance. It couldn't handle escalations or edge cases. Service quality dropped. Productivity went backwards. Teams spent their time fixing AI output instead of doing actual work. And the worst part: they'd thrown away expertise they couldn't get back quickly.
This is the thing that doesn't show up in the shareholder letter. When you cut 4,000 people who know how your customers think, who've developed judgment over years of doing the actual work, you lose capacity. You lose institutional memory. The people who knew which edge cases mattered and understood the difference between a real problem and noise.
That's skill debt. And it accumulates fast once you start running too lean.
The Harder Question
AI will absolutely reshape how organizations operate. That's not debatable. What is debatable is whether the current wave of layoffs reflects that reality or just exploits the narrative.
Benioff cut 4,000 support jobs, then two months earlier had told Fortune that AI "augments" people, not replaces them, and that companies still needed "humans in the loop." Then eight weeks later he was cutting nearly half his support division. The whiplash there isn't a small thing. It's a CEO figuring out the story as he goes.
Dorsey's version is at least more consistent. He built Goose, Block's internal AI tool, and has been public about the bet: smaller, flatter teams using AI can out-execute larger ones. Maybe he's right. Maybe in two years Block is a case study in getting it right. Or maybe they find out in 18 months that the judgment and relationships they cut are the things that were actually compounding.
The companies that will win aren't the ones that cut fastest. They're the ones that cut right. That know which work AI can genuinely absorb and which work requires the kind of judgment that only comes from doing something long enough to understand it.
What This Means If You're Building
If you're running a team or advising on one, the question isn't whether to bring AI in. You should. The question is what you're actually automating and what you're protecting.
Repetitive, high-volume, low-judgment work: automate it. Routine report generation, standard correspondence, first-pass code review, data transformation. This is table stakes now.
Complex client relationships, architectural decisions, edge case handling, institutional knowledge: protect it. These are exactly where AI fails publicly and expensively. The companies learning that lesson the hard way are doing it in front of their customers.
The third category is the one people get wrong most often: work that looks repetitive but isn't. Support that feels templated until someone has a problem the template can't handle. Code that looks straightforward until it has to live in a real environment with real constraints. This is where cutting headcount based on AI efficiency projections gets organizations into trouble. The model performs well in testing. Production is different.
Run AI on your work before you decide who's doing it. See what it actually produces. That sequence matters more than any projection deck.
The Stock Goes Up. Watch What Comes After.
Block surged 24% on the layoff news. Salesforce investors cheered the headcount reduction. The market is rewarding this story right now, which means more companies will tell it.
The real test comes in 12 to 18 months. Whether Block's smaller team actually ships faster. Whether Salesforce's support quality holds under AI, or whether "efficiency gains" turn into churn they didn't see coming. Whether the expertise that walked out the door shows up as institutional debt on a future earnings call.
The narrative is AI. The reality is messier, slower, and far more dependent on whether the people making these calls actually understand what the people they're cutting were doing every day.
Most of them don't. That's the part nobody puts in the shareholder letter.