When AI Stops Being Theoretical

March 11, 2026. Atlassian cuts 1,600 people—10% of its workforce. CEO Mike Cannon-Brookes sends an email. Twenty minutes later, affected employees find out. They get 6–12 hours of Slack access to say goodbye.
The reason? "AI changes the mix of skills we need."
February 26, 2026. Block cuts 4,000 people—nearly half its workforce. CEO Jack Dorsey posts on X: "Intelligence tools have changed what it means to build and run a company."
Stock surges 25% after-hours.
Most coverage treated these as layoff stories. Labor impact. Employee morale. The human cost of automation.
They missed it.
This is a founder psychology story. The moment you stop talking about what AI will do and start acting on what it's already doing. The shift from theoretical to operational. The decision to make the hard call before you're forced to.
And most founders are avoiding that calculus entirely.
The False Binary
Cannon-Brookes didn't say "AI replaces people." He also didn't say "AI doesn't replace people."
He said: "It would be disingenuous to pretend AI doesn't change the mix of skills we need or the number of roles required in certain areas."
Operating honesty. Not the hedged language most CEOs use when they're trying to have it both ways. Not "AI will augment our workforce" while quietly planning cuts. Not "We're becoming AI-first" while keeping headcount flat to appease investors.
He named the shift and acted on it.
Dorsey did the same: "We're not making this decision because we're in trouble. Our business is strong. But something has changed."
Both moved from strength, not desperation. Both made the call before the market forced them to. Both framed it as adapting to operational reality, not reacting to financial pressure.
And that's what makes most founders uncomfortable.
Because if AI has already changed the economics of your business—if the work you're hiring for can genuinely be done better, faster, cheaper with tools instead of headcount—then waiting isn't strategic patience. It's denial.
The Timing Problem
There's a version of this story where these decisions look premature. Where founders cut too early, regret it, and spend 2027 rehiring.
Forrester already reported that 55% of companies regret AI-related layoffs. The criticism writes itself: overhyped technology, rushed decisions, human cost.
But look at the pattern.
Block's headcount jumped from 3,800 in 2019 to 10,000+ by 2022. Pandemic hiring. The "growth at all costs" model that defined the 2010s. Then Dorsey cut back to 6,000—still above pre-pandemic levels—and said explicitly: "I'd rather get there honestly and on our own terms than be forced into it reactively."
Founder judgment call. Not "Do I believe AI will change things eventually?" but "Do I act on what I know is happening now, or do I wait until the market forces my hand?"
Waiting feels safer. You can point to uncertainty. "We're still evaluating." "We want to see more proof." "Let's not overreact to hype."
But the cost of waiting isn't neutral.
Every quarter you carry headcount you don't need, you're paying an opportunity cost. Not just salary and overhead—though that's real—but strategic focus. Coordination tax. The organizational drag that comes from managing more people than the work requires.
Dorsey framed Block's move as enabling "a significantly smaller team using the tools to do more and do it better." Not layoffs for margin expansion. Layoffs to remove friction so the company can move faster.
That reframe matters. It changes the question from "Can we afford to cut?" to "Can we afford not to?"
What Gets Automated, What Doesn't
The gap in most AI adoption conversations is specificity.
"AI will change work" is abstract. It lets you nod along without committing to anything.
"AI has already replaced the work these 900 R&D roles were doing" is concrete. It forces you to either agree and act, or disagree and explain why your business is different.
Atlassian cut heavily in software R&D. Block cut across functions where "intelligence tools" could replace execution work.
The common thread: execution tasks that are bounded, repeatable, and don't require novel judgment.
Customer support. Data analysis. Content creation. Basic QA. Meeting documentation. Code review. Reporting.
These aren't trivial jobs. They're jobs that used to require smart, capable people—and still do in most companies. But the threshold for "requires a human" has moved.
What hasn't moved: strategic thinking, customer relationships, creative problem-solving, cross-functional coordination. The work that's messy and context-dependent.
The mistake most founders make is thinking this is binary. Either a role is "safe" or it's "automatable."
It's not. It's a gradient. And the gradient is shifting every quarter.
A role that was 80% judgment and 20% execution two years ago might be 50/50 today. And if you're honest about it, you realize you're paying for the 80% you still need but subsidizing the 50% you don't.
Cannon-Brookes and Dorsey are naming that operational reality.
The Coordination Tax
Headcount isn't just a cost. It's a coordination tax.
More people = more meetings. More alignment. More communication overhead. More onboarding. More performance reviews. More everything that isn't building the product or serving the customer.
AI doesn't just replace tasks. It collapses entire layers of coordination.
When you automate customer support, you're not just saving salary. You're eliminating the need to manage a support team. No hiring. No training. No scheduling. No escalation paths. No performance issues.
When you automate reporting, you're not just saving an analyst's time. You're removing the need to explain what you want, review the output, go back and forth on revisions, schedule a meeting to discuss it.
Friction disappears.
And the smaller the team, the less coordination tax you pay. Which means you move faster. Which compounds over time.
Dorsey's framing—"a significantly smaller team using the tools can do more and do it better"—isn't just about productivity per person. It's about removing the drag that comes from managing a larger organization.
That's why the stock surged. Not because investors love layoffs. Because they believe a leaner, faster Block can outmaneuver competitors who are still carrying organizational weight they don't need.
The Honesty Gap
What made Cannon-Brookes' statement land—"It would be disingenuous to pretend AI doesn't change the mix"—is that it named what everyone already knows but most leaders won't say.
AI has changed the mix.
Not in theory. Not eventually. Not "we're exploring use cases."
Right now. Today. The work you hired for last year can be done differently this year. And if you're not acting on that, you're either in denial or you're choosing to subsidize inefficiency because it feels safer than making the hard call.
Most founders are stuck in the honesty gap. They know AI is real. They see the demos. They use the tools personally. But they haven't internalized what it means operationally.
So they hedge. "We're AI-first, but we're not replacing people." "We're investing in AI, but we believe in human-centered work." "We're experimenting, but we're taking a measured approach."
All of that might be true. But it's also a way to avoid the uncomfortable question: If AI can do this work better than a human, why are we still paying a human to do it?
Cannon-Brookes and Dorsey didn't hedge. They acted. And they framed it honestly: this isn't about being in trouble. It's about adapting to a shift that's already happened.
The Counterfactual No One Tracks
The problem with acting early is you'll never know if you were right.
If you cut headcount in March 2026 and your competitors wait until Q4, you'll spend the next six months second-guessing. Did I move too fast? Did I cut too deep? What if the tools aren't ready?
And you won't have an answer. Because the counterfactual—what would have happened if you waited—doesn't exist.
But here's what you will have: six months of moving faster than your competitors. Six months of lower burn. Six months of operating with less coordination tax.
And by Q4, when they're forced to make the same cuts reactively—because the market demanded it, because their board pressured them, because they ran out of runway—you'll be past it.
You'll have reorganized. You'll have rebuilt workflows around the new reality. You'll have figured out what works and what doesn't.
They'll be starting from zero.
That's the edge. Not being right. Being early. And being willing to act on incomplete information because the cost of waiting is higher than the cost of being wrong.
What This Means for You
If you're running a company right now, you're facing some version of this decision.
Maybe not 10% cuts. Maybe not 40% cuts. But the question is the same: Are you carrying headcount you don't need because it feels safer than acting on what you know is true?
Here's how to think about it:
1. Audit execution vs judgment. Go role by role. What percentage of the work is bounded execution vs novel judgment? If it's >50% execution, you need to be honest about whether a tool can do it better.
2. Count the coordination tax. How much time do you spend managing people vs building? How many meetings exist just to keep everyone aligned? What would happen if you had half the team and twice the tools?
3. Name the shift. Stop hedging. Either AI has changed the economics of your business or it hasn't. If it has, act. If it hasn't, explain why your business is different. But don't pretend you're still evaluating when you already know.
4. Move from strength, not desperation. The companies that make these calls before they're forced to will have rebuilt by the time their competitors are reacting. The companies that wait will spend 2027 playing catch-up.
5. Be honest about the trade-offs. You will lose things. Institutional knowledge. Diverse perspectives. The buffer that comes from having extra capacity. But you'll gain speed, focus, and the ability to move without coordination drag. Decide which matters more.
The Real Test
The hardest part isn't making the call. It's living with it.
Because you won't know if you were right. You'll just know you acted on what you believed was true, and you'll spend the next year finding out if the bet holds.
Cannon-Brookes and Dorsey made the bet. Time will tell if they were early or premature.
But one thing is certain: the founders who are still hedging in March 2026—still talking about AI in the future tense, still carrying headcount they know they don't need, still waiting for more proof before acting—won't have that luxury by the end of the year.
The shift has already happened. The only question is whether you act on it now or wait until the market forces your hand.
And if you wait, you won't be making a strategic decision.
You'll be making a reactive one.
About Jaxon Parrott
Jaxon Parrott is founder of AuthorityTech and creator of Machine Relations — the discipline of using high-authority earned media to influence AI training data and LLM citations. He built the 5-layer Machine Relations stack to move brands from un-indexed to definitive AI answers.
Read his Entrepreneur profile, and follow on LinkedIn and X.
Jaxon Parrott