53% of CEOs say their teams can't align on their AI priorities. If you've been through a previous technology wave, you've probably seen this play out already: leadership wants one thing, teams are doing another, and nobody can explain why the gap keeps growing.
But misalignment is a symptom. The cause is something more structural, and it explains not just why AI and digital technology projects fail, but why they keep failing the same way across every technology wave. I've started calling it the information gap.
The three-sided disconnect
Every AI or digital technology initiative that actually delivers needs three things to line up: business strategy (what the organization wants to achieve), technology capability (what AI and digital technology can actually deliver), and operationalization (whether the organization can make it work in practice).
The problem is that these three perspectives live in different parts of the organization, and in most companies, they never meet.
Leadership defines strategy based on board priorities, market trends, and executive offsites. Technical teams evaluate capability based on vendor demos, proof of concepts, and benchmark data. Operations knows what actually works on the ground, but that knowledge is scattered across teams, informal processes, and practices that nobody ever documents.
Each side is working from a different picture. Leadership's strategic intent and teams' operational reality rarely line up, and that disconnect is the information gap in practice.
No single person holds all three views. That's the information gap. It showed up in ERP rollouts, cloud migrations, and data platform projects. AI just makes it more visible because the feedback loop is faster and more brutal.
Why the math makes it worse
You might assume this is just a communication problem. Better meetings, a shared Slack channel, maybe an offsite with cross-functional workshops. It's not, and the math explains why.
Communication channels in a team grow with the square of its headcount. With 10 people, you have 45 possible channels. With 50, you're at 1,225. With 500, that's nearly 125,000. The information needed for good technology decisions is distributed across all of them, and there's a hard limit on how many any person can actually maintain: about 150 stable working relationships, according to Dunbar's research.
Past that threshold, information flow breaks down structurally. At 500, departmental boundaries harden and political silos form. At 1,500, cross-functional visibility becomes nearly impossible without dedicated coordination infrastructure. At 5,000, leaders are operating on filtered summaries that bear only a loose resemblance to operational reality.
The scaling problem goes beyond how many channels exist. What matters is what happens to the information flowing through them.
What actually diverges
As organizations scale past those communication thresholds, departments don't just struggle to talk to each other. They drift apart in five specific ways. Each one makes the next worse. And if you've worked in any mid-to-large organization, you've probably lived through all five.
1. Tool fragmentation
It starts with tools. Marketing runs HubSpot. Sales runs Salesforce. Finance runs SAP. The integration between them is held together with duct tape and someone's custom spreadsheet. The average enterprise runs hundreds of applications, yet only about 29% of them actually talk to each other. Every department picked the tool that made sense for their work. Nobody planned the incompatibility. It just accumulated.
2. Workflow divergence
With different tools come different workflows. Ask three departments how they handle customer onboarding and you'll get three completely different answers. Same activity on paper, different steps, different approvals, different data captured along the way. It happened one workaround at a time, each team adapting to their own tools and constraints.
3. Vocabulary silos
Then the vocabularies split. What operations calls "efficiency," finance calls "cost reduction," and engineering calls "optimization." They might be describing the same improvement; they can't tell because the words don't match. And it goes deeper than terminology. These are different mental models shaped by years of domain immersion, different ways of framing problems, different definitions of success. Employees waste roughly three hours a day just searching for information that already exists somewhere in the organization.
4. Ownership fragmentation
Nobody owns the full picture. The CRM is sales' territory. The ERP belongs to finance. The knowledge base is support's domain. Each team optimizes for its own metrics and targets, with its own definition of what "good" looks like. If that sounds familiar, you're not alone: 82% of enterprises say silos disrupt their most important workflows.
5. Ground truth drift
And then you arrive at the endpoint. Different teams with different tools, different processes, different vocabulary, and different ownership develop genuinely different versions of reality. Marketing's "customer" is a persona and a segment. Sales' "customer" is a pipeline stage. Support's "customer" is a ticket history. Same entity, three representations, none of them wrong exactly, but incompatible when you need a single answer. Conway's Law predicts this: the systems an organization builds mirror its communication structure. If departments don't talk, their data won't either. 68% of enterprise data goes unanalyzed. This is a big part of why.
These five forces aren't independent. Tools fragment processes. Fragmented processes create vocabulary silos. Silos fragment ownership. And fragmented ownership lets ground truth quietly drift apart across the organization. Each layer compounds the last. This is the environment into which organizations try to deploy AI.
Invisible knowledge (Enterprise)
In large organizations (2,500+ employees), the information gap is primarily a visibility problem, compounded by the fact that each department's version of reality has already drifted. The knowledge exists somewhere. An engineer in logistics knows exactly which data pipelines would break under a new AI or automation workload. A team lead in customer service knows which processes could genuinely benefit from digital technology and which ones are too messy to touch. A mid-level manager understands the dependencies between departments that no org chart captures.
But that knowledge is invisible to the people making investment decisions. Each management layer filters information upward, usually in the direction of optimism. By the time it reaches leadership, the picture is incomplete and disconnected from ground truth.
This is the classic information flow breakdown, and it's why enterprises with access to the best talent and the biggest budgets still fail at roughly the same rate as everyone else. They had the same problem with digital transformation and data programmes; AI just accelerated the consequences.
The external maze (Small businesses)
For smaller organizations (50 to 250 employees), the gap looks completely different. They don't have a visibility problem; most people can see across the entire company. But the external information gap they face is at least as large as what enterprises deal with internally. The market is flooded with specialized solutions, each designed for a slightly different problem, each claiming to be the right fit.
Hundreds of AI and digital tools, each with its own vocabulary, its own use case framing, and a different definition of what "ready" means. A 150-person manufacturing company trying to evaluate these options is facing the same kind of fragmentation that enterprises face across their departments, except it's the entire market, and they don't have a team of analysts to sort through it. Nobody on the team has spent years evaluating technology feasibility across different organizational contexts or mapping capability dependencies. That knowledge simply doesn't exist internally.
Every solution they add to fix one problem introduces new tools, new workflows, and new integration gaps. The divergence cascade doesn't just happen inside large organizations. It happens to small ones too, one well-intentioned software purchase at a time.
Enterprises can't see across their own organization. Small businesses can't see across the market of solutions. The information gap is just as wide; it's just pointing outward instead of inward. Same result: technology decisions made on incomplete information.
The double bind (Mid-market)
Mid-market organizations (250 to 2,500 employees) get the worst of everything. They're large enough that internal silos have formed, well past Dunbar's threshold, so they face the same tool, process, and vocabulary divergence as enterprises. But they're too constrained to build the dedicated coordination infrastructure that large enterprises rely on. AI adoption is widespread at this scale, but 70% still need outside help and 39% say lack of in-house expertise is their biggest barrier. Large enough that the problems are real, too constrained to solve them the way enterprises do.
And they face the external gap too. The same market fragmentation that overwhelms small businesses hits mid-market companies just as hard, except they're also trying to coordinate across departments that already can't see each other clearly. Marketing might need external AI expertise that doesn't exist internally (the small business pattern). Supply chain might need cross-functional coordination to assess what they already have (the enterprise pattern). And every function is evaluating against a market of solutions that each speak a different language. Every tool they bring in to solve one problem adds another node to the internal fragmentation. Same organization, same budget, three compounding problems.
The distributed knowledge problem
The knowledge of the circumstances of which we must make use never exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.
Hayek arrived at this insight nearly a century ago about economic planning, but swap that for "AI and technology" and the observation still holds. The knowledge needed to make good technology decisions is distributed across your organization in tribal practices, workflows, tacit expertise, and informal processes. No consultant interview, no vendor assessment, and no executive strategy session captures it all.
That's why AI and digital technology initiatives fail even when the technology itself works perfectly. And it gives you a mental model for nearly every adoption challenge: pilot purgatory traces back to nobody mapping what the pilot depends on beyond the lab. Wrong use case selection happens when strategy gets set without operational input. Skills gap surprises appear when leadership assumes capabilities that don't exist on the ground. Each one is a specific failure of information, not technology. And behind each, you'll usually find the same pattern: tools that don't talk to each other, processes that diverged years ago, and teams operating on different versions of reality.
Closing the information gap
If the knowledge needed for good digital tech and AI decisions is scattered across tribal practices, workflows, and tacit expertise, the solution has to go where that knowledge lives. It can't be another top-down framework, another consulting engagement that interviews executives and calls it discovery, or another series of pilots that test feasibility in a vacuum and assumes the rest will be easy.
That's why we built AI Readi. It's a discovery platform that systematically surfaces what's distributed across your organization: capabilities, processes, readiness, and the informal knowledge that nobody documents. Insights come from the people who actually do the work, not filtered summaries from management layers. Every use case connects back to strategic objectives, so strategy, capability, and operationalization stay aligned by design. The platform builds the shared picture that no single person could hold, one contribution at a time.
If you want to see how systematic discovery replaces incomplete assumptions, that's what we built - click Get Started above.
The information gap is structural and mostly invisible to the people inside it, but its symptoms aren't: pilot purgatory, strategies that collapse on contact with reality, experiments nobody coordinates.
Next week: how the information gap creates pilot purgatory and top-down AI strategies that keep failing. The root cause is invisible, but its effects are everywhere.
Sources
- 74% of companies struggle to achieve and scale value from AI — Boston Consulting Group, AI Adoption Survey (October 2024), 1,400+ C-suite executives
- 10/20/70 rule: 10% algorithms, 20% technology/data, 70% people and processes — BCG X prescriptive framework (Beauchene, 2023; Duranton, 2025)
- Brooks' Law: communication channels = n(n-1)/2 — Frederick Brooks, "The Mythical Man-Month" (1975)
- Dunbar's social cascade thresholds (150 → 500 → 1,500 → 5,000) — Robin Dunbar, Oxford evolutionary psychology research
- 70% of mid-market firms need outside help with AI; 39% cite lack of in-house expertise — RSM 2025 Middle Market AI Survey (966 executives, firms with $10M–$1B revenue)
- Hayek distributed knowledge quote — Friedrich Hayek, "The Use of Knowledge in Society," American Economic Review (1945)
- Enterprise application integration rates (29% connected) — MuleSoft 2025 Connectivity Benchmark Report (1,050 IT leaders, Vanson Bourne)
- 82% of enterprises report data silos disrupting critical workflows; 68% of enterprise data goes unanalyzed — IBM, The Data Differentiator, IBM Think
- Employees waste ~3 hours/day searching for information — Coveo 2025 EX Relevance Report (4,000 U.S./U.K. employees at 5,000+ person companies, Arlington Research)
- Conway's Law: systems mirror communication structure — Melvin Conway, "How Do Committees Invent?" (1968)
- 53% of CEOs say teams struggle to align on priorities; only 10% say companies are ready for AI disruption — The Adecco Group, Leading Through the Great Disruption (May 2025), 2,000 C-suite leaders across 13 countries