TL;DR
A BCG survey of 625 CEOs and board members found that 61% of chief executives believe their boards are rushing AI transformation. Three-quarters of board members rate their AI knowledge as adequate, but nearly 40% of CEOs disagree, and more than half say hype is distorting boardroom judgment.
Sixty-one per cent of chief executives say their boards are pushing AI transformation too fast, according to a global survey of 625 leaders published by Boston Consulting Group. The research, titled Split Decisions, polled 351 CEOs and 274 board members at companies with at least $100 million in annual revenue and found a consistent pattern: boards and CEOs agree that AI matters, but disagree on how quickly it should be deployed, how well boards understand it, and how much of a CEO's job now depends on delivering returns from it.
The findings land at a moment when AI FOMO has become a dominant force in corporate strategy. More than half of the CEOs surveyed said that hype around artificial intelligence is distorting their boards' judgment, and nearly 40 per cent said their boards lack an informed view of how AI is reshaping growth strategy. One in three said their board overestimates the human capabilities that AI can replace.
The confidence gap
The survey's most striking finding is the disconnect between how board members rate their own AI knowledge and how their CEOs rate it. Three-quarters of board members said their AI understanding is on par with or ahead of their peers. CEOs were far less impressed. The implication is that many boards are making consequential decisions about AI strategy on the basis of knowledge their chief executives consider inadequate.
BCG's Julie Bedard, a managing director and partner, said the gap can be closed if CEOs take direct responsibility for board education. Rather than delegating AI briefings to a chief technology officer or an outside consultant, she argued, CEOs should personally lead upskilling sessions that demonstrate what current tools can and cannot do, and should frame AI in terms that distinguish between tasks where the technology substitutes for humans and tasks where it complements them.
That distinction is more important than it sounds. Boards that treat AI as a wholesale replacement for human labour are likely to push for faster, broader deployment than the technology can support. Boards that understand AI as a complement to human work are more likely to approve investments that are scoped to realistic outcomes. The survey suggests that too many boards are in the first camp, and that the consequences of FOMO-driven investment decisions in AI are becoming harder to ignore.
The accountability mismatch
The survey also exposed a gap in how CEOs and boards perceive accountability for AI results. CEOs estimated that 35 per cent of their performance evaluation now depends on delivering AI-related returns on investment. Board members put the figure at 27 per cent. The eight-percentage-point difference suggests that CEOs feel more pressure to show AI results than their boards realise they are applying.
This matters because it shapes behaviour. A CEO who believes more than a third of their evaluation hinges on AI outcomes has a strong incentive to prioritise AI projects, even if those projects are premature or poorly scoped. A board that believes the figure is lower may not understand why its CEO is resisting calls to move faster, or may underestimate the operational risk of accelerating deployment to meet perceived expectations.
Judith Wallenstein, BCG's managing director and senior partner who leads its global CEO Advisory practice, said CEOs need to bring their boards along on the same learning journey they have taken, but compressed and focused on building genuine understanding rather than surface-level awareness. The engineering and operational realities of AI deployment are considerably messier than the boardroom presentations that often precede investment decisions.
What the survey does not say
It is worth noting what the research does not cover. The survey does not measure whether the CEOs who say their boards are rushing are themselves correct in their caution, or whether some boards are right to push harder. It is possible that in certain industries, faster AI adoption is exactly the right strategy and that CEO resistance reflects organisational inertia rather than sound judgment. The data captures a perception gap, not a verdict on who is right.
The survey also does not break down results by industry, geography, or company size beyond the $100 million revenue threshold, which limits the conclusions that can be drawn about specific sectors. A board pushing AI transformation at a financial services firm faces a very different risk profile from a board doing the same at a manufacturing company, and the survey treats both identically.
What the research does establish is that the most senior leaders at large companies are not aligned on the most consequential technology investment of the current era. Approximately 80 per cent of both CEOs and board members agreed that prospective board candidates should be required to demonstrate a measurable understanding of how AI can reshape their industry, a finding that suggests both groups recognise the knowledge gap even if they disagree on its severity.
The harder question
The deeper issue the survey raises is whether traditional board governance is suited to decisions about AI at all. Boards typically meet a handful of times per year, rely on management presentations for information, and are composed of members whose primary expertise may lie in finance, regulation, or sector-specific operations rather than technology. That structure worked well when the pace of technological change allowed for quarterly deliberation. It is less clear that it works when the questions that matter most about AI require technical fluency that most board members do not have.
BCG's recommendation, that CEOs should personally educate their boards, is practical but also reveals the problem. If the chief executive is the primary source of a board's AI understanding, the board's ability to independently evaluate the CEO's AI strategy is compromised. The survey does not propose a solution to this structural tension, but it does make the tension visible.
For companies trying to scale AI in 2026, the message is that alignment at the top is not optional. Boards that push too fast risk approving projects that fail to deliver returns. CEOs that move too slowly risk losing competitive ground. And for both groups, the temptation to let AI substitute for clear thinking rather than support it is a risk that no survey can fully quantify.