If your board can’t explain how AI affects your business model, your earnings and your risk profile, you have a governance problem. And in 2026, governance problems become liability problems.
That’s the bottom line. AI has moved from the IT roadmap to the boardroom agenda. But for the vast majority of boards, it hasn’t moved into the governance framework. That gap – AI deployment and AI oversight – is now one of the fastest-growing sources of director liability in corporate governance.
This isn’t a technology issue. It’s a fiduciary one.
The Numbers Tell a Stark Story
The data on the AI literacy gap is sobering. According to McKinsey, 66% of board directors report having “limited to no knowledge or experience” with AI. Nearly one in three say AI doesn’t even appear on their board agendas.
Meanwhile, 88% of organisations are deploying AI in at least one business function. Only 25% have board-approved AI governance policies.
That’s not a gap. That’s a chasm. This is exactly the kind of mismatch that draws regulatory scrutiny, investor pressure and legal exposure.
ISS expects boards to demonstrate AI literacy and documented oversight frameworks by the 2026 proxy season. Glass Lewis and BlackRock have already updated their stewardship guidelines. Directors face withhold recommendations, reputational damage and potential Caremark-style derivative claims.
Why Fiduciary Duty Now Extends to AI
Fiduciary duty has always been the “gap-filler” in corporate governance. When new risks outpace existing charters and bylaws, courts look at whether the board asked the right questions and provided proper oversight.
AI is now that risk.
It affects your earnings, your total shareholder return, your cost structure and your competitive position. When boards don’t understand AI well enough to oversee it, three things happen:
- Risks surface too late: Often after audits, public exposure or regulatory action
- Strategic potential is wasted: Leaders lack the confidence to pursue AI-driven innovation responsibly
- Accountability becomes unclear: No one owns the outcomes of automated decisions
AI-related securities class actions doubled from 2023 to 2024. Average settlement values for D&O claims have climbed to roughly $56 million. Plaintiffs don’t need to prove the AI system failed. They need to prove the board failed to govern the AI system. That distinction matters.
The Governance Lag Is the Real Risk
Most boards aren’t ignoring AI. Many are forming committees, hiring advisors and adding agenda items. The problem is that formal structures are outpacing actual readiness.
A recent Fortune 500 survey found that 70% of executives say their companies have AI risk committees. Yet only 14% say they are fully ready for AI deployment. The gap between governance on paper and governance in practice is where the real exposure lives.
Only 16% of Russell 3000 companies have even one director with specialised AI skills. Among those that do, the expertise is concentrated in a handful of sectors—Information Technology, Industrials and Consumer Discretionary. Everyone else is flying with limited instruments.
And here’s the uncomfortable truth: you can’t govern what you don’t understand. An AI governance framework that sits in a binder while directors lack the literacy to ask meaningful questions isn’t governance. It’s theatre.
As Mark Kelly, Founder of AI Ireland, puts it: “Boards don’t need to understand how AI works at a technical level. They need to understand how it’s changing their business, their risk profile and their competitive position. That’s not a technology conversation; it’s a leadership one.”
What AI Literacy Actually Means for a Board
Let’s be clear about what AI literacy does and doesn’t require at the board level.
Directors don’t need to become data scientists. They need to understand enough to ask the right questions:
- Where is AI being used across our operations today?
- What data feeds those systems, and who owns it?
- How do we detect when an AI system starts contradicting our values or commitments?
- What’s the ROI framework for our AI investments?
- Who is accountable when an automated decision goes wrong?
This is the same standard we apply to financial literacy, cybersecurity and regulatory compliance. No one expects every director to be a CPA. But every director must understand the balance sheet well enough to fulfil their duty of care.
AI is no different. The organisations with digitally and AI-savvy boards outperform their peers by nearly 11 percentage points in return on equity. Those without AI-savvy boards fall 3.8% below their industry average. AI literacy isn’t just protection; it’s a competitive moat.
How to Close the Gap Before It Closes on You
The boards that move first will be best positioned, both for regulatory confidence and competitive advantage. Here’s what that looks like in practice:
1. Run an AI Audit
Map where AI is already operating in your organisation. Most boards are surprised by how many AI-enabled tools are already embedded in operations, from HR screening to financial forecasting to customer service. You can’t govern what you haven’t catalogued.
2. Invest in Director Education
Arrange structured briefings on AI capabilities, limitations, and business implications. This isn’t a one-off seminar. It’s a recurring programme, just like your annual compliance and cybersecurity training.
3. Assign Clear Oversight
Decide which committee owns AI governance. Many boards split this between Audit and Risk (for controls), a Technology Committee (for architecture and vendor risk) and Nomination/Governance (for board composition and talent).
4. Define Metrics That Matter
Only about 15% of boards currently receive AI-related metrics from management. You need KPIs for AI performance, risk indicators for model failures and bias and reporting on regulatory compliance. What gets measured gets governed.
5. Bring in Outside Expertise
Where internal skill gaps exist, consider advisory directors, fractional AI governance officers or external consultants for structured board education. The cost of building literacy is a fraction of the cost of its absence.
The Window Is Closing
The era of “passive awareness” around AI is over. Proxy advisors are watching. Investors are asking questions. Regulators are tightening requirements and the legal standard for board oversight now explicitly includes AI.
Every quarter you delay closing the AI literacy gap is a quarter of unmanaged risk on your balance sheet and a quarter of missed opportunity for AI-driven value creation. This is a fiduciary issue. Treat it like one.
Ready to close the AI literacy gap in your boardroom?
Book an AI Leadership Board Presentation with AI Ireland – a focused, hands-on session built specifically for Boards of Directors and senior leadership teams.
In this session, your board will:
1. Define your AI Ambition
Get clear on where AI fits in your business model, your industry and your growth plan. No vague vision statements. A real, grounded ambition your board can stand behind.
2. Learn how to compete with AI
Understand how competitors and disruptors are already using AI to reshape cost structures, customer experience and market position. See what’s coming and decide how to respond.
3. Move from strategy to execution
Bridge the gap between boardroom discussion and operational reality. Walk away with a practical framework to advance your AI strategy into measurable action across the business.
This isn’t a technology briefing. It’s a governance and leadership session that builds the AI fluency your board needs to fulfil its fiduciary duty with confidence.
Book your AI Leadership Board Presentation with AI Ireland today.
Frequently Asked Questions
Q: Is AI oversight really a fiduciary duty for board directors?
A: Yes. Courts and regulators increasingly view AI as a material business risk that falls within the board’s existing duty of care and oversight. Proxy advisors like ISS and Glass Lewis now expect boards to demonstrate AI literacy and documented governance frameworks. Failure to oversee AI deployment can expose directors to derivative claims and withhold recommendations.
Q: Do board members need to become technical AI experts?
A: No. Directors need conceptual fluency, not coding skills. They should understand where AI is deployed, what data it uses, how risks are monitored, and who is accountable for outcomes. Think of it like financial literacy – you don’t need to be an accountant, but you must understand the balance sheet.
Q: What is the business cost of the AI literacy gap?
A: Research shows that companies with AI-savvy boards outperform peers by nearly 11 percentage points in return on equity. Meanwhile, AI-related D&O claims are rising sharply, with average settlements around $56 million. The cost of building AI literacy is far less than the cost of governing without it.
Q: How quickly do boards need to act on AI governance?
A: Immediately. ISS expects documented AI oversight frameworks by the 2026 proxy season. Over 88% of organisations are already deploying AI, but only 25% have board-level governance policies. Boards that act now protect their directors, improve their D&O positioning, and create a defensible record of fiduciary diligence.
Q: What’s the first step a board should take to close the AI literacy gap?
A: Start with an organisation-wide AI audit. Map every AI tool and system currently in use. Many boards are surprised by how much AI is already embedded in daily operations. From there, build a structured education programme and assign clear committee-level oversight responsibilities.
Call to Action
If you’d like to delve deeper into how these trends can reshape your organisation, we would be delighted to discuss them in more detail. Invite Mark Kelly, Founder of AI Ireland, to speak at your next team meeting, conference or strategy session. We can explore practical ways to harness AI responsibly, meet sustainability goals, and navigate the evolving consumer landscape. Let’s work together to ensure Ireland remains at the vanguard of innovation in 2026 – and beyond.
Related
Discover more from AI Ireland
Subscribe to get the latest posts sent to your email.
