AI Governance: How to Implement a Governance Framework in Just 4 Hours a Year
Only 28 percent of S&P 100 companies have a board-level AI governance framework, formal AI oversight, and a formal AI policy all at once. The remaining 72 percent—and nearly all European SMEs—are flying blind. Starting in August 2026, the EU AI Act will put an end to this luxury. It’s time to implement AI governance—but not the way Big Four consultants say to.
Why is it important to address the AI governance framework—now?
According to a 2025 study by Glass Lewis, only 55 percent of S&P 100 companies disclose board-level oversight of AI, and only 45 percent have a documented AI policy. This rate is lower among Fortune 500 companies, disastrous among mid-sized companies, and virtually nonexistent among SMEs.
Meanwhile, WilmerHale’s AI Governance Playbook, published in January 2026, states unequivocally: AI governance is not optional—it is a legal and strategic necessity. According to the fourth-quarter risk index from the Diligent Institute and Corporate Board Member, technology is the top risk for 60 percent of executives—far ahead of economic (33 percent) and customs (23 percent) risks.
The EU AI Act will be fully applicable to high-risk systems starting August 2, 2026. If your company uses AI for recruitment, credit scoring, performance evaluation, or customer service automation—this applies to you.
The 4 Steps You Really Need to Take — According to Deloitte and WilmerHale
Two key publications from the Harvard Law School Forum on Corporate Governance—the Deloitte AI Governance Roadmap and the WilmerHale Playbook—identify four steps. I’ll break these down into terms that make sense for SMEs:
1. AI Inventory: Know what you’re using. Make a list of every AI tool your company uses—ChatGPT, Copilot, AI-powered CRM features, automated email responses, AI image generation. Most companies are surprised to find that they use 3–5 times as many AI tools as management is aware of. This is the “shadow AI” problem, and Deloitte considers it the most acute risk.
2. Risk classification. The EU AI Act defines four risk levels: prohibited, high, restricted, and minimal. Most AI applications used by SMEs (chatbots, content generation, data analysis) pose minimal risk—but if you use AI in HR or customer rating, that falls under the high-risk category and requires a conformity assessment.
3. Appointing a responsible person. You don’t need an “AI board”—you need one person whose job description includes overseeing AI. According to PwC’s 2025 Annual Corporate Directors Survey, companies that explicitly designated a person in charge achieved a 73 percent AI adoption rate, compared to 31 percent for those where this was not clearly defined.
4. Audit trail. Every high-risk AI system must maintain an automatic, tamper-proof log of its decision-making logic. This will be a legal requirement starting in August 2026. The good news is that most modern AI platforms (Claude, GPT, Copilot) log this information by default—it just needs to be archived.
What most consultants don’t tell you: governance ≠ bureaucracy
The Deloitte Roadmap and the PwC framework are designed for enterprise-level companies. A company with 50 to 500 employees does not need an AI Ethics Board, a Chief AI Officer, or ISO 42001 certification from day one.
What you'll need:
An A4 document that outlines: which AI tools we use, who is responsible, what data we input, and what is prohibited. This is the AI Policy. Quarterly 30-minute review: new tools, new risks, new rules. Annual training: 60 minutes for the entire company—not about AI technology, but about the rules for using AI.
That amounts to a total of 4 hours of management time per year. The maximum fine under the EU AI Act is 35 million euros. The figures speak for themselves.
The Gloster Experience
I launched the AI governance process at Glostor in 2024—well before the EU AI Act came into effect. We use 23 unique AI skills and 7 MCP connectors in our daily operations. Our inventory now includes over 40 AI tools: Claude, Copilot, Kie.ai, SerpApi, and in-house automation tools.
The lesson: Governance doesn’t slow down innovation. It speeds it up—because everyone knows what’s expected, and there’s no need to rehash the basics with every decision.
AI Governance To-Do List — Tomorrow Morning
Open a blank document. List the names of all the AI tools your company uses—check with IT, marketing, sales, and HR. Assign someone to review the list every quarter. Write down the three most important rules (e.g., “We do not copy customer data into ChatGPT”). Done. This is the minimum viable product of AI governance—and it’s more than what most European SMEs are doing today.
Frequently Asked Questions (FAQ)
What is AI governance, and why is it important for SMEs?
AI governance is the regulatory framework governing the corporate use of artificial intelligence: who can use it, with what authorization, and with what data. Due to the EU AI Act, it will be mandatory rather than optional starting in 2026.
How do I implement an AI governance framework at a small company?
You don’t need a legal department—all you need is an AI policy document, a designated person in charge, and a one-hour review every quarter. At Gloster, it operates with just four hours of board time per year.
What is an AI audit trail, and why is it mandatory?
The AI audit trail records what the AI system did, what data it used, and what results it produced. The EU AI Act will make this mandatory for high-risk systems starting in August 2026.
Do Hungarian companies need an AI policy template?
Yes—a simple AI policy that outlines permitted tools, data protection restrictions, and the approval process. This forms the foundation of compliance and internal security.
Related articles
- EU AI Act: What Hungarian Business Leaders Need to Know in 2026 — Specific Regulations, Deadlines, and Fines
- We’re no longer building chatbots—we’re building a digital workforce —How can AI governance be scaled when there are not 1, but 23 AI-skilled professionals?
- 7 custom AI connectors that save 15–20 hours of work per week — The technical architecture that needs to be governed






