With a deep understanding of customer-centricity and change, Full Fathom Five Founder, Antony Roberts is considered a thought leader in digital, customer experience, and ecommerce, with awards ranging from 'Automotive Website of the Year' and Global best-practice award for eCommerce to being recognised as an IBM 'New Creator.
Full Fathom Five are digital and customer experience specialists, who have faced the problems you are facing, and bring extensive experience and lessons learnt in how to get it right. They work with deep integrity, an openness to understand and learn, and a genuine desire to deliver value for the people they work with.
AI is transforming how organisations work but it’s also creating confusion.
New rules like the EU AI Act mean every company now needs to understand where and how AI is being used across their business. The challenge? AI Tools are prolific, easily accessible and often being used outside of business policy. In short, most organisations don’t actually know.
An EU AI audit is simply a structured way to find out.
A focused AI audit helps you get EU AI Act-ready, surface risks early, and pinpoint where to streamline costs and scale high-impact AI initiatives.
It’s not about catching people out or slowing innovation, it’s about building confidence that your organisation is using AI safely, responsibly, and productively.
Here’s how to do it.
1. Map Your Current AI Use
Start with what you already have.
Work with IT and business leaders to list all approved AI tools whether that’s customer-facing chatbots, data analysis platforms, or AI-assisted content tools like Copilot or Grammarly.
Then, look beyond the official systems. Many employees are already experimenting with AI on their own using tools like ChatGPT, Canva AI, or Otter.ai.
That’s your Shadow AI layer, and it’s where hidden risks often sit.
2. Create a Safe Space for Honesty
If people fear punishment, they won’t tell you what they’re using.
The best audits begin with an AI Engagement Campaign: a short, supportive campaign that invites staff to share which tools they use and why, without blame or judgement.
This helps you understand what’s genuinely helping them work smarter, and what needs to be managed differently.
3. Build Your AI Inventory
Once you’ve combined your IT inventory and AI Usage data, document everything in a central AI Inventory.
Each entry should record what the tool does, who uses it, what data it touches, and its risk level.
This becomes your single source of truth, essential for compliance, but also a foundation for smarter decision-making.
4. Classify and Prioritise Risks
Not all AI is equal under the EU AI Act.
Some tools are low-risk, like grammar assistants. Others, such as recruitment screeners or AI that affects people’s rights, are high-risk.
Categorising each tool helps you see where to focus your attention.
The key is proportionate governance and applying the right amount of oversight, not bureaucracy, to match the level of risk.
5. Establish Ongoing Governance
An AI audit isn’t a one-off event.
Create a small AI Governance Group, typically leaders from IT, HR, Comms, and Legal to keep the inventory up to date, review new tools, and maintain awareness of changing regulations.
Add simple policies that promote responsible, confident AI use, not just restrictions.
Why It Matters
Done well, an AI audit turns uncertainty into opportunity.
It shows your organisation exactly where AI adds value, where it carries risk, and how to get the balance right.
It protects people, data, and reputation, while freeing teams to innovate safely.
At Full Fathom Five AI, we call that evidence-based, not fear-based AI enablement.
Full Fathom Five AI
Website: https://www.fullfathomfive.uk/
Email: hello@fullfathomfive.uk