Back to BlogEU AI Act Checklist: 7 Steps to Compliance by August 2026
EU AI Act

EU AI Act Checklist: 7 Steps to Compliance by August 2026

From AI inventory to risk classification to an exportable compliance report: 7 concrete steps to get your company EU AI Act compliant by August 2026. With timeline, common mistakes, and a takeaway checklist.

March 22, 2026
Yannick | SimpleAct Team
6 min read
EU AI ActKI-ComplianceRisikoklassen
EU AI Act Checklist: 7 Steps to Compliance by August 2026

Less than five months until August 2, 2026. From that date, companies must be able to prove that their AI systems are documented, classified, and compliant. Those who aren't prepared risk fines of up to 35 million euros or 7% of global annual revenue.

That sounds like a lot. It is. But the good news: preparation isn't rocket science if you approach it step by step.

This checklist shows you what you should have done by August 2026. Not as a legal treatise, but as a practical roadmap for SMEs.


What already applies (since February 2025)

Before we look ahead: two obligations are already in force. If you haven't addressed these yet, they should be at the top of your priority list.

🚫

Prohibited AI practices (Art. 5)

Social scoring, manipulative AI systems, and untargeted biometric surveillance in public spaces are banned. Check whether you have any such systems in use. If so: shut them down immediately.

🎓

AI literacy (Art. 4)

All employees who work with AI systems must have sufficient AI competency. This applies across every risk class, for every AI system. Document your training.


The checklist: 7 steps to August 2026

The following steps take you from the initial inventory to full compliance. Work through them in order.


1

Create your AI inventory

Map all AI systems in your company. Not just the obvious ones like ChatGPT or Copilot, but also embedded AI features in CRM, HR, and ERP systems. Ask every department. Shadow AI is real.

Record per system: Name, provider, purpose, affected department, responsible person, type of data processed

2

Determine risk classes

Classify every AI system into one of the four risk classes: unacceptable risk (prohibited), high risk, limited risk, minimal risk. The use case determines the class, not the technology.

Pay special attention to: AI in recruiting, credit scoring, medical diagnostics, automated decisions about people. These almost always fall under high risk (Annex III).

3

Assign responsibilities

Designate a person or team responsible for AI compliance. Clarify: Who registers new systems? Who runs assessments? Who reviews and approves? Without clear ownership, documentation stays incomplete.

4

Implement transparency obligations (limited risk)

For limited-risk AI systems (chatbots, image generators, text generators), transparency obligations apply: users must know they're interacting with AI, and AI-generated content must be labeled.

Specifically: Create an internal labeling policy, train employees, adapt processes for external communications.

5

Meet high-risk requirements

High-risk AI systems face the strictest requirements. This is the most demanding part of the checklist, but also the most important.

Risk management system (Art. 9) Identify, assess, and mitigate risks. Across the entire lifecycle.
Data quality (Art. 10) Review training and test data. Document bias risks.
Technical documentation (Art. 11) Comprehensive description of the system, its functionality, and limitations.
Record-keeping (Art. 12) Automatic logging of outputs and decisions.
Transparency (Art. 13) Clear information for operators and affected persons.
Human oversight (Art. 14) Natural persons must be able to monitor and intervene.
6

Train your employees

The AI literacy obligation (Art. 4) is already in effect. But training also matters operationally: employees need to know which AI tools they can use, what data they may input, and which outputs they need to verify. Document all training.

7

Create your compliance report and establish a review process

The end result is an exportable report: which AI systems are in use, how they've been classified, what measures have been taken. This report must be kept current. Establish a regular review cycle (e.g. quarterly) to capture changes and update your documentation.


Timeline: What should be done when

Timeframe Task Priority
Already due Check prohibited practices, ensure AI literacy Immediate
Now to May 2026 AI inventory, risk classification, assign responsibilities High
May to July 2026 Transparency obligations, high-risk documentation, training High
From August 2026 Ongoing review process, keep documentation current Continuous

Note on the Digital Omnibus: In November 2025, the EU Commission proposed postponing the high-risk deadlines to December 2027. This proposal still needs to be adopted by Parliament and Council. Our recommendation: continue planning for August 2026. A postponement is not guaranteed, and the AI literacy obligation and prohibitions are already in effect.


The most common mistakes in implementation

Common mistakes

"This doesn't apply to us" without actually checking. Almost every company uses AI today.

Classifying everything as "Minimal Risk" because it's quicker. Without a structured assessment, you risk misclassification.

Not assigning responsibilities. If nobody owns it, nothing happens.

Waiting for the Digital Omnibus. A potential delay is a bonus, not a planning basis.

Better approach

Actively ask every department. IT alone doesn't have the full picture.

Use guided questionnaires that derive risk classes rule-based. Traceable and repeatable.

Set up compliance as a team project. IT, DPO, business units, and management at one table.

Start now and use the head start that most competitors don't have yet.


Checklist to take away

EU AI Act Compliance Checklist

☐ Prohibited AI practices checked (Art. 5)

☐ AI literacy of employees ensured (Art. 4)

☐ Complete AI inventory created

☐ Risk class determined per system

☐ Responsible person / team assigned

☐ Transparency obligations implemented for limited-risk systems

☐ High-risk requirements met (Art. 9-14)

☐ Employees trained and training documented

☐ Compliance report exportable

☐ Review process established


The fastest way through this checklist

You can do every one of these steps manually. With spreadsheets, Word documents, and email chains. That works until it doesn't.

Or you use a solution built exactly for this process. SimpleAct walks you through every element of this checklist: register AI systems, classify them rule-based, work through compliance checklists per risk class, log all changes, export your report. All in one place, traceable, audit-ready.

Get started for free →


This article is for general information purposes only and does not constitute legal advice. EU AI Act requirements may change through the planned Digital Omnibus package. If in doubt, we recommend seeking legal counsel. Last updated: March 2026.


About SimpleAct: SimpleAct is a German compliance platform that helps companies structurally document their AI systems in accordance with the EU AI Act. From registration to risk assessment to exportable audit reports. All in one place.

Learn more →

Tags

EU AI ActKI-ComplianceRisikoklassen
Y

Yannick | SimpleAct Team

Author · SimpleAct Team

Yannick Heisler

Yannick Heisler

Vertrieb · Persönliche Beratung