EU AI Act compliance deadline: 2 August 2026

Before compliance,
you need an inventory.

Regulators don't ask “are you compliant?” first. They ask “what AI systems do you actually have?”A complete AI system inventory is the foundation of every EU AI Act assessment.

Free to use. No sign-up required. CSV format — open in Excel, Google Sheets, or Notion.

Already know what AI systems you have?

Tell us your context — we'll come back with a tailored quote. No commitment.

Request a Quote →

Why this matters

The 3 most common mistakes companies make

01

No AI system inventory

Companies ask 'are we compliant?' when regulators expect something different first: 'what AI systems do you actually have?' Without an inventory, compliance work becomes guesswork.

02

Assuming existing processes cover the AI Act

ISO standards, product safety rules, and software quality systems do not automatically satisfy AI Act obligations. Dataset governance, bias evaluation, and Annex IV documentation are entirely new requirements.

03

Confusing provider and deployer responsibilities

Using AI from a vendor does not transfer all compliance responsibility. If you use AI for hiring, credit scoring, or employee monitoring — you are the deployer, and deployers have their own obligations.

The template

What an AI system inventory looks like

Five example systems — from clearly low-risk to definitely high-risk. Your inventory is the starting point for every compliance decision that follows.

AI SystemPurposeRisk CategoryYour RoleData SubjectsAnnex III

HR Screening AI

Third-party vendor

Rank and filter job applicants

High RiskDeployer

Job applicants

Art. 6 — Employment

Credit Scoring Model

Internal ML model

Evaluate loan applications and set limits

High RiskProvider + Deployer

Customers

Art. 6 — Credit

Predictive Maintenance Model

Internal ML model

Detect equipment failure before it occurs

Likely Low RiskProvider

None

Customer Service Chatbot

Third-party (OpenAI)

Handle inbound support queries

Limited RiskDeployer

Customers

Employee Monitoring System

Internal

Track productivity and flag anomalies

High RiskDeployer

Employees

Art. 6 — Employment

High Risk — full Annex III obligations
Limited Risk — transparency obligations only
Low Risk — no mandatory obligations

Know your role

Provider or deployer — or both?

The EU AI Act assigns different obligations depending on your role. Many companies are both — and don't realise it.

Provider

Develops or places the AI system on the market.

Key obligations

  • Technical documentation (Annex IV)
  • Conformity assessment
  • CE marking
  • Registration in EU database

Example: A startup that builds and sells an AI hiring tool.

Deployer

Uses the AI system in their own operations.

Key obligations

  • Human oversight mechanisms
  • Usage logs & monitoring
  • Fundamental rights impact assessment
  • Staff training

Example: An employer using an AI tool to screen CVs.

⚠️

Many companies are both. If your team built the AI system and also uses it internally — you are provider and deployer. Both sets of obligations apply.

Free resource

Start your inventory today.

The compliance deadline is 2 August 2026. The inventory is step one — and it's free. Download the template, map your systems, and you'll have a foundation that every regulator, investor, and auditor will expect to see.

Want the full diagnostic?
⚖️

LexSutra turns your inventory into a full compliance diagnostic.

Once you know what AI systems you have, the next question is: are any of them high-risk under Annex III? LexSutra runs an 80-question diagnostic across all 8 EU AI Act obligation areas and delivers a graded PDF report — colour-coded, legally cited, version-stamped — that you can hand to a regulator, investor, or board.

Human expert review on every reportLegal citations includedPrioritised remediation roadmapPolicy version-stamped
Request a quote — tell us your context