Before compliance,
you need an inventory.
Regulators don't ask “are you compliant?” first. They ask “what AI systems do you actually have?”A complete AI system inventory is the foundation of every EU AI Act assessment.
Free to use. No sign-up required. CSV format — open in Excel, Google Sheets, or Notion.
Already know what AI systems you have?
Tell us your context — we'll come back with a tailored quote. No commitment.
Why this matters
The 3 most common mistakes companies make
No AI system inventory
Companies ask 'are we compliant?' when regulators expect something different first: 'what AI systems do you actually have?' Without an inventory, compliance work becomes guesswork.
Assuming existing processes cover the AI Act
ISO standards, product safety rules, and software quality systems do not automatically satisfy AI Act obligations. Dataset governance, bias evaluation, and Annex IV documentation are entirely new requirements.
Confusing provider and deployer responsibilities
Using AI from a vendor does not transfer all compliance responsibility. If you use AI for hiring, credit scoring, or employee monitoring — you are the deployer, and deployers have their own obligations.
The template
What an AI system inventory looks like
Five example systems — from clearly low-risk to definitely high-risk. Your inventory is the starting point for every compliance decision that follows.
HR Screening AI
Third-party vendorRank and filter job applicants
High RiskDeployerJob applicants
Art. 6 — Employment
Credit Scoring Model
Internal ML modelEvaluate loan applications and set limits
High RiskProvider + DeployerCustomers
Art. 6 — Credit
Predictive Maintenance Model
Internal ML modelDetect equipment failure before it occurs
Likely Low RiskProviderNone
—
Customer Service Chatbot
Third-party (OpenAI)Handle inbound support queries
Limited RiskDeployerCustomers
—
Employee Monitoring System
InternalTrack productivity and flag anomalies
High RiskDeployerEmployees
Art. 6 — Employment
Know your role
Provider or deployer — or both?
The EU AI Act assigns different obligations depending on your role. Many companies are both — and don't realise it.
Develops or places the AI system on the market.
Key obligations
- ›Technical documentation (Annex IV)
- ›Conformity assessment
- ›CE marking
- ›Registration in EU database
Example: A startup that builds and sells an AI hiring tool.
Uses the AI system in their own operations.
Key obligations
- ›Human oversight mechanisms
- ›Usage logs & monitoring
- ›Fundamental rights impact assessment
- ›Staff training
Example: An employer using an AI tool to screen CVs.
Many companies are both. If your team built the AI system and also uses it internally — you are provider and deployer. Both sets of obligations apply.
Start your inventory today.
The compliance deadline is 2 August 2026. The inventory is step one — and it's free. Download the template, map your systems, and you'll have a foundation that every regulator, investor, and auditor will expect to see.
LexSutra turns your inventory into a full compliance diagnostic.
Once you know what AI systems you have, the next question is: are any of them high-risk under Annex III? LexSutra runs an 80-question diagnostic across all 8 EU AI Act obligation areas and delivers a graded PDF report — colour-coded, legally cited, version-stamped — that you can hand to a regulator, investor, or board.