Questions about the AI Act?
We guide you safely through the new regulations.
Request without obligationFree & non-binding.
Scope & Applicability
Who is really obligated
under the AI Act?
Many think the AI Act only affects AI developers. But the regulation places an entire chain of actors under obligation – from manufacturers to distributors to companies using AI. Find out which role you play.

The 5 Central Actors of the AI Act
The AI Act applies to everyone who develops, offers, or uses AI systems in the EU – including companies based outside the EU if their products are on the market here. The regulation divides responsibility among different "actors". Your role determines your obligations.
1. Provider
A provider develops an AI system or has it developed to market it under their own name or brand. They bear primary responsibility for conformity, ensuring risk management, technical documentation, and conformity assessment.
2. Deployer (User / Operator)
A deployer is any company that uses an AI system in the course of its professional activity (excluding purely private use). Users of high-risk systems must ensure human oversight, follow instructions for use, and monitor functionality.
3. Importer
An importer is an EU-based company that places an AI system from a third country on the market. They must verify that the provider has conducted the conformity assessment, that CE marking is applied, and that technical documentation exists.
4. Distributor
A distributor is part of the supply chain and makes an AI system available on the EU market without being a provider or importer. They must exercise due diligence to verify that the system has the required CE marking and documentation.
5. Authorised Representative
A person or company in the EU appointed in writing by a non-EU provider to perform certain obligations under the AI Act on their behalf, e.g., as a contact point for authorities.
New Obligation for All: AI Literacy
Regardless of role and risk class, the AI Regulation introduces an overarching obligation for "AI Literacy". Users (companies) of high-risk systems must ensure that their personnel and others operating the systems on their behalf have the necessary skills.
This is more than just technical knowledge. It includes understanding the possibilities and limitations of AI, as well as the ethical, social, and legal implications. Proactively building competence in your company is therefore essential.
Checklist: How to Classify Yourself Correctly
To prepare for the AI Regulation, you should act now. The following steps will help you gain clarity for your company:
- 1.Inventory: Identify all AI systems that are developed, purchased, or used in your company.
- 2.Role clarification: Determine your role for each system according to the definitions above. Are you a provider, user, or both?
- 3.Risk assessment: Classify each system according to the 4 risk classes of the AI Act.
- 4.Derive obligations: Analyze the specific requirements arising from your role and risk class.
- 5.Governance & Competence: Establish internal guidelines for AI use and plan the development of AI competence in your team.
Read Also
The 4 Risk Classes of the AI Act Simply Explained
The heart of the AI Act is the classification into risk classes. Understand the logic from prohibited AI to high-risk systems.
Read articleWhat is the AI Act? A Simple Explanation
A quick overview of the goals, scope, and key definitions of the new EU AI Regulation.
Read article