FAQ: AI Literacy Obligations (EU AI Act)

As a Trooper.AI customer, you may be developing or deploying AI systems on our infrastructure. That’s why we’ve put together this FAQ to help you better understand your obligations under the EU Artificial Intelligence Act (AI Act) — particularly when it comes to AI Literacy.


What is “AI Literacy”?

AI Literacy refers to the ability of individuals to understand, interact with, and make informed decisions about AI systems. According to Article 3(56) of the EU AI Act, it includes:

  • Understanding how AI works in general,
  • Being aware of its capabilities and limitations,
  • Knowing how to use it safely, ethically, and responsibly.

Who needs to be AI literate?

Anyone involved in the development, deployment, or use of an AI system should have a basic level of AI literacy. This includes:

  • Developers and data scientists,
  • Product managers,
  • Business stakeholders,
  • End users who interact with the AI system,
  • Support staff and external contractors.

The depth of understanding required depends on the role and context of use.


I’m using Trooper.AI GPU servers to run my own model. Do I need to care?

Yes — if you’re deploying your own AI solution, you’re considered a “deployer” under the EU AI Act. This means:

  • You are responsible for ensuring your team understands the AI system’s purpose and behavior.
  • You should provide training or documentation that promotes safe and informed usage.

Even if your system is not classified as high-risk, AI literacy is still required under Article 4 of the AI Act.


What should AI literacy training include?

While there’s no fixed curriculum, good AI literacy programs often cover:

  • What the AI system does and how it makes decisions,
  • Common risks (e.g. bias, misuse, misinterpretation),
  • Ethical considerations and human oversight,
  • Legal requirements (e.g. data protection, fairness),
  • Practical guidelines for safe and appropriate usage.

Training can take the form of internal workshops, onboarding material, e-learning modules, or documentation.


What if I only use pre-built models like GPT or Copilot?

Even when using pre-built models, your team still needs to understand:

  • That they are interacting with an AI system,
  • What the system is designed to do — and what it’s not,
  • How to recognize and respond to incorrect or harmful outputs.

If you use AI in a business context (e.g. for decision-making, analysis, or customer interactions), this awareness is part of your compliance responsibility.


Is there a deadline to comply with the AI literacy requirement?

Yes. While Article 4 of the EU AI Act is already in force since February 2, 2025, enforcement by national authorities is expected to begin from August 2, 2026. That gives you time to:

  • Assess your current knowledge levels,
  • Provide appropriate training,
  • Document your efforts.

We recommend starting as soon as possible to avoid last-minute compliance issues.


Does Trooper.AI provide AI literacy training?

Trooper.AI does not offer in-house training, but:

  • We provide educational content and best practices in our documentation,
  • We’re happy to point you toward trusted learning resources,
  • We’re available to help you evaluate whether your current practices meet your responsibilities.

See also: AI Literacy


What if I have questions about my specific use case?

Please reach out to us! We’re happy to help clarify how the AI Act may apply to your specific deployment. Email us at:

đź“§ support@trooper.ai

Together, we can build powerful and responsible AI systems that benefit everyone.