Last updated: August 3, 2025

On August 2, 2025, a critical milestone in European AI regulation takes effect: the EU AI Act’s comprehensive obligations for providers of “new” general-purpose AI models (GPAI models) become legally binding.

This represents the second major compliance deadline under the landmark European Union Artificial Intelligence Act, following the February 2025 implementation of prohibited AI practices and AI literacy requirements.

What Are General-Purpose AI Models? (GPAI Definition EU AI Act)

General-Purpose AI Model Definition: Under Article 3 of the EU AI Act, a general-purpose AI model is legally defined as “an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications.

Examples of General-Purpose AI Models:

  • Large Language Models (LLMs) like GPT, Claude, Llama
  • Multimodal AI systems (text, image, audio processing)
  • Foundation models used across multiple applications
  • AI models designed for integration into various downstream systems

This definition specifically excludes AI models used solely for “research, development or prototyping activities before they are placed on the market.”


European AI Act Compliance Course: From Basics to Full Mastery

European AI Act Compliance Course: From Basics to Full Mastery

The EU AI Act is here—and compliance is now a must. This course gives you the tools to turn complex AI regulation into action. Learn the Act’s core principles, risk categories, and obligations, then put them into practice with ready-to-use templates and checklists.

€299

EU AI Act Timeline: New vs Old General-Purpose AI Models

The EU AI Act is being rolled out in stages, with different obligations taking effect over time. Understanding when specific rules apply—especially for high-risk systems and general-purpose AI models—is critical for staying ahead of compliance. Here’s a breakdown of the most important dates you need to know.

New GPAI Models (placed on market from August 2, 2025):

  • Immediate compliance required from August 2, 2025
  • Must meet all Article 53 obligations from day one
  • No grace period for new market entrants

Old GPAI Models (placed on market before August 2, 2025):

  • Extended deadline: August 2, 2027 for full compliance
  • Two-year grace period for existing models
  • Allows existing providers time to adapt operations

Key question to ask yourself is, “When was your AI model first placed on the EU market?” This determines your compliance timeline under the EU AI Act.

Four Core EU AI Act Obligations for GPAI Providers (Article 53)

Technical Documentation Requirements (Article 53(1)(a))

Mandatory Documentation:

  • Comprehensive technical documentation of the AI model
  • Training and testing process documentation
  • Evaluation results and performance metrics
  • Information specified in Annex XI of the EU AI Act
  • Must be kept up-to-date and available upon request

Regulatory Access: Documentation must be provided to:

  • AI Office (Brussels-based EU regulator)
  • National competent authorities
  • Available within reasonable timeframes upon official request

Downstream Provider Information (Article 53(1)(b))

Documentation for AI System Integrators:

  • Clear information on model capabilities and limitations
  • Technical specifications enabling downstream compliance
  • Elements specified in Annex XII of the EU AI Act
  • Balance between transparency and intellectual property protection

Purpose: Enable AI system providers to:

  • Understand model capabilities and limitations
  • Comply with their own EU AI Act obligations
  • Make informed integration decisions according to the Code of Practice for GPAI providers

Copyright Compliance Policy (Article 53(1)(c))

Legal Requirements:

  • Policy complying with EU copyright law
  • Specific compliance with Directive (EU) 2019/790
  • Identification of rights reservations in training data
  • Use of “state-of-the-art technologies” for rights identification
  • Respect for copyright and related rights throughout model development

4. Training Data Summary (Article 53(1)(d))

Public Transparency Requirement:

  • Sufficiently detailed to understand data sources and composition
  • Detailed summary of training data content
  • Must be made publicly available
  • Follow template provided by AI Office

Are you ready for the EU AI Act’s deadline?

eyreACT’s AI Act compliance platform will help organisations like yours seamlessly navigate these complex requirements. Be among the first to access our comprehensive solution for AI system classification, risk assessment, and ongoing compliance management.

Open-Source AI Models: EU AI Act Exemptions and Limitations

Partial Exemptions for Open-Source GPAI Models

Exempted from Article 53 obligations:

  • Technical documentation requirements (Article 53(1)(a))
  • Downstream provider information requirements (Article 53(1)(b))

Still required for open-source models:

  • Copyright compliance policy (Article 53(1)(c))
  • Training data summary (Article 53(1)(d))

Open-Source Qualification Criteria (Article 53(2))

Must meet ALL criteria:

  • Released under free and open-source license
  • Allow access, usage, modification, and distribution
  • Parameters publicly available (including weights)
  • Model architecture information publicly available
  • Model usage information publicly available

Critical Exception: Systemic Risk Models

No exemptions for high-compute models: Open-source general-purpose AI models with systemic risk (exceeding 10^25 FLOPs in training) face full compliance obligations regardless of open-source status.

Examples of potentially affected models:

  • Large open-source LLMs (Llama 3, Mistral Large, etc.)
  • High-parameter multimodal models
  • Any open-source model above the 10^25 FLOP threshold

EU AI Act Codes of Practice for GPAI Compliance

Upcoming Codes of Practice Framework

Development timeline:

  • Codes of practice deadline: May 2, 2025
  • Purpose: Provide concrete guidance for demonstrating Article 53 compliance
  • Legal status: Compliance with approved codes creates presumption of conformity

Compliance options:

  1. Follow approved codes of practice (presumption of conformity)
  2. Comply with European harmonised standards (when available)
  3. Demonstrate alternative adequate compliance (requires Commission assessment)

Key benefit: Codes of practice will translate abstract legal requirements into practical, actionable steps for GPAI providers.

Enforcement: AI Office and National Authorities

Centralised EU Enforcement

AI Office responsibilities:

  • Primary regulator for general-purpose AI models
  • Located in Brussels, Belgium
  • Centralised approach ensures consistent interpretation
  • Authority to request technical documentation
  • Oversight of systemic risk models

National competent authorities:

  • Support AI Office enforcement
  • Focus on AI systems integrating GPAI models
  • Coordinate with AI Office on compliance matters

Penalties and Consequences

Non-compliance risks:

  • Administrative fines up to 3% of annual worldwide turnover
  • Market access restrictions in EU
  • Reputational damage and regulatory scrutiny
  • Potential criminal liability for serious violations

Systemic Risk Models: Additional EU AI Act Obligations

10^25 FLOPs Threshold (Article 55)

Computational threshold: General-purpose AI models exceeding 10^25 floating-point operations (FLOPs) in training computation face additional “systemic risk” obligations.

Additional requirements for systemic risk models:

  • Model evaluations and systemic risk assessments
  • Risk mitigation measures implementation
  • Cybersecurity protection requirements
  • Incident reporting obligations to AI Office
  • Red-teaming and adversarial testing
  • Monitoring of downstream use where feasible

No open-source exemptions: Even open-source models above this threshold must comply with full systemic risk obligations.

Examples of Potentially Affected Models

  • GPT-4 class models and larger
  • Claude 3 Opus and equivalent models
  • Large multimodal models (text, image, audio)
  • Next-generation foundation models

EU AI Act Compliance Checklist for GPAI Providers

Immediate Actions (Before August 2, 2025)

Assessment phase:

  •  Determine if your AI model qualifies as “general-purpose” under Article 3 definition
  •  Calculate training computation (FLOPs) to assess systemic risk threshold
  •  Identify market placement date (before or after August 2, 2025)
  •  Evaluate open-source licensing options and qualification criteria

Documentation preparation:

  •  Develop technical documentation system (Annex XI requirements)
  •  Create downstream provider information package (Annex XII requirements)
  •  Implement copyright compliance policy and procedures
  •  Prepare comprehensive training data summary for public disclosure

Operational readiness:

  •  Monitor AI Office codes of practice development (due May 2, 2025)
  •  Establish communication channels with AI Office and national authorities
  •  Set up incident reporting procedures (for systemic risk models)
  •  Plan ongoing compliance monitoring and documentation updates

Strategic Considerations for AI Companies

Market timing decisions:

  • Consider impact of August 2, 2025 deadline on product launch schedules
  • Evaluate benefits of pre-deadline vs. post-deadline market entry
  • Assess competitive implications of compliance requirements

Open-source strategy evaluation:

  • Analyze benefits and limitations of open-source licensing for compliance
  • Consider systemic risk threshold implications for open-source models
  • Evaluate community and business model implications

Resource allocation planning:

  • Consider third-party compliance support and legal counsel
  • Budget for compliance infrastructure development
  • Plan for ongoing compliance monitoring and reporting

Book a Demo and Simplify AI Act compliance

EU AI Act is more complex than GDPR but we help you nail it. From automated AI system classification to ongoing risk monitoring, we’re creating the platform of developer-friendly, business-friendly tools you need to confidently deploy AI within the regulatory European framework.

Frequently Asked Questions: EU AI Act GPAI Compliance

Q: What happens if my AI model is already on the market before August 2, 2025?

A: Your model is considered “old” and has until August 2, 2027 to achieve full compliance with Article 53 obligations.

Q: Do open-source models need to comply with all requirements?

A: Open-source models are exempt from technical documentation and downstream provider information requirements, but must still implement copyright compliance policies and publish training data summaries. Models with systemic risk (>10^25 FLOPs) must comply with all requirements regardless of open-source status.

Q: How is “placed on the market” defined?

A: The EU AI Act defines this as “the first making available of an AI system on the Union market.” This typically occurs when a model is first offered for commercial use or integration by third parties.

Q: What are the penalties for non-compliance?

A: Administrative fines can reach up to 3% of annual worldwide turnover, along with potential market access restrictions and other regulatory consequences.

Q: When will the codes of practice be available?

A: Codes of practice must be ready by May 2, 2025, providing three months for providers to understand and implement guidance before the August 2, 2025 deadline.

Looking Forward: GPAI Compliance Evolution Globally

The August 2025 deadline represents a significant moment in AI governance, as it will be the first major test of how the AI Act applies to the foundation models that underpin much of today’s AI innovation. The success of this implementation phase will likely influence how the broader AI Act provisions are interpreted and enforced as they come into effect in August 2026 as well as for other countries to follow the EU lead.

Model providers should begin preparation immediately, as the technical documentation and policy requirements will take significant time to develop and implement properly. The upcoming codes of practice, expected by May 2025, will provide crucial guidance for practical compliance approaches.

This regulatory milestone reflects the EU’s commitment to establishing comprehensive AI governance while balancing innovation needs with risk management and transparency requirements.


Key Definitions

TermDefinition
GPAI (General-Purpose AI)AI models designed to perform a wide range of tasks and adaptable across multiple domains without fine-tuning for a specific application.
Code of PracticeA voluntary set of principles and guidelines designed to help organizations meet regulatory requirements.
AI ActThe European Union’s comprehensive regulation on artificial intelligence, setting obligations for various categories of AI systems.
Systemic RiskRefers to AI models with potential wide-scale impact, including risks to public safety or democratic processes, as outlined in Article 55.
TransparencyRequirements to disclose how AI models work, their capabilities, limitations, and training methods.
Copyright (in AI context)Legal obligations related to using copyrighted content in AI training and ensuring the rights of content creators are respected.
Safety and SecurityMeasures ensuring AI systems operate reliably and do not pose harm to users, systems, or society at large.
Multi-Stakeholder ProcessA collaborative development approach involving representatives from industry, academia, government, and civil society.
SignatoryAn organization that formally agrees to adhere to the Code of Practice.
AI OfficeThe EU body responsible for implementing and monitoring compliance with the AI Act, including coordination of the Code of Practice.

Important Disclaimer

We will update this article once further information or clarifications are released by the European Commission and AI Office. Any new information should be reviewed directly—as of now, this is the latest verified information available.

Website |  + posts