GUIDELINES ON GPAI UNDER THE AI ACT

Introduction  

With the adoption of Regulation (EU) 2024/1689 (the “AI Act”), the European Union has introduced a horizontal framework governing the entire life‑cycle of artificial‑intelligence systems. Article 96(1) of the AI Act mandates the Commission to issue guidelines clarifying the practical application of the Regulation (the “Guidelines”). The Guidelines cover four core areas: definition of a GPAI model (Section 2);identification of the provider placing a GPAI model on the market (Section 3);exemption conditions for certain models released under an open‑source license (Section 4); and enforcement methods and compliance timelines (Section 5).

When is a model qualified as GPAI?

A model is generally considered GPAI if:

  • the training compute exceeds 10²³ FLOP (“Floating Point Operations”);
  • it is capable of generating language (text or audio) or visual content (text to image or text to video);
  • it displays significant generality, i.e. it can competently perform a broad range of distinct tasks.

A model that surpasses the 10²³ FLOP threshold but is confined to a single purpose (e.g. pure speech‑to‑text transcription, image upscaling, inpainting, etc.) may not fall within the GPAI category, absent the required generality.

The GPAI life-cycle and related obligations

The life‑cycle of a GPAI begins with the first large pre‑training run. From that moment, the provider must comply with the obligations provided by the AI Act including:

  • prepare and keep up to date the technical documentation (Art. 53(1)(a)‑(b) AI Act);
  • adopt an EU copyright‑compliance policy (Art. 53(1)(c));
  • publish a public summary of the training data (Art. 53(1)(d)); and
  • for GPAI with systemic risk (>10²⁵ FLOP or designated by the Commission), carry out continuous risk assessment, mitigation and reporting of serious incidents (Art. 55).

These measures must span the entire life‑cycle of the model, including any subsequent re‑warm or fine‑tuning stages after it is placed on the market.

Who is (and when it becomes) the provider

A provider is the person who develops, or has developed, a GPAI model and places it on the Union market for the first time, whether for payment or free of charge. Placement can occur, inter alia, via:

  • software libraries, packages or public repositories (e.g. direct download);
  • APIs or cloud services;
  • on‑premise or edge distribution; or
  • integration in chatbots, mobile apps or core internal services delivered to third parties in the EU.

Where a GPAI model is developed by multiple natural persons or legal entities, the provider is deemed to be the actor that assumes overall leadership of the project, i.e. the project coordinator. The Commission nonetheless keeps a degree of flexibility, stressing that such scenarios, undoubtedly among the most debated and delicate, must be examined on a case-by-case basis. In every instance, the assessment must be anchored in the definition of “placing on the market” set out in the Blue Guide.

The AI “production line” typically involves a network of actors who contribute not only to the development but also to the availability of the software. Paragraph 59 of the Guidelines is therefore crucial: “The upstream actor should then be considered the provider of the model, unless the upstream actor has excluded, in a clear and unequivocal way, the distribution and use of the model on the Union market, including its integration into AI systems that are intended to be placed on the Union market or put into service in the Union”.

Where a GPAI is embedded in an AI system that thereby becomes itself “general‑purpose”, it is necessary to distinguish:

  • the original provider of the model, responsible for GPAI‑specific obligations; and
  • any downstream modifier who introduces significant changes to the model.

A strong indicator to consider the modifier a provider is when the training compute used for the modification exceeds one‑third of the original model. If this value is unknown, the one‑third threshold applies to:

  • 10²⁵ FLOP if the base model already has systemic risk; or
  • 10²³ FLOP in all other cases.

Open-source exemption: key conditions

Providers that release the model under a free and open‑source license and make publicly available the parameters, weights, architecture and usage information are exempt from certain obligations namely:

  • drafting/updating technical documentation for the AI Office and for downstream actors (Art. 53(1)(a)‑(b)); and
  • where applicable appointing an authorized representative in the EU in accordance with Art. 54.

The exemption does not apply if the model:

  • is classified as GPAI with systemic risk;
  • is distributed under licenses restricting use (e.g. research‑only or non‑commercial terms) or imposing fees or personal‑data collection as a condition of access/use; or
  • fails to publish the complete technical information required for integration.

Even where the exemption applies, the provider is still required to comply with the applicable provisions including the adoption of a copyright‑compliance policy.

Lexify as Your Legal Advisor

The Guidelines provide an operational roadmap for all stakeholders in the AI value chain, clarifying qualification criteria, scope of obligations and possible exemptions. Lexify continuously monitors regulatory developments and assists European and international providers in launching and operating GPAI models in compliance with the AI Act. For further information or support, our legal team is at your disposal.

Connect with us

Thank you for taking the time to read our article. We hope you found it informative and engaging. If you have any questions, feedback, or would like to explore our services further, we’re here to assist you.

Follow Us

Stay updated and connected with us on social media for the latest news, insights, and updates:

Linkedin Lexify
Retour en haut