Over the next 18 months, three forces collide: the EU AI Act’s model transparency duties, the EU Cyber Resilience Act vulnerability reporting start date (11 September 2026), and the operationalisation of ISO 42001 and the NIST AI RMF across enterprises. The connective tissue is an AI Bill of Materials (AI BOM), a structured inventory of models, datasets, training lineage, dependencies, and risks. If you are building, buying, or deploying AI, you will soon be asked to show it. Here is what to know and how to stand up an AI BOM in 90 days.
An AI BOM is a machine readable inventory that lists every dataset, model, and software component used to build and operate an AI system, including versions, sources, training and evaluation artefacts, and relationships. Think of it as SBOM plus for AI. It captures model lineage and data provenance, not just code libraries.
The concept has matured quickly. OWASP runs an AI BOM project with guidance and a generator initiative under its GenAI Security workstream. Security leaders and vendors are publishing implementation guides, and researchers have proposed extending the SPDX SBOM standard to cover AI supply chains.
Why it matters: Without a transparent record of training data, fine tuning steps, embedded third party models, and evaluation results, you cannot credibly attest to safety, IP provenance, bias mitigation, or respond quickly when a dependency becomes vulnerable.
The EU AI Office’s GPAI Code of Practice (published 10 July 2025) provides a Model Documentation Form for transparency obligations under Article 53 and sets out safety and security expectations for systemic risk models under Article 55. Obligations for new GPAI models began in August 2025, with enforcement powers beginning in August 2026 and legacy model compliance required by August 2027. An AI BOM operationalises this documentation at scale.
High risk AI requirements become fully applicable on 2 August 2026.
From 11 September 2026, manufacturers must report actively exploited vulnerabilities and severe security incidents for all products with digital elements. The ENISA single reporting platform becomes operational for this purpose. AI BOMs help tie model and dataset provenance to SBOM style vulnerability processes, which becomes essential once these reporting timelines begin.
ISO 42001 provides a management system for AI. The NIST AI RMF and its Generative AI Profile (July 2024) provide risk based controls for GenAI. An AI BOM becomes the evidence spine linking these frameworks to concrete assets, versions, and documentation.
A robust AI BOM includes:
There is active experimentation with extending the SPDX standard for AI artefacts.
Financial sector entities required to comply with DORA from 17 January 2025 can strengthen their incident reporting and third party oversight with AI BOM evidence. NIS2 enforcement (from October 2024 to 2026 depending on the member state) similarly benefits from explicit AI dependency mapping.
Is AI BOM a formal standard?
Not yet. OWASP is leading practical work and researchers have proposed SPDX extensions.
How does it relate to model cards?
Model cards are human readable. AI BOMs are machine readable inventories that support security, audit, and compliance.
Do I need an AI BOM if I only deploy vendor models?
Yes. You still control configuration, guardrails, datasets, and risk mitigations. Request vendor documentation aligned to the GPAI Code and an AI BOM excerpt.