If your firm uses ChatGPT, Copilot, Harvey or any other artificial intelligence tool, the EU Artificial Intelligence Act already applies to you. And no, you can't ignore it until 2027.

The AI Act (Regulation EU 2024/1689) entered into force on 1 August 2024 with a staggered implementation timeline. What many firms don't realise is that Article 4 — the obligation to ensure AI literacy — has been in effect since 2 February 2025. It's already live.

What exactly does Article 4 say?

The text is intentionally broad: providers and deployers of AI systems must take measures to ensure a sufficient level of AI literacy among their staff and any other persons dealing with AI systems on their behalf.

What does "sufficient level" mean? It depends on context. The European Commission published a Q&A document in May 2025 clarifying the criteria, but the essence is this: people who use AI tools in your organisation must understand what AI is, how it works, which systems are in use, and the associated risks and opportunities.

Who does it affect?

Everyone who uses AI. Not just tech companies. If a solicitor in your firm uses ChatGPT to draft documents, or an accountant uses Copilot to analyse data, your firm is a "deployer" of AI systems under the Regulation's definition.

The Commission has confirmed that "other persons" referred to in the article includes not only employees but also contractors, service providers and even clients who use AI systems on the organisation's behalf. It's a broad interpretation.

Are there penalties for non-compliance?

There's no specific direct fine for breaching Article 4 alone. But — and this is important — supervision and enforcement by national market surveillance authorities begins on 2 August 2026.

In practice, a lack of AI training will be treated as an aggravating factor in investigations for other AI Act breaches. If your firm uses a high-risk AI system and something goes wrong, and the investigation reveals that responsible staff lacked adequate training, regulators will use that gap to justify more severe penalties. We're talking fines of up to €15 million or 3% of global turnover.

What should my firm do?

1. Audit which AI tools are being used

Map every AI tool your team uses: ChatGPT, Copilot, Harvey, transcription tools, document analysis, chatbots... Include both officially sanctioned tools and those staff may be using on their own initiative.

2. Assess training needs

Not everyone needs the same level of knowledge. A layered approach works well: basic general training for all staff, role-specific training for those who use AI daily, and technical training for those responsible for oversight.

3. Document everything

Documentation is key. Record what training has been given, to whom, when and on what topics. If a regulator ever asks, you need to be able to demonstrate that you took reasonable measures.

4. Establish usage policies

Define clear rules: which tools are authorised, what they can be used for, what information must not be entered into them (client personal data, for example) and who is responsible for overseeing their use.

5. Review periodically

AI evolves quickly. The policies and training you establish today will need updating. The European Commission has described its guidance as a "living document" that will be updated over time.

The broader AI Act timeline

Article 4 is just the beginning. The full timeline includes important milestones: since August 2025, governance rules and obligations for general-purpose AI models have been in effect. In August 2026, transparency rules and requirements for high-risk AI systems will apply. And by August 2027, providers of models launched before August 2025 must be fully compliant.

An opportunity, not just a burden

The AI Act isn't just a regulatory obligation. Firms that get ahead of it will have a real competitive advantage: they'll be able to offer their clients the reassurance that they work with a firm that understands and complies with the world's most advanced AI regulation.

For solicitors in particular, this is doubly relevant: not only must they comply themselves, but their clients will ask them how to comply. Being prepared means being relevant.