The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there’s some connection to the EU – to suppliers and deployers of any AI programs.
The AI Act provides little away on what compliance would appear to be although. Luckily, the Fee’s AI Workplace just lately supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI programs should “take measures to make sure, to their greatest extent, a adequate degree of AI literacy of their employees and different individuals coping with the operation and use of AI programs on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related folks with “the required notions” to make knowledgeable choices about AI programs.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness in regards to the alternatives and dangers of AI and attainable hurt it will possibly trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI programs.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so may very well be contractors, service suppliers, or purchasers.
What’s a “adequate” degree of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations have to tailor their strategy – for instance, organisations utilizing high-risk AI programs would possibly want “extra measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI programs are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching remains to be wanted on related dangers comparable to hallucination.
The Fee doesn’t plan to offer sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to think about whether or not they perceive the dangers and methods to keep away from or mitigate them, and different related data such because the authorized and moral elements of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, in actual fact AI literacy is extra necessary for people within the loop. To supply real oversight, they should perceive the AI programs they’re overseeing.
What are the results of not doing it?
Enforcement will probably be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties may very well be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there will probably be cooperation with the AI Board and all related authorities to make sure coherent software of the principles.
The element on what enforcement will appear to be can be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be prone to be taken into consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the opportunity of personal enforcement, and people suing for damages – but in addition acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to appear to be – however, finally, because it highlights, what’s “adequate” will probably be private to every organisation.
To form an AI literacy programme, it’s going to first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, purchasers, and affected individuals.
- What does every group already know and what does every group have to know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists might have to give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them may very well be acceptable.
- What medium could be most acceptable? E.g. a workshop format would possibly work nicely for AI governance committee members or information scientists, whereas an e-learning may very well be adequate for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and make sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is crucial for constructing a robust AI governance programme geared up to handle the vary of authorized and organisational dangers that include AI use.
The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there’s some connection to the EU – to suppliers and deployers of any AI programs.
The AI Act provides little away on what compliance would appear to be although. Luckily, the Fee’s AI Workplace just lately supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI programs should “take measures to make sure, to their greatest extent, a adequate degree of AI literacy of their employees and different individuals coping with the operation and use of AI programs on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related folks with “the required notions” to make knowledgeable choices about AI programs.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness in regards to the alternatives and dangers of AI and attainable hurt it will possibly trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI programs.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so may very well be contractors, service suppliers, or purchasers.
What’s a “adequate” degree of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations have to tailor their strategy – for instance, organisations utilizing high-risk AI programs would possibly want “extra measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI programs are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching remains to be wanted on related dangers comparable to hallucination.
The Fee doesn’t plan to offer sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to think about whether or not they perceive the dangers and methods to keep away from or mitigate them, and different related data such because the authorized and moral elements of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, in actual fact AI literacy is extra necessary for people within the loop. To supply real oversight, they should perceive the AI programs they’re overseeing.
What are the results of not doing it?
Enforcement will probably be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties may very well be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there will probably be cooperation with the AI Board and all related authorities to make sure coherent software of the principles.
The element on what enforcement will appear to be can be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be prone to be taken into consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the opportunity of personal enforcement, and people suing for damages – but in addition acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to appear to be – however, finally, because it highlights, what’s “adequate” will probably be private to every organisation.
To form an AI literacy programme, it’s going to first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, purchasers, and affected individuals.
- What does every group already know and what does every group have to know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists might have to give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them may very well be acceptable.
- What medium could be most acceptable? E.g. a workshop format would possibly work nicely for AI governance committee members or information scientists, whereas an e-learning may very well be adequate for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and make sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is crucial for constructing a robust AI governance programme geared up to handle the vary of authorized and organisational dangers that include AI use.
The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there’s some connection to the EU – to suppliers and deployers of any AI programs.
The AI Act provides little away on what compliance would appear to be although. Luckily, the Fee’s AI Workplace just lately supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI programs should “take measures to make sure, to their greatest extent, a adequate degree of AI literacy of their employees and different individuals coping with the operation and use of AI programs on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related folks with “the required notions” to make knowledgeable choices about AI programs.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness in regards to the alternatives and dangers of AI and attainable hurt it will possibly trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI programs.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so may very well be contractors, service suppliers, or purchasers.
What’s a “adequate” degree of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations have to tailor their strategy – for instance, organisations utilizing high-risk AI programs would possibly want “extra measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI programs are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching remains to be wanted on related dangers comparable to hallucination.
The Fee doesn’t plan to offer sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to think about whether or not they perceive the dangers and methods to keep away from or mitigate them, and different related data such because the authorized and moral elements of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, in actual fact AI literacy is extra necessary for people within the loop. To supply real oversight, they should perceive the AI programs they’re overseeing.
What are the results of not doing it?
Enforcement will probably be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties may very well be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there will probably be cooperation with the AI Board and all related authorities to make sure coherent software of the principles.
The element on what enforcement will appear to be can be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be prone to be taken into consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the opportunity of personal enforcement, and people suing for damages – but in addition acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to appear to be – however, finally, because it highlights, what’s “adequate” will probably be private to every organisation.
To form an AI literacy programme, it’s going to first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, purchasers, and affected individuals.
- What does every group already know and what does every group have to know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists might have to give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them may very well be acceptable.
- What medium could be most acceptable? E.g. a workshop format would possibly work nicely for AI governance committee members or information scientists, whereas an e-learning may very well be adequate for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and make sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is crucial for constructing a robust AI governance programme geared up to handle the vary of authorized and organisational dangers that include AI use.
The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there’s some connection to the EU – to suppliers and deployers of any AI programs.
The AI Act provides little away on what compliance would appear to be although. Luckily, the Fee’s AI Workplace just lately supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI programs should “take measures to make sure, to their greatest extent, a adequate degree of AI literacy of their employees and different individuals coping with the operation and use of AI programs on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related folks with “the required notions” to make knowledgeable choices about AI programs.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness in regards to the alternatives and dangers of AI and attainable hurt it will possibly trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI programs.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so may very well be contractors, service suppliers, or purchasers.
What’s a “adequate” degree of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations have to tailor their strategy – for instance, organisations utilizing high-risk AI programs would possibly want “extra measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI programs are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching remains to be wanted on related dangers comparable to hallucination.
The Fee doesn’t plan to offer sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to think about whether or not they perceive the dangers and methods to keep away from or mitigate them, and different related data such because the authorized and moral elements of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, in actual fact AI literacy is extra necessary for people within the loop. To supply real oversight, they should perceive the AI programs they’re overseeing.
What are the results of not doing it?
Enforcement will probably be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties may very well be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there will probably be cooperation with the AI Board and all related authorities to make sure coherent software of the principles.
The element on what enforcement will appear to be can be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be prone to be taken into consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the opportunity of personal enforcement, and people suing for damages – but in addition acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to appear to be – however, finally, because it highlights, what’s “adequate” will probably be private to every organisation.
To form an AI literacy programme, it’s going to first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, purchasers, and affected individuals.
- What does every group already know and what does every group have to know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists might have to give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them may very well be acceptable.
- What medium could be most acceptable? E.g. a workshop format would possibly work nicely for AI governance committee members or information scientists, whereas an e-learning may very well be adequate for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and make sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is crucial for constructing a robust AI governance programme geared up to handle the vary of authorized and organisational dangers that include AI use.