{"id":96541,"date":"2025-08-27T14:53:11","date_gmt":"2025-08-27T14:53:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/96541\/"},"modified":"2025-08-27T14:53:11","modified_gmt":"2025-08-27T14:53:11","slug":"ai-literacy-is-the-next-big-compliance-challenge-for-business","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/96541\/","title":{"rendered":"AI literacy is the next big compliance challenge for business"},"content":{"rendered":"<p>        Businesses can no longer rely on simply deploying AI responsibly, they must also prove their people understand how to use it<\/p>\n<p>            <img decoding=\"async\" loading=\"lazy\" alt=\"\" src=\".\/media_1456050df3871c5e9b6593062bd85f64711f1e920.png?width=750&amp;format=png&amp;optimize=medium\" width=\"949\" height=\"604\"\/><\/p>\n<p>Nobody reading Computing will need persuading of the growth of AI. Figures reveal 78% of global companies use AI, <a href=\"https:\/\/explodingtopics.com\/blog\/companies-using-ai\" rel=\"nofollow noopener\" target=\"_blank\">with 71% deploying GenAI<\/a> in at least one function. And the spread of AI combined with a lack of understanding of basic tech principles is at the heart of regulatory change, with a new concentration from legislators on AI literacy.<\/p>\n<p>As usage rises, so do regulatory expectations. A new and little-discussed provision of the EU AI Act &#8211; Article 4 on AI literacy &#8211; places a clear obligation on organisations to ensure that all their people (including contractors and suppliers) understand the tools they\u2019re using.<\/p>\n<p>This provision, which came into effect on 2nd February 2025, creates new compliance risks and operational challenges, especially for UK businesses trading in the EU. Whilst regulatory enforcement starts in August 2026, we\u2019re seeing private litigation threatened to enforce literacy obligations.<\/p>\n<p>        AI knowledge isn\u2019t just for IT teams<\/p>\n<p>Article 4 states users and those impacted by AI systems must have \u201csufficient AI literacy to make informed decisions.\u201d That doesn\u2019t just mean IT teams, developers or data scientists; it extends to HR staff using AI in hiring, marketing teams using GenAI, and even third-party contractors.<\/p>\n<p>It\u2019s easy for some organisations to assume the AI literacy provisions don\u2019t apply to them simply because they\u2019re not in the tech industry. However, deployers of AI systems are also included. This could catch many organisations that don\u2019t think they are dealing with AI at all.<\/p>\n<p>The European Commission published additional guidance this spring, defining AI literacy as the \u201cskills, knowledge and understanding\u201d needed to use or interact with AI systems responsibly. This includes:<\/p>\n<p>          Understanding how AI systems work and data they use;<br \/>\n          Recognising risks such as hallucination, discrimination or bias;<br \/>\n          Knowing when and how to apply human oversight;<br \/>\n          Being aware of legal obligations under the EU AI Act and other relevant frameworks.<\/p>\n<p>The time has come for businesses to fully get to grips with AI and to train staff to prevent misuse.<\/p>\n<p>        Who do the rules apply to?<\/p>\n<p>The scope of Article 4 is broad. It applies to any organisation using AI in the EU, even if they\u2019re based elsewhere. That includes UK businesses deploying AI tools within EU operations or offering AI-enabled services into EU markets.<\/p>\n<p>Crucially, non-compliance doesn\u2019t just affect the tech team. If a customer service chatbot misleads users or a hiring algorithm perpetuates bias the business could be held liable.<\/p>\n<p>As regulators sharpen their focus, the risks associated with Shadow AI are also rising. AI bans just don\u2019t work at best switching AI use to personal devices where the risk of harm may be greater. According to a McKinsey study, 90% of employees use AI and 21% are heavy users. That\u2019s why staff training and clear policies are essential.<\/p>\n<p>There\u2019s a generational risk too. Digital natives are more likely to find the tools they need to do the job via social media or by search. Without proper guidance, this can open organisations up to risk. Including everyone in a well thought out AI literacy programme can reduce misuse and strengthen compliance.<\/p>\n<p>        Consequences of non-compliance<\/p>\n<p>The AI literacy obligation came into effect on 2nd February 2025, but enforcement by national authorities across the EU begins from 3rd August 2026. EU Member States will determine their enforcement strategy and level of penalty. The European Commission highlighted that enforcement must be proportionate and based on the individual case. Factors such as gravity, intention, and negligence should be considered.<\/p>\n<p>The European AI Office exists to provide expertise, foster innovation, and coordinate regulatory approaches &#8211; but it does not directly enforce Article 4.<\/p>\n<p>Whilst the new EU regulatory regime takes shape currently, the primary risk for organisations not meeting AI literacy requirements is civil action, with various pressure groups actively monitoring AI use. Complaints could also be made to GDPR regulators if the use of personal data is not lawful, fair and reasonable. We\u2019ve seen complaints like this made already against several social media companies and a UK business involved in a <a href=\"https:\/\/noyb.eu\/en\/bumbles-ai-icebreakers-are-mainly-breaking-eu-law\" rel=\"nofollow noopener\" target=\"_blank\">popular online dating app<\/a> who used AI for \u2018icebreakers\u2019 in initial introductions<a href=\"https:\/\/word-edit.officeapps.live.com\/we\/wordeditorframe.aspx?ui=en-GB&amp;rs=en-US&amp;wopisrc=https%3A%2F%2Fthechannelco.sharepoint.com%2Fsites%2Fadobecms%2F_vti_bin%2Fwopi.ashx%2Ffiles%2Fb4c5a84b0ee7465b8e99d64242144805&amp;wdprevioussession=bbb5c4da%2D498c%2D7b39%2D00c1%2Db04c6a0126be&amp;wdenableroaming=1&amp;mscc=1&amp;hid=968DBFA1-90E5-A000-0FB4-C13029B22522.0&amp;uih=sharepointcom&amp;wdlcid=en-GB&amp;jsapi=1&amp;jsapiver=v2&amp;corrid=64a1f05f-16f8-c6e1-1d2a-39e1760ce931&amp;usid=64a1f05f-16f8-c6e1-1d2a-39e1760ce931&amp;newsession=1&amp;sftc=1&amp;uihit=docaspx&amp;muv=1&amp;ats=PairwiseBroker&amp;cac=1&amp;sams=1&amp;mtf=1&amp;sfp=1&amp;sdp=1&amp;hch=1&amp;hwfh=1&amp;dchat=1&amp;sc=%7B%22pmo%22%3A%22https%3A%2F%2Fthechannelco.sharepoint.com%22%2C%22pmshare%22%3Atrue%7D&amp;ctp=LeastProtected&amp;rct=Normal&amp;wdorigin=Other&amp;afdflight=95&amp;csiro=1&amp;instantedit=1&amp;wopicomplete=1&amp;wdredirectionreason=Unified_SingleFlush#sdfootnote1sym\" rel=\"nofollow noopener\" target=\"_blank\">1<\/a>.<\/p>\n<p>        Practical steps to prepare<\/p>\n<p>All of this shows businesses cannot afford to wait until 2026 to act. National regulators are already developing plans for audits and enforcement. Practical preparation means tackling both governance and culture.<\/p>\n<p>Here are five steps for legal and compliance teams to consider:<\/p>\n<p>          Map your AI estate<br \/>Conduct a comprehensive audit of AI systems used across your business. Include tools used for decision-making, customer interaction, or generating content\u2014whether built in-house or provided by a third party.<br \/>\n          Develop and deliver targeted AI literacy training<br \/>Training shouldn\u2019t be generic. It must be tailored to users\u2019 roles and risk exposure. For example, HR teams using AI in hiring must understand issues around bias, data protection, and explainability.<br \/>\n          Review contracts and third-party relationships<br \/>If vendors or service providers are using AI systems on your behalf, they may need to meet AI literacy standards too. Ensure these obligations are reflected in contracts.<br \/>\n          Create internal policies on AI use and governance<br \/>Establish clear policies on acceptable AI use, approval processes, and human review. Treat this with the same rigour as data protection or anti-bribery frameworks.<br \/>\n          Engage the board and embed a culture of responsible AI use<br \/>AI is now a board-level issue. Leadership must set expectations around responsible innovation, transparency, and compliance.<\/p>\n<p>        The bigger picture<\/p>\n<p>The introduction of Article 4 marks a clear regulatory shift. Businesses can no longer rely on simply deploying AI responsibly; they must also prove their people understand how to use it. Just as GDPR transformed how organisations handle data, the EU AI Act is reshaping how AI is implemented, monitored, and explained across the workforce. What was once good practice is now a legal obligation.<\/p>\n<p>Jonathan Armstrong is a Partner at <a href=\"https:\/\/puntersouthall.law\" rel=\"nofollow noopener\" target=\"_blank\">Punter Southall Law<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Businesses can no longer rely on simply deploying AI responsibly, they must also prove their people understand how&hellip;\n","protected":false},"author":2,"featured_media":96542,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[21],"tags":[733,4323,1346,1215,306,86,56,54,55],"class_list":{"0":"post-96541","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-artificial-intelligence","9":"tag-computing","10":"tag-financial-services","11":"tag-law","12":"tag-regulation","13":"tag-technology","14":"tag-uk","15":"tag-united-kingdom","16":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/96541","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=96541"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/96541\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/96542"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=96541"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=96541"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=96541"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}