{"id":495243,"date":"2026-03-25T22:13:07","date_gmt":"2026-03-25T22:13:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/495243\/"},"modified":"2026-03-25T22:13:07","modified_gmt":"2026-03-25T22:13:07","slug":"drcf-responsible-ai-forum-2026-what-businesses-need-to-know-part-2-new-technology","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/495243\/","title":{"rendered":"DRCF Responsible AI Forum 2026: What Businesses Need To Know (Part 2) &#8211; New Technology"},"content":{"rendered":"<p>&#13;<br \/>\n            To print this article, all you need is to be registered or login on Mondaq.com.&#13;\n    <\/p>\n<p>        Article Insights<\/p>\n<p>Lee   Ramsay\u2019s articles from Lewis Silkin are most popular:<\/p>\n<p>                    &#13;<br \/>\n                            with Senior Company Executives, HR and Finance and Tax Executives&#13;<br \/>\n                            in United Kingdom&#13;<br \/>\n                            with readers working within the Healthcare, Property and Securities &amp; Investment industries&#13;<\/p>\n<p>Somewhere on a leafy British road the above sign sits in quiet&#13;<br \/>\nabsurdity. Drivers read it, ponder its existence, and accelerate&#13;<br \/>\naway none the wiser. It&#8217;s a sign about a sign that isn&#8217;t in&#13;<br \/>\nuse.<\/p>\n<p>But all is not as strange as it first seems. Typically, the sign&#13;<br \/>\nis a placeholder which indicates that the variable message sign on&#13;<br \/>\na nearby gantry isn&#8217;t working yet or is being tested.<\/p>\n<p>AI regulation in the UK finds itself in similar territory: there&#13;<br \/>\nare signs that things are afoot, but concrete signposting seems&#13;<br \/>\nsome way off. The\u00a0<a href=\"https:\/\/www.ft.com\/content\/e759a712-eddf-4bdd-b4d9-03446f8c6545?syn-25a6b1a6=1\" target=\"_blank\" rel=\"nofollow noopener\">Financial Times<\/a>\u00a0recently reported that an&#13;<br \/>\nAI Bill isn&#8217;t looking likely in the next King&#8217;s Speech,&#13;<br \/>\nwhich is due to take place in mid-May.\u00a0<\/p>\n<p>Put simply, the UK is still in testing mode on AI regulation,&#13;<br \/>\ntrying to work out where it wants to go and how it wants to get&#13;<br \/>\nthere.<\/p>\n<p>This is why the\u00a0<a href=\"https:\/\/www.drcf.org.uk\/\" target=\"_blank\" rel=\"nofollow noopener\">Digital Regulation Cooperation&#13;<br \/>\nForum<\/a>\u00a0(DRCF) has assumed such importance in shaping&#13;<br \/>\nBritain&#8217;s current approach to AI. On 10 March 2026, the forum&#13;<br \/>\nconvened its second Responsible AI Forum in which all those&#13;<br \/>\ninvolved in AI could better understand the who, what, when, where,&#13;<br \/>\nwhy and how of UK AI rules in 2026.<\/p>\n<p>What follows is our second article on the practical lessons for&#13;<br \/>\nsenior leadership teams and general counsel who need to understand&#13;<br \/>\n(and thrive in) the shifting world of AI regulation.<\/p>\n<p>What is the DRCF?<\/p>\n<p>The DRCF launched in 2020 as a voluntary forum, to ensure a&#13;<br \/>\ngreater level of cooperation between regulators, given the unique&#13;<br \/>\nchallenges posed by regulation of online platforms. It brings&#13;<br \/>\ntogether four UK regulators with responsibilities for digital&#13;<br \/>\noversight: the Competition and Markets Authority; the Financial&#13;<br \/>\nConduct Authority; the Information Commissioner&#8217;s Office; and&#13;<br \/>\nOfcom.<\/p>\n<p>Agentic AI<\/p>\n<p>With the recent publication of the\u00a0<a href=\"https:\/\/ico.org.uk\/about-the-ico\/research-reports-impact-and-evaluation\/research-and-reports\/technology-and-innovation\/tech-horizons-and-ico-tech-futures\/ico-tech-futures-agentic-ai\/\" target=\"_blank\" rel=\"nofollow noopener\">ICO&#8217;s Tech Futures report on Agentic&#13;<br \/>\nAI<\/a>\u00a0we have some insight into the regulatory thinking&#13;<br \/>\naround this emerging technology. While the ICO work focusses on the&#13;<br \/>\ndata protection implications of deploying AI it was interesting to&#13;<br \/>\nhear from Ofcom, UKAI and Responsible Intelligence about trends and&#13;<br \/>\ndevelopments of agentic AI.<\/p>\n<p>Accountability in the agentic AI supply chain is one senior&#13;<br \/>\nleaders will be pondering before embarking on any projects. It is&#13;<br \/>\nclear there needs to be discussion around where boundaries lie&#13;<br \/>\nwithin the agentic ecosystem and the web of relationships this&#13;<br \/>\nemerging tech inevitably brings. If something were to go wrong&#13;<br \/>\n\u2013 and there are already many examples to choose from \u2013&#13;<br \/>\nhow will liability be assessed and determined and how will this&#13;<br \/>\nreflect on your brand&#8217;s reputation as well as the bottom&#13;<br \/>\nline?<\/p>\n<p>Place agentic AI in the consumer context and then you&#13;<br \/>\nsignificantly up the ante. Transparency and explainability will be&#13;<br \/>\nkey to compliance but also so consumers can understand exactly what&#13;<br \/>\nit is they are signing up to and how they will be protected in line&#13;<br \/>\nwith existing laws and guidance. We were directed to several&#13;<br \/>\npublications on agentic AI recently released by the CMA which&#13;<br \/>\nprovide insight into the regulator&#8217;s thinking:<\/p>\n<p>While agentic AI presents opportunities, senior leaders need to&#13;<br \/>\nstep back and take time to make informed decisions. It is important&#13;<br \/>\nto experiment, but also build in pilot phases and extensive testing&#13;<br \/>\nand feedback loops before launching such a product.\u00a0<\/p>\n<p>It is imperative to protect consumers from harm, whether that&#13;<br \/>\nharm is caused directly or indirectly, to avoid potential&#13;<br \/>\nexploitation, particularly for vulnerable consumers who may not&#13;<br \/>\nunderstand the possible outcome of using agentic AI, as well as&#13;<br \/>\nensuring competition is across the whole AI stack so consumers are&#13;<br \/>\nable to give clear consent to transparent decision making with&#13;<br \/>\ntheir best interests at heart.\u00a0<\/p>\n<p>If businesses don&#8217;t get this right they will likely find&#13;<br \/>\nthemselves falling foul of existing laws and potential accusations&#13;<br \/>\nof manipulation, price discrimination and lack of competition in&#13;<br \/>\nthe goods and services offered.<\/p>\n<p>Consumer trust is paramount and at the moment trustworthiness in&#13;<br \/>\nAI, let alone agentic AI, is a huge issue. Many people are aware of&#13;<br \/>\nfraud and misinformation but when it comes to agentic AI, unless&#13;<br \/>\nthere is transparency and explainability, as well as the right to&#13;<br \/>\nchallenge decisions, it seems the uptake will not match the hype.&#13;<br \/>\nThat said, responsible development with implemented safeguards,&#13;<br \/>\nallowing for genuine, freely given, informed consent might see a&#13;<br \/>\nvery different landscape in 12 months&#8217; time.\u00a0<\/p>\n<p>Children&#8217;s wellbeing\u00a0<\/p>\n<p>With the backdrop of the government&#8217;s consultation on UK&#13;<br \/>\nchildren&#8217;s digital wellbeing, covering social media age bans,&#13;<br \/>\ncurfews, AI chatbots and gaming (for more see our article\u00a0<a href=\"https:\/\/www.lewissilkin.com\/insights\/2026\/03\/02\/uk-government-launches-much-anticipated-consultation-on-childrens-online-wellbei-102mlli\" target=\"_blank\" rel=\"nofollow noopener\">here<\/a>), the DRCF hosted a session on growing up&#13;<br \/>\nwith chatbots.\u00a0<\/p>\n<p>Depending on your point of view there were some&#13;<br \/>\nfascinating\/terrifying statistics shared by the expert panel, e.g.&#13;<br \/>\none third of UK teenagers use a chatbot for an emotional&#13;<br \/>\nrelationship, 56% of UK teenagers believe AI can think, 23% believe&#13;<br \/>\nAI can feel emotions and 40% have no concerns about taking advice&#13;<br \/>\nfrom an AI chatbot.<\/p>\n<p>Children exhibit a high level of trust in chatbots, often&#13;<br \/>\nblurring the boundary between what is a chatbot and what is a&#13;<br \/>\nfriend. While LLMs are improving all the time, they don&#8217;t have&#13;<br \/>\nempathy \u2013 they are merely learning how to respond to&#13;<br \/>\nemotional questions, and we know they don&#8217;t always get the&#13;<br \/>\nresponse right.<\/p>\n<p>This is why the ICO&#8217;s\u00a0<a href=\"https:\/\/ico.org.uk\/for-organisations\/uk-gdpr-guidance-and-resources\/childrens-information\/childrens-code-guidance-and-resources\/age-appropriate-design-a-code-of-practice-for-online-services\/\" target=\"_blank\" rel=\"nofollow noopener\">Age Appropriate Design Code<\/a>\u00a0is so&#13;<br \/>\nimportant and for those exploring AI tools that are in scope, it is&#13;<br \/>\nessential to invest time and resources in getting this right. Media&#13;<br \/>\nand regulatory scrutiny in this area are at an all time high&#13;<br \/>\nglobally and no-one wants to be making headlines for the wrong&#13;<br \/>\nreasons.\u00a0<\/p>\n<p>Well-designed chatbots, that don&#8217;t process children&#8217;s&#13;<br \/>\nspecial category data, have safety by design features enabled, and&#13;<br \/>\nare segmented to deal with cognitive development do have a positive&#13;<br \/>\nrole to play in children&#8217;s lives.<\/p>\n<p>The discussion concluded that children and parents need to&#13;<br \/>\nupskill to understand the tech the children are being exposed to&#13;<br \/>\nand have informed conversations around its use. Companies operating&#13;<br \/>\nin this space need to carefully consider their legal, regulatory&#13;<br \/>\nand ethical obligations and ensure they always put the best&#13;<br \/>\ninterests of the child first. This raises nuanced questions about&#13;<br \/>\nrevenue models, creating dependency, doom scrolls etc. but that, as&#13;<br \/>\nthey say, is for another day!\u00a0<\/p>\n<p>UK Government AI roadmap<\/p>\n<p>To round off the day, Mary Jones, Director of AI Strategy and&#13;<br \/>\nPreparedness at DSIT provided an update on the UK government&#8217;s&#13;<br \/>\nAI Strategy. It may come as a surprise that 75% of the\u00a0<a href=\"https:\/\/www.gov.uk\/government\/publications\/ai-opportunities-action-plan\/ai-opportunities-action-plan\" target=\"_blank\" rel=\"nofollow noopener\">AI Opportunities Action Plan<\/a>\u00a0is complete&#13;<br \/>\none year on, with 38 out of the 50 recommendations met.\u00a0<\/p>\n<p>The UK government is clear this is the foundation on which it&#13;<br \/>\ncan now build by &#8220;looking up and looking out&#8221; to&#13;<br \/>\nget AI working across the economy and ensure responsible but faster&#13;<br \/>\nadoption. It is hoped that the five established AI Growth Zones&#13;<br \/>\nwill be key to unlocking private investment, driving job creation&#13;<br \/>\nand building the required data centre capacity. An additional&#13;<br \/>\n\u00a3500 million worth of funding to back UK AI companies is also&#13;<br \/>\nin place.\u00a0<\/p>\n<p>While work is underway to upskill workers it is clear the 10&#13;<br \/>\nmillion by 2030 is an ambitious target. It is hoped the appointment&#13;<br \/>\nof sectoral industry champions will provide impetus across various&#13;<br \/>\nindustries and make real progress towards this goal.\u00a0<\/p>\n<p>A new Future of Work unit has also been established to ensure&#13;<br \/>\nresponsible AI adoption, while monitoring disruption and the impact&#13;<br \/>\nof AI on the labour market. It is also tasked with ensuring AI&#13;<br \/>\nboosts jobs and growth while helping workers to upskill and adapt.&#13;<br \/>\nA wide remit and one to watch with interest!<\/p>\n<p>Responsible AI: a quick reminder<\/p>\n<p>The UK hasn&#8217;t created a single AI regulator. Instead,&#13;<br \/>\nit&#8217;s asking existing bodies (the FCA, Ofcom, the ICO and&#13;<br \/>\nothers) to police AI within their own patches.\u00a0<\/p>\n<p>Five principles currently sit at the heart of this approach:<\/p>\n<p>&#13;<br \/>\nsafety, security and robustness;&#13;<br \/>\n&#13;<br \/>\ntransparency and explainability;&#13;<br \/>\n&#13;<br \/>\nfairness;&#13;<br \/>\n&#13;<br \/>\naccountability and governance; and&#13;<br \/>\n&#13;<br \/>\ncontestability and redress.&#13;<\/p>\n<p>Each regulator interprets the principles for its own remit.<\/p>\n<p>So what?<\/p>\n<p>While the DRCF highlights the importance of working&#13;<br \/>\ncollaboratively with other digital regulators and industry, there&#13;<br \/>\nis still an air of waiting, wondering and the unknown. The sign&#13;<br \/>\nisn&#8217;t absurd, it&#8217;s there as a placeholder, which is very&#13;<br \/>\nmuch what the current state of play seems to reflect when it comes&#13;<br \/>\nto AI in the UK.\u00a0<\/p>\n<p>Everyone is navigating this new world together and while there&#13;<br \/>\nare competitive pressures to realise the benefits of AI, it is&#13;<br \/>\nessential to heed the placeholder and know that you have your&#13;<br \/>\nstrategy agreed, governance in place and appropriate use cases for&#13;<br \/>\nwhichever AI tool\/s you are using.<\/p>\n<p>The content of this article is intended to provide a general&#13;<br \/>\nguide to the subject matter. Specialist advice should be sought&#13;<br \/>\nabout your specific circumstances.<\/p>\n<p>                    <a href=\"https:\/\/www.mondaq.com\/home\/redirect\/original\/1763846?location=sourceoriginal\" target=\"_blank\" rel=\"nofollow noopener\"> [View Source] <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"&#13; To print this article, all you need is to be registered or login on Mondaq.com.&#13; Article Insights&hellip;\n","protected":false},"author":2,"featured_media":15222,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[59,57,58,50,56,54,55],"class_list":{"0":"post-495243","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-united-kingdom","8":"tag-gb","9":"tag-great-britain","10":"tag-greatbritain","11":"tag-news","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/495243","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=495243"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/495243\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/15222"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=495243"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=495243"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=495243"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}