{"id":896,"date":"2024-09-16T13:03:53","date_gmt":"2024-09-16T12:03:53","guid":{"rendered":"https:\/\/thethinkingmachine.uk\/?p=896"},"modified":"2024-09-16T13:05:15","modified_gmt":"2024-09-16T12:05:15","slug":"x","status":"publish","type":"post","link":"https:\/\/thethinkingmachine.uk\/?p=896","title":{"rendered":"Public Authority Automated Decision-Making Bill Doomed"},"content":{"rendered":"\n<p>Lord Clement-Jones (LibDem) has introduced a private members\u2019 bill to regulate the use of artificial intelligence algorithms and automated decision-making technologies by public authorities, citing the need to avoid another Scandal like that of Horizon Post Office. Reflecting the EU rules, if \u201ccomputer says no\u201d to a benefit decision, immigration decision or similar, a citizen would have a right to access the information on why that happened so they have the opportunity to challenge it and the <a href=\"https:\/\/bills.parliament.uk\/bills\/3760\">Public Authority Algorithmic and Automated Decision-Making Systems Bill<\/a> would also oblige the government to provide an independent dispute resolution service, to publish impact assessments of any automated or AI algorithms that influence decision-making processes (which would include a mandatory bias assessment to ensure compliance with the Equality Act and Human Rights Act), as well as maintain a transparency register to provide greater public information about how systems are used, ensuring that any systems deployed by public authorities are designed with automatic logging capabilities (so that its operation can be continuously monitored and interrogated) and prohibiting the procurement of systems that are incapable of being scrutinised, including where public authorities are hampered in their monitoring efforts by contractual or technical measures, and the intellectual property interests of suppliers. Lord Clement-Jones claims that the Bill will be proactive, not reactive, when it comes to protecting citizens and their interactions with new technologies, but the bill fails to address many already known risks from AI and merely reflects computer law into AI.<\/p>\n\n\n\n<p>Parliamentarians introduce private members bills as a mechanism to generate debates on important issues and test opinion in Parliament &nbsp;and very few private members\u2019 bills ever become law, such as the AI (Regulation) Bill, &nbsp;and the ill-fated 10-minute rule bill for worker-centric AI, which was the starting point for the April 2024, the Trades Union Congress (TUC) <a href=\"https:\/\/www.tuc.org.uk\/research-analysis\/reports\/artificial-intelligence-regulation-and-employment-rights-bill\">Regulation and Employment Right Workplace AI proposal<\/a>, which sets out a comprehensive range of new legal rights and protections to protect workers from the effects of AI and outlawing automated decision-making on workers, something that under this Labour Government is likely to be considered as the unions are major contributors to the Labout Party funds and effectively have bought the incoming Labour Government, amending the basis for the previous Conservative government AI proposals for an agile, pro-innovation AI framework of legislation and although the<\/p>\n\n\n\n<p>new Labour government said it would \u201cseek to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models\u201d, it was noted the only legislative AI-specific measure was &nbsp;part of a \u201c<em>Product Safety and Metrology Bill<\/em>, which aims to respond \u201cto new product risks and opportunities to enable the UK to keep pace with technological advances, such as AI\u201d. (Metrology is \u201cthe science of measurement).<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Lord Clement-Jones (LibDem) has introduced a private members\u2019 bill to regulate the use of artificial intelligence algorithms and automated decision-making &hellip; <\/p>\n","protected":false},"author":1,"featured_media":888,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-896","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uk"],"_links":{"self":[{"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/posts\/896","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=896"}],"version-history":[{"count":2,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/posts\/896\/revisions"}],"predecessor-version":[{"id":899,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/posts\/896\/revisions\/899"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=\/wp\/v2\/media\/888"}],"wp:attachment":[{"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=896"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=896"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thethinkingmachine.uk\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=896"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}