Recent study by Dr Kristina Irion joins up three EU policy areas that intersect in the digital age: consumer protection, EU governance of AI and EU external trade. "They are becoming so intertwined', says Dr. Irion. 'My research is breaking up the silos and drawing insights from their interactions.' According to the research, the source code clause within trade law restricts the EU's right to regulate AI policy.
Dr Irion's study concludes that the EU position on source code in international trade agreements limits the EU's ability to regulate AI in the interests of consumers. This is particularly the case in the ongoing e-commerce negotiations at the WTO. These findings are relevant for future EU regulations that allow algorithmic controls requiring access to the source code or access to the interfaces of an AI system. On 26 January, Dr Irion presented the findings of her study to members of the European Parliament and the European Commission. Several media have covered the study, including Euractiv, Politico and Tagesspiegel.
Why is AI regulation necessary?
Today, algorithmic decision-making is an integral part of many digital services, such as online shopping and social media applications. It is also used in financial services. To protect consumers from harmful artificial intelligence practices, such as (price) discrimination, inaccurate consumer information and algorithmic bias, the European Commission will propose rules on AI transparency early this year.
As AI applications are often delivered across borders from outside the EU, it is not only the EU regulations themselves that matter, but also the design of trade disciplines. Members of the World Trade Organisation are currently discussing plurilateral rules on e-commerce. These negotiations also include agreements on the non-disclosure of source code. Although these agreements are mainly mentioned as a tool against forced technology transfer, they should also ensure that national and European rules on AI transparency cannot be obstructed.
Research on cross-border use of AI technologies
The Institute for Information Law was commissioned by the German consumer organisation Verbraucherzentrale Bundesverband to carry out a study on the cross-border use of AI technologies and its impact on consumer law in the EU. In the current negotiations, the EU supports the inclusion of a special clause. This clause prohibits participating countries from including in national legislation measures requiring access to or transfer of the source code of software, with some exceptions. This is a cause for concern because if such a clause is not carefully conditioned, it may prevent future EU legislation on harmful AI.
Sloppily worded source code clause restricts EU law
The study concludes that the source code clause in trade law indeed restricts the EU's right to regulate AI policy in several important ways. This conclusion is surprising given that EU trade policy documents do not refer to AI but only to e-commerce and no direct link has been made between the software source code clause and algorithms. The investigation raises an important EU policy issue that needs to be democratically examined and debated. This must be done before the EU agrees to a new software source code clause in a plurilateral WTO agreement on electronic commerce.
Two recommendations
Digitalisation is leading to more and more digital artefacts consisting of software source code. AI technology can create new risks for individuals and society, while trade law remains largely static after ratification. The source code clause is too broad for domestic digital policies that must build on the ability of systems to exchange data (interoperability), accountability and verifiability of digital technologies.
The study makes two recommendations:
- The European Commission should clarify the impact of the source code clause on the EU's digital policy, especially in the area of consumer rights. In addition, this trade law clause should be abandoned, as software source code already enjoys copyright and trade secret protection; or
- The European Commission should limit the trade law clause to the situation of forced technology transfer for dishonest commercial practices, or carve out measures on algorithmic accountability from the scope. This would be prudent and provide time to develop robust domestic policy, as well as international standards for accountable AI.
- Tagy
- EU law Artificial Intelligence algorithm consumer protection non-discriminiation WTO trade law electronic commerce software source code autonomy to regulate