0

The European Parliament has grave concerns for the Artificial Intelligence (AI) industry moving without regulation and is taking measures to see it fall under control in the EU. Patrick Sharkey discusses. Patrick is a journalism student who specialises in politics and sports throughout Europe.

The EU parliament has planned to ensure a fair and safe use for consumers when it comes to the advancement of AI technologies. It plans to update EU safety and liability rules in light of AI-enabled products, use unbiased algorithms and review structures, and ensure that humans remain ultimately in control. AI, machine learning, algorithmic-based systems and automated decision making (ADM) are now advancing at a rapid pace. MEPs want a strong set of rights to protect the consumers in the context of artificial intelligence and ADM. Parliament’s Internal Market and Consumer Protection Committee approved on Thursday a resolution addressing several challenges.

“When consumers interact with the ADM system, they should be “properly informed about how it functions, about how to reach a human with decision-making powers, and about how the system’s decisions can be checked and corrected”, says the committee. Those systems should only use high-quality and unbiased data sets and “explainable and unbiased algorithms” in order to boost the consumer trust and acceptance, states the resolution. Review structures should be set up to remedy the possible mistakes in automated decisions. It should also be possible for consumers to seek a human review of, and redress for, automated decisions that are final and permanent.

“Humans must always be ultimately responsible for, and able to overrule, decisions” that are taken via ADM processes, especially in relation to the medical, legal and accounting professions and the banking sector, MEPs underlined.  There was also further action taken to adjust safety and liability rules to the new technologies. AI-enabled products could evolve and act in ways not envisaged when they were first placed on the market. MEPs urge the Commission to table proposals adapting the EU’s safety rules for products.

The Product Liability Directive, adopted over 30 years ago, would also need to be updated to adapt concepts such as ‘product’ ‘damage’ and ‘defect’ as well as rules governing the burden of proof.

MEPs are now calling for a risk-assessment scheme for AI and ADM and for a common EU approach to help and further action was taken to check differentiated pricing and discrimination. Under EU law, traders must inform consumers when the price of goods or services has been personalised on the basis of ADM, MEPs recall, asking the Commission to closely monitor implementation of those rules. It must also check how the EU regulation banning unjustified geo-blocking is applied to ensure that ADM is not being used to discriminate against the consumers based on their nationality, place of residence or temporary location. Petra De Sutter, Chair of the Internal Market and Consumer Protection Committee, said: “Technology in the field of artificial intelligence and automated decision-making is advancing at a remarkable pace. The committee has today (23rd January) welcomed the potential of these advances, while at the same time highlighting three important issues that need to be addressed. “

Thanks for visiting our site. Did you enjoy this article? If so, don’t forget to share our work as we totally rely on you spreading the word on Europa United. All our writers are volunteers and we appreciate any help in getting our articles to a larger audience. Maybe you would also like to also help us to maintain our organisation by making a donation here.

Patrick Sharkey
Donegal based student studying Journalism. Passionate about Politics and sport.

    It won’t happen by itself – how the UK will one day rejoin the EU

    Previous article

    Is your pension helping the planet? Is it helping you?

    Next article

    You may also like

    Comments

    Leave a reply

    Your email address will not be published. Required fields are marked *