In a White Paper dedicated to the subject, the European Union stresses the need to develop ethical artificial intelligence. But some experts and industry players are already concerned about this strategy.
The European Union has placed a great deal of emphasis on ethics in its White Paper on artificial intelligence, a laudable ambition but one whose music will not be easy, experts warn.
The White Paper, published on 19 February, first stresses the importance of respecting the fundamental rights of citizens and warns, for example, of distortions in recruitment algorithms leading to discrimination.
Brussels recommends that future high-risk artificial intelligence systems (e.g. health- are certified, tested and monitored, as are cars, cosmetics and toys.
But some experts and industry players are already concerned about this emphasis on values and ethics.
“It takes so much energy to put on frames and not to move forward,” the boss of a brilliant French artificial intelligence startup, which has many public contracts and prefers to remain anonymous, said last week.
“Ethics Is Used by the Powers In place”, Large Companies Yes institutions, “As A Way of Talk intelligence Artificial Without In Do Oh, really. It’s not only Very Damage because This time-ThereThe United States, Canada, the Chinese Advance”, Explains-t-The.
Read Also : What for France and Canada Bet On AI Ethics
Instead of big ambitious plans, “we could already advance the use of artificial intelligence for predictive maintenance of aircraft engines” (using AI to predict failures and intervene before), which poses few ethical problems He says.
VALUES, WEAPON Strategic?
Theodorous Evgueniou, a professor at the prestigious Insead School of Management, has issued an opinion with several other European and American researchers warning of the risks of a European approach that is too focused on its values.
“The Seems What mind White Paper Is What Europe Uses Its Values As A Weapon Strategic For Catch China and the United States”, and Make attractive in the race World To intelligence Artificial, Says-t-The. But “What for Think countries noneuropéens Prefer intelligences Artificial Formed To Values European ? Is no ‘m Not Course What It Walking”, Explains-t-The.
Read Also : Data intelligence Artificial : comment Europe Wants Catch its delay
By Example, the White Paper shows “explicability” As Value Cardinal: The Must Be possible Understand Exactly What for A System intelligence Artificial happens to be Such conclusion As refusing a loan to Someone.
From his SideGuillaume Avrin, Responsible of intelligence Artificial within the Laboratory National Metrology And testing (NE), not Hands Not In causes the Requirements Ethical in the White Paper. But The Regret that the Commission moves forward Little on the Means of Check The Compliance intelligences Artificial auxdites Values.
“How do we ensure that artificial intelligence systems comply” with regulations, when they are often scalable and adaptive and have “non-linear” or even “chaotic” behaviours, he explains.
As well By Example, “we don’t Knows Not Still Evaluate The Level explicability one of the System intelligence Artificial”, Stresses-t-The.
“The question is not Not Detailed, the White Paper Happy Just to evoke Financing Important for one centre testing The Level of Union European”, Regret-t-The.
Laurence Devillers, an artificial intelligence researcher at the CNRS, also acknowledges that the Commission remains very discreet for the time being on the concrete modalities of “ethical” artificial intelligence.
But given the stakes, it was important to show high and clear this issue of values, says the researcher, who has just published “Emotional robots: health, surveillance, sexuality… and ethics in all this?”, at the editions of the Observatory.
“Yes, the challenge is to be creative and to build a positive and ethical economy. Europe has the resources to do that,” she says. “What will all those who protest today say, when their child explains to them tomorrow that his application for a job has been rejected” by an intelligent machine, she asks.