Civil, and also criminal, responsibility in artificial intelligence involves complex technical and legislative aspects; and it is a fertilizer for future discussions and problems within the framework of the EU and worldwide. The ex ante discussion is necessary for European citizenship and the estates involved in the present and future. The sensitivity and solidarity (and also the rights and obligations) of our society must be activated now, and warn of its lack of confidence and lethargy; a society that is saturated with information that causes us serious absences in the latent forums of the European debate. In order to contribute to the analysis of the need to define Artificial Intelligence (AI) as an entity with legal personality, or legal entity, and the subsequent responsibility that would lead to such a decision, we extract and analyze elements that we consider important in the Group Report of Experts in Responsibility and New Technologies - Training of New Technologies. 2019 ... And we find it very relevant especially one of the 'findings' of the report, called Liability for Artificial Intelligence and other emerging digital technologies: “It is not necessary to give autonomous devices or systems a legal personality, since the damage they can cause It can and should be attributable to existing people or organizations.
- Inicie sessão para publicar comentários
- Etiquetas
- Europe research EU robotics AI oer digitalstrategy scienceopen robótica trustingai pnl ieee iamopeningscience ML IA inteligenciaartificial engrxiv OSF