
Pubblicato 17-10-2024 — Aggiornato il 23-01-2025
Versioni
- 23-01-2025 (3)
- 23-01-2025 (2)
- 17-10-2024 (1)
Parole chiave
- Artificial Intelligence,
- Parenting

Questo lavoro è fornito con la licenza Creative Commons Attribuzione - Non commerciale - Non opere derivate 4.0 Internazionale.
Abstract
With the undeniable advancement of artificial intelligence, it is obvious to ask whether artificial intelligence systems are able, with the inclusion of standards in the algorithmic function, to confer greater scientificity to evidentiary data in areas such as the evaluation of parental skills, a field very delicate in which the primary interest must be that of the minor, as the machine itself would not be able to perceive signals such as tone of voice, facial expressions, emotional reaction to a certain affirmation or denial. If on the one hand artificial intelligence could avoid prejudices and conditioning that are typical of human beings, on the other, however, they could invalidate the decision by lacking that human part, which in certain fields of law cannot be left out. It could be useful to set up AI systems that assist the consultant and are able to operate on the analysis of the personality, family relationships, parenting skills and testimonial reliability of adults and minors. The CTU, having to analyze the machine’s decision-making process, would find itself comparing it with its own and this would effectively force the expert to review his reasoning and ask himself whether he has explored all the points that the AI deemed important for the decision made or, on the contrary, gave weight to other factors that should have been less decisive. Ultimately, it would be an AI-powered decision support system.