Attitudes and perceptions regarding algorithmic judicial judgement: barriers to innovation in the judicial system?
Article Sidebar

Main Article Content
This study aims to be a starting point in the process of studying attitudes and perceptions around the use of algorithmic tools in the judicial system and explores possible barriers to innovation. The results reveal significant differences in acceptance between the general population, experts in data analytics and artificial intelligence (AI) and legal professionals, with notable variations in the acceptance of algorithmic tools for judicial analysis. In addition, participants with a legal background showed a negative correlation with acceptance, indicating a more cautious stance towards the integration of such tools in the criminal justice domain. This suggests a cautious and reserved attitude among legal professionals towards the integration of algorithmic tools in the justice system, potentially rooted in concerns regarding objectivity, fairness and the preservation of legal principles in judicial processes. Furthermore, the study reveals that the acceptance of algorithmic tools is influenced by the complexity of the tasks involved in the automation of the criminal justice system. This underlines the importance of considering the level of automation and the degree of human intervention in the use of these tools. In short, this study highlights the importance of having the choice of society and more specifically of legal operators to encourage the adoption and effective implementation of algorithmic tools in the judicial sphere.
Article Details

This work is licensed under a Creative Commons Attribution-NoDerivatives 3.0 Unported License.
(c) Sandra Pérez Domínguez, Pere Simón Castellanos, 2023
Copyright
Contents published in IDP are subject to a Creative Commons Attribution-No Derivative Works 3.0 Spain licence, the full text of which can be consulted on http://creativecommons.org/licenses/by-nd/3.0/es/deed.en.
Thus, they may be copied, distributed and broadcast provided that the author and IDP are cited, as shown in the recommended citation that accompanies each article. Derivative works are not permitted.
Authors are responsible for obtaining the necessary permission to use copyrighted images.
Assignment of intellectual property rights
The author non exclusively transfers the rights to use (reproduce, distribute, publicly broadcast or transform) and market the work, in full or part, to the journal’s editors in all present and future formats and modalities, in all languages, for the lifetime of the work and worldwide.
The author must declare that he is the original author of the work. The editors shall thus not be held responsible for any obligation or legal action that may derive from the work submitted in terms of violation of third parties’ rights, whether intellectual property, trade secret or any other right.
Sandra Pérez Domínguez, Miguel Hernández University of Elche
Graduate in Psychology and master degree in Criminological and Victimological Intervention from the Miguel Hernández University of Elche. Predoctoral investigator at Miguel Hernández University of Elche in collaboration with the company Plus Ethics. Teacher of the subject of Intrafamily and Gender Violence in the degree in Public and Private Safety of the UMH. Researcher of the Crimina Center for the study and prevention of crime and the research group TEMEC (Technology, Mind and Social and deviated behavior), both of the Miguel Hernández University of Elche. Member of the organization of events in the Spanish Network of Young Researchers in Criminology (REJIC).
Pere Simón Castellanos, Universidad Internacional de la Rioja (UNIR)
Full professor of Constitutional Law at the Universidad Internacional de la Rioja (UNIR). Partner of the firm Font Advocats, specializing in Digital Law and director of the Criminal Compliance area. Teacher in undergraduate and graduate courses at different universities (UNIR, University of Girona, UOC, Nuclio Digital School, University of Salamanca, UNIANDES). Secretary of the Board of the ICT Law Section of the Illustrious Bar Association of Barcelona. Awarded by the Spanish Data Protection Agency (2011) and the Basque Data Protection Agency (2015). He is a member of the UNIR’s PENALCRIM research group.
BARYSÉ, D.; SAREL, R. (2023). “Algorithms in the court: does it matter which part of the judicial decision-making is automated?”. Artificial Intelligence and Law. DOI: https://doi.org/10.1007/s10506-022-09343-6
BERIAIN, I. D. M. (2018). “Does the use of risk assessments in sentences respect the right to due process? A critical analysis of the Wisconsin v. Loomis Ruling”. Law Probab Risk, vol. 17, no. 1, pp. 45-53. DOI: https://doi.org/10.1093/lpr/mgy001
BOIX PALOP, A. (2020). “Los algoritmos son reglamentos: La necesidad de extender las garantías propias de las normas reglamentarias a los programas empleados por la administración para la adopción de decisiones”. Revista de Derecho Público. Teoría y Método, vol. 1, pp. 223-269. DOI: https://doi.org/10.37417/RPD/vol_1_2020_33
CATERINI, M. (2022). “El sistema penal en la encrucijada ante el reto de la inteligencia artificial”. IDP. Revista de Internet, Derecho y Política, no. 35, pp. 1-19. DOI: https://doi.org/10.7238/idp.v0i35.392754
CASTRO TOLEDO, F. J. (2022). (Dir.). La transformación algorítmica del sistema de justicia penal. Cizur Menor: Aranzadi
CERRILLO I MARTÍNEZ, A. (2019). “El impacto de la inteligencia artificial en el derecho administrativo ¿nuevos conceptos para nuevas realidades técnicas?”. Revista General de Derecho Administrativo, no. 50.
COTINO HUESO, L. (2022). “Nuevo paradigma en las garantías de los derechos fundamentales y una nueva protección de datos frente al impacto social y colectivo de la inteligencia artificial”. In: Bauzá Reilly, M. (coord.) and Cotino Hueso, L. (dir.). Derechos y garantías ante la inteligencia artificial y las decisiones automatizadas, pp. 69-105. Cizur Menor: Aranzadi.
GONZÁLEZ FUSTER, G. (2020). “Artificial Intelligence and Law Enforcement - Impact on Fundamental Rights”. Policy Department for Citizens’ Rights and Constitutional Affairs. Directorate-General for Internal Policies [online]. Available at: https://www.europarl.europa.eu/RegData/etudes/STUD/2020/656295/IPOL_STU(2020)656295_EN.pdf
HERMSTRÜWER, Y.; LANGENBACH, P. (2022). “Fair governance with humans and machines”. MPI Collective Goods Discussion Paper, no. 2022/4, Psychology, Public Policy, and Law (forthcoming). DOI: https://dx.doi.org/10.2139/ssrn.4118650
KIM, B.; PHILLIPS, E. (2021). “Humans’ assessment of robots as moral regulators: importance of perceived fairness and legitimacy”. arXiv. DOI: https://doi.org/10.48550/arXiv.2110.04729
MARCHENA GÓMEZ, M. (2022). “Inteligencia Artificial y jurisdicción penal”. Speech given by Manuel Marchena Gómez on the occasion of his admission as a Full Academician of the Royal Academy of Doctors of Spain on 26 October 26 2022.
MIRÓ LLINARES, F. (2022). “Policía predictiva: realismo frente a utopías y distopías”. In: Castro Toledo, F. J. (dir.). La transformación algorítmica del sistema de justicia penal, pp. 177-198. Cizur Menor: Aranzadi.
PONCE SOLÉ, J. (2019). “Inteligencia artificial, Derecho administrativo y reserva de humanidad: algoritmos y procedimiento administrativo debido tecnológico”. Revista General de Derecho Administrativo, no. 50.
REILING, A. D. (2020). “Courts and artificial intelligence”. International Journal for Court Administration, vol. 11, no. 1. DOI: https://doi.org/10.36745/ijca.343
SORIANO ARNANZ, A. (2023). “Creando sistemas de inteligencia artificial no discriminatorios: buscando el equilibrio entre la granularidad del código y la generalidad de las normas jurídicas”. IDP. Revista de Internet, Derecho y Política, no. 38, pp. 1-12. DOI: https://doi.org/10.7238/idp.v0i38.403794
SIMÓN CASTELLANO, P. (2021). Justicia Cautelar e Inteligencia Artificial: La Alternativa a Los Atávicos Heurísticos Judiciales. First edition. Barcelona: J.M. Bosch.
SIMÓN CASTELLANO, P. (2022). La prisión algorítmica: Prevención, reinserción social y tutela de derechos fundamentales en el paradigma de los centros penitenciarios inteligentes. First edition. València: Tirant lo Blanch.
SIMÓN CASTELLANO, P. (2023). La evaluación de impacto algorítmico en los derechos fundamentales. First edition. Cizur Menor: Aranzadi. DOI: https://doi.org/10.2307/j.ctv1tqcxbh
ULENAERS, J. (2020). “The impact of artificial intelligence on the right to a fair trial: towards a robot judge?”. Asian Journal of Law and Economics, vol. 11, no. 2. DOI: https://doi.org/10.1515/ajle-2020-0008
YAlCIN, G.; THEMELI, E; STAMHUIS, E. et al. (2022). “Perceptions of justice by algorithms”. Artif Intell Law, no. 31. DOI: https://doi.org/10.1007/s10506-022-09312-z
ZARSKY, T. (2016). “The Trouble with Algorithmic Decisions: An Analytic Road Map to Examine Efficiency and Fairness in Automated and Opaque Decision Making”. Science, Technology, & Human Values, vol, 41, no. 1, pp. 118-132. DOI: https://doi.org/10.1177/0162243915605575