Predicting the past: a philosophical critique of predictive analytics

Main Article Content

Daniel Innerarity

If we address this topic from a conceptual and critical point of view, we need to address three issues: 1) why predictions are too often right, 2) why, at the same time, they are so often mistaken, and 3) what consequences arise from the fact that our instruments for prediction ignore at least four realities that must be true about future forecasts or at least be conscious of their limits: a) that individuals cannot be fully subsumed into categories, b) that their future behaviour tends to have unpredictable dimensions, c) that propensity is not the same as causality and d) that democratic societies must make the desire to anticipate the future compatible with respect for the open nature of the future.

Keywords:

predictive analytics, artificial intelligence, algorithms, democracy, future, freedom

Article Details

How to Cite
Innerarity, Daniel. “Predicting the past: a philosophical critique of predictive analytics”. IDP. Internet, Law and Politics E-Journal, no. 39, pp. 1-12, doi:10.7238/idp.v0i39.409672.
Author Biography

Daniel Innerarity, University of the Basque Country / Euskal Herriko Unibertsitatea

Professor of political philosophy, Ikerbasque researcher at the University of the Basque Country and Chair of Artificial Intelligence and Democracy at the European University Institute. Former fellow of the Alexander von Humboldt Foundation at the University of Munich, visiting professor at the University of Paris 1-Sorbonne, visiting fellow at the London School of Economics, and at Georgetown University. His recent books in English include Ethics of hospitality (2017), The democracy in Europe (2018), Politics in the Times of Indignation (2019) and A Theory of Complex Democracy (2023). He has been awarded with different prizes, recently the National Research Prize of Human Sciences.

References

ABEBE, R.; KASY, M. (2021). “The means of prediction”. In: Acemoglu, D. Redesigning AI. Work, democracy, and justice in the age of automation, pp. 87-91. Cambridge: Boston Review.

ACCOTO, C. (2019). Il mondo ex machina. Cinque brevi lezioni di filosofia dell’automazion. Milan: Egea.

ADAMS, V.; MURPHY, M.; CLARKE, A. (2009). “Anticipation: Technoscience, life, affect, temporality”. Subjectivity, vol. 28, n.º 1, pp. 246-265. DOI: https://doi.org/10.1057/sub.2009.18

AGRAWAL, A.; GANS, J.; Goldfarb, A. (2018). Prediction Machines. The Simple Economics of Artificial Intelligence. Cambridge: Harvard University Press.

AMOORE, L.; PIOTUKH, V. (2015). “Life beyond big data: governing with little analytics”. Economy and Society, vol. 44, n.º 3, pp. 314-366. DOI: https://doi.org/10.1080/03085147.2015.1043793

ANDREJEVIC, M. (2013). Infoglut: How Too Much Information Is Changing the Way We Think and Know. New York: Routledge. DOI: https://doi.org/10.4324/9780203075319

ANGWIN, J.; LARSON, J. (2016, December). “Bias in Criminal Risk Scores is Mathematically Inevitable, Researches Say”. ProPublica [online]. Available at: https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematical-inevitable-researches-say

ARENDT, H. (2017). Mensch und Politik. Stuttgart: Reclam.

BOELLSTORF, T. (2013). “Making big data, in theory”. First Monday, vol. 18, no. 10. DOI: https://doi.org/10.5210/fm.v18i10.4869

BRAMAN, S. (2009). Change of State: Information, Policy and Power. Cambridge: The MIT Press.

BRAYNE, S. (2020). Predict and Surveil: Data, Discretion, and the Future of Policing. Oxford University Press. DOI: https://doi.org/10.1093/oso/9780190684099.001.0001

BROUSSARD, M. (2018). Artificial Unintelligence: How Computers Misunderstand the World. Cambridge: The MIT Press. DOI: https://doi.org/10.7551/mitpress/11022.001.0001

DERRIDA, J. (1994). “Nietzsche and the Machine”. Journal of Nietzsche Studies, no. 7, pp. 7-65.

ESPOSITO, E. (2011). The Future of Futures: The Time of Money in Financing and Society. Edward Elgar. DOI: https://doi.org/10.4337/9781849809115

ESPOSITO, E. (2021). Artificial Communication: How Algorithms Produce Social Intelligence. Cambridge: The MIT Press. DOI: https://doi.org/10.7551/mitpress/14189.001.0001

EUROPEAN COMMISSION (EC) (2015). Evidence-Based Better Regulation. European Commission [online]. Available at: https://commission.europa.eu/law/law-making-process/planning-and-proposing-law/better-regulation/better-regulation-guidelines-and-toolbox_en

FEDERAL TRADE COMMISSION (2016, January). Big Data. A tool of Inclusion or Exclusion? Understanding the issues. United Sates of America: Federal Trade Commission [online]. Available at: https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understanding-issues/160106big-data-rpt.pdf

FOERSTER, H. Von (2003). Understanding Understanding: Essays on Cybernetics and Cognition. New York: Springer. DOI: https://doi.org/10.1007/b97451

HILDEBRANDT, M. (2006). “Privacy and identity”. In: Claes, E., Duff, A., Gurtwith, S. (eds.). Privacy and the Criminal Law. Antwerpen/Oxford: Intersentia, pp. 43-57. DOI: https://doi.org/10.1007/s11572-006-9006-x

KAPOOR, S.; NARAYANAN, A. (2022). “Leakage and the Reproducibility Crisis in ML-based Science”. Patterns, vol. 4, no. 9. DOI: https://doi.org/10.1016/j.patter.2023.100804

MACKENZIE, A. (2015). “The production of prediction: What does machine learning want?”. European Journal of Cultural Studies, vol. 18, no. 4-5, pp. 429-445. DOI: https://doi.org/10.1177/1367549415577384

MASSUMI, B. (2007). “Potential politics and the primacy of preemption”. Theory & Event, vol. 10, no. 2. DOI: https://doi.org/10.1353/tae.2007.0066

MATZNER, T. (2018). “Grasping the ethics and politics of algorithms”. In: Sætnan, A. R., Schneider, I., Green, N. (2018). The Politics of Big Data. Big Data, Big Brother, pp. 30-45. Oxford, New York: Routledge.

MAYER-SCHOENBERGER, V.; CUKIER, K. (2013). Big Data. A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton.

MERTON, R. (1948). “The self-fulfilling prophecy”. The Antioch Review, vol. 8, no. 2, pp. 193-210. DOI: https://doi.org/10.2307/4609267

NOWOTNY, H. (2021). In AI we trust. Power, illusion and control of predictive algorithms. Cambridge: Polity Press.

PORTER, T. M. (1995). Trust in Numbers. The Pursuit of Objectivity in Science and Public Life. Princeton University Press. DOI: https://doi.org/10.1515/9780691210544

SCHNEIDER, I. (2018). “Bringing the state back in. Big Data-based capitalism, disruption, and novel regulatory approaches in Europe”. In: Sætnan, A. R., Schneider, I., Green, N. The Politics of Big Data. Big Data, Big Brother, pp. 129-175. Oxford, New York: Routledge.

STRAUSS, S. (2015). “Datafication and the Seductive Power of Uncertainty-A Critical Exploration of Big Data”. Information, no. 6, pp. 836-847. DOI: https://doi.org/10.3390/info6040836

TYLER, I. (2015). “Classificatory Struggles: Class, Culture and Inequality in Neoliberal Times”. The Sociological Review, vol. 63, no. 2, pp. 493-511. DOI: https://doi.org/10.1111/1467-954X.12296