Smart Technologies and the End(s) of Law

Smart Technologies and the End(s) of Law

Novel Entanglements of Law and Technology

Mireille Hildebrandt

This timely book tells the story of the smart technologies that reconstruct our world, by provoking their most salient functionality: the prediction and preemption of our day-to-day activities, preferences, health and credit risks, criminal intent and spending capacity. Mireille Hildebrandt claims that we are in transit between an information society and a data-driven society, which has far reaching consequences for the world we depend on. She highlights how the pervasive employment of machine-learning technologies that inform so-called ‘data-driven agency’ threaten privacy, identity, autonomy, non-discrimination, due process and the presumption of innocence. The author argues how smart technologies undermine, reconfigure and overrule the ends of the law in a constitutional democracy, jeopardizing law as an instrument of justice, legal certainty and the public good. Finally, the book calls on lawyers, computer scientists and civil society not to reject smart technologies, explaining how further engaging these technologies may help to reinvent the effective protection of the rule of law.

Chapter 5: Threats to fundamental rights in the onlife world

Mireille Hildebrandt

Subjects: law - academic, internet and technology law, legal philosophy

Extract

Before calculating a risk, you need to be aware of what threat you are facing. Moving to risk too soon and too fast has several drawbacks. One is that you may be taking the threats for granted and start translating what is perceived as a threat into discrete data points, just because it allows for sophisticated number crunching. This could lure you into skipping the stage of qualification and conceptualization that enables a reliable translation of perceived or expected threats into the objects and attributes of your data model. Qualification always precedes quantification, whether or not one is paying explicit attention to this. To calculate the monetary value that people attach to their privacy it does not suffice to offer them money for what you consider their privacy, for instance based on their willingness to share location data. Qualifying the sharing of location data in the context of a scientific experiment as privacy is a quick and easy way to construct data models, but I dare say it has little to do with what most people understand as giving up privacy in real life. You might actually end up constructing quantifiable solutions that resolve numerically defined problems in the dataset, whereas these solutions have no relation to the problems we need to resolve in the real world (whatever that is). Second, you may have missed non-obvious or invisible threats because they are not – yet – computable or – as yet – intractable.

You are not authenticated to view the full text of this chapter or article.

Elgaronline requires a subscription or purchase to access the full text of books or journals. Please login through your library system or with your personal username and password on the homepage.

Non-subscribers can freely search the site, view abstracts/ extracts and download selected front matter and introductory chapters for personal use.

Your library may not have purchased all subject areas. If you are authenticated and think you should have access to this title, please contact your librarian.

Further information