Robot Law

Robot Law

Edited by Ryan Calo, A. Michael Froomkin and Ian Kerr

Robot Law brings together exemplary research on robotics law and policy – an area of scholarly inquiry responding to transformative technology. Expert scholars from law, engineering, computer science and philosophy provide original contributions on topics such as liability, warfare, domestic law enforcement, personhood, and other cutting-edge issues in robotics and artificial intelligence. Together the chapters form a field-defining look at an area of law that will only grow in importance.

Chapter 4: Lawyers and engineers should speak the same robot language

Bryant Walker Smith

Subjects: innovation and technology, technology and ict, law - academic, internet and technology law, law and society, legal philosophy, legal theory, public international law, terrorism and security law, politics and public policy, public policy, terrorism and security


Lawyers and engineers can, and should, speak to each other in the same language. Both the law and engineering are concerned with the actual use of the products they create or regulate. They engage in similar concepts and terms and have interconnecting roles. Yet confusion and inconsistencies can lead to a regulator’s system boundaries being wholly incongruous with a developer’s system. This chapter emphasizes the importance of four concepts – systems, language, use, and users – to the development, regulation, and safety of robots. To guide the discussion, the author uses motor vehicle automation as an example and references a number of technical documents. The author finds that defining a system’s boundaries is a key conceptual challenge. Inconsistency in the use of language – particularly in the use of the terms control, risk, safety, reasonableness, efficiency, and responsibility – leads to unnecessary confusion. Furthermore, there is no uniform understanding of “safety” from a technical, much less legal, perspective. The author discusses how several concepts and terms are susceptible to numerous meanings, and suggests more effective uses of these concepts. Developers and regulators have interconnecting roles in ensuring the safety of robots and must thoughtfully coordinate the technical and legal domains without conflating them. Additionally, humans should be understood as part of the systems themselves, as they remain a key part of the design and use of automated systems. The systems analysis introduced in this chapter reveals the conceptual, linguistic, and practical difficulties that developers and regulators will confront on the path of increasing automation. Sensibly defining automated systems requires a thoughtful dialogue between legal and technical domains in the same robot language.

You are not authenticated to view the full text of this chapter or article.

Elgaronline requires a subscription or purchase to access the full text of books or journals. Please login through your library system or with your personal username and password on the homepage.

Non-subscribers can freely search the site, view abstracts/ extracts and download selected front matter and introductory chapters for personal use.

Your library may not have purchased all subject areas. If you are authenticated and think you should have access to this title, please contact your librarian.

Further information