4.1.2 Understanding the requirements of rules and regulations

Practical guidance - inspection and maintenance robots

Authors: RIMA project

Guidance on assessing liability and the role of humans in the context of robots for inspection and maintenance

Introduction

The intention of this entry is to provide guidance regarding the potential liability of humans in the context of the use of robots for inspection and maintenance purposes. This will provide an overview of the types of issues that are likely to arise in order to inform approaches that should be adopted in this setting. The focus will be on Europe and the robotic inspection and maintenance domain.

When considering potential liability issues in the context of the use of autonomous robotic systems for inspection and maintenance, there are a number of key considerations. These include the body of regulations that apply, especially considering that the relevant legal frameworks in such a fast-moving industry are underdeveloped. Next is the role of operator liability, something that becomes particularly unclear in the context of an autonomous system with decision making capacity. The final issue that will be discussed is the impact of any product defects.

Relevant legal framework(s)

The development of robotic technology and its increasing use necessitates that existing legal frameworks will need to develop in order to best facilitate, as well regulate their use. This applies both in a general sense, and more specifically with regard to their application for inspection and maintenance.

There will of course be a need to account for existing bodies of regulations that apply. These will vary depending on the nature of the robotic technology in question, as well as the nature of the environment in which it is to be used.
The location in which the robotic technology will be used is also important. There are regulation(s) at an EU level that will apply directly to any projects in the EU. Beyond this, individual member states have additional regulatory frameworks that may be applicable. In addition, there will of course be industry specific regulations (on required safety standards for example) that will have to be accounted for.

Apportioning liability where fault occurs

A key consideration is the potential liability issues that may arise in the event that robotic technology causes damage to property, humans, or both. This is important on the basis that like any form of technology, robotics can fail, be operated poorly, or be improperly maintained. This would require that the situation be remedied, including the compensation of any victims.

Location

Location (i.e. the country of use) is an important factor here. For example, direct, and indirect approaches are used to regulate robotic technologies. From a direct perspective, a variety of EU directives that may be applicable [1]. These offer safety standards, while there also exists liability regimes for victim compensation which could provide guidance for potential legal reform specific to commercial robotics [2].

At the time of this entry, a future piece of applicable EU legislation is the proposed ‘AI Act’, which has been put forward by the European Commission. It is currently in the process of becoming an EU regulation, and will be a ground-breaking instrument in regulating AI of different risk levels, particularly those with higher risks [3].

Indirect approaches concerning civil liabilities are also of particular interest here. These concern the application of laws specifically designed for the particular technology, or tort law (which concern damage caused through non-criminal actions, often through negligence) that may incorporate or refer to industry standards. Also of note is that individual EU member states have different systems of liability law. This means that the member state in which litigation occurs, and the law(s) which apply is very much case specific. This could be influenced by numerous factors which could include where any damage was caused, where the robot was used or operated from, or where the manufacturer or supplier is based [4].

User negligence

In many cases, liability issues concerning robotic technology will concern some level of user negligence. Users have a duty to exercise reasonable care when operating a robot. However, this is likely to change where autonomous technology is in operation as where this is the case meaning that the task of apportioning liability becomes more arduous.

This technological advancement in robotics technology could mean that the party at fault may be hard to determine. For example, this could fall upon the manufacturer, programmer, or operator.

The impact of ‘product defects’

In a general context, the European General Product Safety Directive is likely to apply to a large amount of robotic technology due to the broad definition of product it provides [5]. Similarly, the Product Liability Directive seeks to protect consumers when products cause personal injuries, death, or property damage [6]. There are also jurisdiction-specific approaches which apply the product safety, and product liability to consider [7]. However, because the use of robotic technology for inspection and maintenance purposes is more likely within commercial applications, those directives will not necessarily apply due to their focus on consumer use and interaction.

References

[1] Machinery Directive 2006/42/EC; Radio Equipment Directive 2014/53/EU; Compatibility Directive 2014/30/EU; Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 9.

[2] Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 25.

[3] European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts’ COM/2021/206 final “Tort law” is “the name given to the branch of law that imposes civil liability for breach of obligations imposed by law. The most common tort is the tort of negligence which imposes an obligation not to breach the duty of care (that is, the duty to behave as a reasonable person would behave in the circumstances) which the law says is owed to those who may foreseeably be injured by any particular conduct.” (Thomson Reuters Practical Law, https://uk.practicallaw.thomsonreuters.com/6-107-7397)

[4] Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 25.

[5] General Product Safety Directive 2001/95/EC; Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 55.

[6] Product Liability Directive 85/374/EEC; Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 57

[7] Gheraibia Y, Kao B, Alexander R, Morgan, PDJ and Kilvington L (2020). Review of legal frameworks, standards and best practices in verification and assurance for infrastructure inspection robotics, p 61.

Contact us

Assuring Autonomy International Programme

assuring-autonomy@york.ac.uk
+44 (0)1904 325345
Institute for Safe Autonomy, University of York, Deramore Lane, York YO10 5GH

Contact us

Assuring Autonomy International Programme

assuring-autonomy@york.ac.uk
+44 (0)1904 325345
Institute for Safe Autonomy, University of York, Deramore Lane, York YO10 5GH