Skip to content Accessibility statement
Home>Assuring Autonomy International Programme>Impact>Nick Hall HSE case study

Nicholas Hall is HM Principal Specialist Inspector (Advanced Automation and Cyber Security) for the Health and Safety Executive.

His work requires him to be at the cutting edge of research in AI and autonomy. After meeting Dr James Law at the University of Sheffield he became the regulatory partner for the CSI:Cobot demonstrator project, which demonstrated how research in the fields of safety engineering, machine learning, and cybersecurity can be applied to human-robot collaborative processes to address safety concerns and improve confidence.

Contact us

Assuring Autonomy International Programme

assuring-autonomy@york.ac.uk
+44 (0)1904 325345
Institute for Safe Autonomy, University of York, Deramore Lane, York YO10 5GH

It was a natural fit. It hit all my topics, would enable me to work with people I wanted to work with, and it would usefully feed into our work at the HSE on robotics, autonomy, AI and cyber security.

The 'voice of the regulator'

Through weekly project meetings, Nick made suggestions to help make assurance more 'regulator friendly’ and educated peers on risks and hazards. He recommended putting features into the digital twin developed by the project team to make it risk-based for real-world application. This involved collaborating with a team of around 15 people leading on sensing, cyber security, digital twinning, and safety synthesis.

Nick was also part of the education and training elements of the project, including being a judge at the Manufacturing Robotics Challenge. He also designed and delivered a workshop that brought 20 people together from the HSE, industry, and CSI:Cobot and university researchers as a final element of the project. The workshop helped participants come up with detailed answers to issues that they had clearly been thinking a lot about. It gave rise to discussions about what might need to change in standards or legislation and helped generate ideas for further research.

Influencing standards

Nick’s learning and experience is likely to be a contributory factor in his thinking and technical work to develop a range of standards with collaborators from around the world.

As a standards writer you draw on lots of influences, so the AAIP experience may contribute to work I’m involved with around the new Industrial Robots safety standards ISO10218 parts 1 and 2 and the new Autonomous Mobile Robots standard which I’ll be working on from next year. More generally ISO12100, risk assessment for machinery, may include my thinking about where AI could be used and what dutyholders need to think about before using AI near a machinery safety system.

Future work

More generally Nick reports that many parts of the HSE are interested in AI and autonomy as are other regulators who could benefit from the sort of collaboration he has enjoyed with AAIP.

He thinks attitudes are shifting from ‘AI + safety = no thanks’ to a growing understanding that there is a need to develop along this route. Assurance is critical to that path. Consequently, HSE are currently developing policy on assurance of autonomous systems that to some extent will have been influenced by his mutually beneficial relationship with AAIP.

Without the AAIP engagement I would probably have arrived at a similar position in my thinking, however, I think the AAIP has accelerated my development along this path, and it has been achieved in spite of competing demands. The AAIP has got scale, a range of different demonstrators across sectors and cross-learning is really beneficial to regulators so accessing the AAIP’s body of knowledge is valuable too.

Contact us

Assuring Autonomy International Programme

assuring-autonomy@york.ac.uk
+44 (0)1904 325345
Institute for Safe Autonomy, University of York, Deramore Lane, York YO10 5GH