High Integrity Systems Research Group
Our research focuses on safety-critical software development and assurance across multiple domains, including emerging autonomous technologies.
It ranges from fundamental principles through to trial implementation, linking core computer science with disciplines including engineering, law, mathematics, philosophy and psychology.
Contact us
Professor John McDermid
High Integrity Systems Research Group lead
The group is well-known for its work on assurance cases, having developed the widely used goal structuring notation (GSN). The work on assurance of machine learning (ML) includes argument patterns supporting each stage of the assurance process, e.g. training data selection. There is ongoing work with lawyers and philosophers (ethicists) to develop principled ethical assurance arguments and to find ways of representing and reasoning about responsibilities for the decisions and actions of autonomous systems.
There are several strands of work focused on the safety assurance and sociotechnical resilience of robotics and autonomous systems (RAS). Such systems move decision-making responsibility from human to machine, and we're developing analysis methods to identify hazards and to derive safety requirements to control those hazards. Similarly, we are developing ways of using mathematical methods to develop decision-making algorithms together with proofs of their soundness. The group is also exploring the use of explanations of ML-based decision-making systems to support validation, and hence assurance, of such algorithms.
The group is involved in major projects: The six-year Assuring Autonomy International Programme (AAIP), supported by Lloyd’s Register Foundation, focuses on the assurance and regulation of RAS. The AAIP develops and evaluates assurance methods with its partners in domains as diverse as agriculture, autonomous driving, healthcare, maritime and space. The group leads the Trustworthy Autonomous Systems Node in Resilience - one node of the EPSRC-funded Trusted Autonomous Systems (TAS) programme. We are also involved in two TAS-funded projects concerned with improving trust in autonomous systems, including the Assuring Responsibility for Trusted Autonomous Systems project. The group also has projects directly funded by a range of companies in several different sectors and with government bodies, including NHS Digital and Volvo Construction Equipment.
We advance the state of the art and practice of assuring the safety of complex computer-based systems, including autonomous systems such as self-driving cars. This is a critical step to enable the promised benefits of robotics and autonomy to be realised whilst ensuring the health and safety of the public. The group provides the conceptual underpinnings and key methods to design, implement, verify and assure robots and autonomous systems, with a special emphasis on assurance of safety. As the systems progressively embrace ML and become capable of operating with greater degrees of autonomy, this introduces ethical challenges, and the group’s work is expanding to embrace assurance of ethical as well as safe behaviour of ML-based systems.
More specific objectives include:
- providing verification and assurance processes for demonstrating the safety of ML-based systems
- developing methods for analysing robots and autonomous systems to derive safety requirements for the system as a whole, as well as for the sensing and decision-making subsystems
- developing theory and tools for the verification of these safety requirements
- developing methods for the synthesis of correct-by-construction decision-making control systems
- developing principles and guidelines for ethically assured development of resilient robotics and autonomous systems
- understanding how to ensure the safety of the interactions between humans and autonomous systems
- evaluating the efficacy of safety engineering methods.
The group influences industrial practice and regulations by engaging with industry and government across a range of sectors including aerospace, automotive, healthcare and maritime.
The group’s research has influenced national and international safety standards, guidelines and regulations. These include ISO 26262, the international standard for programmable electronics and software used by all automotive suppliers. The GSN method, which is used to document and present proof that safety goals have been achieved, has advanced the implementation of safety case practice within regulatory structures and processes.
The group’s ongoing work on regulation and standards for autonomous driving will benefit much of UK society by establishing norms for assuring that the vehicles are safe and behave ethically. This work will also benefit citizens of other countries through influence on international standards. The group’s work in the healthcare sector is already influencing policy and is likely to benefit many hospital patients, for example by assuring and explaining the safety of recommendations from ML-based decision-support systems. The work will also benefit clinicians and is already helping to provide a sound technical basis for regulating such systems. This form of impact will increasingly be seen in other sectors including aviation, maritime and defence. The group also provides specialist research-led teaching to industry and government departments to support the transition of emerging principles and methods from research into practice.
Stories
Group members
Photo | Contact details |
---|---|
Academic staff | |
Academic staff |
|
Academic staff
|
|
Academic staff |
|
Academic staff |
|
Academic staff |
|
Academic staff
|
|
Dr Yan Jia Academic staff
|
|
Academic staff - group lead |
|
Academic staff |
|
Academic staff |
|
Academic staff mark.sujan@york.ac.uk |
|
Academic Staff |
|
Research staff | |
Research Associate |
|
Senior Technical Specialist |
|
Nathan Hughes Research Associate |
|
Dr Calum Imrie Research Associate |
|
Marten Kaas Research Associate |
|
Dr John Molloy Research Fellow |
|
Research Fellow & Postgraduate Research Student |
|
Research Fellow |
|
Research Fellow |
|
Research Assistant |
|
Ioannis Stefanakos Research Associate |
|
Research Staff/ Postgraduate Research Student
|
|
Research Associate |
|
Postgraduate research students | |
Haris Aftab Postgraduate Research Student |
|
John Bragg Postgraduate Research Student |
|
Brendan Devlin-Hill Postgraduate Research Student |
|
Jane Fenn Postgraduate Research Student |
|
Hasan Bin Firoz Postgraduate Research Student |
|
Frank Fischer Postgraduate Research Student |
|
Josh Hunter Postgraduate Research Student |
|
Vira Jogia Postgraduate Research Student |
|
Shakir Laher Postgraduate Research Student |
|
Dr Ernest Lim Postgraduate Research Student |
|
Berk Ozturk Postgraduate Research Student |
|
Tejas Pandey Postgraduate Research Student |
|
Annabelle Partis Postgraduate Research Student |
|
Nawshin Mannan Proma Postgraduate Research Student |
|
Georgia Sowerby Postgraduate Research Student |
|
Mohammad Tishehzan Postgraduate Research Student |
|
Danielle Turner Postgraduate Research Student |
|
Bernard Twomey Postgraduate Research Student |
|
Stephen Wright Postgraduate Research Student |
|
Qi Zhang Postgraduate Research Student |
|
Other affiliates | |
Teaching & Scholarship |
|
Teaching & Scholarship |
|
Academic affiliate |
|
Teaching & Scholarship affiliate |
|
Teaching & Scholarship |
|
Teaching & Scholarship |
|
Associate Staff |
Contact us
Professor John McDermid
High Integrity Systems Research Group lead