John McDermid has been Professor of Software Engineering at the University of York since 1987 and twice served as Head of Department. He founded the High Integrity Systems Engineering (HISE) research group, and has researched, taught and consulted on the safety of complex computer-based systems internationally over the last 30 years. He originated the Goal Structuring Notation (GSN) now used widely for the structuring of safety cases and has been involved in a number of projects adapting classical safety methods for computer systems and software. He also helped develop a number of safety standards, including CAP 670 SW01 used in air traffic management, and DS 00-560 and 550 used in defence.
Professor McDermid’s research interests include the safety of robotics and autonomous systems and he has worked in a number of domains, particularly in aerospace. He continues to work on enhancing software engineering and safety practices in civil aerospace, for example through the SECT-AIR and ENCASE projects funded through the Aerospace Technology Institute (ATI). He is also involved in a range of projects in the maritime domain, including work on submarine systems. He has a long-standing collaboration on railway signalling with Beijing Jiaotong University, and collaborates with a number of other research, industrial and government organisations world-wide.
He was Head of the Department of Computer Science at the University of York from 2006 to 2012 and was elected a Fellow of the Royal Academy of Engineering in 2002. He was awarded an OBE in 2010.
Visit John McDermid's profile on the York Research Database to see publications, projects, collaborators, related work and more.
Dr Rob Alexander is a Lecturer in the High Integrity Systems Engineering (HISE) group in the Department of Computer Science at the University of York. His main research focus is automated testing, with a particular interest in the safety and security validation of autonomous robots. He is currently supervising research projects on mutation testing in model-driven engineering, justifying the dependability of safety decision support systems, and automated robot testing using simulation and metaheuristic search. He has published over 50 papers on these and related topics and is currently chairing the Safety Critical Systems Club’s working group on the safety of autonomous systems.
Dr Radu Calinescu is a Reader in Large-Scale Complex IT Systems at the University of York. He holds a DPhil in Computation from the University of Oxford, and was awarded a British Computer Society Distinguished Dissertation Award for his doctoral research. His research team is undertaking leading research on the use of mathematical modelling and analysis to improve the safety and performance of self-adaptive systems and processes. He has published over 120 research papers, has been a PI on projects funded by £1.75M of Dstl, Lloyd’s Register Foundation, EPSRC and Wellcome Trust grants, and led industrial R&D projects as the Senior Development Manager and Technical Architect of an Oxford University spin-off software company. He chaired or served on the Programme Committees of over 80 international conferences and workshops, and is a member of the editorial board of Springer’s Computing journal, for the area of autonomic, adaptive and dependable computing.
Ibrahim Habli is a Senior Lecturer in the Department of Computer Science. His research interests are in the design and assurance of safety-critical systems, with a particular focus on intelligent systems (e.g. autonomous and connected driving) and Digital Health (e.g. ePrescribing and self-management apps). In 2015, he was awarded a Royal Academy of Engineering Industrial Fellowship through which he collaborated with the NHS on evidence-based means for assuring the safety of digital health systems.
Ibrahim’s work is highly interdisciplinary, with active collaborative links with clinicians, health scientists, economists and ethicists. His research is empirical and industry-informed, with collaborative projects with organisations such as Rolls-Royce, NASA, Jaguar Land Rover and NHS Digital. Ibrahim has led/co-led research programmes funded by the EPSRC, the Royal Academy of Engineering, the EU and industry. He has been a member of several international and national safety standardisation committees (e.g. DO178C, MISRA and BSI).
Ibrahim mainly teaches on York’s postgraduate programmes in Safety-Critical Systems Engineering and supervises several PhD students (including practising engineers).
Richard Hawkins is a Senior Research Fellow in the Assuring Autonomy International Programme, where he is investigating the assurance and regulation of robotic and autonomous systems.
Prior to joining the Programme he was a Lecturer in the High Integrity Systems Engineering (HISE) research group at the University of York where his research focused on the safety and assurance of software systems and the development of assurance cases. He previously worked as a Software Safety Engineer for BAE Systems and as a safety adviser in the nuclear industry.
Mark is a dedicated educator with over 20 years of experience developing and delivering industrially oriented teaching both on Award-bearing (PGCERT, PGDIP, MSc) programmes and non-credit bearing Continuing Professional Development of staff involved in system safety. He is the educational lead for the Assuring Autonomy International Programme and a member of the University of York's Teaching Committee.
He leads the IET / BCS accredited Safety-critical Systems programmes at York. Dedication to education reaches beyond practice to developing the teaching and academic skills of colleagues at the University of York through involvement with professional development programmes as both supervisor and chair Board of Studies.
The education provided is highly industrially relevant based on research and consultancy activities, as well as best industrial practice. He was the European editor of ARP 4754a Civil Aerospace safety recommended practice. He is currently involved in creating competency guidelines for those involved in analysing and assessing the impact of digital security issues in Safety-Critical Systems. Data is a core enabling technology for autonomy so he is also researching data safety issues.
Youcef received the Ph.D. degree in computer engineering from the University of Annaba, Algeria, in 2016. He is currently a Research Associate in the Assuring Autonomy International Programme and robotics for inspection and maintenance project (RIMA).
His research interests include optimisation, machine learning, probabilistic risk and safety analysis and autonomous system safety. He has published over 20 papers on these and related topics. In 2016 Youcef received The National Innovation Award from the Algerian Government.
Calum Imrie is a Research Associate at the Assuring Autonomy International Programme, with his particular research focus on the safety and continuous assurance of robotics and autonomous systems. This is part of the wider aim to bridge the gap between academia and real-world applications.
Calum completed his PhD at the University of Edinburgh, within the Edinburgh Centre of Robotics. The primary focus of Calum's thesis was swarm robotics titled: Robustness of Interaction Control in Robot Swarms. During his studies, Calum has also investigated low-level controllers, emotional robotics, reinforcement learning, and information theoretic approaches to robot control.
Nikita Johnson is a Research Associate in the Assuring Autonomy International Programme.
Dr John Molloy joined the AAIP as a Research Fellow in 2020. His work includes the assurance of understanding and perception suites in autonomous systems, and the effects of environmental factors on sensor performance.
Previously, he spent ten years in the Electromagnetic Technologies Group at the National Physical Laboratory (NPL), where he was technical lead for autonomous systems, and future communications. He delivered several projects on perception in vehicular and marine autonomous systems including two reports on behalf of CCAV. He provided technical consultancy for external customers such as Lloyd’s Register, Osram GmbH and the USAF.
From 2015-2020 he was visiting research scientist at the Hyper Terahertz Facility, University of Surrey, providing technical expertise on the development of high-power laser and electro-optic systems for remote sensing and quantum applications.
John received his Eng.D in Applied Photonics from Heriot-Watt, Scotland (2016) for work on Nonlinear Optics and development of solid-state THz Optical Paramedic Oscillators (OPOs). He received his MSc Photonics & Optoelectronic Devices from St. Andrews & Heriot-Watt, Scotland (2010) and his BSc (Hons) Physics & Astronomy, NUI, Galway, Ireland (2006)
Previously in industry, he worked as a laser engineer, manufacturing diode-pumped solid-state lasers, and as an electronic engineer, developing high-speed high-capacity storage solutions for digital media.
Matt is a Research Fellow in the Assuring Autonomy International Programme, where he is currently researching software safety assurance in AI, and the safety of decision-making in autonomy. Prior to joining the AAIP, Matt worked for many years as an independent consultant in assuring the safety of software in complex, socio-technical systems.
Matt’s PhD research involves identifying the impediments to recognised good practice in software safety assurance; with the aim of identifying, characterising, and eradicating the impediments to the adoption of best practice for software safety assurance
Mike is a Research Fellow on the AAIP, looking at autonomy in complex environments. He is also the Safety-Critical Systems Club (SCSC) Director and Events Coordinator and leads the SCSC Data Safety Initiative and Service Assurance Working Groups. He has been in industry for many years; his first safety project was a launch tracking system for the Ariane space launcher.
Chiara is a Research Associate in the Assuring Autonomy International Programme. Chiara’s research interests include machine learning and the development of dynamic safety cases. In particular she is currently working on autonomous system safety.
Previously she was enrolled in a PhD at the Department of Electronics at the University of York. During her PhD she investigated the efficiency of the evolutionary algorithms in monitoring and diagnosing two neurological disorders. She has a Masters in Artificial Intelligence and Robotics and a Bachelor degree in Computer Engineering.
Zoe is a Research Associate in the Assuring Autonomy International Programme where her research concerns the ethics, regulation, and governance of autonomous systems. She is also a postgraduate researcher in the Department of Philosophy, interrogating the question of the allocation of moral responsibility for unintended harms caused by artificial agents.
Zoe was previously Chief Speechwriter at the Equality and Human Rights Commission, and she has also worked for members of the UK and European Parliaments. She holds an MA in Philosophy from the University of London, and a BA from the University of Oxford.
Programme management and administration
Sarah is Partnership and Communications Manager for AAIP. In her current role, Sarah leads work to identify suitable global partners for the AAIP, developing mutually beneficial collaborations to accelerate the safe adoption of robotics and autonomous systems. She leads the communications for the AAIP and works closely with the Programme team and external stakeholders to identify key messages and information to support the Programme's aims to generate international impact. Sarah's background is in the charity and educational sectors, where she has led the development of major partnerships and collaborations including securing the support of Boots, Asda and the Football Association for the No Smoking Day campaign. Her work has spanned partnerships, stakeholder communications and engagement, marketing, and fundraising.
Chrysle took up her role as Programme Administrator in December 2017. Previously, Chrysle was the Head of Department’s PA in Computer Science.
John is a part-time PhD research student. His research is within the area of the safety assurance challenges of machine learning systems, with a focus on the cost and reward mechanisms associated with reinforcement learning. John is a recent graduate of our MSc in Safety-Critical Systems Engineering and also holds a MEng (Hons) in Systems Engineering from Loughborough University. John is a chartered engineer and works full-time in industry as a product safety specialist and technical expert in software safety. John is also a member of several safety-related bodies including MISRA C++, and the SCSC’s Data Safety Initiative and Safety of Autonomous Systems working groups.
Naif is a research student in the Assuring Autonomy International Programme. His research work is focussed on building a software performance model for cloud-based big data applications. Naif has a bachelor degree in Information Systems from King Khalid University, Saudi Arabia, a masters degree in Computer Information Systems from St. Mary’s University, San Antonio, USA. He has worked as a system analysts assistant at Umm Al Qura University, Saudi Arabia, a teaching assistant in the College of Arts and Sciences in King Khalid University and as a lecturer at King Khalid University where he was also head of the Information Systems department.
Helen is a chartered engineer with over 20 years’ industrial experience, working in a product development environment. Helen’s career in system safety started with British Rail Research. She then worked at Jaguar Land Rover for 15 years as power train system safety technical specialist. In 2011 Helen joined Protean Electric, and in 2016 joined HORIBA MIRA as a chief engineer for functional safety. Helen has published a number of automotive functional safety related papers and promotes industry best practice through her work with MISRA, ISO and SAE groups. Helen is currently a member of the UK ISO 26262 functional safety for road vehicles group, a UK representative on the ISO 21448 Safety of the Intended Functionality (SOTIF) working group, and a MISRA Safety Case guidelines author. Helen is interested in the impact that highly automated driver assistance systems might have on the way industry develops and assures the safety of such vehicle features. As a doctoral research student at York Helen is researching controllability considerations for highly automated and connected vehicle safety.
Josh is a PhD Research Student on the Assuring Autonomy International Programme. His research is focussed on safety within multi-agent reinforcement learning primarily for robotic teams. Prior to joining the University of York Josh obtained an MSc in Advanced Computer Science from the University of Birmingham, focusing his work on intelligent mobile robotics and multi-robot systems in both theoretical and practical settings. Much of this work was built upon Josh's foundations within game theory and reinforcement learning, which was gained through a BSc in Computer Science (Games Development) at the University of Wolverhampton.
Andrew Shepherd is a research student in the Assuring Autonomy International Programme. His research investigates the synthesis and verification of dynamic assurance cases for self-adaptive systems. Having completed an EngD in aerospace engineering with Cranfield University and Airbus, Andrew worked in industry as a systems engineer across land, air, maritime and information system domains. Recently, Andrew joined the civil service where he continues his systems engineering work providing support to large engineering programmes. Andrew holds a Bachelor’s degree in computer science from York as well as Masters degrees in flight dynamics and business administration from Cranfield.
Premathas (Prem) Somasekaram is a research student in the Assuring Autonomy International Programme and also a member of the enterprise systems research group at the University of York. His research focusses on introducing a probabilistic reasoning approach to improve the effectiveness of High-Availability Clusters (HAC) and to promote autonomous decision-making in HACs. Prem holds Bachelor’s degrees in computer science from Uppsala University, Sweden and computer engineering from Linnaeus University, Sweden as well as master degrees in information security from Luleå University of Technology, Sweden and computer science from Uppsala University, Sweden.
Ioannis Stefanakosis a research student in the Assuring Autonomy International Programme and also a member of the enterprise systems research group at the University of York. His interests include software engineering and formal verification. In particular he is working on the verification of functional and non-functional properties of software systems. Having completed a Bachelor degree in Information Technology, Ioannis started a PhD at University of York. His project focuses on devising theory and tools for combining the formal verification of functional properties of software systems; and the probabilistic model checking of performance and dependability properties of these systems.
For 12 years, Bernard was the Electrical Superintendent for a Norwegian shipping company. He left and attended Loughborough University of Technology where he graduated with a degree in electro-mechanical power engineering. In 1993 he joined Lloyd’s Register, and in 2007 was invited to become the Global Head of Electro-technical Systems within the Technology Directorate. In 2000 Bernard was appointed as an Advisor to the UK Naval Authority for Power and Propulsion. Since 2017 Bernard has been responsible for the Regulatory Development activities within marine at Rolls-Royce Marine, focussing on the Maritime Autonomous Infrastructure. During his career Bernard has presented and published around 50 technical papers and articles on many aspects of marine electro-technology systems, and is an active member on a number of research and special interest groups. He sits on the technical committees of two leading classification societies, two international committees, and is a member of ISO, BSI and CIMAC. In 2018 Bernard took up an offer of a part-time PhD research post in the Department of Computer Science at the University of York.
Saud has been a PhD student at the University of York since October 2016. He is a member of the enterprise systems group under the supervision of Dr Radu Calinescu and Dr. Javier Camara. He holds a MSc. degree in Computer Security from the School of Computer Science, University of Birmingham. His main research focus is the application of formal techniques to ensure performance properties of distributed self-adaptive systems.
Marie Curie Fellows
Fang Yan is an early stage researcher under the Marie Sklodowska-Curie Innovative Training Network “SAS – Safer Autonomous Systems”. Her research interests are in the model-based assurance cases process for robotic and autonomous systems.
She is also enrolled as a PhD student at the University of York working with the AAIP group and RoboStar Group, on assurance case generation and verification technical solutions.
Fang holds a Master degree in aviation maintenance engineering from ENSICA, France. After that, she worked in the aviation software certification field for years to provide certification service, industry advisory, teaching, and training in China.