How can we develop autonomous vehicles that can explain the decisions they take?

Understanding the decisions taken by an autonomous machine is key to building public trust in robotics and autonomous systems (RAS). This project will design, develop, and demonstrate fundamental AI technologies in real-world applications to address this issue of explainability.

The aim of the project is to build robots, or autonomous vehicles, that can:

  • sense and fully understand their environment
  • assess their own capabilities
  • provide causal explanations for their own decisions

In on-road and off-road driving scenarios, the project team will study the requirements of explanations for key stakeholders (users, system developers, regulators). These requirements will inform the development of the algorithms that will generate the causal explanations.

The work will focus on scenarios in which the performance of traditional sensors (e.g cameras) significantly degrades or completely fails (e.g. in harsh weather conditions). The project will develop methods that can assess the performance of perception systems and adapt to environmental changes by switching to another sensor model or a different sensor modality. For the latter, alternative sensing devices will be investigated (incl. radar and acoustic sensors) which can guarantee robust perception in situations when traditional sensors fail.

Project progress

The team has been extending their methods for interpreting and representing observations of the environment in human-understandable terms, which will allow end-users to comprehend algorithmic decisions. To this end, they have improved the way that traditional sensors (e.g. cameras, lasers) are used in complex and rare traffic situations. However, due to limitations of these sensors under certain environmental conditions (e.g. bad weather, poor illumination), the team has also investigated non-traditional sensors (e.g. radars, microphones) for understanding the drivable surface and for assessing the ability to measure the vehicle's ego-motion.

The team takes a more nuanced approach than solely relying on one type of sensor. Towards this, they have developed a system which facilitates introspection (knowing when we don’t know) using radar. They have recently released a radar dataset advocating for the increased exploitation of these unusual sensors, and will use the lessons learned here in planning of the collection and release of a SAX dataset capturing complex, challenging driving scenarios outside of already well-investigated urban environments.

The team with a specially adapted ORI Range Rover in off road trials at Blenheim

The team undertaking off-road trials in December 2019 with a specially adapted ORI Land Rover, equipped with vision and Lidar sensors to gather data to test algorithms for localisation and perception for autonomy in challenging environments.

From left to right: Matt Towlson (Hardware Engineer, Principal Research Technician), Matthew Gadd (Postdoctoral Research Assistant, Researcher Co-Investigator), Lars Kunze (Departmental Lecturer, Co-Investigator), Daniele De Martini (Postdoctoral Research Assistant, Researcher Co-Investigator), and Oliver Bartlett (Trials Manager, Trials and Communications Organiser).
 
Earlier research was very focused on extremely robust localisation with unusual sensors - using only radar or using a combination of radar and satellite imagery - and introducing audio as information to use in conjunction with other sensors. The team saw some very promising initial results on using radar to obtain a rich understanding of the objects and agents in the scene, for safer operation of autonomous vehicles. Their recently released urban autonomy dataset is the largest radar-focused resource available to the community and has been gaining traction - having recently been accepted for formal publication and presentation at an international venue. Beyond this traditionally explored urban setting, the team has been busy planning and capturing some of the initial records for the more unusual off-road dataset.
 
The team recommenced trials after the initial UK-wide COVID-19 lockdown and will soon expand their off-road trial schedule to an alternative site in England.
 
They have begun diversifying their planned dataset release with urban, suburban, and motorway driving conditions, and have now formalised the sensing suite to publish as part of the dataset release. Aside from fieldwork, the team has consolidated a two-year line of research in radar localisation and published an open-access journal paper to that effect. They are moving on from alternative sensing work and developing introspective techniques for navigation and scene understanding - recently having developed a performance assessment system and novelty detection for camera segmentation. 
 
Most recently the team has continued its data collection for their mixed off-road, urban, and motorway dataset. They are continually paying attention to the ultimate value of the dataset to the broader community and making sure that the planned annotations and ground truth are a part of their trial schedule. Specifically, they are finalising a precise motion estimation ground truth for off-road driving, looking into annotating their motorway dataset with vehicle locations, and recording a series of voiceovers with a professional driving instructor in tricky urban driving scenarios. In terms of scientific contributions, the team continues to dig into the effectiveness of their research radar for enabling robust vehicle autonomy, showing recently its capability in proposing sensible routes for an autonomous vehicle to follow.
Sense-Assess-eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios”, in Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Workshop on Ensuring and Validating Safety for Automated Vehicles (EVSAV), 2020
 
Presentations and papers
Project team