We ask and answer the practical ethical questions raised by the integration of RCAS in society in order to ensure that any such systems are demonstrably beneficial, fair and trustworthy.
Robotics and connected autonomous systems have the potential to change the way that society works. To achieve this safely, we need to ask ourselves some questions.
What change do we want to see? What say should we have in it? How will it affect our rights as citizens? Will it narrow or widen existing social divisions, or create new ones?
These are broad societal questions of justice, fairness and trust, and it is the role of the Society and Ethics research pillar to answer them. This means harnessing an equally broad range of perspectives, reaching across academic disciplines in York and beyond, and engaging proactively with as representative a cross section of the population as possible.
Only that way can we truly put the public good at the heart of the design, deployment and governance structures of these systems.
“For autonomous systems to be safe by design, they also have to be ethical by design. And that’s a conversation in which the ISA is determined to make every voice matter.”
Dr Philip Garnett, Research Lead
Activities and Partnerships
Administrative Fairness Lab
We are analysing the impact of autonomous systems on the actual and perceived fairness of 'frontline' interactions between people and government, compiling robust evidence on how RCAS can reinforce positive civic engagement across society.
We are working with housing associations to explore the potential use of RCAS to monitor and manage environmental health issues within properties, identifying and addressing the technical, ethical and legal issues this raises for government, landlords, tenants.
We are investigating the increased use of autonomous systems within organisations to monitor their activities and identify ‘threats’ to their performance. What impact will these data-led forms of management have on organisations, individual employees, and society?