Accessibility statement

Dr Jennifer Chubb



My work focuses on the role of responsibility in science and the public perception of science and technology.

My background is in Philosophy (University of Leeds) and in Social Science (University of York). I apply philosophical theory and applied ethics when approaching sociological problems and would consider myself to be an interdisciplinary researcher. 

I completed my PhD in 2017 at the University of York where I explored the notion of instrumentalism and epistemic responsibility in science and research. After my PhD, I took a research position at the University of Sheffield, focusing on knowledge systems, cultures, responsible metrics and assessment and expertise in science policy, advice and diplomacy. I re-joined the University of York in 2019 as a postdoctoral research associate at the Digital Creativity Labs where I began to focus on the social implications of emerging technologies and Artificial Intelligence (AI).

In 2020, I began a fellowship with XR Stories primarily focused on ethical and responsible storytelling. The fellowship ‘AI, what’s  that sound?’ brought together my love of music and research interests by exploring the ways AI is sonically portrayed in public narratives, specifically in documentaries. Relatedly, I co-supervise a WRoCAH Collaborative Doctoral Award Studentship exploring live music venues as sites of cultural heritage and am currently researching public attitudes towards AI in music-making and gig-going. My research methods are predominantly qualitative, with experience of some quantitative research as well as documentary and narrative approaches.

I am Co-Director of the Science and Technology Studies Unit (SATSU), a member of the AsSIST-UK executive board and an editor for Springer Nature Journal of Humanities and Social Sciences Communications. I am a committee member of the Wellcome Trust Early-Career Social Sciences Committee and I am an appointed advisor to the Better Images of AI project, tackling stereotypes in AI imagery. I regularly consult for external organisations including policy and business. Additionally, I am Lead for Responsible Innovation for the Centre for Doctoral Training in Lifelong Safety Assurance of AI-enabled Autonomous Systems (SAINTS CDT).

I am always happy to talk about all things science and society!



Recent projects

Setting the Legal Tone: Towards a framework for the protection of rights in voice personality

(Co-I). Peter Harrison (Law P-I) & James Tompkinson (Languages and Linguistics Co-I, funded by YorVoice).

AI is everywhere. What’s more, AI is not always what it seems. AI deep fakes create ‘misuse’ of an individual’s personal image but little is known about whether this ‘personal image’ includes the voice. Despite emphasis on the legal and ethical implications of rights pertaining to image and voice (see the SAG “Actors strike”), the legal and ethical implications of this are relatively under-explored. Many of the existing intellectual property tools which have been successful in protecting intangible concepts fail when it comes to protecting the voice - particularly the voice of the ordinary person. With a focus on the creative industries, we seek to understand the characteristics of the voice and how to protect it by understanding the legal, linguistic and ethical dimensions of ‘personality of the voice’. In this project we ask; what are the characteristics of personality in voice and can they be protected? And can a framework concerning the rights and responsibilities of voice personality be developed?

UKRI AI Centre for Doctoral Training in Lifelong Safety Assurance of AI-Enabled Autonomous Systems (SAINTS) CO-I and Responsible Innovation Lead.

SAINTS at the University of York is training 60 talented PhD students with the research expertise and skills needed to ensure that the benefits of AI-enabled Autonomous Systems (AI-AS) are realised without introducing harm as systems and their environments evolve.  SAINTS is a single-university Centre for Doctoral Training with an enthusiastic and diverse leadership team, reflecting its ambition for its student cohorts: Prof Ibrahim Habli (Director), Dr Jenn Chubb, Dr Jo Iacovides, Prof Cynthia Iglesias, Dr Ana MacIntosh, Prof John McDermid, Dr Phillip Morgan, Dr Colin Paterson, Dr Zoe Porter, Prof Tom Stoneham, Prof Richard Wilson.

AI, what’s that sound? Public attitudes toward the role of Artificial Intelligence (AI) in music making and live music.

Artificial Intelligence (AI) and algorithms play an increasingly prevalent role in how we consume music and culture. Generative AI is expanding the boundaries of music, AI personalisation tools tailor listening experiences and this expansion extends to live music immersive XR experiences. The impacts of these developments has sparked debate and revolt across the creative industries. Whilst there is research into how listeners and musicians perceive artificially created music and the effects of AI consumption on societies, musician and gig-goer attitudes towards the disruption and innovation taking place in our music making, performing and gig-going is only peripherally explored.

This research reflects on 75 survey responses from UK gig-goers and musicians about their attitudes towards AI in music. We find a relative acceptance of AI providing its use is made transparent and under certain conditions of co-creation. An overall resistance towards AI’s ability to ‘mimic affect’ or reflect (let alone replace) the role of the human in art and music creation and live expression poses questions about human connection, authenticity and relatability in an age of AI music-making. One of the outputs of which was a public exhibition at Streetlife, York.

The Sonic Framing of AI in documentaries - AI What's that Sound?

The potential of sound for storytelling and its impact on public perception is often ignored in favour of other aspects of a story’s production such as its contextual, technical or visual framing. Yet the soundtrack and sound design of stories allow the storyteller to speak to the audience and can influence their attitudes and understanding of a narrative object. The aim of this research is to understand in what ways a failure to consider the sonic framing of AI influences or undermines attempts to broaden public understanding of AI. Based on our preliminary impressions, we argue that the sonic framing of AI is just as important as other narrative features. Some of this research is published with We and AI and a forthcoming journal article in AI & Ethics.


This project aims to bring together expertise across the humanities and social sciences interested in Algorithms, Loss and Grief. In particular, I am interested in algorithmic hauntings or triggers affecting the grief process, digital death (and non-death) and the ethics of AI in understanding grief and loss.

A Vice Framework for AI

In this work, we consider if we myopically focus on the moral positives of AI, then there is a risk of inadvertently creating corrupting techno-social systems, which is potentially going to make our future work environments even worse than they’re already likely to be. Contemporary theorising about the ethical dimensions of AI indicates at least two problematic tendencies. The first, a default focus on concepts like autonomy, justice, fairness, non-discrimination, explicable in terms of the entrenchment of a liberal morality of the sort central to much contemporary moral and political philosophy. Second, a focus on concepts that track the positive dimensions of moral practice and experience, such as virtue, flourishing, moral progress, the good life (Vallor, 2016). Taken together, the result is large-scale moral frameworks for moral appraisal of new technologies that fail to systematically address the negative aspects of moral life (Jobin et al, 2019): one is emphasis on virtues and excellences to the occlusion of vices and failings; another is affirmation of the edifying potential of new technologies to the relative neglect of the corrupting possibilities (Coekelberg, 2020). p.59). We suggest that a more comprehensive assessment of the moral possibilities of new technologies in the future of work requires a more complex set of concepts and sensibilities and that there are good empirical and theoretical reasons to consider the vicious and corrupting dimensions of new technologies. If so, exploring the negative moral possibilities of these technologies requires a greater sensitivity to what we will call technomoral-vices.

AI in Society Lab – Think slowly and fix things! This network coordinates and supports thoughtful reflection on ethical AI development across the University co-led with ethicist Dr Zoe Porter in the Institute for Safe Autonomy (ISA).

Exploring the social implications of AI


The AI in Society (AIS) Lab is a forum at the University of York for academics and researchers working on the societal implications of AI.


AI Futures

In collaboration with the Digital Creativity Labs, this project explored the opportunity spaces (and gaps) noted in living with data and AI. The project yielded several outputs including invited talks and journal articles exploring AI in science and AI missing narratives.

I have published in the journals of AI and Society, The International Journal of Child-Computer Interaction, Journal of Empirical Research, Journal of Theory in Research and Education, Studies in Higher Education, Higher Education Research and Development,
Higher Education Quarterly and British Politics. I have also conducted a number of pieces of consultancy and written policy reports for the Royal Society, European Commission, JISC, Research England and The Wellcome Trust.


Full publications list

Journal articles

Poster accepted "A Vice Framework for AI". Philosophy of AI Conference (PhAI 2023). Erlangen,15-16 December, 2023.

Chubb, J. A., & Beer, D. G. (2023). Establishing counterpoints in the sonic framing of AI narratives. AI and Ethics.

Chubb, J., Cowling, P., & Reed, D. (2022). Expert views about missing AI narratives: is there an 'AI story crisis'? AI & Society: Knowledge, Culture and Communication.

Weinstein, N., Chubb., J & Wilsdon. J. (2022). Supported or Stressed While Being Assessed? How Motivational Climates in UK University Workplaces Promote or Inhibit Researcher Well-Being. Higher Education Quarterly (accepted).

Chubb, J., Cowling, P., & Reed, D. (2021). Speeding up to keep up: exploring the use of AI in the research process. AI & Society, 1-19.

Kidd, I. J., Chubb, J., & Forstenzer, J. (2021). Epistemic corruption and the research impact agenda. Theory and Research in Education, 19(2), 148-167.

Chubb, J., Missaoui, S., Concannon, S., Maloney, L., & Walker, J. A. (2021). Interactive storytelling for children: A case-study of design and development considerations for ethical conversational AI. International Journal of Child-Computer Interaction, 100403.

Samuel, G., Chubb, J., & Derrick, G. (2021). Boundaries Between Research Ethics and Ethical Research Use in Artificial Intelligence Health Research. Journal of Empirical Research on Human Research Ethics, 15562646211002744.

Weinstein, N., Chubb, J. A., Haddock, G., & Wilsdon, J. R. (2021). A conducive environment? The role of need support in the higher education workplace and its effect on academics' experiences of research assessment in the UK. Higher Education Quarterly,
75(1), 146-160.

Chubb, J., & Derrick, G. E. (2020). The impact a-gender: gendered orientations towards research Impact and its evaluation. Palgrave Communications, 6(1), 1-11.

Ingram, C., Chubb, J., Boardman, C., & Ursu, M. (2020) Generating Real-World Impact from Academic Research: Experience Report from a University Impact Hub. IEEE/ACM 42nd International Conference on Software Engineering Workshops (ICSEW’20), May 23–29, 2020, Seoul, Re- public of Korea. ACM, New York, NY, USA, 8 pages.

Reichard, B., Reed, M. S., Chubb, J., Hall, G., Jowett, L., Peart, A., & Whittle, A. (2020). Writing impact case studies: a comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications, 6(1), 1-17. 

Watermeyer, R., & Chubb, J. (2019). Evaluating impact in the UK’s Research Excellence Framework (REF): liminality, looseness and new modalities of scholarly distinction. Studies in Higher Education, 44(9), 1554-1566

Hancock, S., Wakeling, P., & Chubb, J. (2019). 21st Century PhDs: Why we need better methods of tracking doctoral access, experiences, and outcomes. Res. Res. Inst.

Chubb, J., & Reed, M. S. (2018). The politics of research impact: academic perceptions of the implications for research funding, motivation and quality. British Politics, 13(3), 295-311. Special Edition, invited contribution.

Chubb, J., & Reed, M. (2017). Epistemic responsibility as an edifying force in academic research: investigating the moral challenges and opportunities of an impact agenda in the UK and Australia. Palgrave Communications, 3(1), 1-5.

Chubb, J., Watermeyer, R., & Wakeling, P. (2017). Fear and loathing in the academy? The role of emotion in response to an impact agenda in the UK and Australia. Higher Education Research & Development, 36(3), 555-568. Special Edition, invited contribution.

Chubb, J., & Watermeyer, R. (2017). Artifice or integrity in the marketization of research impact? Investigating the moral economy of (pathways to) impact statements within research funding proposals in the UK and Australia. Studies in Higher Education, 42(12), 2360-

Book chapters

Baker et al., (2022). Do-it-together: Punk Methodologies for Researching the Heritage of Popular Music. Track 11 on Documentary film. In prep.

Chubb, J. (2014). How does the impact agenda fit with attitudes and ethics that motivate research? In P.M. Denicolo (ed), Success in Research: Achieving Impact in Research. (pp. 20 – 32). London: Sage.

Chubb, J. (2014). What skills are needed to be an impactful researcher? In P.M. Denicolo (ed), Success in Research: Achieving Impact in Research. (pp. 113-126). London: Sage.

Book review 

Gemma Milne, Smoke & Mirrors: How Hype Obscures the Future and How to See Past It  (2023)

Policy publications

XR Stories and SIGN (2022) Creative Futures. Written evidence submitted to the House of Lords Communications and Digital Committee.

A Taxonomy for AI in Science. (2023) The Royal Society.(Forthcoming)

Chubb, J.A. (2022). Industry Briefing: Conversational AI for Children, Screen Network.

ŠlosarĨík, I., Meyer, N., Chubb, J., (2020): Science diplomacy as a means to tackle infectious diseases: The case of Zika. In: Young, M., T. Flink, E. Dall (eds.) (2020): Science Diplomacy in the Making: Case-based insights from the S4D4C project. Science diplomacy as a means to tackle infectious diseases: The case of Zika.

ŠlosarĨík, I., Meyer, N., Chubb, J., (2020) Science diplomacy and Infectious diseases: Between national and European narratives - Using science diplomacy for addressing global challenges. European Commission.

Stilgoe, J., Stirling, A., Chubb, J., Montana, J., & Wilsdon., J (2018). A review of recent evidence on the governance of emerging science and technology. The Wellcome Trust. [3 citations]

Weinstein, N., Wilsdon, J., Chubb, J., Haddock, G., (2019). The Real Time REF Review: A Pilot Study to Examine the Feasibility of a Longitudinal Evaluation of Perceptions and Attitudes Towards REF 2021. Research England.

Weinstein, N., Wilsdon, J., Chubb, J., Haddock, G., (2019). The Real-Time REF Review. Full Working Paper. Research England. [16 citations]

Institute, RoRI; Hancock, S., Wakeling, P & Chubb, J., (2019). 21st Century PhDs: Why we need better methods of tracking doctoral access, experiences and outcomes. Research on Research Institute. Report.

Underpinning research assistance in NESTA’s Biomedical Bubble Report by Jones., R  & Wilsdon., J (2018).

Chubb, J. A. Report prepared for the University of Sheffield: Policy Shops in Higher Education. HEIF funded project (2019).

External activities


Reviewing duties (2017-2023) AI and Society, New Media and Society, Humanities and Social Sciences Communication, Discover Artificial Intelligence, Responsible Innovation, Studies in Higher Education, Social Science & Medicine, Critical Studies in Education, Palgrave Communications, More than Human Centred Design Conference June 2020, Higher Education Research and Development, AI and Ethics, Arts and Humanities in Higher Education, Oxford Press, Routledge books. 

Editorial responsibilities - Humanities & Social Sciences Communications, Springer Nature.

Peer Review Committees – Wellcome Trust Early-Career Social Sciences Committee.

Public outreach – Exhibition co-curator ‘Archive All Areas: People, Places, Memories & Music (2023).

Presentations, blogs and media

AI could help you find your next favourite song (interview)

Three key themes on artificial intelligence (2024)

Truth about killer robots: AI stories and narratives (2023)

‘AI in the research process’ (2023) A workshop for JISC and senior university managers. London, UK.

‘Speeding up to keep up?’ (2023). A paper for the Swiss National Science Foundation. Switzerland.

I worry about how my child will be affected by AI assistants. I'm teaching her how to fact-check their answers. (Pickburn, 2023). The Insider. 

Chubb, J.A. (2023) Exploring public attitudes to AI in music-making and gig-going. Article for Ensemble Magazine. The UK’s Association for Music Teachers. 

Chubb, J.A. (January, 2022). The AI Story Crisis. York Talks, University of York. 

Chubb & Cetin (2022). We need better AI imagery for better science communication.

Chubb, J.A. (2022). AI, Narratives, Sound and Equality. 

Chubb, J.A. (2022). Industry briefing. Conversational AI with Joi Polloi, BAFTA Award Winning Company.

Chubb, J. & Beer, D (2022). tl;dr – AI and the acceleration of research communication.

Chubb, J.A. (2021). Research report delivered to technology company Joi Polloi - a literature review of ethical considerations for the design of conversational AI for children’s storytelling.

Chubb, & Maloney (2022). AI, what’s that sound? Stories and Sonic Framing of AI.

Chubb, J.A et al. 2021. AI in research could be a rocket booster or a treadmill. Research Professional. 

Ethics of Conversational AI for Children - York Festival of Ideas June 2021 online conference.

‘Conversational AI for Children’ - Ethics and Technological Review - Department of Computer Science, University of York January 2021.

‘Conversational AI for Children’ - Ethics and Technological Review - Department of Sociology, University of York December 2021.

Good Practices in the use of Machine Learning and AI by Research Funding Organisations: A virtual RoRI workshop in three acts – 11, 18 and 25 January 2021.

The Future of AI: People and Data. Aesthetica Film Festival. November 2020.

The Future of AI and Humanity. Yornight, 8 February 2020, with Professor Peter Cowling 

Invited Speaker for BBC Voice and AI. September 2020. Responsible Innovation and Design Ethics of Voice Technology ‘Beeb.’ 

Women in Technology. York. Talk, December 2020. AI and Ethics. 

Impact: The Teenage Years. San Servolo, Venice. Venice International University. Invited talk by the Coimbra Group, Coimbra Group High-Level Seminar on Research Policy, San Servolo, 6-7 December 2018.

Australian Broadcasting Centre (ABC NEWS) - In the Age of hyper-productivity and hustling I’m embracing learning for learning’s


Australian Broadcasting Centre (ABC NEWS) - Academics dramatise expected outcomes.  

Australian Broadcasting Centre (ABC NEWS) - Study finds academics embellishing on grants. 

LSE Impact Blog - Impact a-gender? calling out the gendered nature of research impact and assessment.

Nature Index - Steps to a top-scoring impact case study. 2018. 

WonkHE article -  Why we need a real time REF Review to plan for 2027.

WonkHE article - Re-evaluating REF: have reforms changed how researchers experience the exercise? 2019.  

Times Higher Education - REF Review. 

Converge in Research Professional - REF Review. 

The Conversation - Academics feel pressure to embellish. 

The Conversation - The value of knowledge. 

Times Higher Education - Integrity and grants.

Are AI documentaries sonic playgrounds of hype? - AI Ethics, Talk for Women in Tech Network, York, Revolution Bar, 11 January 2024.




  • Supervision
  • AI in Society (Sociology) (Module Convenor)



  • Supervision
  • Future of Story (School of ACT)
  • Researching Digital Life
  • MSc AI in the Creative Industries (Sociology and School of ACT) (Module Convenor)
  • PGT Dissertation



Other teaching


Director of Student Experience and Academic Lead for Supervision (UG/PGT)


Yorgos Paschos - Assessing the (Sub) cultural Heritage Significance of Grassroots Music Venues.

Paul Ord - Algorithmic Hauntings: miscarriage and grief in online social media.

Contact details

Dr Jennifer Chubb
Dept of Sociology
University of York
YO10 5GD

Tel: +44 (0)1904 32 2163