- See a full list of publications
- Browse activities and projects
- Explore connections, collaborators, related work and more
My work focuses on the role of responsibility in science and the interplay between science and society.
My background is in Philosophy (University of Leeds) and in Social Science (University of York) as such I apply philosophical theory and applied ethics when approaching sociological problems. My research expertise is in epistemic responsibility, ethics, socio-technical futures, the governance of science and research, algorithms and AI, knowledge systems and notions of epistemic value, meta-research, impact and metrics, scientific narratives and the public perception of science and technology. In recent years my research has focused on the role of responsible storytelling and ethical development of Artificial Intelligence (AI). This interest in AI extends across a range of domains, with specialisms in higher education, health and the creative industries. I am particularly interested in virtue/vice epistemology and moral care as applied to digital futures. My current research focuses on the public perceptions of the role of AI in music making and gig-going.
I completed my PhD in 2017 at the University of York where I explored the notion of instrumentalism and epistemic responsibility in science and research. I re-joined the University of York in 2019 as a postdoctoral research associate at the Digital Creativity Labs where I worked on AI futures. Prior to this I held a research position at the University of Sheffield, focusing on institutions, research policy, expertise in science policy, advice and diplomacy. In 2020, I began a fellowship with XR Stories primarily focused on ethical and responsible storytelling. The fellowship ‘AI, what’s that sound?’ brought together my love of music and research interests by exploring the ways AI is sonically portrayed in public narratives, specifically documentary. Relatedly, I co-supervise a WRoCAH Collaborative Doctoral Award Studentship exploring live music venues as sites of cultural heritage.
I am a board member of the Science and Technology Studies Unit, a member of the AsSIST-UK executive board and an editor for Springer Nature Journal of Humanities and Social Sciences Communications. I am a committee member of the Wellcome Trust Early-Career Social Sciences Committee and I an appointed advisor to the Better Images of AI project, tackling stereotypes in AI imagery.
I am an Interdisciplinary researcher focusing on the impact of science and technology on society and culture. My research broadly focuses on the ethical implications of science on the public, practice and policy and the role of stories in sense-making science and I have particular interests in the domains of music, higher education, interactive media and health and implications for younger generations and marginalised groups.
In recent years, my research agenda has focused heavily on the implications of Artificial Intelligence (AI). I have conducted futures research and qualitative empirical work to seek to understand what desirable outcomes look like in a future with AI, who benefits and who is disadvantaged? I am interested in how we mitigate against bias and injustice. How we protect marginalised groups and young people. I have looked at the ethical impact of AI in health, in science and in the creative industries. Specifically, I am interested in the role that stories play in shaping our understanding of the future. What dominant narratives exist about AI, to what extent they differ from the current state of the technology, hype and embellishment.
Recent projects include:
AI, what’s that sound? Public attitudes toward the role of Artificial Intelligence (AI) in music making and live music.
Artificial Intelligence (AI) and algorithms play an increasingly prevalent role in how we consume culture. Our live music experience is changing. Immersive live experiences such as ABBA’s ‘Voyage’ famously launched a digital holographic concert. During the pandemic, augmented reality and streaming allowed us to experience virtual venues in the comfort of our own homes. From Fortnite to Minecraft, some of the world’s biggest virtual gaming spaces hosted gatherings of thousands of music fans. At the same time algorithms are influencing our music consumption and streaming behaviour. Want to know your ‘year unwrapped’? Algorithms will do that for you. Recommendation systems enable you to explore new and exciting genres and bands. What’s more, AI is creative. AI can tell you a joke, paint a picture, tell you stories and compose songs. But does it know the joke is funny or understand the beauty of a painting? What then are the likely impacts on live music and the future of music? Whilst there is research into how listeners and musicians perceive artificially created music, less is known about public attitudes towards the disruption and innovation taking place in our music making, performing and gig-going. I am running a survey about AI in music-making and gig-going. One of the outputs of which was a public exhibition at Streetlife, York.
The Sonic Framing of AI in documentaries - AI What's that Sound? The potential of sound for storytelling and its impact on public perception is often ignored in favour of other aspects of a story’s production such as its contextual, technical or visual framing. Yet the soundtrack and sound design of stories allow the storyteller to speak to the audience and can influence their attitudes and understanding of a narrative object. The aim of this research is to understand in what ways does a failure to consider the sonic
framing of AI influence or undermine attempts to broaden public understanding of AI? Based on our preliminary impressions, we argue that the sonic framing of AI is just as important as other narrative features. Some of this research is published with We and AI
and a forthcoming journal article.
ALGorithms – This project aims to bring together expertise across the humanities and social sciences interested in Algorithms, Loss and Grief.
A Vice Framework for AI – In this work, we consider if we myopically focus on the moral positives of AI, then there is a risk of inadvertently creating corrupting techno-social systems, which is potentially going to make our future work environments even worse than they’re already likely to be. Contemporary theorising about the ethical dimensions of AI indicates at least two problematic tendencies. The first, a default focus on concepts like autonomy, justice, fairness, non-discrimination, explicable in terms of the entrenchment of a liberal morality of the sort central to much contemporary moral and political philosophy. Second, a focus on concepts that track the positive dimensions of moral practice and experience, such as virtue, flourishing, moral progress, the good life (Vallor, 2016). Taken together, the result is large-scale moral frameworks for moral appraisal of new technologies that fail to systematically address the negative aspects of moral life (Jobin et al, 2019): one is emphasis on virtues and excellences to the occlusion of vices and failings; another is affirmation of the edifying potential of new technologies to the relative neglect of the corrupting possibilities (Coekelberg, 2020). p.59). We suggest that a more comprehensive assessment of the moral possibilities of new technologies in the future of work requires a more complex set of concepts and sensibilities and that there are good empirical and theoretical reasons to consider the vicious and corrupting dimensions of new technologies. If so, exploring the negative moral possibilities of these technologies requires a greater sensitivity to what we will call technomoral-vices.
AI & Society Network – Think slowly and fix things! This network is in the process of being set up to coordinate and support thoughtful reflection on AI development across the University co-led with ethicist Dr Zoe Porter in the Institute for Safe Autonomy.
AI Futures – In collaboration with the Digital Creativity Labs, this project explored the opportunity spaces (and gaps) noted in living with data and AI. The project yielded several outputs including invited talks and journal articles exploring AI in science and AI missing narratives.
I have published in the journals of AI and Society, The International Journal of Child-Computer Interaction, Journal of Empirical Research, Journal of Theory in Research and Education, Studies in Higher Education, Higher Education Research and Development,
Higher Education Quarterly and British Politics. I have also conducted a number of pieces of consultancy and written policy reports for the Royal Society, European Commission, Research England and The Wellcome Trust.
Full publications list
Chubb, J., Cowling, P., & Reed, D. (2022). Expert views about missing AI narratives: is there an 'AI story crisis'? AI & Society: Knowledge, Culture and Communication.
Weinstein, N., Chubb., J & Wilsdon. J. (2022). Supported or Stressed While Being Assessed? How Motivational Climates in UK University Workplaces Promote or Inhibit Researcher Well-Being. Higher Education Quarterly (accepted).
Chubb, J., Cowling, P., & Reed, D. (2021). Speeding up to keep up: exploring the use of AI in the research process. AI & Society, 1-19.
Kidd, I. J., Chubb, J., & Forstenzer, J. (2021). Epistemic corruption and the research impact agenda. Theory and Research in Education, 19(2), 148-167.
Chubb, J., Missaoui, S., Concannon, S., Maloney, L., & Walker, J. A. (2021). Interactive storytelling for children: A case-study of design and development considerations for ethical conversational AI. International Journal of Child-Computer Interaction, 100403.
Samuel, G., Chubb, J., & Derrick, G. (2021). Boundaries Between Research Ethics and Ethical Research Use in Artificial Intelligence Health Research. Journal of Empirical Research on Human Research Ethics, 15562646211002744.
Weinstein, N., Chubb, J. A., Haddock, G., & Wilsdon, J. R. (2021). A conducive environment? The role of need support in the higher education workplace and its effect on academics' experiences of research assessment in the UK. Higher Education Quarterly,
Chubb, J., & Derrick, G. E. (2020). The impact a-gender: gendered orientations towards research Impact and its evaluation. Palgrave Communications, 6(1), 1-11.
Ingram, C., Chubb, J., Boardman, C., & Ursu, M. (2020) Generating Real-World Impact from Academic Research: Experience Report from a University Impact Hub. IEEE/ACM 42nd International Conference on Software Engineering Workshops (ICSEW’20), May 23–29, 2020, Seoul, Re- public of Korea. ACM, New York, NY, USA, 8 pages. [3 citations]
Reichard, B., Reed, M. S., Chubb, J., Hall, G., Jowett, L., Peart, A., & Whittle, A. (2020). Writing impact case studies: a comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications, 6(1), 1-17. [8 citations]
Watermeyer, R., & Chubb, J. (2019). Evaluating impact in the UK’s Research Excellence Framework (REF): liminality, looseness and new modalities of scholarly distinction. Studies in Higher Education, 44(9), 1554-1566
Hancock, S., Wakeling, P., & Chubb, J. (2019). 21st Century PhDs: Why we need better methods of tracking doctoral access, experiences, and outcomes. Res. Res. Inst.
Chubb, J., & Reed, M. S. (2018). The politics of research impact: academic perceptions of the implications for research funding, motivation and quality. British Politics, 13(3), 295-311. Special Edition, invited contribution.
Chubb, J., & Reed, M. (2017). Epistemic responsibility as an edifying force in academic research: investigating the moral challenges and opportunities of an impact agenda in the UK and Australia. Palgrave Communications, 3(1), 1-5.
Chubb, J., Watermeyer, R., & Wakeling, P. (2017). Fear and loathing in the academy? The role of emotion in response to an impact agenda in the UK and Australia. Higher Education Research & Development, 36(3), 555-568. Special Edition, invited contribution.
Chubb et al, 'Top paper' [181 citations]
Chubb, J., & Watermeyer, R. (2017). Artifice or integrity in the marketization of research impact? Investigating the moral economy of (pathways to) impact statements within research funding proposals in the UK and Australia. Studies in Higher Education, 42(12), 2360-
Baker et al., (2022). Do-it-together: Punk Methodologies for Researching the Heritage of Popular Music. Track 11 on Documentary film. In prep.
Chubb, J. (2014). How does the impact agenda fit with attitudes and ethics that motivate research? In P.M. Denicolo (ed), Success in Research: Achieving Impact in Research. (pp. 20 – 32). London: Sage.
Chubb, J. (2014). What skills are needed to be an impactful researcher? In P.M. Denicolo (ed), Success in Research: Achieving Impact in Research. (pp. 113-126). London: Sage.
Forthcoming invited book review ‘Smoke & Mirrors: How Hype Obscures the Future and How to See Past It’ 2020.
XR Stories and SIGN (2022) Creative Futures. Written evidence submitted to the House of Lords Communications and Digital Committee. [link, when available]
A Taxonomy for AI in Science. The Royal Society. [link, when available]
Chubb, J.A. (2022). Industry Briefing: Conversational AI for Children, Screen Network. https://screen-network.org.uk/publication/industry-briefing-conversational-ai-for-children/
Šlosarčík, I., Meyer, N., Chubb, J., (2020): Science diplomacy as a means to tackle infectious diseases: The case of Zika. In: Young, M., T. Flink, E. Dall (eds.) (2020): Science Diplomacy in the Making: Case-based insights from the S4D4C project. Science diplomacy as a means to tackle infectious diseases: The case of Zika.
Šlosarčík, I., Meyer, N., Chubb, J., (2020) Science diplomacy and Infectious diseases: Between national and European narratives - Using science diplomacy for addressing global challenges. European Commission.
Stilgoe, J., Stirling, A., Chubb, J., Montana, J., & Wilsdon., J (2018). A review of recent evidence on the governance of emerging science and technology. The Wellcome Trust. [3 citations]
Weinstein, N., Wilsdon, J., Chubb, J., Haddock, G., (2019). The Real Time REF Review: A Pilot Study to Examine the Feasibility of a Longitudinal Evaluation of Perceptions and Attitudes Towards REF 2021. Research England.
Weinstein, N., Wilsdon, J., Chubb, J., Haddock, G., (2019). The Real-Time REF Review. Full Working Paper. Research England. [16 citations]
Institute, RoRI; Hancock, S., Wakeling, P & Chubb, J., (2019). 21st Century PhDs: Why we need better methods of tracking doctoral access, experiences and outcomes. Research on Research Institute. Report.
Underpinning research assistance in NESTA’s Biomedical Bubble Report by Jones., R & Wilsdon., J (2018).
Chubb, J. A. Report prepared for the University of Sheffield: Policy Shops in Higher Education. HEIF funded project (2019).
Reviewing duties (2017-2023) AI and Society, New Media and Society, Humanities and Social Sciences Communication, Discover Artificial Intelligence, Responsible Innovation, Studies in Higher Education, Social Science & Medicine, Critical Studies in Education,
Palgrave Communications, More than Human Centred Design Conference June 2020, Higher Education Research and Development, Arts and Humanities in Higher Education, Oxford Press, Routledge books.
Editorial responsibilities - Humanities & Social Sciences Communications, Springer Nature.
Peer Review Committees – Wellcome Trust Early-Career Social Sciences Committee.
Citizenship - Research Staff Liaison Officer 2021- 23.
Public outreach – Exhibition curator ‘Archive All Areas: People, Places, Memories & Music (2023)
Talks, blogs and media
Chubb, J.A. (2023) Exploring public attitudes to AI in music-making and gig-going. Article for Ensemble Magazine. The UK’s Association for Music Teachers. [link, when available]
Chubb, J.A. (January, 2022). The AI Story Crisis. York Talks, University of York.
Chubb & Cetin (2022). We need better AI imagery for better science communication https://blogs.lse.ac.uk/impactofsocialsciences/2022/07/25/we-need-better-ai-imagery-for-better-science-communication/
Chubb, J.A. (2022). AI, Narratives, Sound and Equality . https://blog.betterimagesofai.org/ai-narratives-sound-and-equality/
Chubb, J.A. (2022). Industry briefing. Conversational AI with Joi Polloi, BAFTA Award Winning Company.
Chubb, J. & Beer, D (2022). tl;dr – AI and the acceleration of research communication.
Chubb, J.A. (2021). Research Report delivered to technology company Joi Polloi - a literature review of ethical considerations for the design of conversational AI for children’s storytelling.
Chubb, & Maloney (2022). AI, what’s that sound? Stories and Sonic Framing of AI https://blog.betterimagesofai.org/ai-whats-that-sound-stories-and-sonic-framing-of-ai/
Chubb, J.A et al. 2021. AI in research could be a rocket booster or a treadmill. Research Professional.
Chubb, J.A. Blog. 2021. ALEXA, Go Away! DC Labs website.
Chubb, J.A. Report. 2021 AI at a time of crisis. University of York website.
Chubb, J.A., Cowling, P. & Reed, D. 2021 Blog Everyday ethics - expert behaviour and AI.
Ethics of Conversational AI for children - York Festival of Ideas June 2021 online conference.
Conversational AI for children - Ethics and Technological Review - Department of Computer Science, University of York January 2021
Conversational AI for Children - Ethics and Technological Review - Department of Sociology, University of York December 2021
Good Practices in the use of Machine Learning and AI by Research Funding Organisations: A virtual RoRI workshop in three acts – 11, 18 and 25 January 2021.
The Future of AI: People and Data. Aesthetica Film Festival. November 2020.
The Future of AI and Humanity. Yornight, 8 February 2020, with Professor Peter Cowling https://www.york.ac.uk/news-and-events/events/yornight/2020/talks/future-ai/
Invited Speaker for BBC Voice and AI. September 2020. Responsible Innovation and Design Ethics of Voice Technology ‘Beeb.’
Women in Technology. York. Talk, December 2020. AI and Ethics.
Impact: The Teenage Years. San Servolo, Venice. Venice International University. Invited talk by the Coimbra Group, Coimbra Group High-Level Seminar on Research Policy, San Servolo, 6-7 December 2018.
Australian Broadcasting Centre (ABC NEWS) - In the Age of hyper-productivity and hustling I’m embracing learning for learning’s
Australian Broadcasting Centre (ABC NEWS) - Academics dramatise expected outcomes https://www.abc.net.au/news/2016-03-11/academics-dramatise-expected-study-outcomes-for-funding-study/7238694
Australian Broadcasting Centre (ABC NEWS) - Study finds academics embellishing on grants https://www.abc.net.au/radio/programs/am/study-finds-academics-embellishing-on-grant/7238954
LSE Impact Blog - Impact a-gender? calling out the gendered nature of research impact and assessment https://blogs.lse.ac.uk/impactofsocialsciences/2020/04/29/impact-a-gender-calling-out-the-gendered-nature-of-research-impact-and-assessment/
Nature Index - Steps to a top-scoring impact case study. 2018. https://www.natureindex.com/news-blog/steps-to-a-top-scoring-impact-case-study
WonkHE article - Why we need a real time REF Review to plan for 2027 https://wonkhe.com/blogs/why-we-need-a-real-time-ref-review-to-plan-for-2027/
WonkHE article - Re-evaluating REF: have reforms changed how researchers experience the exercise? 2019. https://wonkhe.com/blogs/re-evaluating-ref-have-reforms-changed-how-researchers-experience-the-exercise/
Times Higher Education - REF Review. https://www.timeshighereducation.com/news/ref-15-academics-survey-made-
Converge in Research Professional - REF Review. https://www.researchresearch.com/news/article/?articleId=1382031
The Conversation - Academics feel pressure to embellish. https://theconversation.com/academics-admit-feeling-pressure-to-embellish-possible-impact-of-research-56059
The Conversation - The value of knowledge. https://theconversation.com/academics-fear-the-value-of-knowledge-for-its-own-sake-is-diminishing-75341
Times Higher Education - Integrity and grants. https://www.timeshighereducation.com/news/academics-regularly-lie-to-get-
Regular blog on AI Futures for DC Labs http://www.digitalcreativity.ac.uk/blog
Yorgos Paschos - Assessing the (Sub) cultural Heritage Significance of Grassroots Music Venues.
Paul Ord - Algorithmic Hauntings: miscarriage and grief in online social media.