|19 April||Dr Miles Whittington, HYMS, York||
Modelling for All, participatory ABM project
Deep sleep is characterised by large, slow rhythmic changes in the electrical activity of the brain. This rhythmic activity takes many forms but the most common is the delta rhythm - a 1-4 Hz oscillation in cortical potential readily observable in EEG and MEG recordings. Current consensus is that delta is generated by interaction between the cortex and thalamus, with a dominant thalamic rhythm generating circuit being well characterised. However, recent detailed analysis of delta rhythms in humans reveals discrepancies that do not fit with the idea of a purely thalamic generator. This presentation will show evidence for a local neocortical delta-generating neuronal circuit and demonstrate how it controls interactions between different layers facilitating an unsupervised synaptic learning environment. Taking cues from precedented aberrations in cortical dynamics associated with deep sleep and learning disability in humans, evidence for a specific mechanism underlying disruption of this learning environment will be considered
|15 March||Pirmin Nietlisbach||
Causes and consequences of genetic diversity: Investigating natural and sexual selection on heterozygosity
The importance of individual genetic diversity, i.e. heterozygosity, as a factor influencing individual fitness has long been recognized
|22 February||Richard Cotton, Health & Safety Laboratory||
Blood Sweat and Urine
Richie Cotton is a data scientist who spent six years working at the Health and Safety Laboratory. In this talk he recounts his job swap with a chemist, and tells you what he learnt from the experience. The talk includes:
|1 February||Dr Christ Sutton, University of Bradford||
Molecular complexity of collagens – new opportunities for proteomics and chemometrics
As part of a breast cancer proteomics study, a number of collagens were identified, some of which were increased in tumours compared to normal breast tissue. The proteomics data revealed an extensive array of peptides which defined two-dimensional heterogeneity, in degree and location of hydroxyproline modification, within collagen structure. The data provided a census of hydroxyproline levels using novel definitions for the rate of site occupancy (r) for each peptide isomer and the total site occupancy (t) for each proline residue. Potential applications of the approach in the characterisation of collagens will be discussed.
|25 January||Professor Thomas McLeish, University of Durham||
Medieval Science: Uncovering meaning with an interdisciplinary methodology
A novel interdisciplinary project within the Institute of Medieval and Renaissance Studies at Durham University brings medieval scholars together with practising scientists to work together on important documents from the “13th century renaissance”.
This seminar focuses on two important works by Robert Grosseteste, Franciscan Master at Oxford and Bishop of Lincoln, from the period around 1220. De Luce and De Colore (“On Light” and “On Colour”) give us access to a remarkable medieval mind intent on exploring the scientific structure and function of light, already theologically central to his thought.
|11 January||Dr Oliver Craig, Archaeology, York||
Chemical and isotopic analysis of archaeological pottery – dealing with mixtures through Bayesian inference
Analysis of food residues in ceramic cooking pots is a major field in science based archaeology and has led to some high profile discoveries, such as the earliest evidence for dairying, evidence for Stone Age culinary practices and how hunter-gatherer’s used pottery. State-of-the-art involves comparing the stable carbonisotope ratios of major saturated fatty acids (palmitic and stearic acid) extracted from archaeological pots with authentic modern reference fats. Recently wecompleted the analysis of over 200 pots from a site next to Stonehenge to look at how the people who gathered at this monument used pottery. Now in the final stages of the data analysis we have hit the problem of what to do when different products were mixed in pots (either at the same time or sequentially). This issue has been considered less than you may think considering the field is over 40 years old. We had the idea of using a Bayesian mixing model developedfor interpreting stable isotope data of animal tissues in order to resolve mixtures of different dietary sources (Parnell et al. 2010). This model also is able to cope with concentration dependence of the sources which makes it particularly appealing for our application. Whilst partially successful, the distribution of thesource data means that the model itself might need to bere-designed in order to produce realistic probability density distributions. At the limits of my knowledge (of R and Bayesian models), I am looking for a collaborator in order to re-interpret the data in preparation for publication. Other alternative approaches also welcomed.
Parnell, A., Inger, R., Bearhop, S. & Jackson, A.L. (2010) Source partitioning using stable isotopes: coping with too much variation.PlosOne, 5(3):e9672.
|14 December||Dr Simon Crouch, Health Sciences York||
The Epidemiology of Haematological Malignancies; Statistical Models for Complex Data
The haematological malignancies, cancers of the blood and lymph system, form a widely diverse class of conditions; from diseases that are asymptomatic to diseases that are rapidly fatal in the majority of patients. In addition, traditional classifications of the disease types, based largely on tumour cell morphology, are being reconsidered in the light of modern techniques for the analysis of constitutional and tumour genetics and of epigenetic factors. Such issues make the epidemiology of these diseases a complex problem; even basic parameters such as incidence and prevalence are poorly understood. In order to address these epidemiological issues the Haematological Malignancy Research Network (HMRN) was set up at the University of York in partnership with the Haematological Malignancy Diagnostic Service in Leeds. Covering the UK Cancer Networks of Yorkshire and of North-East Yorkshire and Humber, it accrues roughly 2,000 new diagnoses of haematological malignancies each year. The data held on each patient includes traditional epidemiological factors as well as diagnostic, treatment and response-to-treatment data. Essentially the entire post-diagnosis life course of the patient is available to us; pre-diagnosis data is also available through linkage to general practitioner and hospital health care records.
Such a richness of data promises much to the statistician, but also presents many technical challenges that stretch current methodology. In this talk, I will give a broad outline of the conditions themselves and the data that we hold on our patients and will then go on to discuss the analytical strategy that we are taking with this data.
|7 December||Paul Genever (Biology), Matthew Collins (Archaeology), Yvette Hancock (Physics), Roland Kroger (Physics), Paul O’Higgins (HYMS), Julie Wilsom (Mathematics/Chemistry), Angelika Sebald (Chemistry||
Understanding bone formation and degradation
Bone develops and is maintained through an interaction between mechanical forces and cellular and molecular signalling mechanisms. Bone structurally and geometrically adapts to mechnical loading. The mechanisms that determine stem cell fate, tissue remodelling, repair and regeneration are being used to generate 3D bone tissue constructs to treat human disease conditions (Genever). This can be linked to materials science where Raman spectroscopy is used to study biomineralization, defects and interfaces (Kroger and Hancock). Solid state NMR has also been suggested as a method to study bone at the atomic level (Sebald). Mass spectrometry and mathematical modelling have been used to assess the level of degradation of collagen in bone, the potential exists to relate this to mechanical performance (Collins and Wilson). At nano and micro scales the detailed structure of bone determines performance. At larger scales performance depends more on larger scale bone tissue geometry, such as trabecular orientation, shape, size. At the level of whole bones more important is cortical thickness and bone size and shape. O’Higgins is concerned with how large scale bone form relates to mechanical performance. By combining these approaches to understanding bone development, form and function we will offer projects that examine the links between nano and metre scale bone structure in relation to the development and maintenance of mechanically competent bone. This is relevant to understanding how bone structure is adapted to lifestyles and to such things as fracture risk as bone architecture becomes disorganised with age or disease.
|30 November||Professor Péter Érdi, Hungarian Academy of Sciences||
Dynamics of Complex Systems
Dynamical system theory proved to be very efficient to build and test predictive models in the physical sciences. What the mental strategy of the theory of dynamical systems can offer for finding governing rules of complex biological and social systems ? This question should not be labeled as physicalism, but as an interdisciplinary approach. As Newton famously told related to his huge loss in the stock market; ''I can calculate the motions of heavenly bodies, but not the madness of people.'' The theory of complex dynamic systems suggests that extreme events may be predicted by detecting their precursors, and that there are methodological similarities for analyzing and modeling different critical events occurring in physical, biological and social phenomena. There are initial promising results and many open problems. The following topics are discussed:
|23 November||Professor Richard Kerner, University of Paris VI||
Stochastic models of agglomeration and growths
Summary: We propose a set of models describing in a probabilistic way various processes of growth and agglomeration, ranging from ordered or disordered solid state growth, fullerene formation, glass transition to viral capsid assembly.
The talk is based on Richard's book: http://www.amazon.com/Models-Agglomeration-Transition-Richard-Kerner/dp/1860947565
|9 November||Damien Farine, University of Oxford||
Social and collective behaviour of mixed-species flocks
The choice of when and with whom to associate is a fundamental question in the study of animals group that live in groups. When making such decisions, individuals are generally considered to trade-off the competition costs and mutualistic benefits that arise in order to maximise their potential fitness. Mixed-species flocks, a ubiquitous case of multi-species group-living, are often described as ‘bird waves’ consisting of individuals from several species that maintain cohesion through time and space. I combine social network analysis and collective animal behaviour approaches on mixed-species flocks in the wild across multiple populations. In particular, I developed an automated system for studying the social responses of individuals to both conspecifics and heterospecifics. Together, these studies allow me to elucidate the social decisions of individuals in the context of multi-species group-living and find some important consequences that arise from variation these decisions. I find that within-species variation in social network position can span the entire variation of individuals from all species present. Further, I find that these important differences can arise from intrinsic characteristics of individuals, such as dominance rank. I show that the resulting social position of individuals can play an important predictor of foraging success of individuals in heterogeneous landscapes. Finally, I present the remarkable collective behaviour that is exhibited by these mixed-species flocks and discuss the influence that the trade-off between competition and foraging benefits has on collective behaviour. Together, these results suggest that mixed-species associations are both important and potentially treated as much more direct extensions of single-species groups than previously considered.
Poster Session for new Research Students
|12 October||Nikolai Bode, Princeton University||
Context-dependent mechanisms for collective decision making?
Collective decision making is ubiquitous in animal groups and plays an important part in social animals' life histories. Here, we investigate information transfer mechanisms that dictate the timing of individual decisions within groups. We expose groups of fish to two distinct environmental stimuli that closely mimic real-life situations (discovery of a source of food and attack by a predator). Strong reactions at the individual level allow us to extract the time points at which individuals show a response to the stimulus. Subsequently, we use Bayesian methods to investigate the support of our data for a number of candidate mechanisms for individual decision making. We find that for both stimuli the response time of individuals is likely to be influenced by the response times of other group members. Interestingly, our data supports different mechanisms for the two stimuli, suggesting that the mechanisms for social decision making in animals could depend on the context.
Summer School Presentations
Summer School Presentations
|27 July||Professor Peter Cowling, YCCSA||
Monte Carlo Tree Search - New Applications
Computerised searching through possible futures to help make good decisions now, with only a system model and a utility measure on final outcomes.
|20 July||Professor Mat Evans, Department of Chemistry, University of York||
Challenges in Atmospheric Compostition, complexity and simplicity
Understanding the composition of the atmosphere is scientifically complex and central to addressing some of society's "grand challenges" (climate change, air quality, food security, ecosystem protection). Physics, chemistry and biology all conspire to create this complexity and developing Earth System Models, which can be used to explore the inter-related nature of the different systems, is currently a globally active area of research. However, even with the world's fastest computers, the full expression of this complexity is impossible. In this talk I will discuss how some of these systems create complexity, how we can explore and visualise it and discuss ways that the complexity could be reduced without losing the underpinning science and understanding.
|6 July||Dr Matthew Simpson, Queensland University of Technology||
Velocity-jump models with crowding effects
Velocity jump processes are discrete random walk models that have many applications including the study of biological and ecological collective motion. In particular, velocity jump models are often used to represent a type of persistent motion, known as a “run and tumble”, which is exhibited by some isolated bacteria cells. All previous velocity jump processes are non-interacting, which means that crowding effects and agent-to-agent interactions are neglected. By neglecting these agent-to-agent interactions, traditional velocity jump models are only applicable to very dilute systems. Our work is motivated by the fact that many applications in cell biology, such as wound healing, cancer invasion and development, often involve tissues that are densely packed with cells where cell-to-cell contact and crowding effects can be important. To describe these kinds of high cell density problems using a velocity jump process we introduce three different classes of crowding interactions into a one-dimensional model. Simulation data and averaging arguments lead to a suite of continuum descriptions of the interacting velocity jump processes. We show that the resulting systems of hyperbolic partial differential equations predict the mean behaviour of the stochastic simulations very well.
|22 June||Dr Patrick Hoganm University of Sheffield||
Of honeybee swarms and brains: collective decision-making and the stop signal
Of honeybee swarms and brains: collective decision-making and the stop signal. “Honeybee swarms and complex brains show many parallels in how they make decisions. In both, separate populations of units (bees or neurons) integrate noisy evidence for alternatives, and, when one population exceeds a threshold, the alternative it represents is chosen. We show that a key feature of a brain—cross inhibition between the evidence-accumulating populations—also exists in a swarm as it choosesits nesting site. Nest-site scouts send inhibitory stop signals to other scouts producing waggle dances, causing them to cease dancing, and each scout targets scouts reporting sites other than her own. Using methods from statistical physics, we show how the collective behaviour of the swarm emerges from these observed interactions between individual scouts, showing that cross inhibition between populations of scout bees increases the reliability of swarm decision-making by solving the problem of deadlock over equal sites. Recently published in the journal Science, this illustrates the power of statistical physics for understanding social insect colonies as complex systems."
|15 June||Professor Natasha Jonoska, University of South Florida||
Describing Self-assembly of Nanostructures
There is an increased necessity for mathematical study of self-assembly of various phenomena ranging from nano-scale structures, material design, crystals, bio-molecular cages such as viral capsids and for computing. We show an algebraic model for describing and characterizing nanostructures built by a set of molecular building blocks. A molecular building block is represented as a star-like graph with a central vertex and molecular bonding sites at the single-valent vertices. Such a graph represents a branched junction molecule whose arms contain ``sticky ends" ready to connect with other molecular blocks. The algebraic approach connects the classical view of crystal dissection with a more modern system based on algebraic automata theory.
|25 May||Professor Kathleen Kiermand and John Hobcract, Social Policy and Socuial Work, University of York||
What matters for Child Well Being in the Early Years?
|11 May||Dr Richard Walsh, Department of English, University of York||
What have stories got to do with systems? Narrative and emergence
The aim of this talk is to open up a dialogue between two interdisciplinary paradigms, complex systems analysis and narrative theory, which appear to be fundamentally incompatible. This incompatibility is interesting because both perspectives are concerned with ways of modelling temporal processes, and both are notable for the range of the fields in which they apply; the mismatch between them is therefore not a trivial or marginal issue. It is not an issue that lends itself to being resolved one way or the other, because these paradigms are not in competition with each other so much as locked together in mutual dependence. Beginning with some elementary observations on the concept of emergence, I shall try to show first of all that this is so, and secondly that it should not be considered an impasse, but an opportunity. I’ll suggest that there are several ways in which this encounter between concepts can be reciprocally illuminating precisely because it resists any synthesis: I’ll elaborate upon the potential consequences for narrative theory, and hint darkly (if with less confidence) at some of the possible implications for complex systems theory. I will also touch upon some of the broad methodological issues raised in the context of the practice of interdisciplinarity; and, given time, I’ll extrapolate further by sketching the impact this line of research can have upon science communication beyond academia and by referencing that grand theme (or old chestnut), the “two cultures” debate.
|27 April||Dr John Forrester, YCCSA||
Of Maps, Models and Common Heuristics ... oh and may be the odd Ontology too!
In my previous life (before YCCSA) I dabbled more in mapping that in modelling. However, I was trying to do the same thing that I am now trying to do with models. My aspiration was then, and still is, to approach sufficiently similar ‘ontologies’ which might describe social-ecological decision-making networks. This ontology provides a description of how stakeholders' think things are, and can then becomes the heuristic device around which different stakeholder inputs and discussions at different levels of governance can occur. If there are multiple or competing ontologies then the process can focus time and effort on understanding why those differences. In this talk I will show examples from my previous work where participatory mapping – and the maps produced as a result – has been used both as a dialectic (communication tool) and as a heuristic (aid to discussion). The talk will conclude by asking whether a model – similarly constructed to the way in which we created our maps – can become a ontology allowing the model itself to become the focus of communication across the levels of governance. The talk will illustrated by lots of pretty maps.
|9 March||Dr Bruce Edmonds, Manchester Metropolitan University||
Staging Abstraction using Chains of Models
The talk tackles a central dilemma for people trying to understand complex phenomena (particularly complex social phenomena). The underlying problem is that there is no reason to believe that a model of such phenomena that is adequate for our purposes will be simple enough for us to be able to understand. Any assumption to the contrary is wishful thinking. Thus we have to choose between: models that are simple enough for us to check/analyse/rigorously understand but area at odds with what is observed (KISS models) OR models that are closer to the evidence of what is know to occur but that are too complex to check/analyse/rigorously understand (KIDS). To summarise: the former have rigour, the later relevance. The trouble is that good science requires BOTH rigour and relevance.
|24 February||Dr Richard Watson, University of Southampton||
A meta-dynamical ssystems view of adaption
Grafen describes (somewhat bluntly) the fundamental relationship between the core explanandum in biology, namely design, and an optimisation process: “Adaptation is the centre of biology, adaptation is design, and maximizing fitness [by any meaningful definition of fitness] is what organisms are designed for” . But do all optimisation processes produce design? Non-trivial adaptation requires a non-trivial optimisation process.
|10 February||Dr Jon Pitchford, YCCSA||
Occam's razor and models of ecological complexity
An explosion of pollinator diversity on Earth accompanied the evolution of flowering plants, but surprisingly little is known about the mathematical and ecological “rules” that permit plants and their pollinators to co-exist. Could these rules help us understand similar patterns found where different components of a system work together, generating reciprocal benefits, from sub-cellular to social and economic networks? Previous attempts to answer this question, using complicated mathematics and supercomputers, disguised simpler and ecologically-meaningful patterns and mechanisms. In the models we present, simply counting the number of partners a species interacts with is a reliable predictor of biodiversity. This shifts the emphasis away from performing endless computations built on mathematical idealisations, and towards a more careful look at the biological detail structuring the natural world.
|27 January||Professor Hassan Ugail, University of Bradford||
Facial Emotion Analysis using Visual andf Thermal Imaging
This talk will centre around our recent work on computer based human facial analysis for emotion detection and analysis. The talk will focus on how cues from both visual and thermal domain can be identified and a particular application in focus is the detection and analysis of potential deception. Discussions will also focus on the experimental setup through which sufficient data has been collected and analysed to determine the individual baseline which serves as the ground truth for interrogation. Further our current integrated setup for non-invasive lie detection will be outlined and future direction of this research will be discussed.
|13 Janaury||Dr Oliver Ebenhoeh, University of Aberdeen||
The role of mixing entropy in carbohyrate metabolism
The vast diversity of carbohydrates is generated bycarbohydrate-active enzymes (CAZymes) accepting many different substrates and catalyzing numerous reactions. This promiscuity is in stark contrast to most enzymes active in central metabolism which are highly specific and catalyze exactly one or a very small number of reactions. The multitude of accepted substrates and catalyzed reactions makes CAZymes hard to characterize in classical enzymological terms. Equilibrium constants and Michaelis constants, which were developed for highly specific enzymes, have no straight forward analogon for CAZymes
|16 December||Dr Thomas House, Warwick Mathematics Institute||
Modelling inflenza in structured populations
Epidemic models are now routinely used to inform government policy on infectious diseases, particularly pandemic influenza. This talk will review the role of complex population structure in such models, showing how modern data-driven approaches and computationally intensive influence can be used to improve our policy recommendations.
|9 December||Dr Jamie Wood, YCCSA||
Speed distributions in collective motion: modelling them and what it led to
Computer models of collective motion are a popular area of research, but comparing them to empirical data is a task still in its infancy. Popularised in the theoretical literature by the physics community as spin systems with convection, they are a fascinating field of work. I will first describe an abstract modelling exercise that first highlighted the importance of the speed of individuals within a flock or shoal. I will then introduce a new modelling framework that we have devised in order to better capture these details, as well as the empirical work we have conducted to verify the principle of our approach. I will then discuss how our new model also captures effects seen in the most extensive empirical work to date before concluding with some new work that highlights the importance of underlying social preferences.
|25 November||Dr Julianne Halley, Wellcome Trust, Cambridge||
Self-organizing circuitry and emergent computation in mouse embryonic stem cells
Pluripotency is a cellular state of multiple options. Here, we highlight the potential for self-organization to contribute to stem cell fate computation. A new way of considering regulatory circuitry is presented that describes the expression of each transcription factor (TF) as a branching process that propagates through time, interacting and competing with others. In a single cell, the interactions between multiple branching processes generate a collective process called ‘critical-like self-organization’. We explain how this phenomenon provides a valid description of whole genome regulatory circuit dynamics. The hypothesis of exploratory stem cell decision-making proposes that critical-like self-organization (also called rapid self-organized criticality) provides the backbone for cell fate computation in regulative embryos and pluripotent stem cells. Unspecific amplification of TF expression is predicted to initiate this self-organizing circuitry, where cascades of gene expression propagate and may interact either synergistically or antagonistically. The emergent and highly dynamic circuitry is affected by various sources of selection pressure, such as the expression of TFs with disproportionate influence over other genes, and extrinsic biological and physical stimuli that differentially modulate particular gene expression cascades. Extrinsic conditions continuously trigger waves of transcription that ripple throughout regulatory networks on multiple spatiotemporal scales, providing the context within which circuitry self-organizes. In this framework, a distinction between instructive and selective mechanisms of fate determination is misleading because it is the 'interference pattern', rather than any single instructing or selecting factor, that is ultimately responsible for computing and directing cell fate. Using this framework, we consider whether the idea of a naïve ground state of pluripotency and that of a fluctuating transcriptome are compatible, and whether a ground state like that captured in vitro could exist in vivo.
Dr Hywel Williams, University of Exeter
Modelling coevolution of bacteria and bacteriophage in marine microbial ecosystems
Marine microbial ecosystems are complex adaptive systems in which higher-level processes (e.g. material/energetic flows) emerge from lower-level interactions of a huge diversity and abundance of organisms. A good example of this complexity is the interaction between bacteriophage (viruses that infect bacteria) and their unicellular hosts. Bacteria are of central importance in marine ecosystems, where they are responsible for a large proportion of primary production and nutrient cycling. Bacteriophage are a major cause of bacterial mortality and thereby also significantly influence ecosystem function, both directly through the material products of cell lysis and indirectly by their impacts on bacterial diversity and community structure. Here I present results from an evolutionary simulation model of bacteria and phage in an aquatic environment. The model represents individual bacteria and phage with simple `genomes' coding for phenotypic traits governing infection. Model validation is performed by comparison with empirically tested models of bacteria and phage in chemostats. Experiments show rapid coevolutionary diversification of simulated bacteria and phage, with associated impacts on community-level productivity and resource use. Metagenomic signatures of phage predation are observed in the simulated bacteria that match those seen in real marine bacteria, supporting a `constant diversity' theory of bacterial evolution.
Professor Charles Laughton, Pharmacy, University of Nottingham
From Chemistry to Biology via Biomelecular Simulation, Where are we and where are we going
Most experimental methods for studying the structure of biomolecules (e.g. proteins and nucleic acids) at the atomistic scale present a rather static picture of the molecule in question, frozen in space and time. But thermodynamics tell us that at normal temperatures, the thermal motions of such small objects are extremely significant. Molecular dynamics has a profound influence on molecular function, and computational biomolecular simulation gives us a framework to connect fundamental principles of physics and chemistry to the emergent behaviour of these molecules - i.e., biology. In my talk I will provide a brief introduction to biomolecular simulation techniques, illustrate some applications of the method, some successes and limitations, and discuss where future developments may take us.
Professor Gary Green the York Neuroimaging Centre
Identification of the (nonlinear) processes in the human brain
Non-invasive imaging of the human brain using magnetic resonance and magnetoencephalographic methods allow, respectively, the localisation of where and when the brain responds to external events. After an introduction to the use of these core technologies, I will discuss the problems that arise due to the large scale of the datasets that are generated and the way that Bayesian methods and Extreme Statistics can be exploited to help with the localisation of function. The harder problem of identifying the mechanisms that are involved in neural processing will be posed. As first-principled, bottom up models of the brain remain impractical, it will be argued that the use of (nonlinear) model free methods for the identification of the invariants of a dynamic system offer a practical way forward that allows the testing of specific hypotheses and the discrimination of both normal and abnormal processing modes.
Professor John Wright, Director Bradford Institute of Health Research and Professor Kate Pickett, Health Sciences, University of York
Born in Bradford Project
The Born in Bradford (BiB) study (www.borninbradford.nhs.uk) is a multi-ethnic birth cohort study of over 13,000 families that offers a unique opportunity for multidisciplinary medical research. The genetic, epigenetic, psychological and social determinants of health, development and well-being are tracked through bespoke data collection (biological, clinical & survey data) and linkage to primary and secondary care, public health and educational systems. Unlike most population cohort studies, BiB is embedded in clinical practice, offering exciting opportunities for applied research and intervention studies (Born in Bradford Collaborative Group. Born in Bradford, a cohort study of babies born in Bradford and their parents: protocol for recruitment phase; BMC Public Health 2008; 8:327). The complete cohort covers 60% of all infants born in Bradford in a 3-year window. BiB provides the platform for both cohort-wide studies (horizontal studies e.g., socio-spatial analysis) and nested studies of population subsets (vertical studies e.g., neuroimaging) and the opportunity for skills training in mathematics, statistics and computation (to support cross-cutting analyses). The BiB platform allows research questions relating to normal and pathological infant and child growth and development, the epidemiology of the South Asian British community, the evaluation and improvement of health systems and data collection, health inequalities, and public health interventions to be addressed.
TRANSIT Students Presentations
TRANSIT Students Presentations
Professor Tim Newman, College of Life Sciences, University of Dundee
Embryogenesis and cancer progression: discrete theoretical methods applied to living systems
The discreteness of biological components, whether they be proteins in a cell, cells in a tissue, or individual organisms in a population, has important consequences for the dynamics of biological systems. Discreteness also presents significant challenges for computational biology, such as the issues of heterogeneity and stochasticity. I will discuss some examples relevant to cell biology, cancer progression, and embryo development, in which stochastic process theory along with agent-based computer simulations have helped provide new insights and understanding.
Dr Shubba Sathyendranath, Plymouth Marine Laboratory
Bio-optical controls int he pelagic ecosysttem of the ocean
Light absorption by phytoplankton is at the heart of practically all biological and bio-geochemical processes in the pelagic ocean: It is the first step towards marine primary production by phytoplankton and all organisms at higher trophic levels depend on phytoplankton directly or indirectly for food. Phytoplankton fix some 50 GT of carbon per year globally, and are a major player in the global carbon cycle. Absorption by phytoplankton modulates the vertical distribution of solar heating in the ocean, thus affecting mixed-layer dynamics, supply of nutrients to the surface, and air-sea exchange of green-house gases. As we move towards an Earth-System approach to climate-change studies, the key role of phytoplankton absorption ought not to be overlooked.
Dr Paul Schweinzer, Department of Economics, University of York
Applying Game Theory to the study of 'complex' social problems
|24 June||Dr Dan Franks, YCCSA||
The social lives of killer whales
I will talk about my recent exploits in analysing the social structure of killer whale societies (Orcas). For over 40 years orca associations have been observed and recorded, providing a rich dataset which we are now exploiting. I will discuss our new findings about the social lives of killer whales and, among other things, will be claiming that male killer whales are momma's boys. I will also tell of the problems the whales, and the whale observers, have suffered over the years.
|10 June||Dr Ville Mustonen, Wellcome Trust Sanger Institute||
Quantifying selection from allele frequency time-series data
When selection is acting on a large, genetically diverse population, regions containing beneficial alleles increase in frequency. This idea can be used to map trait loci by deep sequencing the pooled DNA from the population under selection at consecutive time points, and observing allele frequency changes. Recently, such an approach was used to map loci responsible for high temperature growth using a very large pool of random segregants from a cross between two phenotypically different yeast strains.
 Revealing the genetic structure of a trait by sequencing a population under selection. L. Parts, F.A. Cubillos, J. Warringer, K. Jain, F. Salinas, S.J. Bumpstead, M. Molin, A. Zia, J. Simpson, M.A. Quail, A. Moses, E.J. Louis, R. Durbin and G. Liti. Genome Res. 2011.
|27 May 2011||YCCSA||Launch Event|
|13 May 2011||Dr Tom Stafford, University of Sheffield||
Infering cognitive architectures from high-resolution behavioural data
I will give an overview of some of the work done in our lab, the Adaptive Behaviour Research Group (http://www.abrg.group.shef.ac.uk/ ) in the Department of Psychology, University of Sheffield. Across human, non-human animal, simulation and robotics platforms we investigate the neural circuits that allow intelligent behaviour, bringing to bear psychological, neuroscientific and computational perspectives. We are particularly interested in the action selection problem - that of deciding what to do next (and of doing it). This talk will focus on my own work looking at 3 paradigms where we have collected high-resolution behavioural data in humans - mistakes made by expert touch typists, eye-movements during visual search and a novel paradigm for investigating the learning of new motor skills. I will make comments on how we analyse such data in order to make inferences about the underlying architecture of human decision making.
|8 April 2011||YCCSA||Risks and Strategy Planning Meeting|
|1 April||Dr Tobias Galla, University of Manchester||
Agent-based modelling of the nest-site choice by honeybee swarms
In a recent paper List, Elsholtz and Seeley [Phil. Trans. Roy. Soc. B. 364, 755 (2009)] have devised an agent-based model of the the nest-choice dynamics in swarms of honeybees, and have concluded that both interdependence and independence are needed for the bees to reach a consensus on the best nest site. I here present a simplified versionof the model which can be treated analytically with the tools of statistical physics and which largely has the same features as the original dynamics. Based on analytical approaches it is possible to characterize the co-ordination outcome exactly on the deterministic level, and to a good approximation if stochastic effects are taken into account, reducing the need for computer simulations on the agent-based level. In the second part of the talk I will discuss a spatial extension, and show that transient non-trivial patterns emerge, before consensus is reached.
|25 March 2011||YCCSA||
Introducing YCCSA new starters since October 2010
Stewart Aitken, Celina Wong, Alex Turner, Anna Grey, Huda Mohd Azmi, Daniel Moyo, Simon Croft, Jason Liu, Richard Williams, Luis Fuente, Zoe Cook, Richard Greaves
|11 March 2011||TRANSIT Seminar: Dr Tony Shannon, Clinical Review Board, Leeds NHS Trust||
An Information Revolution for the NHS
The National Health Service in England has been in existence for over 60 years now and is generally well regarded by the population that it serves. My experience of the difficult transformation of emergency care in the NHS suggests that it delivers a very good service all free at the point of care. As the NHS has developed in recent years the importance of process improvement and information technology has been increasingly recognized. In a complex environment (as the A&E) which is compounded by financial limitations, I plan to present my views on how we can better use of information and technology to help drive better care and better outcomes. (see more at www.frectal.com)
|25 February 2011||Dr John Forrester, SEI/YCCSA||
What does an anthropologist do anyway: and why is there one in YCCSA?
Official definitions of anthropology include that it is the study of humankind "in its broadest and most inclusive sense" (RAI). So that leaves a rather broad remit! Other definitions of anthropology will be explored (ants may be mentioned); followed by a personal view of what it means to be both an applied anthropologist (applying the methods of the qualitative social sciences to the analysis of practical problems – a.k.a. 'making the world a better place') but also an academic [cultural] anthropologist concerned with creating theoretical models to help us understand why people do some of the apparently silly things they do.
|11 February 2011||Prof Jim Austin & Dr Tom Jackson, Department of Computer Science||
The youShare SaaS platform
Computer Science has been funded by HEFCE to pilot a collaborative web based system, youShare, to share and run software and hold data used by the software. The aim is to support research projects during the project and after. The system will be provided through a web portal and be cloud based – research software will be provided as a SaaS (software as a service). We are keen to help users get onto the system and gain there feedback on it. The talk will introduce youShare and invite questions on how it can be used by researchers to share their data and software and allow easy access to their results.
|28 January 2011||Professor Susan Stepney, Department of Computer Science and YCCSA, University of York||
Formalisms for Natural Computing
Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations.
Natural Computing uses novel kinds of substrates - such as chemicals, biomolecules, even slime moulds - to perform computations that don't conform to the classical model. Although these substrates can often be "tortured" to perform classical computation, this is not how they "naturally" compute. But our ability to exploit natural computation is hampered by a lack of corresponding formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates.
What does, say, a slime mould programming language look like? Here I discuss some of the issues, from the perspective of a user who wants a principled programming approach for these systems.
|14 January 2011||Dr Adam Sampson, Abertay University||
Free-Range Code: open source in practice
Free and open source software is an important part of contemporary research. Making the software we develop available for others to use, modify and build upon can promote collaboration, improve code quality and aid public engagement -- but it's not always obvious how to develop a piece of research software into a successful, sustainable open source project. This seminar will discuss the practicalities of releasing, maintaining, and building a community around open source software from an academic perspective.
|10 December||Professor David Harvey, Professor of Agricultural Economics at Newcastle University||
A Conjecture on the Nature of Social Systems
Repeated calls for interdisciplinary research and communication, both within social science and between social, natural and physical scientists, as well as with the rest of the world, frequently meet a fundamental problem: there is no commonly accepted framework or common language within and through which to integrate the largely separate social understandings generated by our social sciences.
|26 November||Professor Jerzy Gorecki, Institute of Physical Chemistry University of Warsaw||
Information Processing with Chemistry
My talk will be concerned with a few ideas on applications of chemistry in the field of information processing. At the beginning I describe operations that can be performed by reactions proceeding in a homogeneous system. Next I consider spatio-temporal phenomena in geometrically distributed chemical medium. I focus my attention on the excitable medium with Ru-catalyzed Belousov-Zhabotinsky reaction and information coded in the presence of excitation pulses. I show that if such system is composed of regions characterized by different excitability level then it can perform non-trivial operations (logic gates, memory, counting). Therefore, for information processing with chemical reactions, the geometrical complexity of the medium is equally important as the complexity of reactions involved. The same operation can be executed by a simple chemical reaction proceeding in a complex environment as well as by a complex reaction network in a homogeneous system. Concluding the presentation I show how a geometrically complex medium with interesting information processing potential can be generated as the result of selforganisation at non-equilibrium conditions.
|12 November||Professor Les Firbank, Visiting Professor in Sustainable Agriculture Faculty of Biological Sciences, University of Leeds||
Optimising land use, top down, bottom up or no chance?
Rising concern about food security has caused a major shift in Government thinking; there is more to land use planning that green belts and built developments. It is also causing a significant refocus among agricultural and conservation communities, that we may see a new balance between agricultural production and the environment, while new economic tools are being developed to cost and value the delivery of ecosystem services such as carbon sequestration, water pollution and flood control that could reshape how land is managed. I look at different patterns of land use, including national trends, medieval villages and visions for future cities. I also look at the drivers and constraints of changing land use, and their increasing complexity and dynamism. Land use science can be used to devise new landscapes to deliver more efficient land use. But will it be worth using, and if so, should it be used?
|29 October||Dr Thomas Anderson, National Oceanography Centre, University of Southampton||
Models and Muddles
Global climate is a classic example of a complex system, with many interacting parts and emergent behaviour. Predicting how this system responds to, for example, anthropogenic forcing represents a major challenge for modellers. Just how good our climate models? Using a range of everyday models, I will articulate the pros and cons of modelling, distinguishing models from muddles. I will then focus in particular on modelling issues in the oceanographic realm, finishing with a few comments on the merits of the current generation of climate models.
|15 October||Dr James Marshall, University of Sheffield||
Optimal collective decision-making: from neurons to ant and honeybee colonies
The problem of how to compromise between speed and accuracy in decision-making faces organisms at many levels of biological complexity. Recent models of decision-making in the vertebrate brain have shown how simple neural models can implement statistically optimal decision-making processes that are able to minimise decision time for any desired error rate. Such models consider mutually inhibitory populations of neurons. The activation threshold required of these populations can be varied to emphasise speed, or accuracy, of decision-making. There are some striking similarities between these models and models of the decision-making processes in colonies of social insects, such as ants or honeybees, when searching for a new nest site. In this talk I will present stochastic models of collective behaviour in social insects that can be directly compared with corresponding neural models, and show how they may also implement statistically optimal decision-making in a similar manner to neural circuits.
|23 September||Dr James Whitacre, University of Sheffield||
How synergistic relationships between robustness, adaptiveness, and complexity can arise through the presence of degeneracy
As a research fellow at the Australian Defence Force Academy (Artificial Life and Adaptive Robotics Lab) and now at the University of Birmingham (CERCIA Computational Intelligence Lab), I have explored new bio-inspired design principles (open access http://www.tbiomed.com/content/7/1/6) (open access http://www.tbiomed.com/content/7/1/20) for facilitating robust and adaptive behavior in complex systems. More generally, my research investigates how synergistic relationships between robustness, adaptiveness, and complexity can arise through the presence of degeneracy, i.e. the co-occurrence of functional redundancy and functional plasticity (www.scitopics.com/Degeneracy_in_Biology.html).
In my research, degeneracy principles have shown promise in dynamic optimization algorithm design, simulations of biological system behavior (e.g. simple genome:proteome models), complex scheduling problems and in support of strategic planning activities associated with future transportation fleet designs at the Australian Defence Science and Technology Organization. I have also recently collaborated with researchers that are applying these concepts to develop new therapeutic strategies for reducing the evolvability of aggressive cancers. Finally, I also provide consulting support (at Adaptive Intelligent Systems Pty Ltd) in planning and design of adaptable engineered systems.
James Whitacre's Specialties:
|10 September||TRANSIT Summer Scholarship Student Presentations||
Bradley Kavanagh - Voter models on complex and dynamics networks
|3 September||Professor Miklos Szylagyi, University of Arizona, Tucson||
Emergence in N-person Games
N-person games. Their agent-based simulation. Social dilemmas. The role of personalities. Games with crossing payoff functions. Emergent behavior. Demonstrations. Applications
|20 August||Dr Angelika Sebald and Dr Mattias Bechmann, Department of Chemistry, York and Dr Alastair Abbot, University of Auckland New Zealand||
NMR-Based Classical Implementation of the De-quantisation of Deutsch's Problem
In particular solution-state NMR has been used for quite a number of years for physical implementations of quantum algorithms. We have previously used solution-state NMR experiments to implement classical logical gates in various different ways. Here we will give a work in progress report, covering (probably): 1) classical solution-state NMR implementations of the de-quantisation of Deutsch's problem (for n = 1,2), using 1H NMR spectra; 2) a direct comparison of costs for quantum vs. classical implementations for these implementations; 3) extend our work to include n = 3 - where we will initially concentrate on identifying possibilities for full exploitation of classical approaches - we cannot predict this week how far we will be able to pursue this over the next week.
|6 August||Dr Alexey Goltsov and Dr James Bown, Centre for Research in Informatics and Systems Pathology
University of Abertay Dundee
Systems biology modelling of cell signalling networks to inform drug design in cancer
A key challenge in treating patients with cancer is that some patients are either immediately resistant to or develop resistance to the drugs used in clinical settings. This variation in sensitivity to drug treatment among patients arises from complexities in the drug target system. One of more promising target systems is the suite of signalling networks that exist within the cell. These networks regulate cell functioning, such as growth and division, and govern the cell response to external stimuli generally and drug treatments in particular. Anticancer drugs target specific pathways within this network known to govern key functioning in aberrant cells. However, the cell's inherent complexity means that it is possible for that cell to invoke alternate pathways and feedback control mechanisms to effect resistance to external interventions, including the impact of drug treatment. Systems biology modelling offers a means to understand the mechanisms by which this resistance arises and so to address the insensitivity of signalling networks to drug intervention and restore the efficacy of anticancer therapy.
|23 July||Dr Ed Clark, York Centre for Complex Systems Analysis, York||
Horror Stories from Systems Biology
Over the last decade, the approach to understanding biology has shifted towards a systems biology perspective. One contributing factor for the shift is increase in technology of lab equipment, allowing for high throughput experiments producing vast quantities of data. The data from these experiments needs to be stored and analysed in order to advance our understanding. I will focus on the state of network inference methodology used in biology and present a narrative overview of the dream competition – a competition for comparing state of the art techniques in network inference. I will also touch on sorting storing and sorting of biological data in the ever growing number of data bases.
|26 June||Dr Igor Nikolic, Assistant Professor, Faculty of Technology, Policy and Management Delft University of Technology||
Agent Based Modeling of Socio-Technical system evolution
Our human society consists of many intertwined Large Scale Socio-Technical Systems (LSSTS), such as infrastructures, industrial networks, the financial systems etc. Environmental pressures created by these systems on Earth’s carrying capacity are leading to exhaustion of natural resources, loss of habitats and biodiversity, and are causing a resource and climate crisis. To avoid this sustainability crisis, we urgently need to transform our production and consumption patterns. Given that we, as inhabitants of this planet, are part of a complex and integrated global system where and how should we begin this transformation? And how can we also ensure that our transformation efforts will lead to a sustainable world?
|11 June||Dr Sonia Mazzi, Department of Mathematics, University of York||
The role of Statistics in the "Forward-Inverse Problem" setup
I will illustrate with a study conducted to infer the age of fish from
|28 May||Dr Kiran Fernandes and Management Complex Systems Group||Management Complex Systems Research Show and Tell (Treehouse, Berrick Saul)|
|14 May||Professor Royston Goodacre, University of Manchester||
Investigating abiotic stresses on microbiological systems using metabolomics
Human pharmaceuticals are readily detected in waste water treatment plants, rivers and estuaries. Whilst levels are not yet high enough to cause immediate harm to aquatic life, it is widely acknowledged that there is insufficient information available to determine whether exposure to low levels of these substances over long periods of time is having an impact on the microbial ecology of these environments. In order to investigate the effect on the metabolic potential of the microbial community we have been adopting a metabolomics approach using various analytical platforms including vibrational spectroscopic approaches for generating spatial metabolic fingerprints, gas chromatography-mass spectrometry (GC-MS) for metabolic profiling and direct infusion mass spectrometry (DIMS) for lipid profiling. Analysis of environmentally relevant microbes and algae will be presented. We shall show that Propranolol had significant effects on the lipid components of Pseudomonas putida cells , and in particular large changes in phospholipid head groups in order to maintain correct membrane fluidity (so called homeoviscous adaptation). In order to investigate this further, Fourier transform infrared (FT-IR) microspectroscopy was used to generate detailed metabolic fingerprinting maps from the alga Micrasterias hardyi . These chemical maps revealed dramatic effects on the distribution of various chemical species throughout the algae. This illustrates the additional power of spatial metabolic fingerprinting for investigating abiotic stresses on complex biological species.
|23 April||YCCSA 2015||A review activities that we envisage running out of YCCSA for the next few years|
|9 April||Dr Elva Robinson, YCCSA, University of York||
Distributed Decisions: New Insights from Radio-tagged Ants
Ant colonies have been used as model systems for the study of self-organisation, and viewing ants as identical agents following simple rules has led to many insights into the emergence of complex behaviours. However, real biological ants are far from identical in behaviour. New advances in radio-frequency identification (RFID) technology now allow the exploration of ant behaviour at the individual level, providing unprecedented insights into distributed decision-making. I have addressed two areas of decision-making with this new technology: 1 Collective decision-making during colony emigration; 2 Task decisions in a changing environment. The first of these, collective decision-making during colony emigration, uses RFID microtransponder tags to identify the ants involved in collecting information about the environment, and to determine how their actions lead to the final colony-level decision. My results demonstrate that ants use a very simple threshold rule to make their individual decisions, and from these individual decisions emerges a sophisticated choice mechanism at the colony level. The second area of distributed decision-making which has benefitted from the use of RFID is ant colony task-allocation, and in particular, how tasks are robustly distributed between members of a colony in the face of changing environmental conditions. The use of RFID tags on worker ants allows simultaneous monitoring of a range of factors which could affect decision-making, including age, experience, spatial location, social interactions and fat reserves. My results demonstrate that individual ants base some task decisions on their own physiological state, but also utilise social cues. For non-specialist tasks, self-organisation also contributes, as movement patterns can cause emergent task allocation. The combination of these simple mechanisms provides the colony as a whole with a responsive work-force, appropriately allocated across tasks, but flexible in response to changing environmental conditions.
|12 March||Dr Sara Karvala, Computational Biology, University of Warwick||
Myxobacteria as a model of complex behaviour
Much research amongst biologists concentrates on a few well-known models, and amongst these the family of Myxobacteria provide an interesting platform to study how relative simple entities can communicate with each other to generate complex spatio-temporal behaviour. In this talk I hope to introduce you to some of the issues which are of interests to biologists and what are the strengths and militations of computational models in improving understanding of not only Myxobacteria but also of multi-cellular coordination in general.
|12 February||Professor Tom Stoneham, Department of Philosophy, University of York||
Philosophical Debates about Time
In this talk I will explore some traditional philosophical debates about time, specifically presentism (only the present exists), determinism (the future is fixed by the present) and the open future (there is more than one possible future), and try to work out what is at stake in these issues and how we might provide solutions. My overall view is that time is rather less problematic than philosophers have made it out to be, but not because we can now turn to physics for answers to these questions.e
|29 January||Dr Antony Masinton, Department of Archaeology, University of York||
Archaeology, Architecture and Game Engines: Ways of integrating diverse datasets in real time
Archaeology, Architecture and Game Engines: Ways of integrating diverse datasets in realtime" The video game industry is a multi-billion pound business. Essential to this industry is the use of increasingly sophisticated systems of 3d visualisation and animation (game engines) which include realistic physics simulation, artificial intelligence and data-driven content to create virtual worlds. However, game engines are not just for games. The potential benefits of 3d game engines for archaeologists, architects and those who study architectural space are numerous. Not only can spaces which may no longer exist, or exist in altered form from their original, be experience spatially in real-time, but how such spaces guide experience and relationships within them may be simulated and analysed. Game engines can take datasets beyond simple visualisation to become an important research tool in its own right, testing theories and hypothesis of the the social use of space, the effects of colour, light and sound, and even how buildings collapse. This talk will cover several archaeological or architectural uses of game engines in both modern practice and in the study of the past
|15 January||Professor Alan Winfield, Electronic Engineering, University of the West of England||
Mimesis and Emergence in Robotic Collective
Most research in robot imitation has been concerned with the problem of robots (primarily humanoid robots) learning skills from human demonstrators. Surprisingly little work has focussed on robot-robot imitation. However, mimesis in robot collectives would introduce social learning. Combining this with evolutionary learning would raise the intriguing possibility of studying gene-meme co-evolution in robots. This talk introduces imitation in robotics with reference to current EPSRC project 'The Emergence of Artificial Culture in Robot Societies', with some initial results and a discussion of how we might track memes in a robot collectives
|27 November||Professor Carole Knibbe, Universite C. Bernard, Lyon||(Digital) Evoluation in action: Robustness and evolvability in digital genomes and gene networks|
|13 November||Professor Karl Claxton, Department of Economics, University of York||Uncertainty and the value of evidence in health care decisions|
|6 November||Casper Macindoe||ArtScience: A beautiful cancer|
|16 October||David Lauder, European Project Office, University of York||European Funding and other useful funding streams|
|17 September||Professor Mustafa Khammash, Director of CDCC, University of California, Santa Barbara|
|4 September||TRANSIT Project Students:|
|Verena Fischer||A metabolic subsumptionarchitecture|
|Tim Lucas||Collective Motion: How do we find birds of 'a feather'|
|Robert Foster||Investigating the effects of climate change on food crops|
|Peter Armitage||The Grey Model of immunology|
|Matthew Lang||Coupled dynamical systems|
|Miroslav Batchkarov||Developing a learning tool for network analysis|
|Jenny O'Hare||Assessing the biological effects of nanotechnology|
|Francesca Day||Simulating auxin canalisation in plants|
|14 August||Professor Irun Cohen, Department of Immunology, Weizmann Institute, Germany||A view of a Biology-Computer Science alliance from immunology|
|24 July||Dr Paul Christian, School of Chemistry, University of Manchester||Silver nanoparticles: some model materials for exposure|
|17 July||Professor Susan Stepney, Department of Computer Science, University of York||Statistics with Confidence|
|10 July||Dr Janine Illian, University of St Andrews||Spatial point processes - modelling spatial dependence|
|26 June||Professor David Howard, Department of Electronics, University of York||... towards Real Virtuality: the virtual cocoon|
|12 June||Dr Andy Hunt, Department of Electronics, University of York||Sonification - an alternative way of portraying Complex Systems
|29 May||Professor Michael Fagan, University of Hull||Multiscale computer modelling of bone|
|15 May||Emma Uprichard, Department of Sociology, University of York||A Sociological approach to Complex Systems|
|1 May||Stella Jones-Devitt and Catherine Samei, University of York St John||Anarchy doesn't work unless you think about it: exploring the purpose of Critical Thinking in Higher Education|
|23 April||Professor Angela Brew, Macquarie University, Australia||Integrating Teaching and Learning: What do we know?|
|13 March||Dr Rosalind Allen, Univerity of Edinburgh||Simulating the flipping of the bacteriophage lambda genetic switch|
|7 February||Dr Ambrose Field, Department of Music, University of York||Advanced digital modelling meets the fifteenth century|
|13 February||Dr Leo Caves, YCCSA, University of York||TRANSIT update, The Hub and Heslington East|
|30 January||Dr David Sumpter, University of Uppsala, Finland||Decision making by groups of fish, birds and locusts|
|12 December||Dr Jamie Wood, YCCSA, University of York||Stochastic Modelling - What to do with that pesky Master equation|
|28 November||Professor Peter Lindsay, University of Queensland||Use of modelling to study non-linear systems: 2 examples|
|2 June||Dr Elva Robinson, School of Biological Sciences, Bristol||Individual and Collective Decisions: New Insights from Radio-tagged Ants|