Accessibility statement

Past CHE Economic Evaluation Seminars 2008

13 November 2008

Title: Group consensus probability distributions: the textbook problem and e-participation
Speaker: Professor Simon French, Manchester Business School, University of Manchester

Abstract: In a paper written some twenty years ago, I distinguished three contexts in which one might wish to combine expert judgements of uncertainty: the expert problem, the group decision problem and the text-book problem.  Over the intervening years much has been written on the first two, which have the focus of a single decision context, but little on the third.  With many developments in internet technology, particularly in relation to interactivity and communication, the text-book problem is gaining in importance since data and expert judgements can be made available over the web to be used by many different individuals to shape their own beliefs in many different contexts.  Moreover, applications such as web-based decision support, e participation and e democracy are making algorithmic ‘solutions’ to the group decision problem attractive, but we know that such solutions are, at best, rare and, at worst, illusory.

 In this paper I briefly survey developments since my earlier paper, then turn to how expert judgement might be used within web-based group decision support, as well as in e participation and e democracy contexts.  The latter points to a growing importance of the textbook problem and suggests that Cooke’s principles for scientific reporting of expert judgement studies may need enhancing for such studies to be used by a wider audience.

16 October 2008

Title: Accounting for structural uncertainty in decision models using model averaging
Speaker: Christopher Jackson, Research Statistician, MRC Biostatistics Unit, Institute of Public Health, University of Cambridge

Abstract: In health economic decision models, uncertainty about the choice of model structure is usually described informally, by presenting results under a series of alternative scenarios.  This work describes how model uncertainty can often be accounted for formally using model averaging.  Many structural choices can be assessed against available data, using statistical methods which express model adequacy as a trade-off between model fit and complexity.  These assessments provide weights which can be used to calculate model-averaged distributions of cost and effectiveness.These methods are applied to the EVAR decision model which compared two surgical techniques for repairing abdominal aortic aneurysms, and the ICD decision model comparing implantable cardioverter defibrillators with anti-arrhythmic drugs for the prevention of cardiac arrhythmia.  The structural uncertainties considered are whether the incidence of certain events should depend on treatment or particular patient characteristics, and the choice of model for the dependence of mortality on age. Both classical and fully Bayesian methods of model averaging are described.Some other issues in structural uncertainty will be discussed.  For example, expert judgements might be elicited about uncertainties which are not testable against hard data, such as how to extrapolate short-term evidence to a lifetime, and what sources of evidence to employ in the model.  Another open question is how to assess the choice of states or events to include in the typical Markov model for accumulation of cost and benefit.

18 September 2008

Title: The social value of a QALY (SVQ) project.  Experiences of a UK (feasibility) study
Speaker: Rachel Baker, Lecturer in Health Economics, University of Newcastle

Abstract: In publicly funded health care systems decisions must be made about how to allocate scarce resources.  In England and Wales the National Institute for Health and Clinical Excellence (NICE) issues guidance about the provision of treatments and services.  One of the central pieces of information used by NICE in their deliberations is the incremental cost per quality adjusted life year (QALY).  Some judgement is necessary, however, as to the value of additional QALYs, so that decisions can be made about the provision of health technologies.  Currently a cost per QALY range of £20,000 to £30,000 is applied1but there is no empirical basis for this value and use of threshold cost per QALY values has recently been questioned by the House of Commons Health Committee2

We conducted a feasibility study to investigate methods for eliciting willingness to pay for a QALY from members of the general public.  A survey was designed, piloted and administered to 400 respondents in face to face interviews.  I will present the results of this feasibility study and discuss both encouraging and discouraging methodological findings. There is some variability in values using different methods of aggregation which will also be discussed.  I will finish by reflecting on the use of such methods in future research.

  1. Rawlins M, Culyer A. National Institute for Clinical Excellence and its value judgements BMJ 2004 329: 224-227
  2. House of Commons Health Committee National Institute for Health and Clinical Excellence.  First report of session 2007-2008.  Volume 1.

28 August 2008

Title: The willingness to pay for reducing pain and disability
Speaker: Dr. Andy Chuck, Health Economist and Manager of the Decision Analytic Modeling Unit at the Institute of Health Economics in Alberta, Canada and Adjunct Assistant Professor, Department of Anesthesiology and Pain Medicine, University of Alberta, Canada.

Abstract: Objectives: We sought to identify chronic pain patients' preferences for levels of improvement in pain related morbidity (PRM) by measuring their willingness to pay (WTP) for reducing their pain intensity and pain related disability.

Methods: The study was a cross sectional non-randomized design. Participants were recruited from a tertiary multidisciplinary pain centre in Canada. A computer administered discrete choice experiment was used to explore participant's WTP for various levels of improvement to PRM. Participants chose between two varying combination of treatments which differed in terms of their level of improvement in pain intensity, level of improvement in pain related disability and out of pocket monthly cost.

Results: The WTP to completely minimize PRM was $1,401 per month. Reduction in pain intensity was valued more highly than functional improvement. For every dollar an individual was WTP to improve his/her disability to the lowest severity (mild), s/he was WTP approximately $2 to reduce pain intensity to moderate and $3 to reduce pain intensity to mild. The potential return on investment in terms of health improvement gained was $336 per patient visit per year.

Conclusion: The morbidity associated with chronic pain is worth approximately $1,401 for every month in the chronic pain health state. From the patient's perspective, treatment and management strategies that focus on reducing pain intensity would have the greatest impact on improving health related quality of life. Valuing health improvement in monetary terms allows for direct monetary comparisons between the costs of chronic pain interventions and their associated health returns.

14 August 2008

Title: Comparison of mathematical and behavioural elicitation approaches: Application using a decision model for topical negative pressure (TNP) therapy for pressure ulcers
Speaker: Laura Bojke Team for Economic Evaluation and Health Technology Assessment, Centre for Health Economics, and Marta Soares, York Trials Unit, Department of Health Sciences, University of York

Abstract: Purpose: There is little guidance on the conduct of elicitation to inform decision models. We explored two approaches: implicit synthesis using behavioural aggregation and explicit synthesis using mathematical elicitation. In addition the performance of experts and elicitation approaches was assessed using calibration methods. Methods: The elicitation exercise was designed to generate initial estimates for a model investigating topical negative pressure (TNP) therapy for pressure ulcers. Distributions were elicited for six parameters from seven experts using the histogram approach: with histograms synthesised using linear pooling. Four parameters, concerning the effectiveness of chronic wounds treatments, had robust known values and were used to calibrate experts in the mathematical approach. The remaining two parameters, regarding the impact of TNP therapy, were unknown. For the behavioural method the same seven experts repeated the elicitation exercise but using the nominal group approach. At the end of the elicitation experts were asked for their views on the two approaches. Results: There was some variation in elicited values between individuals and between elicitation approaches. The behavioural method produced irrational responses to two known parameters.

Values for the two unknown parameters were higher in the behavioural approach compared with the mathematical approach. Behavioural values suggest that almost 17% more patients will not have healed 6 months after treatment with TNP. In this study calibration appeared to have little impact on estimates of the unknown parameters.

There were mixed messages from experts regarding the ease of the two approaches.

Conclusions: There was no clear appropriate approach in this small study. Given the ambiguity of experts' views regarding the two approaches, the choice must be governed by the extent to which it is fit for purpose, at least from a theoretical perspective.

12 June 2008

Title: Germans & QALYs - misalliance or match made in heaven? The German discussion on cost-utility analysis
Speaker: Dr Wolf Rogowski, Research Associate, Institute of Health Economics and Health Care Management, University of Munich

Abstract: The "Act to promote competition of the statutory health insurance" recently introduced formal health economic evaluation for reimbursement decisions in Germany. The economic evaluations are conducted by the Institute for Quality and Efficiency in Health Care (IQWiG), an institution with similarities to NICE. Yet even if it seems straightforward to learn from the English experiences and adopt a similar methodology, German health care decision makers seem reluctant  to undertake cost per QALY modelling: past decisions have,  to a large  extent, been based on benefit in terms of condition-specific measures of health and, rather than models, published clinical trials were used to establish medical benefit. Opponents to cost-utility analysis claim that cost per QALY measures conflict with the German social legislation. The IQWiG therefore proposes a different approach to economic evaluation, based on condition-specific efficiency frontiers. This study summarizes the legal background and and practice of reimbursement decision making in Germany, provides a critical summary of the IQWiG approach and discusses options and constraints of using NICE-type decision-analytic models for economic evaluation in Germany.

19 May 2008

Title: The role of the joint distribution of potential outcomes in cost-effectiveness analysis
Speaker: Anirban Basu, PhD, Assistant Professor, Section of General Internal Medicine, Department of Medicine, University of Chicago

Background Work: The Clinical Antipsychotic Trials of Intervention Effectiveness (CATIE) is a high-profile NIH-funded comparative effectiveness study whose results have been widely interpreted as demonstrating that older-generation antipsychotic drugs are the cost-effective first-line treatment in schizophrenia. We raise several concerns about the design of CATIE and its associated cost-effectiveness analysis that question this interpretation. We then perform an expected value of future research calculation that suggests that additional research comparing the effectiveness of antipsychotics would be of great value, but would require enormous sample sizes in order to answer these questions precisely.

Policy implications of results: One of our concerns, in lieu of these results, is whether we were asking the right comparative question in the first place. We discuss whether the intention-to-treat (ITT) principle is relevant for cost-effectiveness analysis of antipsychotic drugs where biological heterogeneity is quite dominant. Consequently, we argue to move away from the ITT principle and to focus on evaluating treatment algorithms and to finding the optimal sequencing of these drugs.

Implications for methodological work: However, there remains a fundamental limitation in the CEA methods to address such questions of sequencing.Traditional CEA based on modeling studies or observational studies and also clinical trial outcomes analysis in health and medicine often overlook or ignore the true underlying covariances between the potential outcomes, which are necessary to set up optimal dynamic treatment algorithms. For example, in a randomized experiment, randomization ensures that the sample means and variances for outcomes in alternative treatment arms provide consistent estimates of the population means and variances. However, most traditional two-arm randomized trials provide no information about the population-level correlation in potential outcomes between alternative treatments. That is, if both treatment arms produce a, say, 25% response rate then the treatment arms are considered equivalent without attempting to know whether the effect of each treatment occurs in the same 25% of the population. One must look at specific types of information, such as results from a cross-over clinical trial and observational studies with potential proxies to extract information on these correlations.

15 May 2008

Title: Modelling the cost-effectiveness of first, second, and third generation polychemotherapy regimens in women with early breast cancer who have differing prognoses.
Speaker: Helen Campbell, Senior Researcher (MRC Special Training Fellow), Health Economics Research Centre, Division of Public Health and Primary Health Care, University of Oxford and David Epstein, Susan Griffin and Mark Sculpher, Team for Economic Evaluation and Health Technology Assessment, University of York.

Abstract: Background: Amongst women diagnosed with early breast cancer, the risk of recurrence following primary surgery can vary substantially.  Adjuvant chemotherapy works to prevent recurrence by destroying undetectable residual disease in the body.  First, second, and now third generation chemotherapy regimens have been developed for this purpose.  Randomised controlled trials have evaluated first generation chemotherapy relative to no chemotherapy, second generation chemotherapy relative to first generation chemotherapy, and third generation chemotherapy relative to second generation chemotherapy.  In this study we use data from three randomised trials of first, second, and third generation chemotherapy regimens to populate a Markov model and facilitate indirect comparisons of the costs and effects of three different types of chemotherapy plus a no treatment option.  By using a prognostic model to generate transition probabilities within the Markov model, it has been possible to perform these analyses for women with differing baseline prognoses.

Methods:  A Markov model was constructed based upon a review of published health economic models in the field of early breast cancer.  The model includes health states for: disease free, local recurrence, metastatic recurrence, contralateral recurrence and death.  Transition probabilities are estimated using parametric survival models incorporating tumour characteristics such as number of positive lymph nodes, tumour grade, and tumour size.  The effects of each chemotherapy regimen on preventing recurrence are taken from three randomised controlled trials and are assumed to be additive on the log scale to facilitate previously untested comparisons.  Costs and utility decrements associated with chemotherapy treatment, its toxicity, and different types of recurrent disease, have been informed using a combination of trial data and the published literature. Joint uncertainty around the model input parameters is examined using probabilistic sensitivity analysis.

Results: The net benefits of the chemotherapy options vary considerably between women with different risk factors.  For some groups, first or second generation chemotherapy  is cost-effective, whereas for others it may actually result in a net health loss compared with no chemotherapy, as the small survival benefits afforded by chemotherapy are offset by the reduction in quality of life brought about by the treatment's toxicity.  Third generation chemotherapy is unlikely to be cost-effective in any patient group.  Various ways of presenting the results of comparisons between different generations of chemotherapy regimen for women with different combinations of risk factors and differing levels of baseline risk are explored.

Conclusions: Evaluating the cost-effectiveness of first, second, and third generation chemotherapy regimens in women with early breast cancer who have differing prognoses is valuable.  Thought does however need to be given to how best present cost-effectiveness results when there are many possible combinations of risk factor that confer differing levels of baseline risk between different groups of women.

9 April 2008

Title: A value of information approach to optimal drug development decisions: the case of Erlotinib in lung cancer
Speaker: Emanuela Castelnuovo, Centre for Health Economics and Department of Economics and Related Studies, University of York

Abstract: A recent review of health research funding in the UK proposes to reform biomedical research prioritisation and aspects of licensing and regulation for new pharmaceutical drugs. This reform aims to foster innovative pharmaceutical products and to ensure that basic research is translated into health and economic benefits.  Methods are needed to ensure that research prioritisation reflects societal and commercial incentives to develop new drugs.

We propose a decision-analytic framework to assess the economic value of a new drug at the early stages of development and before regulatory trials are undertaken, and to characterise optimal go/no-go conditions based on the commercial and societal value of a drug under uncertainty, conditional on stylised drug licensing and reimbursement regulation. We illustrate the practical application of this method to value a development plan using the example of erlotinib. This chemotherapy agent was licensed in the US and Europe for second-line therapy of non-small cell lung cancer; however it was recently denied adoption in the UK on grounds of poor cost-effectiveness compared to the currently preferred drug for this indication, docetaxel.

We explore a method to predict effectiveness and cost-effectiveness of erlotinib compared to docetaxel using historical data for the two drugs based on Phase II trials and Phase III trials for docetaxel, available at the time of development of erlotinib.  We then compare predictions with the actual regulatory history of erlotinib, and use these predictions to calculate the value of development and explore optimal development decisions for this drug.

13 March 2008

Title: Economic evaluation with a large decision space
Speaker: Dr Pelham Barton, Senior Lecturer, Department of Health Economics, University of Birmingham

Abstract: Methods for economic evaluation in health care are well-developed for a choice between two clearly defined decision options. These methods can be extended to three or more options, but become increasingly unwieldy as the number of options increases. This talk considers two contexts in which this may occur, the first involving a single decision variable which is (in principle) continuous valued, and the second involving a complex model for sequential treatments.

The first example uses a simple model involving once in a lifetime screening for a hypothetical cancer. The decision to be made is the optimal age for such screening. Here we consider what happens to the cost-effectiveness acceptability frontier and expected value of perfect information (EVPI) as the difference between allowable options is reduced. In the limit, the frontier collapses to zero whenever screening at some age is indicated, but the EVPI approaches a meaningful value.

The second example uses the Birmingham Rheumatoid Arthritis Model, which was developed to inform the NICE Technology Appraisals process. This model considers lifetime management of rheumatoid arthritis through sequential use of various treatments. Considering only monotherapy, with 11 treatments there are over 100 million possible strategies to compare. This part of the talk will show how combinatorial optimisation heuristics from the field of Operational Research can be used to find good strategies in a way that is computationally feasible.

20 February 2008

Title: Evaluating the use of payments to implement best practise in primary care
Speaker: Simon Walker, Research Fellow, Team for Economic Evaluation and Health Technology Assessment, Centre for Health Economics, University of York

Abstract: The use of financial incentives for physicians to increase the utilisation of 'appropriate' medical services in general practice raises questions about cost-effectiveness. These are essential to address in a budget constrained health system where there are numerous ways in which limited resources can be used. These questions relate, firstly, to whether the services incentivised are themselves cost-effective; and secondly, whether it is cost-effective to use financial incentives to increase the utilisation of these services.

The purpose of our research is to show how, through the value of implementation, the cost-effectiveness of payments aimed at improving the implementation rate of best practise in primary care can be assessed. Using our framework based on individual patients' health outcomes and costs, and the rate of implementation with and without payment we examine the conditions that are required for the financial incentives to be considered cost-effective.

This framework has then been used to assess the cost-effectiveness of the UK Quality and Outcomes Framework (QOF). Introduced in 2004 as part of the new GMS contract, the Quality and Outcomes Framework (QOF) seeks to secure higher quality primary care.  The QOF operates through a system of financial incentives for General Practitioners (GPs).  To receive payments, GPs must achieve targets relating to a set of approximately 150 performance indicators that cover a range of clinical and managerial domains.

17 January 2008

Title: Using propensity score methods to analyse individual patient‑level cost‑effectiveness data from observational studies
Speaker: Dr Andrea Manca, Senior Research Fellow, Centre for Health Economics, University of York

Abstract: The methodology relating to the statistical analysis of individual patient‑level cost‑effectiveness data has evolved dramatically in the last ten years. This body of techniques has been developed and applied mainly in the context of the randomised clinical trial design. There are however, many situations in which a trial is neither the most suitable nor the most efficient vehicle for such evaluations, and recourse to observational data may be the only alternative.   This paper provides a discussion of the ways in which propensity score methods could be used to assist in the analysis of observational individual patient‑level cost‑effectiveness data, with the objective to control for bias in the estimation of average cost‑effectiveness. As a motivating example we assessed the cost‑effectiveness of CABG versus PTCA – one year post procedure ‑ in a cohort of individuals who received the intervention within 365 days of their index admission for AMI. The data used for this paper were obtained from the Ontario Myocardial Infarction Database (OMID), linking these with data from the Canadian Institute for Health Information (CIHI), the Ontario Health Insurance Plan (OHIP), the Ontario Drug Benefit (ODB) program, and Ontario Registered Persons Database (RPDB).

Who to contact

For more information on these seminars, contact: