Your shopping cart is empty.

Use of information for health service planning and evaluation

Applications: Use of Information for Health Service Planning and Evaluation

Commissioning or planning health care begins with an assessment of need[1].

This requires information on:

  • Size and structure of underlying population
  • Areas of met and unmet need
  • Incidence and / or prevalence of disease(s) of interest
  • Effectiveness of interventions available
  • Relative cost-effectiveness of those interventions
  • Current services, including, capacity, quality, effectiveness, efficiency
  • Prioritisation processes,  including political priorities such as waiting times, service targets etc

Typically this would be in the context of an assessment of the population's overall health which requires identifying and reviewing data sources such as:

  • Local data, such as a local specialised survey
  • Routine local and / or national statistics
  • Ad hoc data
  • Relevant published surveys
  • Qualitative data as well as quantitative sources
  • Trends in incidence and outcome

When data are not available directly, such as when future demand needs to be predicted or when estimating the impact of service re-design the available information may be used to formulate the assumptions of a model, and to calibrate it (see Uses of Mathematical Modelling).

Issues to consider when using information for local health service planning include:

  • The applicability of national data to the local population and conditions. If only national data are available, when applied locally the data may need to be adjusted for the composition of the population such as age, ethnicity and deprivation
  • Many data sources cannot give any indication of who has the health condition of interest but is not accessing services.
  • National or regional data may mask inequalities at smaller geographical levels such as wards
  • Data are subject to random fluctuations. Small area data may have very small numbers. Indicators of precision, such as intervals should always be used in these cases.
  • Confidentiality issues with small numbers may mean the information is suppressed to prevent disclosure when the data source is used in combination with other sources.
  • How the data have been obtained.
  • For what purpose the data were originally collected.  If different from the new purpose, there may be biases within the data that hamper a correct assessment of the new purpose.  For example a clinician records patient information to help treat that individual, not for population based analyses.  This could result in inconsistent coding. Also, there are legal restrictions on using personal data for purposes other than those for which it was collected.
  • How timely the data are.
  • How complete the data are.

The process of evaluation and monitoring should be a continuous one[2] and lessons learned should inform future developments.  A crucial part of evaluating health care is assessing the quality of health care.  (In the UK, 10% of inpatient episodes lead to unintended harm - around half are preventable[3])

'Quality of care is the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge' (Lohr, 1990)

Services can be evaluated using the Donabedian[4] framework of 'Structure - Process -Outcome'; all three elements are required to give an overall picture of the quality of a service.

Structure: examine the provision of facilities and staff available.  This could include:

  • numbers of hospital beds
  • ratio of doctors, nurses and other staff to patients and to each other
  • comparisons across different geographical regions
  • ease of access and opening hours to a clinic or screening service

Process: what is done for and to a patient or a population, and how well.  This could include:

  • methods by which patients are identified as suitable to be recorded on a disease register
  • how thoroughly the diagnostic criteria had been determined
  • validity and reliability of diagnostic tests
  • time taken from diagnosis to treatment
  • frequency of patient follow up
  • how people are recruited and involved in a process, e.g. a screening or immunisation programme or substance abuse treatment


  • Did the patient get better?
  • Were there complications?
  • Did the patient feel satisfied with the service?
  • Has there been a recurrence?
  • Was there a reduction in incidence in a  population?
  • Has coverage improved
  • Has life expectance increased?

Maxwell's dimensions of quality[5] take account of the population as well as patient care.  The dimensions include:

  • Access to services (for example, taking a population based approach, do some sub-groups find services easier to access than other sub-groups?  This could be about the physical location of services as well as different attitudes on seeking care)
  • Relevance to need (for the whole community)
  • Effectiveness (on an individual patient basis)
  • Equity  (could the service in  any way be made more fair?)
  • Acceptability (is this a procedure which many find too uncomfortable/embarrassing/ painful to undertake and so avoid treatment?)
  • Efficiency and economy

Example of where structure, process and outcome framework could be applied: death of a patient due to maladministration of an anti-cancer drug.[2]

In this case a drug which should only ever be administered into a vein was inserted into the spine.  The patient died and the investigation found 40 system failures.

The syringes for the two different methods of injecting were labelled in a confusing manner.  Senior doctor not present.

Two junior doctors were allowed to dispense these drugs without adequate training.  No formal induction programme was in place for junior doctors starting.

Death of a teenager

Evaluations can lead to service re-design which then needs to be monitored regularly, such as:

  • Clearer protocols
  • Formal training
  • Induction programmes
  • Redesign of equipment
  • Re-balancing of junior and senior staff

Information sources for evaluating health services
Evaluation involves comparison of the actual outcome of an intervention with the intended outcome. While it often involves an economic component, this is by no means the whole of  evaluation.

Depending on what needs to be evaluated, the information sources listed in the planning section above will be useful to varying degrees.  For example, if patients' post operative mortality rates are being measured and compared with other areas, Hospital Episode Statistics (HES) will be required, with linkage to ONS mortality data since HES only captures mortality within  hospital stays.

Patient satisfaction may be determined by means of a survey which captures qualitative information.

If access to service is under-represented from one ethnic group compared to what would be expected for that population, then ethnicity needs to be captured consistently as part of an appropriate process (e.g. when it is asked for, how it is asked for, which ethnic group categories are used).  However, there are legal restrictions on collecting this  data: hospitals may only collect it for admitted patients, not for outpatients or patients awaiting services.

Nationally, any health care system must have in place robust mechanisms for assuring the quality and safety of services provided to patients that enable comparisons across health care institutions and geographical regions.  This requires consistent methods of capturing and coding data.  Within the NHS, a framework for ensuring health care quality includes:

  • Clear national standards (NICE clinical guidelines, National Service Frameworks supported by the National Clinical Audit Programme)
  • Local Clinical governance mechanisms for implementing quality assurance programmes as well as patient safety and quality improvements
  • Inspection and audit programmes (Healthcare Commission)
  • National audits such as the confidential enquiries into Perioperative Deaths, Maternal deaths, Suicide and Homicide by People with Mental Illness

In addition to clinical audits and inspections, on-going monitoring of public health issues can be undertaken using a variety of websites and databases.

NHS providers routinely send data to commissioners though the Connecting for Health Secondary User Service (formerly via the NHS-wide clearing service).

Independent providers of services such as terminations of pregnancy should supply datasets detailing their activities to support their invoices, and these data are used to monitor progress against plans. These data include clinical, personal, and financial data, but patients are anonymised before transmission.

Incidence of communicable disease is monitoring continuously by the Health Protection Agency, who are informed through the notifiable diseases process.

The London Health Observatory produces Public Health performance management reports on a quarterly basis for each PCT in London, which monitor a range of indicators such as take up of screening services, immunisation uptake, participants in smoking cessation services take data from the Department of Health, Health Protection Agency and Compendium websites as well as data from the UNIFY database via NHS Net.  This database stores data on local delivery plan targets and all PCTs need to upload data to this database on a regular basis. [accessed 30/11/2007].

The National Drug Treatment Agency collates and analyses data on a monthly basis that local drug action teams use to monitor their services against target.



[1] Pencheon D, Guest C, Melzer D, Gray JAM Oxford Handbook of Public Health Oxford University Press, 2003
[2] Donaldson LJ, Donaldson RJ Essential Public Health 2nd Edition (Revised) , Petroc Press 2003
[3] Vincent C, Neale G, Woloshynowych M Adverse events in British hospitals: preliminary retrospective record review. BMJ 2001; 322:517-19
[4] Donabedian A. (1988) The quality of care: how can it be assessed? Journal of the American Medical Association 260: 1743-8.
[5] R. Maxwell, 'Quality Assessment in Health', British Medical Journal, Vol. 288, l984.

© M Goodyear & N Malhotra 2007