Lessons from the English National Health Service new care models programme

Discussion and conclusion : The slowing of the growth in emergency admissions that the vanguards achieved is likely to be due, at least in part, to changes made by the vanguards to their care models. System wide changes (e.g. cultural) may also play a part as the scale of the impact cannot be fully explained by the interventions themselves. The additional investment in the vanguards ended in March 2018. It is not yet clear whether the changes and results will be sustained. Lessons learned and limitations : The study produced valuable learning about evaluating integration programmes: balancing locally-owned and commissioned research, with the need to maintain quality and comparability; the importance of common descriptions of interventions and ways of measuring their implementation and impacts; and the risks of focussing on a primary outcome measure and the need for better patient focused integration metrics Suggestions for future research : The findings from the evaluation are shaping the implementation of integration across England.


Introduction:
Between 2015 and 2018, the National Health Service (NHS) in England invested over £300m in a New Care Models (NCM) programme to develop new approaches to delivering care to whole populations. Five new care models were prototyped by fifty "vanguards".
Two of these models, covering 23 vanguards, focused on integration and involved partnerships between general practice, community providers and major hospitals. The ambition was that these models, once developed and tested, would spread nationwide. The programme was supported by the NHS's largest ever investment in evaluation.

Methods:
The central evaluation team within the NHS developed an ambitious evaluation strategy to generate intelligence for many stakeholders, including national leaders and policymakers, vanguards themselves and other areas of the country.
A decentralised, multi-faceted approach departed from the traditional approach of commissioning a central evaluation. Each vanguard designed, commissioned, and carried out their own local evaluations, generating a rich set of findings.
Complementing local evaluations was a range of national evaluation work including: statistical analysis of impact; provision of quarterly impact evaluation dashboards to vanguards; thematic studies of common interventions; and synthesis of the data and evidence being generated by the vanguard evaluations for a national evaluation report and various learning resources.
Results: Common interventions included risk stratification, multidisciplinary teams, social prescribing and increasing the range of professionals working in primary care. Although new forms of contracts between the vanguards and commissioners were envisaged, these were challenging. Successful vanguards went beyond interventions, focusing also on achieving system-wide change.
A key aim of the new care models was to reduce the rate of growth in emergency admissions. Over the three years of the programme growth in the 23 vanguards was around 6% lower than in the rest of England.
National data collections provided little useful data on patient outcomes and quality of care, Local evaluations provided evidence in these areas -particularly about the positive effects of the vanguards on patient and staff experience.
The evaluation also identified common challenges to integration. These include issues around workforce and information sharing, Tallack; Lessons from the English National Health Service new care models programme Discussion and conclusion: The slowing of the growth in emergency admissions that the vanguards achieved is likely to be due, at least in part, to changes made by the vanguards to their care models. System wide changes (e.g. cultural) may also play a part as the scale of the impact cannot be fully explained by the interventions themselves. The additional investment in the vanguards ended in March 2018. It is not yet clear whether the changes and results will be sustained.

Lessons learned and limitations:
The study produced valuable learning about evaluating integration programmes: balancing locally-owned and commissioned research, with the need to maintain quality and comparability; the importance of common descriptions of interventions and ways of measuring their implementation and impacts; and the risks of focussing on a primary outcome measure and the need for better patient focused integration metrics Suggestions for future research: The findings from the evaluation are shaping the implementation of integration across England.