TY - JOUR AU - T. Ling AU - N. Fahy AU - J. Dawney A1 - AB - There is global interest in integrated care, often associated with how to improve system efficiency, strengthen clinical and cost-effectiveness, avoid gaps in patient care, and improve patient experiences and outcomes, through improved coordination across services. Despite considerable activity in both delivering and evaluating integrated care, evaluations have not greatly helped to understand how to 'do' it better. Evaluations of integrated care have often arrived at similar conclusions, frequently including the generic finding that results are patchy and context dependent. In this article, we explore and discuss these challenges to evaluation, how these challenges are understood in recent key publications, and suggest an alternative perspective. We explore technical inadequacies of evaluations (concerning definitions, metrics, and timing) as well as deeper problems (such as integrated care being dynamic and relational, and operating across multiple, larger systems). In re-framing how to evaluate integrated care, we propose an approach that involves a recursive evaluation architecture. This draws on systems thinking. This approach also recognises that we can better understand evaluations of integrated care as co-producing knowledge and applying this to learning and adaptation. AD - RAND Europe, Eastbrook, Shaftesbury Road, Cambridge CB2 8BF, United Kingdom. Electronic address: tling@randeurope.org.; RAND Europe, Eastbrook, Shaftesbury Road, Cambridge CB2 8BF, United Kingdom. AN - 40897593 BT - Health Policy C5 - General Literature DA - Aug 20 DO - 10.1016/j.healthpol.2025.105418 DP - NLM ET - 20250820 JF - Health Policy LA - eng N2 - There is global interest in integrated care, often associated with how to improve system efficiency, strengthen clinical and cost-effectiveness, avoid gaps in patient care, and improve patient experiences and outcomes, through improved coordination across services. Despite considerable activity in both delivering and evaluating integrated care, evaluations have not greatly helped to understand how to 'do' it better. Evaluations of integrated care have often arrived at similar conclusions, frequently including the generic finding that results are patchy and context dependent. In this article, we explore and discuss these challenges to evaluation, how these challenges are understood in recent key publications, and suggest an alternative perspective. We explore technical inadequacies of evaluations (concerning definitions, metrics, and timing) as well as deeper problems (such as integrated care being dynamic and relational, and operating across multiple, larger systems). In re-framing how to evaluate integrated care, we propose an approach that involves a recursive evaluation architecture. This draws on systems thinking. This approach also recognises that we can better understand evaluations of integrated care as co-producing knowledge and applying this to learning and adaptation. PY - 2025 SN - 0168-8510 SP - 105418 ST - Reframing the evaluation of integrated care; examples from the NHS in England T1 - Reframing the evaluation of integrated care; examples from the NHS in England T2 - Health Policy TI - Reframing the evaluation of integrated care; examples from the NHS in England U1 - General Literature U3 - 10.1016/j.healthpol.2025.105418 VO - 0168-8510 Y1 - 2025 ER -