Start Submission Become a Reviewer

Reading: Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

Download

A- A+
Alt. Display

Research & theory

Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

Authors:

Esther Suter ,

Faculty of Social Work, University of Calgary; and Workforce Research and Evaluation, Alberta Health Services (AHS), CA
X close

Nelly D. Oelke,

School of Nursing, Universidade Federal do Rio Grande do Sul (UFRGS), CA
X close

Maria Alice Dias da Silva Lima,

School of Nursing, Universidade Federal do Rio Grande do Sul (UFRGS), BR
X close

Michelle Stiphout,

Workforce Research and Evaluation, Alberta Health Services (AHS), CA
X close

Robert Janke,

University of British Columbia, Okanagan (UBCO), CA
X close

Regina Rigatto Witt,

School of Nursing, Universidade Federal do Rio Grande do Sul (UFRGS), BR
X close

Cheryl Van Vliet-Brown,

School of Nursing, University of British Columbia, Okanagan (UBCO), CA
X close

Kaela Schill,

School of Nursing, University of British Columbia, Okanagan (UBCO), CA
X close

Mahnoush Rostami,

Workforce Research and Evaluation, Alberta Health Services (AHS), CA
X close

Shelanne Hepp,

Workforce Research and Evaluation, Alberta Health Services (AHS), CA
X close

Arden Birney,

Workforce Research and Evaluation, Alberta Health Services (AHS), CA
X close

Fatima Al-Roubaiai,

British Columbia Patient Safety and Quality Council (BCPSQC), CA
X close

Giselda Quintana Marques

School of Nursing, Universidade Federal do Rio Grande do Sul (UFRGS), BR
X close

Abstract

Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. 

Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. 

Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. 

Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. 

Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. 

How to Cite: Suter E, Oelke ND, Dias da Silva Lima MA, Stiphout M, Janke R, Witt RR, et al.. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis. International Journal of Integrated Care. 2017;17(6):4. DOI: http://doi.org/10.5334/ijic.3931
1785
Views
791
Downloads
2
Citations
15
Twitter
  Published on 13 Nov 2017
 Accepted on 24 Oct 2017            Submitted on 11 Apr 2017

Background

Integrated care is considered a powerful cure for all that ails health systems in most developed economies: poor performance at increasing cost, fragmentation of services, and lack of human resources to care for the aging population [1, 2, 3]. In their definition, Kodner and Spreeuwenberg [4] comprehend ‘Integrated Care’ as “a coherent set of methods and models on the funding, administrative, organizational, service delivery and clinical levels designed to create connectivity, alignment and collaboration within and between the cure and care sectors” (p.3). In using this broad definition for our review, we surmise that integrated care may refer to the system as a whole or to individual components within the broader health system and we use integrated care synonymously with health systems integration.

Despite far reaching support for integrated care and evidence of promising outcomes [5, 6, 7, 8, 9], achieving integrated health systems remains challenging. This has been attributed to ongoing conceptual ambiguity of integrated care and what successful integration looks like in different contexts [2, 3, 10, 11, 12]. Continued progress towards integrated care will depend much on our ability to contrast and compare the impact of strategies across different levels and context. However, the complex interplay of structures, processes and outcomes of integrated care is difficult to disentangle, hampering evaluation of progress [13, 14]. Besides conceptual ambiguity, measuring integrated care is challenging because of a lack of tools to measure different aspects of integration and inherent difficulties in tracking down existing tools within a dispersed body of literature [15, 16]. Being able to measure and evaluate the success of integration strategies in a consistent way is essential to effectively advance the design and implementation of an integrated health system [10].

The aim of this knowledge synthesis was to identify meaningful and relevant integration measurement domains and to search for and select appropriate instruments to measure these domains. The specific research questions were: 1) what are appropriate indicator domains for each of the 10 key integration principles identified in our previous work [17]? and 2) what measurement tools exist to measure these indicator domains?

Our review will contribute to the growing body of literature concerned with measuring progress towards fully integrated systems [2, 3, 10, 18, 19, 20] and will offer a useful resource to health system planners and decision-makers.

Theoretical Foundation for our Review

In earlier research, our team synthesized definitions and models for integrated care to encourage consolidation efforts [17]. We found more than 70 definitions and, not surprisingly, no ultimate integration model. We identified, however, ten key principles that cover multiple domains that collectively support integrated care. The key principles are: 1) comprehensive services across the continuum of care, 2) patient focus, 3) geographic coverage and rostering, 4) standardized care delivery through interprofessional teams, 5) performance management, 6) information technology, 7) organizational culture and leadership, 8) physician integration, 9) governance structure, and 10) financial management [17]. Using these key principles, integrated care is conceptualized as ten distinct areas that need to be addressed to successfully create connectivity, alignment and collaboration within and across care sectors. Others have uncovered similar constructs confirming the importance of a range of structural and process elements at different levels to achieve integrated care, collectively advancing the field towards a unified conceptual framework [2, 12, 21, 22].

Our ten key principles have proven useful for decision-makers and service planners for designing integrated care models [23]. However, they are not always easy to measure due to their broad and abstract nature. To advance our previous work, the current systematic review aimed to identify domains and measurement instruments for each of the ten principles. We understand indicator domains to be measurable concepts that capture specific aspects of a key principle. For example, patient engagement would be a measurable indicator domain for the principle of patient focus. We defined measurement instruments as any measurement devices (questionnaires, rating scales, checklists, observation forms) that can be completed by researchers, administrators or participants to measure structures, processes or outcomes associated with an indicator domain such as patient engagement.

By using our key principles as a starting point, we offer a cohesive approach to measuring and evaluating a health system’s state of integration that is grounded in solid research.

Methods

The knowledge synthesis followed the methods outlined by Levac, Colquhoun & O’Brien [24] and consisted of three components: 1) Delphi process to identify the most relevant indicator domains from the health providers, decision-maker, and researcher perspectives; 2) focus groups with patients to elicit their perspectives on most relevant integration principles; and 3) systematic review of tools for each identified indicator domain. In this study, we report on the Delphi process and the review of measurement tools.

To enhance the global applicability of the work, we developed a partnership with researchers, decision-makers and policy makers in a large urban centre in southern Brazil (Rio Grande do Sul) and Canada (Alberta and British Columbia). Both countries have publicly funded health systems, comparable funding priorities and similar geography of large urban centres and rural communities. Furthermore, health systems integration is a priority in both countries. Brazilian research team members were actively involved in the study from the development of the proposal, data collection and analysis, and interpretation of the data. Guided by integrated knowledge translation principles [25] we engaged knowledge-users (decision-makers and policy-makers) from each jurisdiction throughout the process. The ethics boards of the three participating jurisdictions approved the research protocol.

Delphi survey

A modified Delphi method [26] was used to reach consensus on the most relevant integration indicator domains. We used the 10 key principles identified in our previous work [17] as starting point. Drawing on the literature, research team members generated a preliminary list of indicator domains for each of the 10 principles. From this list, a Delphi survey was developed in English and translated into Portuguese. We invited 39 integration experts, policy and decision-makers, and health care providers from Canada, Brazil, Europe and the United States to rate the fit and importance and rank priority for each domain. Potential participants were identified by research team members, through the literature, and through research databases of health researchers. Our research coordinator completed an extensive scan of potential panel experts through google searches prior to finalizing the list of participants. The initial survey contained 21 indicator domains across the ten key principles. Participants ranked appropriateness and relevance of each indicator on a scale from 1–5 (1 being most relevant/appropriate). During the first round, participants suggested additional domains, which were included in the second round. The goal was to achieve 75% agreement for inclusion (1 and 2 ratings) or exclusion (4 and 5 ratings) of indicator domains through several survey rounds.

Searching, selecting and appraising relevant studies

We conducted independent, iterative searches for each indicator domain resulting from the Delphi process within the following broad disciplines: Health Sciences, Education and Management/Business using the core bibliographic databases from these fields (Medline including the Cochrane database of systematic reviews, EMBASE, PsycINFO, CINAHL, ABI Inform, and Business Source Premier). Our research librarian assisted us in identifying search terms specific for each of the indicator domains. We conducted two additional searches for health systems integration and instrument/tool development. The domain specific searches were then combined with the health systems and tool development searches to retrieve relevant articles on measurement tools. We completed an advanced google search to find tools in the grey literature. Results were filtered for date and language and the first 50 documents returned were screened. Research assistants also searched websites of relevant government agencies and research organizations (e.g., Institute for Healthcare Improvement), reference lists of included studies, and citations identified through forward citation searching using Web of Science for relevant tools. The librarian in Brazil used the LILACS database and included abstracts in English and Portuguese.

The research team developed, tested and refined inclusion and exclusion criteria for selecting studies. The key inclusion criteria were: 1) articles must include some kind of instrument to measure structures, processes or outcomes associated with one or more integration domains identified through the Delphi process; 2) instruments must be relevant to the health care context; 3) English and Portuguese articles; and 4) published between 1995 and 2014. In discussion of inclusion and exclusion criteria, a decision was made to exclude articles that focused on administrative data. Administrative data can be influenced by various components within and outside of health systems that may not necessarily be related to integration and were thus beyond the scope of this knowledge synthesis. Instruments that measured integration aspects outside of the identified measurement domains, and instruments primarily focusing on clinical outcomes of integrated care (e.g. patient health outcomes) were also excluded. All research team members involved in rating abstracts participated in training sessions where each individual rated the same 50 abstracts. Results were discussed during meetings, criteria clarified and refined if needed. Further rounds were conducted until the desired level of consistency was achieved. We then assigned two researchers to each indicator domain to read and rate abstracts; disagreements were resolved by a third. We developed and tested a template to guide extraction of relevant information and adapted tools [27] to rate relevancy and quality of articles. The data extraction table was organized around domains and focused, where possible, on the original article that described the instrument development. As a result, if the development article was older than 1995 it was still included. Team members conducted audits for each indicator domain at the relevancy stage and extraction stage to ensure consistency.

Two research team members in Brazil followed the same procedures to complete abstract screening and article selection for English and Portuguese abstracts identified through the Lilacs database. Audits were also conducted as outlined above. Their findings were then integrated into the synthesis.

Figure 1 shows details of the number of abstracts screened, considered relevant/excluded, and included for full review.

Figure 1 

Prisma flow chart.

Results

Delphi Results

Seventeen individuals participated in three rounds of Delphi surveys to identify priority indicator domains for measurement. In the first round, consensus was reached on 15 indicator domains (i.e., ≥ 75% of participants ranked them as either 1 or 2 for relevance and appropriateness). Participants suggested 36 additional domains. These were themed and merged where appropriate to produce 38 indicator domains for round 2. In round 2, the panel reached consensus on 29 indicator domains; the panel agreed that 16 of these indicator domains were relevant/appropriate; 13 were irrelevant/inappropriate; and no consensus was reached on nine indicator domains. These nine indicator domains were submitted to a third round. After three rounds, the panel reached consensus on 37 indicator domains; 16 were considered relevant and used for the systematic literature search for measurement instruments. Twenty-one indicator domains were considered irrelevant and removed. No consensus was reached for indicator domains for Principle 9 focusing on governance.

Systematic Review Results

The systematic review for the 16 indicator domains yielded a total of 7,133 abstracts. From those, we retrieved 114 unique tools that we considered relevant for measuring the state of integrated care.

Table 1 shows the review results for each of the 16 domains. Some domains were reviewed together given their common characteristics and search terms. Two tools were applicable for two domains. We added a domain “Overall Integration” to capture tools that reflected three or more domains.

Table 1

Number of abstracts screened and tools identified by domain.

Domain Total # abstracts screened Total # full-text articles Total # of tools2

Principle 1 Coordinated transitions in care across the continuum of care1 (transferring care from one area to another) 298 195 17
Client care is coordinated between sectors and providers within the health system and with supporting services such as education and social services 610 97 14
Principle 2 Patient and/or family involvement in care planning for all patients 569 128 34
Principle 3 Primary care network structures in place (e.g., family health teams, primary care networks, GP Divisions, inner city PHCs) 118 23 8
Principle 4 Team effectiveness 198 83 12
Use of shared clinical pathways across the continuum of health care (e.g., diabetes, asthma care) and geography1 957 229 7
Individualization of care pathways for patients with co-morbidities
Principle 5 Performance measurement domains and tools in place1 1657 99 2
Clinical outcomes being measured
Data tracked and shared 410 47 0
Principle 6 Data (e.g., administrative, performance, clinical) tracked and shared with stakeholders1 315 107 1
Shared patient electronic charts across continuum of care accessible to patients
Data collected is used for service planning 554 68 1
Principle 7 Organizational goals and objectives aligned across sectors 483 50 1
Principle 8 Physician integration within care teams and across sectors 560 53 6
Principle 10 Attainment of goals and objectives are supported by funding and human resource allocation 404 39 1
Overall integration; tools that measure several constructs of integration 0 87 12

Total 7133 1305 116

1 Overlap in domains; screened together.

2 Total number is higher as two tools were appropriate for two domains.

Summary of tools

Appendix 1 provides details on the instruments for each indicator domain including concepts measured, setting and sample tested, and psychometric properties where available. The majority of the instruments (94) were questionnaires; other types of instruments included checklists, toolkits, observational tools, and indicators. Ninety-two of the instruments were based on self-report, the other 22 were completed by either an external party or the data was collected from multiple sources (e.g., both the patient and provider). Fifty-six of the instruments were completed by providers, 42 by patients, 10 by administrators, and six by either administrators and providers or patients and providers.

All but eight of the tools came from the peer-reviewed literature, and 110 of the instruments were developed in a healthcare setting. The other four instruments were developed for virtual teams, in the community, or they did not specify. A large number of tools were developed and tested with a specific population (e.g., mental health, pediatrics) but could potentially be adapted for use in the general population. Some of the tools have extensive psychometric properties while others require further testing. The table also lists the number of citations for each instrument to give a sense of a tool’s use.

In the following, we provide more details on tools found under each domain.

Principle 1: Comprehensive Services across the Care Continuum

Coordinated transitions across the continuum of care

Coordinated care transitions was one of two indicator domains identified under principle one, comprehensive services across the care continuum. This domain aims to capture the adequacy and continuity of transitional care within and between acute care, primary care, and different community care services and settings. We found 17 instruments that measure continuity of transitional care across the care continuum. Most tools (n = 14) were developed for community/primary care settings [28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41], one was developed in acute care [42], and two in both, primary and acute care [43, 44]. The Care Transitions Measure (CTM, [43]) has a number of modifications [45, 46].

Many of the tools focus on processes and measure a range of aspects such as timeliness of information transfer, provider continuity, provider-patient interaction and transition planning or the quality of care transition more generally as experienced by the patient. A few tools measure structural components such as transition policies or existence of care plans.

Client care is coordinated between sectors and providers within the health system and with supporting services such as education and social services

The second indicator domain under principle one measures the coordination of client services across different sectors, e.g., health and social services coordination. The search yielded 14 instruments that measure intersectoral coordination along a continuum from loose linkages to close collaboration. Most instruments are questionnaires and were created or tested in a health care setting or with health-related outcomes. Intersectoral coordination is captured by variables such as: connections between partnering organizations [47, 48, 49, 50]; social networks [51, 52]; interagency linkages [53, 54, 55, 56]; depth of integration [57, 58]; and level of system integration and change [59]. Morrissey et al. 1994 [51] developed two instruments appropriate for this domain. Collectively these tools offer a meaningful way to assess the quality and strength of connections between service areas that cross the health and social care sector.

Principle 2: Patient Focus

Patient and/or family involvement in care planning for all patients

This indicator domain focuses on the patient and/or family at the center of care and having them involved in decision-making. This was the only domain under principle two, patient focus; however, it covered a broad topic area. Out of all 16 indicator domains, patient and family involvement resulted in the largest number of instruments.

We found 34 instruments that were all created or tested in a health care setting. The majority, 25 of the instruments, are completed by patients and/or families [31, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84], the rest are completed by physicians or other health care professionals [85, 86, 87, 88, 89], or by both, patients and physicians [90, 91, 92]. The 30-item Kim Alliance Scale (KAS) [72] was revised (KAS-R) to create a shorter 16-item questionnaire with the same scales [93].

The instruments measure a range of structure, process and outcomes areas mainly from the patient/family perspective such as: 1) patient experiences with care such as administrative processes or customer service aspects; 2) patient satisfaction with various aspects of care such as doctor-patient consultation; 3) quality of care often in relation to patient education and respect received; 4) family involvement in care as expressed, for example, by information received; 5) shared decision-making/involvement with decision-making as a way to participate in the care process; 6) satisfaction with decision made; 7) communication including things such as communication style and preferences; and 8) level of empowerment and empathy. Most instruments contain items in several of these areas allowing for a comprehensive assessment of the patient and family perspective.

Principle 3: Geographic Coverage and Rostering

Primary care network structures in place

Primary care network structures in place was the only indicator domain identified under principle three, geographic coverage and rostering. This domain recognizes that health systems integration cannot be achieved without well-developed primary care structures (such as integrated service delivery networks). We found eight questionnaires that measure general structural components [94, 95, 96, 97] or specific areas of primary care, such as the medical home [98, 99, 100], palliative care [101], and child services [102]. The Medical Home Index (MHI [100]) also has a short version [103, 104].

We highlight the Instrumento de Avaliação da Coordenação das RAS pela APS (COPAS) [97] because it is one of the only two unique instruments we found through the search of the Brazilian database. Originally developed in Portuguese, the COPAS is a 78-item questionnaire to assess the coordination of integrated health service delivery networks in primary health care [97]. The COPAS has five dimensions: 1) population, 2) primary health care, 3) support systems, 4) logistic systems, and 5) management systems. The instrument has also been translated into English, the Tool for Assessment of the Coordination of Integrated Health Service Delivery Networks by the Primary Health Care, and has been validated in a primary health care context [105].

Principle 4: Standardized Care Delivery through Interprofessional Teams

Team effectiveness

Team effectiveness was one of three indicator domains under principle four, standardized care delivery through interprofessional teams. Team effectiveness, including team performance, represents the effectiveness of interprofessional teams involved in integrated health systems. High performing teams have been a prominent topic for organizational development for many years, and there is no shortage of instruments to measure various aspects of team performance. Recognizing that one type of health provider is rarely able to manage all aspects of complex patients, team-based care has gained much traction in health care [12].

From the broader team literature, we identified 12 instruments we considered relevant to the integration context. Nine instruments were from the health care sector [106, 107, 108, 109, 110, 111, 112, 113, 114]; the two virtual team questionnaires were from sectors such as technology and agriculture [115, 116]; one instrument came from the grey literature [117]. Most of the 12 instruments were questionnaires; one instrument used observational methods to measure teamwork in a surgical setting [113]. Five instruments were part of larger and more in-depth questionnaires [106, 108, 110, 111, 115].

These instruments were specifically designed to assess interprofessional teams in health care or teams working in a virtual context. We included instruments that measure the effectiveness of virtual teams as this seemed relevant to integrated care where services are often dispersed. The instruments measure team effectiveness as team perception of their performance, overall team productivity, efficiency and ability of team members to complete their work assignments. Some instruments measure factors that contribute to team effectiveness such as team cohesion, individual well-being and use of resources; or both.

Use of shared clinical pathways across the continuum of health care and geography; and Individualization of care pathways for patients with co-morbidities

These two indicator domains were analyzed together as they are similar concepts that could not be distinguished in the screening stage. They focus on if and how shared clinical pathways are used across the continuum of healthcare and geography and on the individualization of care pathways for patients with co-morbidities.

We found five relevant instruments for the shared clinical pathways domain [118, 119, 120, 121, 122] and three for the domain that measures individualization of care pathways [123, 124, 125]; none of them specifically included geography as a component. Four instruments are completed by healthcare management or physicians [118, 119, 121, 122] while two evaluate clinical pathways from the patient perspective [120, 123].

These instruments can assist with creating integrated care pathways [125] and evaluating the quality of care pathways and their impact on patient experience [119, 124, 126]. Shared care pathways are an effective mechanism to create consistency and continuity of care across team members [127]. These instruments will likely gain importance where care pathways cross organizations and care sectors.

Principle 5: Performance Management

Performance measurement indicators and tools in place and Clinical outcomes being measured

These two indicator domains could not be separated in the literature and were therefore analyzed together. Both aim to capture if relevant structures and processes are in place for ongoing quality monitoring. Some authors have identified key aspects for successful performance measurement systems including clear definitions and parameters for indicators, and appropriate feedback loops and mechanisms for reporting [128, 129, 130]. However, we found only two actual instruments: The Medical Home Index (MHI) [100] includes a number of themes that speak to data management and quality improvement structures. The second instrument, the Índice de Responsividade do Serviço (IRS) (Health Services Responsiveness Index – SRI) is available in Portuguese [131]. The 160-item questionnaire measures Health System Responsiveness to user’s expectations in two areas 1) patient orientation including components that influence patient satisfaction but are not directly connected with health care (agility, social support, facilities and choice) and 2) personal respect including dignity, confidentiality, and autonomy.

Data tracked and shared with stakeholders

The third indicator domain under principle five, performance management, aims to measure if data is being tracked and shared with stakeholders (e.g., clinicians, staff, policy makers, decision-makers) within health systems. We found no instruments that specifically measures this domain.

Principle 6: Information Systems

Shared information systems across sectors, Shared patient electronic charts across continuum of care accessible to patients and Data collected is used for service planning

These were the three indicator domains identified under principle six, information systems. The first two indicator domains capture if information systems exist that are shared across the health system as well as with other sectors such as social services and justice. Furthermore, if such systems are accessible to patients. The only instrument we found for these domains was by Chou et al. 2010 [132]. It consists of structured and open-ended survey questions to evaluate an “internet-based wellness portal” in primary care. The portal provided patients electronic access to their personal health records and resources such as educational content, secure messaging, appointment management, and prescription refills.

The third indicator domain under the information systems principle measures if the data collected is used for service planning. The search yielded one instrument, a semi-structured telephone interview guide [133]. Questions focus on types of data used most often, for what purpose and to what capacity, and why data is not being used [133].

Principle 7: Organizational Culture and Leadership

Organizational goals and objectives aligned across sectors

The only indicator domain identified under the principle of organizational culture and leadership aimed to assess if there is alignment of organizational goals and objectives across not only the health care system but across sectors such as social services and education. We found one instrument for this indicator domain. The Organizational Culture Assessment Instrument (OCAI) is based on the Competing Values Framework, the dominant theoretical model for assessing organizational culture [134]. The OCAI consists of six items (dominant characteristics, organizational leadership, management of employees, organizational glue, strategic emphasis, and criteria of success), each with four alternatives that reflect four culture types (hierarchy culture, market culture, clan culture, and adhocracy culture). For each of the six items, 100 points are divided between the four culture alternatives; the scores are used to create an organizational culture profile and determine cultural alignment including leadership styles across sectors.

Principle 8: Physician Integration

Physician integration within care teams and across sectors

Physician integration into the broader system, a prominent topic in the late nineties [17], continues to be an important integration issue [135]. For the physician integration indicator domain, the only indicator domain for principle eight, we specifically focused on instruments that measure integration between physicians and the health system and integration of physicians within a health care team. Instruments measuring collaboration among team members more generally were included in the team effectiveness indicator domain. Instruments measuring integration with patient and families were included in the patient focus indicator domain.

We found six relevant instruments [111 (2 instruments), 136, 137, 138, 139]. Newer instruments primarily measure physician integration in the context of provider collaboration (e.g., pharmacists, nurses) rather than physician integration into the broader health system. Given the strong role of physicians in primary care, it makes sense to strengthen and evaluate collaboration between physicians and other health providers for the continuous improvement of quality, safety, and the patient-provider experience [135]. Physician integration is essential to improving care delivery and service planning in the rapidly changing healthcare landscape [17, 135]. Some authors have highlighted the integrative function of primary care, arguing that primary care should be “…the starting point from where to improve and integrate care” [12, p.3].

Principle 10: Financial Management

Attainment of goals and objectives are supported by funding and human resource allocation

A single indicator domain was identified for the financial management principle. This indicator aims to capture if there is alignment between organizational goals and objectives and how resources are being used. The search yielded one instrument.

The questionnaire by Bradford et al. 2000 [140] measures how resources are allocated and how effective the allocation processes are. Resource allocation best practice questions include priority-setting methods such as needs-assessments, grants making such as targeted requests for funding, service monitoring, and outcome assessment.

Overall Integration Instruments

The final indicator domain, overall integration, includes instruments that measure health systems integration more generally or that measure three or more of the 16 indicator domains identified.

We found 12 instruments; ten questionnaires target patients, practitioners, managers/leaders, and staff [14, 141, 142, 143, 144, 145, 146, 147, 148, 149]. Two instruments [1, 150] use a set of indicators to measure the degree of implementation of integration components. These tools capture some of our 16 indicator domains for which we were unable to find specific instruments. For example, the questionnaire by Gillies et al. 1993 [145] is well established and validated and measures perceived system integration across a number of dimensions such as, alignment of support services, organizational culture, strategic planning, quality assurance, information systems, financial management and resource allocation, and physician integration. This instrument was further developed into the integration scorecard by the same team [143].

Similarly, the Clinical Microsystem Assessment Tool [146] has 10 scales that align with many of the 10 key principles of health systems integration [17] such as culture, organizational support, patient focus, staff focus, interdependence of the care team, information and information technology, process improvement, and performance patterns. The Whole System Measures (WSM) is noteworthy because it not only includes 13 indicators across multiple domains but also recommends measurement methods for each of the 13 indicators. It was developed by the Institute of Healthcare Improvement to promote the use of a “balanced set of system-level measures… to evaluate health systems overall performance” [150, p. 1].

Discussion

The aim of this knowledge synthesis was to identify meaningful and relevant integration indicator domains and to search for and select appropriate instruments to measure these domains. Building on our previous work [17], we used our ten key integration principles as the framework to prioritize measurement areas and select relevant tools.

Given the nature of the concepts under study, a substantial number of potential indicator domains could be generated for consideration. Delphi is a recognized technique to build consensus amongst experts [26]. In this study, panel members iteratively reviewed indicator domains. Through this process, we were successful in reaching consensus on 16 indicator domains considered highly relevant for measuring progress towards integrated care. The indicator domains span nine of the ten key principles for integration [17], confirming the enduring relevance of these principles. The group reached no agreement on indicator domains for governance. Conceptually, there was no doubt that governance was important, but participants found it difficult to identify measurable indicator domains. Overall, the Delphi process was useful as it helped to identify priority areas for inclusion of instruments in our systematic review.

The subsequent literature review revealed 114 unique instruments that measure various aspects of the 16 domains. The vast majority of instruments found were self-report questionnaires that were completed either by the health care provider or the patient. A popular choice due to the ease of implementation, the limitations of self-report tools have been well recognized. Issues include response bias, recency effects or time pressures. Such reports also greatly depend on a subject’s ability to be insightful, accurate, and honest in their assessment [151]. A broader range of instruments is required to offer assessments of integrated care based on various data.

Over 50% of instruments found measure care coordination across the continuum/sectors, and patient and family involvement. The findings are consistent with other reviews that have uncovered a vast number of instruments related to the concepts of care coordination and patient-centred care [10, 20]. This is perhaps not surprising; these domains are the focus for many health care system reforms as progress in these areas directly influence patient care and experience [152].

We only found 14 instruments for the nine indicator domains that related to primary care network structures, performance monitoring, shared information systems, data used for service planning, and organizational alignment. Others have described these domains as functional, system, organizational or normative dimensions of integrated care and have attested to their significance for successful integration [12, 22]. The lack of measurement tools in these domains is consistent with findings from the literature. For example, in Bautista’s comprehensive review [10], less than 8% of tools touched on these dimensions. Similarly, the findings from Lyngsø et al. 2014 [2] would suggest that these remain poorly measured aspects of integrated care, pointing to an important evidence gap.

Lastly, we uncovered 12 instruments that measure multiple indicator domains. Some of these instruments simply measure three or more domains while others aim to capture overall integration of a system. Lyngsø et al. [2] have reflected on the benefits of instruments that measure several interdependent dimensions of a complex concept. These “overall integration” instruments are particularly useful for a broad assessment of the overall state of a system. In contrast, domain specific instruments allow measurement of targeted areas that may be the focus of integration strategies.

We assessed the quality of the articles included in the review but did not attempt to systematically evaluate the quality of the instruments. Based on a recent review [10] and our cursory analysis, we expect the quality to range widely. Specifically, many of the instruments were not tested for psychometric properties to support instrument validation. Future research should focus on developing, testing, and validating measures for all domains of health system integration. Similar observations have been made in other evolving fields that lack measures with robust psychometrics [153, 154]. Poor reporting may contribute to the quality gap, prompting some researchers to demand clear guidelines for reporting of survey research [155]. There is no question that high quality instruments are essential and should be the preferred choice for measurement purposes. However, instruments validated in one context may not necessarily be valid when applied to a different context [2]. It was interesting to note that the most frequently used instruments (as reflected by number of google citations) were not necessarily of higher quality. This may indicate that the choice of instruments is often influenced by other considerations such as fit with context and the strategy to be evaluated or length and ease of instrument completion.

This project was a partnership between Canada and Brazil. We hoped that expanding the scope beyond Canada would make the work more relevant and universally more applicable. The search of Portuguese databases yielded two unique instruments that were a nice addition to this inventory. This instrument compilation contributes to the growing research concerned with measuring progress towards integrated care (e.g., 2, 3, 10, 18, 19, 20, 22). Collectively, these studies have uncovered several hundred instruments that measure various components of integrated care. While these inventories offer easy access to available instruments, they pose the formidable challenge of how to select the most appropriate instrument from this vast collection. Continued progress towards integrated care will depend much on our ability to contrast and compare the success of strategies across different levels and context. This can only be achieved through a consolidated measurement approach. We support the call for a unified measurement framework, including recommendations on indicators and measurement instruments [13]. Being able to evaluate the success of integration strategies in a consistent way will ultimately lead to better health system design and improved health outcomes for patients.

Strengths and limitations of this study

Our review has a number of strengths and limitations. We used our previously established framework of the 10 key principles in combination with a consensus approach to select the measurement domains considered most relevant by integration experts. This helped guide the search and selection of instruments. In contrast to other reviews, we included a grey literature search. While the yield was not substantive, instruments published in the grey literature were easy to use as they tended to include user manuals. Easy accessible, user-friendly resources are essential to promote measurement. On the other hand, grey literature reports can easily be missed as they are not always well indexed or posted on easily accessible websites, leading to important omissions. As typical for these kinds of knowledge syntheses, finding the right search terms was challenging and required an iterative approach of searching and refining. Despite ongoing refinement of the search strategies, the literature searches resulted in a vast quantity of literature to examine. Also, the search only included literature up to 2014 so we may have missed some recent instruments. We had numerous people working on different indicator domains providing the opportunity for deviations in our processes. To mitigate these risks, we put in place checks and balances including audits, tracking of decision-making, frequent discussions, and a review of the final report.

Conclusion

This study has identified over 100 unique instruments that measure 16 different indicator domains considered relevant for integrated care. The majority of instruments were self-report questionnaires that measure care coordination, patient and family involvement, and team effectiveness. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. This remains an area for future research as these domains are relevant for the success of integrated care [10, 17, 22]. The search yielded 12 tools that measure overall integration or three or more indicator domains. In the absence of more targeted measures for some domains, these overall integration instruments fill an important gap.

Overall, there is a need to develop instruments other than questionnaires to use a broader range of data for measuring integration and integration outcomes. Indicators derived from administrative databases may fill an important gap here and have been the focus of some recent studies [18, 19]. Existing instruments would benefit from further psychometric testing and validation in a range of contexts to enhance applicability of the tools.

Additional File

The additional file for this article can be found as follows:

Appendix

Details of Instruments. DOI: https://doi.org/10.5334/ijic.3931.s1

Acknowledgements

This study (KRS 138203) was funded by the Canadian Institutes for Health Research.

Reviewers

Sanneke Schepman, PhD student at NIVEL & Policy advisor Health labour market at Stichting RegioPlus, The Netherlands.

Pim P. Valentijn, President.

Integrated Care Evaluation (ICE), Essenburgh, Hierden, The Netherlands.

Competing Interest

The authors have no competing interests to declare.

References

  1. Hébert, R and Veil, A. Monitoring the degree of implementation of an integrated delivery system. International Journal of Integrated Care, 2004; 4(20): 1–7. 

  2. Lyngso, AM, Godtfredsen, NS, Host, D and Frolich, A. Instruments to assess integrated care: A systematic review. International Journal of Integrated Care, 2014; 14: 1–15. DOI: https://doi.org/10.5334/ijic.1184 

  3. Strandberg-Larsen, M and Krasnik, A. Measurement of integrated healthcare delivery: A systematic review of methods and future research directions. International Journal of Integrated Care, 2009; 9(4): 1568–4156. DOI: https://doi.org/10.5334/ijic.305 

  4. Kodner, D and Spreeuwenberg, C. Integrated care: Meaning, logic, applications, and implications—a discussion paper. International Journal of Integrated Care, 2002; 2: e12. DOI: https://doi.org/10.5334/ijic.67 

  5. Busse, R and Stahl, J. Integrated care experiences and outcomes in Germany, the Netherlands, and England. Health Affairs (Milwood), 2014; 33(9): 1549–1558. DOI: https://doi.org/10.1377/hlthaff.2014.0419 

  6. Goodwin, N, Smith, J, Davies, A, Perry, C, Rosen, R, Dixon, A, Dixon, J and Ham, C. Integrated care for patients and populations: Improving outcomes by working together. London: The King’s Fund, Nuffieldtrust; 2012. [cited 2016 January 13]. Available from: https://www.kingsfund.org.uk/publications/integrated-care-patients-and-populations-improving-outcomes-working-together. 

  7. Nolte, E and Pitchforth, E. What is the evidence on the economic impacts of integrated care? Copenhagen: Regional Office for Europe of the World Health Organization; 2014. [cited 2017 January 13]. Available from: http://www.hiirc.org.nz/page/47762/what-is-the-evidence-on-the-economic-impacts/;jsessionid=E9BEE4FC8599C17A93F6022973194116?p=262&tag=coordinatedservices&tab=827&contentType=1033&section=13414. 

  8. Martinez-Gonzalez, NA, Berchtold, P, Ullman, K, Busato, A and Egger, M. Integrated care programmes for adults with chronic conditions: a meta-review. International Journal for Quality in Health Care, 2014; 26(5): 561–570. DOI: https://doi.org/10.1093/intqhc/mzu071 

  9. Ovreveit, J. Does clinical coordination improve quality and save money? London: Health Foundation; 2011. 

  10. Bautista, MC, Nurjono, M, Wei Lim, Y, Dessers, E and Vrijhoef, HJM. Instruments measuring integrated care: A systematic review of measurement properties. Milbank Quarterly, 2016; 94(4): 862–917. DOI: https://doi.org/10.1111/1468-0009.12233 

  11. Goodwin, N. Understanding integrated care: a complex process, a fundamental principle. International Journal of Integrated Care, 2013; 13: 1–2. DOI: https://doi.org/10.5334/ijic.1144 

  12. Valentijn, PP, Schepman, SM, Opheij, W and Bruijnzeels, MA. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care. International Journal of Integrated Care, 2013; 13: 1–12. DOI: https://doi.org/10.5334/ijic.886 

  13. Nuno Solinis, R and Stein, KV. Measuring integrated care – The quest for disentangling a Gordian knot. International Journal of Integrated Care, 2016; 16(3): 1–3. DOI: https://doi.org/10.5334/ijic.2525 

  14. VanDeusen Lukas, C, Meterko, M, Lowcock, S, Donaldson-Parlier, R, Blakely, M, Davies, M and Petzel, R. Monitoring the progress of system Integration. Quality Management in Health Care, 2002; 10(2): 1–11. DOI: https://doi.org/10.1097/00019514-200210020-00004 

  15. Armitage, GD, Suter, ES, Oelke, ND and Adair, CE. Health systems integration: State of the evidence. International Journal of Integrated Care, 2009; 9(17): 1–11. DOI: https://doi.org/10.5334/ijic.316 

  16. de Jong, I and Jackson, C. An evaluation approach for a new paradigm – health care integration. Journal of Evaluation in Clinical Practice, 2001; 7(1): 71–79. DOI: https://doi.org/10.1046/j.1365-2753.2001.00285.x 

  17. Suter, E, Oelke, ND, Adair, CE and Armitage, GD. Ten key principles for successful health systems integration. Healthcare Quarterly, 2009; 13: 16–23. DOI: https://doi.org/10.12927/hcq.2009.21092 

  18. National Quality Forum. Endorsed measures for care coordination: Phase 3, April 29, 2014. Draft report for comment. Available at: http://www.apic.org/Resource_/TinyMceFileManager/Advocacy-PDFs/care_coordination_draft_report.pdf. 

  19. Raleigh, V, Bardsley, M, Smith, P, Wistow, G, Wittenberg, R, Erens, B and Mays, N. Integrated care and support pioneers: Indicators for measuring the quality of integrated care. Final report. London: Policy Innovation Research Unit; 2014. 

  20. Schultz, EM, Pineda, N, Lonhart, J, Davies, SM and McDonald, K. A systematic review of the care coordination measurement landscape. Health Services Research, 2013; 13: 1–12. DOI: https://doi.org/10.1186/1472-6963-13-119 

  21. Singer, SJ, Burgers, J, Friedberg, M, Rosenthal, MB, Leape, L and Schneider, E. Defining and measuring integrated patient care: Promoting the next frontier in health care delivery. Medical Care Research and Review, 2011; 68(1): 112–127. DOI: https://doi.org/10.1177/1077558710371485 

  22. Valentijn, PP, Boesveld, IC, van der Klauw, DM, Ruwaard, D, Struijs, JN, Molema, JJW, Bruijnzeels, MA and Vrijhoef, HJM. Towards a taxonomy for integrated care: A mixed-methods study. Journal of Integrated Care, 2015; 15: 1–18. DOI: https://doi.org/10.5334/ijic.1513 

  23. Suter, E and Armitage, G. Use of a knowledge synthesis by decision makers and planners to facilitate system level integration in a large Canadian provincial health authority. International Journal of Integrated Care, 2011; 11: 1–9. DOI: https://doi.org/10.5334/ijic.576 

  24. Levac, D, Colquhoun, H and O’Brien, KK. Scoping reviews: Advancing the methodology. Implementation Science, 2010; 5(1): 69–78. DOI: https://doi.org/10.1186/1748-5908-5-69 

  25. Canadian Institutes of Health Research. Guide to knowledge translation planning at CIHR: Integrated and End-of-Grant Approaches. 2012. Available from: http://www.cihr-irsc.gc.ca/e/documents/kt_lm_ktplan-en.pdf [8 March 2015]. 

  26. Hsu, C and Sandford, BA. The Delphi technique: Making sense of consensus. Practical Assessment, Research and Evaluation, 2007; 12(10): 1–8. 

  27. Hastings, SE, Armitage, GD, Mallinson, S, Jackson, K and Suter, E. Exploring the relationship between governance mechanisms in healthcare and health workforce outcomes: a systematic review. BMC Health Services Research, 2014; 14(1): 1. DOI: https://doi.org/10.1186/1472-6963-14-479 

  28. Bonomi, AE, Wagner, EH, Glasgow, RE and VonKorff, M. Assessment of chronic illness care (ACIC): A practical instrument to measure quality improvement. Health Services Research, 2002; 37(3): 791–820. DOI: https://doi.org/10.1111/1475-6773.00049 

  29. Durbin, J, Goering, P, Streiner, DL and Pink, G. Continuity of care: Validation of a new self-report measure for individuals using mental health services. The Journal of Behavioral Health Services and Research, 2004; 31(3): 279–96. DOI: https://doi.org/10.1097/00075484-200407000-00005 

  30. Farmanova, E, Grenier, J and Chomienne, MH. Pilot testing of a questionnaire for the evaluation of mental health services in family health team clinics in Ontario. Health Care Quarterly, 2013; 16(4): 61–67. DOI: https://doi.org/10.12927/hcq.2014.23657 

  31. King, S, Rosenbaum, P and King, G. Parents’ perceptions of caregiving: Development and validation of a measure of processes. Developmental Medicine and Child Neurology, 1996; 38: 757–772. DOI: https://doi.org/10.1111/j.1469-8749.1996.tb15110.x 

  32. Le Bas, J, King, R and Block, M. The impact of mental health service integration on systemic function: A staff perspective. Australian and New Zealand Journal of Psychiatry, 1998; 32: 666–672. DOI: https://doi.org/10.3109/00048679809113121 

  33. Lemmon, R and Shuff, IM. Effects of mental health centre staff turnover on HIV/AIDS service delivery integration. AIDS Care: Psychological and Socio-medical Aspects of AIDS/HIV, 2001; 13(5): 651–661. DOI: https://doi.org/10.1080/09540120120063278 

  34. Martz, K and Gerding, A. Perceptions of coordination of care between hospice and skilled nursing facility care providers. Journal of Hospice and Palliative Nursing, 2011; 13(4): 210–221. DOI: https://doi.org/10.1097/NJH.0b013e3182135ddd 

  35. McGuiness, C and Sibthorpe, B. Development and initial validation of a measure of coordination of health care. International Journal for Quality in Health Care, 2003; 15(4): 309–318. DOI: https://doi.org/10.1093/intqhc/mzg043 

  36. Safran, DG, Karp, M, Coltin, K, Chang, H, Li, A, Ogren, J and Rogers, WH. Measuring patients’ experiences with individual primary care physicians: Results of a statewide demonstration project. Journal of General Internal Medicine, 2006; 21: 13–21. DOI: https://doi.org/10.1111/j.1525-1497.2005.00311.x 

  37. Sawicki, GS, Lukens-Bull, K, Yin, X, Demats, N, Huang, IC, Livingood, W, Reiss, J and Wood, D. Measuring the transition readiness of youth with special healthcare needs: Validation of the TRAQ – Transition Readiness Assessment Questionnaire. Journal of Pediatric Psychology, 2009: 1–12. DOI: https://doi.org/10.1093/jpepsy/jsp128 

  38. Schaefer, JA, Cronkite, R and Ingudomnukul, E. Assessing continuity of care practices in substance use disorder treatment programs. Journal of Studies on Alcohol, 2004; 65(4): 513–520. DOI: https://doi.org/10.15288/jsa.2004.65.513 

  39. Tobon, JI, Reid, GJ and Goffin, RD. Continuity of care in children’s mental health: Development of a measure. Administration and Policy in Mental Health, 2014; 41(5): 668–686. DOI: https://doi.org/10.1007/s10488-013-0518-0 

  40. Center for Health Care Transition Improvement/Got Transition™. Health care transition resources: Current assessment of health care transition activities. 2014a. Available from: http://www.gottransition.org/resources/index.cfm [23 January 2017]. 

  41. Center for Health Care Transition Improvement/Got Transition™. Health care transition resources: Health care transition process measurement tool. 2014b. Available from: http://www.gottransition.org/resources/index.cfm [23 January 2017]. 

  42. Grimmer, K and Moss, J. The development, validity and application of a new instrument to assess the quality of discharge planning activities from the community perspective. International Journal for Quality in Health Care, 2001; 13(2): 109–116. DOI: https://doi.org/10.1093/intqhc/13.2.109 

  43. Coleman, EA, Smith, JD, Frank, JC, Eilertsen, TB, Thiare, JN and Kramer, AM. Development and testing of a measure designed to assess the quality of care transitions. International Journal of Integrated Care, 2002; 2(2): 1–9. DOI: https://doi.org/10.5334/ijic.60 

  44. Graetz, I, Reed, M, Shortell, SM, Rundall, TG, Bellows, J and Hsu, J. The next step towards making use meaningful: Electronic information exchange and care coordination across clinicians and delivery sites. Medical Care, 2014; 52(12): 1037–1041. DOI: https://doi.org/10.1097/MLR.0000000000000245 

  45. Coleman, EA, Mahoney, E and Parry, C. Assessing the quality of preparation for posthospital care from the patient’s perspective: The care transitions measure. Medical Care, 2005; 43(3): 246–255. DOI: https://doi.org/10.1097/00005650-200503000-00007 

  46. Parry, C, Mahoney, E, Chalmers, SA and Coleman, EA. Assessing the quality of transitional care: Further applications of the care transitions measure. Medical Care, 2008; 46(3): 317–322. DOI: https://doi.org/10.1097/MLR.0b013e3181589bdc 

  47. Conrad, DA, Cave, SH, Lucas, M, Harville, J, Shortell, SM, Bazzoli, GJ, et al. Community care networks: Linking vision to outcomes for community health improvement. Medical Care Research and Review, 2003; 60(4): 95S–129S. 

  48. Fletcher, BW, Lehman, WEK, Wexler, HK, Melnick, G, Taxman, FS and Young, DW. Measuring collaboration and integration activities in criminal justice and substance abuse treatment agencies. Drug and Alcohol Dependence, 2009; 103(1): S54–S64. 

  49. Singer, SJ, Friedberg, MW, Kiang, MV, Dunn, T and Kugn, DM. Development and preliminary validation of the patient perceptions of integrated care survey. Medical Care Research and Review, 2012; 70(2): 143–164. DOI: https://doi.org/10.1177/1077558712465654 

  50. VicHealth. The partnerships analysis tool: For partners in health promotion. 2003. Available from: https://www.vichealth.vic.gov.au/. 

  51. Morrissey, JP, Calloway, M, Bartko, WT, Ridgely, MS, Goldman, HS and Paulson, RI. Providing treatment to persons with mental illness. The Milbank Quarterly, 1994; 72(1): 49–80. DOI: https://doi.org/10.2307/3350338 

  52. Pagliccia, N, Spiegel, J, Alegret, M, Bonet, M, Martinez, B and Yassi, A. Network analysis as a tool to assess the intersectoral management of health determinants at the local level: A report from an exploratory study of two Cuban municipalities. Social Science and Medicine, 2010; 71(2): 394–399. DOI: https://doi.org/10.1016/j.socscimed.2010.03.041 

  53. Amoroso, C, Proudfoot, J, Bubner, T, Jayasinghe, UW, Holton, C, Winstanley, J, Beiby, J and Harris, MF. The PracCap Research. Validation of an instrument to measure inter-organisational linkages in general practice. International Journal of Integrated Care, 2007; 7(3): 1–11. DOI: https://doi.org/10.5334/ijic.216 

  54. Meredith, LS, Eisenman, DP, Green, BL, Basurto-Davila, R, Cassells, A and Tobin, J. System factors affect the recognition and management of posttraumatic stress disorder by primary care clinicians. Medical Care, 2009; 47(6): 686–694. DOI: https://doi.org/10.1097/MLR.0b013e318190db5d 

  55. Morrissey, J, Calloway, M, Johnsen, M and Ullman, M. Service system performance and integration: A baseline profile of the ACCESS demonstration sites access to community care and effective services and supports. Psychiatric Services, 1997; 48(3): 374–380. DOI: https://doi.org/10.1176/ps.48.3.374 

  56. Tucker, S, Baldwin, R, Hughes, J, Benbow, S, Barker, A, Burns, A and Challis, D. Old age mental health services in England: Implementing the national service framework for older people. International Journal of Geriatric Psychiatry, 2007; 22(3): 211–217. DOI: https://doi.org/10.1002/gps.1662 

  57. Browne, G, Roberts, J, Gafni, A, Byrne, C, Kertyzia, J and Loney, P. Conceptualizing and validating the human services integration measure. International Journal of Integrated Care, 2004; 4(19): 1–9. DOI: https://doi.org/10.5334/ijic.98 

  58. Reilly, S, Challis, D, Burns, A and Hughes, J. Does integration really make a difference? A comparison of old age psychiatry services in England and Northern Ireland. International Journal of Geriatric Psychiatry, 2003; 18(10): 887–893. DOI: https://doi.org/10.1002/gps.942 

  59. Passalent, L, Kennedy, C, Warmington, K, Soever, L, Lundon, K, Shupak, R, Lineker, S and Schneider, R. System integration and clinical utilization of the advanced clinician practitioner in arthritis care (ACPAC) program–trained extended role practitioners in Ontario: A two-year, system-level evaluation. Healthcare Policy, 2013; 8(4): 56–70. DOI: https://doi.org/10.12927/hcpol.2013.23396 

  60. Arora, NK, Reeve, BB, Hays, RD, Clauser, SB and Oakley-Girvan, I. Assessment of quality of cancer-related follow-up care from the cancer survivor’s perspective. Journal of Clinical Oncology, 2011; 29(10): 1280–1289. DOI: https://doi.org/10.1200/JCO.2010.32.1554 

  61. Bennett, C, Graham, ID, Kristjansson, E, Kearing, SA, Clay, KF and O’Connor, AM. Validation of a preparation for decision making scale. Patient Education and Counseling, 2010; 78(1): 130–133. DOI: https://doi.org/10.1016/j.pec.2009.05.012 

  62. Damman, OC, Hendriks, M and Sixma, HJ. Towards more patient centred healthcare: A new consumer quality index instrument to assess patients’ experiences with breast care. European Journal of Cancer, 2009; 45(9): 1569–1577. DOI: https://doi.org/10.1016/j.ejca.2008.12.011 

  63. Deber, RB, Kratschmer, N and Irvine, J. What role do patients wish to play in treatment decision making? Archives of Internal Medicine, 1996; 156(13): 1414–1420. DOI: https://doi.org/10.1001/archinte.1996.00440120070006 

  64. De Kok, M, Scholte, RW, Sixma, HJ, van der Weijden, T, Spijkers, KF, van de Velde, CJH, Roukema, JA, van der Ent, FW, Bell, AVRJ and von Meyenfeldt, MF. The patient’s perspective of the quality of breast cancer care. The development of an instrument to measure quality of care through focus groups and concept mapping with breast cancer patients. European Journal of Cancer, 2007; 43: 1257–1264. DOI: https://doi.org/10.1016/j.ejca.2007.03.012 

  65. Edwards, A, Elwyn, G, Hood, K, Robling, M, Atwell, C, Holmes-Rovner, M, Kinnersley, P, Houston, H and Russell, I. The development of COMRADE-a patient-based outcome measure to evaluate the effectiveness of risk communication and treatment decision making in consultations. Patient Education and Counseling, 2003; 50(3): 311–322. DOI: https://doi.org/10.1016/S0738-3991(03)00055-7 

  66. Farin, E, Gramm, L and Kosiol, D. Development of a questionnaire to assess communication preferences of patients with chronic illness. Patient Education and Counseling, 2011; 82: 81–88. DOI: https://doi.org/10.1016/j.pec.2010.02.011 

  67. Gagnon, M, Hebert, R, Dube, M and Dubois, MF. Development and validation of an instrument measuring individual empowerment in relation to personal health care: The health care empowerment questionnaire (HCEQ). American Journal of Health Promotion, 2006; 20(6): 429–435. DOI: https://doi.org/10.4278/0890-1171-20.6.429 

  68. Galassi, JP, Schanberg, R and Ware, WB. The patient reactions assessment: A brief measure of the quality of the patient-provider medical relationship. Psychological Assessment, 1992; 4(3): 346–351. DOI: https://doi.org/10.1037/1040-3590.4.3.346 

  69. Hays, RD, Shaul, JA, Williams, VS, Lubalin, JS, Harris-Kojetin, LD, Sweeny, SF and Cleary, PD. Psychometric properties of the CAHPS™ 1.0 survey measures. Medical Care, 1999; 37(3): MS22–MS31. DOI: https://doi.org/10.1097/00005650-199903001-00003 

  70. Holmes-Rovner, M, Kroll, J, Schmitt, N, Rovner, DR, Breer, ML, Rothert, ML, Padonu, G and Talarczyk, G. Patient satisfaction with health care decisions: The satisfaction with decision scale. Medical Decision Making, 1996; 16(1): 58–64. DOI: https://doi.org/10.1177/0272989X9601600114 

  71. Jenkinson, C, Coulter, A and Bruster, S. The picker patient experience: Development and validation using data from in-patient surveys in five countries. International Journal for Quality in Health Care, 2002; 14(5): 353–358. DOI: https://doi.org/10.1093/intqhc/14.5.353 

  72. Kim, SC, Boren, D and Solem, SL. The Kim alliance scale: Development and preliminary testing. Clinical Nursing Research, 2001; 10(3): 314–331. DOI: https://doi.org/10.1177/c10n3r7 

  73. Légaré, F, Kearing, S, Clay, K, Gagnon, S, D’Amours, D, Rousseau, M and O’Connor, A. Are you SURE? Assessing patient decisional conflict with a 4-item screening test. Canadian Family Physician, 2010; 56(8): e308–14. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2920798/pdf/056e308.pdf. 

  74. Lerman, CE, Brody, DS, Caputo, GC, Smith, DG, Lazaro, CG and Wolfson, HG. Patients’ perceived involvement in care scale: Relationship to attitudes about illness and medical care. Journal of General Internal Medicine, 1990; 5(1): 29–33. DOI: https://doi.org/10.1007/BF02602306 

  75. Little, P, Everitt, H, Williamson, I, Warner, G, Moore, M, Gould, C, Ferrie, K and Payne, S. Observational study of effect of patient centredness and positive approach on outcomes of general practice consultation. British Medical Journal, 2001; 323(7318): 908–911. DOI: https://doi.org/10.1136/bmj.323.7318.908 

  76. Marco, CA, Buderer, N and Thum, D. End-of-life care: Perspectives of family members of deceased patients. American Journal of Hospice and Palliative Medicine, 2005; 22(1): 26–31. DOI: https://doi.org/10.1177/104990910502200108 

  77. Martin, LR, DiMatteo, MR and Lepper, HS. Facilitation of patient involvement in care: Development and validation of a scale. Behavioral Medicine, 2001; 37(3): 111–120. DOI: https://doi.org/10.1080/08964280109595777 

  78. Meakin, R and Weinman, J. The Medical Interview Satisfaction Scale (MISS-21) adapted for British general use. Family Practice, 2002; 19(3): 257–263. DOI: https://doi.org/10.1093/fampra/19.3.257 

  79. Mercer, SW, Maxwell, M, Heaney, D and Watt, GCM. The consultation and relational empathy (CARE) measure: development and preliminary validation and reliability of an empathy-based consultation process measure. Family Practice, 2004; 21(6): 699–705. DOI: https://doi.org/10.1093/fampra/cmh621 

  80. O’Connor, AM. Validation of a decisional conflict scale. Medical Decision Making, 1995; 15(1): 25–30. DOI: https://doi.org/10.1177/0272989X9501500105 

  81. Safran, DG, Kosinsk, M, Tarlov, AR, Rogers, WH, Taira, DA, Lieberman, NBA and Ware, JE. The primary care assessment survey: Tests of data quality and measurement performance. Medical Care, 1998; 36(5): 728–739. DOI: https://doi.org/10.1097/00005650-199805000-00012 

  82. Simon, D, Schorr, G, Wirtz, M, Vodermaier, A, Caspari, C, Neuner, B, Spies, C, et al. Development and first validation of the shared decision-making questionnaire (SCM-Q). Patient Education and Counseling, 2006; 63(3): 319–327. DOI: https://doi.org/10.1016/j.pec.2006.04.012 

  83. Sixma, HJ, Kerssens, JJ, Campen, CV and Peters, L. Quality of care from the patients’ perspective: From theoretical concept to a new measuring instrument. Health Expectations, 1998; 1: 82–95. DOI: https://doi.org/10.1046/j.1369-6513.1998.00004.x 

  84. Stewart, AL, Napoles-Springer, AM, Gregorich, SE and Santoyo-Olsson, J. Interpersonal processes of care survey: Patient-reported measures for diverse groups. Health Research and Educational Trust, 2007; 42(3): 1235–1256. DOI: https://doi.org/10.1111/j.1475-6773.2006.00637.x 

  85. Ainsworth, F, Cowan, E and Trieschman, AE. Family centered group care practice: Model building. Child and Youth Care Forum, 1998; 27(1): 59–69. DOI: https://doi.org/10.1007/BF02589528 

  86. Campbell, C, Lockyer, J, Laidlaw, T and MacLeod, H. Assessment of a matched-pair instrument to examine doctor-patient communication skills in practising doctors. Medical Education, 2007; 14: 123–129. DOI: https://doi.org/10.1111/j.1365-2929.2006.02657.x 

  87. Elwyn, G, Edwards, A, Wensing, M, Hood, K, Atwell, C and Grol, R. Shared decision making: developing the OPTION scale for measuring patient involvement. Quality and Safety in Health Care, 2003; 12(2): 93–99. DOI: https://doi.org/10.1136/qhc.12.2.93 

  88. Elwyn, G, Tsulukidze, M, Edwards, A, Legare, F and Newcombe, R. Using a “talk” model of shared decision making to propose an observation-based measure: Observer OPTION5’. Patient Education and Counseling, 2013; 93(2): 265–271. DOI: https://doi.org/10.1016/j.pec.2013.08.005 

  89. Heggland, LH, Mikkelsen, A, Ogaard, T and Hausken, K. Measuring patient participation in surgical treatment decision-making from healthcare professionals’ perspective. Journal of Clinical Nursing, 2012; 23(3–4): 482–491. DOI: https://doi.org/10.1111/jocn.12184 

  90. Agnew-Davies, R, Stiles, WB, Hardy, GE, Barkham, M and Shapiro, DA. Alliance structure assessed by the Agnew Relationship Measure (ARM). British Journal of Clinical Psychology, 1998; 37(2): 155–172. DOI: https://doi.org/10.1111/j.2044-8260.1998.tb01291.x 

  91. Cegala, DJ, Thoesen Coleman, M and Turner, JW. The development and partial assessment of the medical communication competence scale. Health Communication, 1998; 10(3): 261–288. DOI: https://doi.org/10.1207/s15327027hc1003_5 

  92. Shields, CG, Franks, P, Fiscella, K, Meldrum, S and Epstein, RM. Rochester Participatory Decision-Making Scale (RPAD): Reliability and validity. Annals of Family Medicine, 2005; 3(5): 436–442. DOI: https://doi.org/10.1370/afm.305 

  93. Kim, SC, Kim, S and Boren, D. The quality of therapeutic alliance between patient and provider predicts general satisfaction. Military Medicine, 2008; 173(1): 85–90. DOI: https://doi.org/10.7205/MILMED.173.1.85 

  94. Flocke, SA. Measuring attributes of primary care: Development of a new instrument. Journal of Family Practice, 1997; 45(1): 64–74. 

  95. Friedberg, MW, Safran, DG, Coltin, KL, Dresser, M and Schneider, EC. Readiness for the patient-centered medical home: Structural capabilities of Massachusetts Primary Care Practices. Journal of General Internal Medicine, 2008; 24(2): 162–9. DOI: https://doi.org/10.1007/s11606-008-0856-x 

  96. Rittenhouse, DR, Casalino, LP, Gillies, RR, Shortell, SM and Lau, B. Measuring the medical home infrastructure in large medical groups. Health Affairs, 2008; 27(5): 1246–1258. DOI: https://doi.org/10.1377/hlthaff.27.5.1246 

  97. Rodrigues, LBB, Leite, AC, Yamamura, M, Deon, KC and Arcencio, RA. Coordenação das redes de atenção à saúde pela atenção primária: validação semântica de um instrumento adaptado. [Coordination of primary healthcare networks: Semantic validation of an adapted instrument]. Cadernos de Saúde Pública, 2014; 30(7): 1384–1390. DOI: https://doi.org/10.1590/0102-311X00137613 

  98. Birnberg, JM, Drum, ML, Huang, ES, Casalino, LP, Lewis, SE, Vable, AM, Tang, H, Quinn, M, Burnet, DL, Summerfelt, T and Chin, MH. Development of a Safety Net Medical Home Scale for Clinics. Journal of General Internal Medicine, 2011; 26(12): 1418–1425. DOI: https://doi.org/10.1007/s11606-011-1767-9 

  99. Center for Medical Home Improvement. The medical home index: Adult 2008. Available from: http://depts.washington.edu/lend/links/CMHI-MHI-Adult-Primary-Care_Full-Version.pdf [24 January 2017]. 

  100. Cooley, WC, McAllister, JW, Sherrieb, K and Clark, RE. The medical home index: Development and validation of a new practice-level measure of implementation of the medical home model. Ambulatory Pediatrics, 2003; 3(4): 173–180. DOI: https://doi.org/10.1367/1539-4409(2003)003<0173:TMHIDA>2.0.CO;2 

  101. Nikbakht-Van, M, Pruyn, JFA and Van der Rijt, CCD. Function of local networks in palliative care: A Dutch view. Journal of Palliative Medicine, 2005; 8(4): 808–816. DOI: https://doi.org/10.1089/jpm.2005.8.808 

  102. Cassady, CE, Starfield, B, Hurtado, MP, Berk, RA, Nanda, JP and Friedenberg, LA. Measuring consumer experiences with primary care. Pediatrics, 2000; 105(4): 998–1003. 

  103. Center for Medical Home Improvement. The medical home index: Revised short form: Pediatric 2006. [cited 24 January 2017]. Available from: https://medicalhomeinfo.aap.org/tools-resources/Documents/CMHI-MHI-Pediatric_Short-Version.pdf. 

  104. McDonald, KM, Schultz, E, Albin, L, Pineda, N, Lonhart, J, Sundaram, V, Smith-Spangler, C, Brustrom, J, Malcolm, E, Rohn, L and Davies, S. Care coordination measures atlas update June 2014, US Department of Health & Human Services. Washington DC: Agency for Healthcare Research and Quality; 2014. 

  105. Rodrigues, LBB, Benedita dos Santo, C, Leiko Takamatsu Goyata, S, Paschoal Popolin, M, Yamamura, M, Christiane Deon, K, Miguel Veles Lapao, L, et al. Assessment of the coordination of integrated health service delivery networks by the primary health care: COPAS questionnaire validation in the Brazilian context. BMC Family Practice, 2015; 16(87): 1–9. DOI: https://doi.org/10.1186/s12875-015-0299-5 

  106. Amundson, SJ. The impact of relational norms on the effectiveness of health and human service teams. The Health Care Manager, 2005; 24(3): 216–224. DOI: https://doi.org/10.1097/00126450-200507000-00005 

  107. Bateman, B, Wilson, FC and Bingham, D. Team effectiveness – development of an audit questionnaire. Journal of Management Development, 2002; 21(3): 215–226. DOI: https://doi.org/10.1108/02621710210420282 

  108. Cramm, JM and Nieboer, AP. Professionals’ views on interprofessional stroke team functioning. International Journal of Integrated Care, 2011; 11(3): 1–8. DOI: https://doi.org/10.5334/ijic.657 

  109. Schroder, C, Medves, J, Paterson, M, Byrnes, V, Chapman, C, O’Riordan, A, Pichora, D and Kelly, C. Journal of Interprofessional Care, 2011; 25(3): 189–195. DOI: https://doi.org/10.3109/13561820.2010.532620 

  110. Shortell, SM, Rousseau, DM, Gillies, RR, Devers, KJ and Simons, TL. Organizational assessment in intensive care units (ICUs): Construct development, reliability, and validity of the ICU nurse-physician questionnaire. Medical Care, 1991; 29(8): 709–726. DOI: https://doi.org/10.1097/00005650-199108000-00004 

  111. Smits, SJ, Falconer, JA, Herrin, J, Bowen, SE and Strasser, DC. Patient-focused rehabilitation team cohesiveness in veterans administration hospitals. Archives of Physical Medicine and Rehabilitation, 2003; 84(9): 1332–1338. DOI: https://doi.org/10.1016/S0003-9993(03)00197-7 

  112. Temkin-Greener, H, Gross, D, Kunitz, SJ and Mukamel, D. Measuring interdisciplinary team performance in a long-term care setting. Medical Care, 2004; 42(5): 472–481. DOI: https://doi.org/10.1097/01.mlr.0000124306.28397.e2 

  113. Undre, S, Healey, AN, Darzi, A and Vincent, CA. Observational assessment of surgical teamwork: A feasibility study. World Journal of Surgery, 2006; 30(10): 1774–1783. DOI: https://doi.org/10.1007/s00268-005-0488-9 

  114. Vinokur-Kaplan, D. Treatment teams that work (and those that don’t): An application of hackman’s group effectiveness model to interdisciplinary teams in psychiatric hospitals. Journal of Applied Behavioral Science, 1995; 31(3): 303–327. DOI: https://doi.org/10.1177/0021886395313005 

  115. Lurey, JS and Raisinghani, MS. An empirical study of best practices in virtual teams. Information & Management, 2001; 38(8): 523–544. DOI: https://doi.org/10.1016/S0378-7206(01)00074-X 

  116. Staples, DS and Webster, J. Exploring traditional and virtual team members “Best Practices”: A social cognitive theory perspective. Small Group Research, 2007; 38(1): 60–97. DOI: https://doi.org/10.1177/1046496406296961 

  117. Hepburn, K, Tsukuda, RA and Fasser, C. Team skills scale, 1996. In: Geriatric interdisciplinary team training. Seigler, EL, Hyer, K, Fulmer, T and Mezey, M (eds). New York: Springer Publishing Company; 1998: 264–265. 

  118. Ainsworth, J and Buchan, I. COCPIT: An instrument for integrated care pathway variance analysis. Quality of Life through Quality of Information, 2012; 180: 995–999. 

  119. Vanhaecht, K, De Witte, K, Depreitere, R, Van Zelm, R, De Bleser, L, Proost, K and Sermeus, W. Development and validation of a care process self-evaluation tool. Health Services Management Research, 2007; 20(3): 189–202. DOI: https://doi.org/10.1258/095148407781395964 

  120. Van Houdt, S, Heyrman, J, Vanhaecht, K, Sermeus, W and De Lepeleire, J. Care pathways to improve care co-ordination and quality between primary and hospital care for patients with radical prostatectomy: a quality improvement project. Quality in Primary Care, 2013; 21(3): 149–55. 

  121. Wagner, C, Thompson, CA, Arah, OA, Groene, O, Klazinga, N, Dersarkissian, M and Sunol, R. A checklist for patient safety rounds at the care pathway level. International Journal for Quality in Health Care, 2014; 26(1): 36–46. DOI: https://doi.org/10.1093/intqhc/mzu019 

  122. Whittle, CL, McDonald, PS, Dunn, L and de Luc, K. Developing the integrated care pathway appraisal instrument (ICPAT): a pilot study. Journal of Integrated Care Pathways, 2004; 8(2): 77–81. DOI: https://doi.org/10.1177/147322970400800207 

  123. Drewes, HW, de Jong-van Til, JT, Struijs, JN, Baan, CA, Tekle, FB, Meijboom, BR and Westert, GP. Measuring chronic care management experience of patients with diabetes: PACIC and PACIC+ validation. International Journal of Integrated Care, 2012; 12(1): 1–11. DOI: https://doi.org/10.5334/ijic.862 

  124. Glasgow, RE, Wagner, EH, Schaefer, J, Mahoney, LD, Reid, RJ and Greene, SM. Development and validation of the patient assessment of chronic illness care (PACIC). Medical Care, 2005; 43(5): 436–444. DOI: https://doi.org/10.1097/01.mlr.0000160375.47920.8c 

  125. Wagner, EH, Austin, BT and Von Korff, M. Organizing care for patients with chronic illness. The Milbank Quarterly, 1996; 511–544. DOI: https://doi.org/10.2307/3350391 

  126. Whittle, C and Hewison, A. Integrated care pathways: pathways to change in health care? Journal of Health Organization and Management, 2007; 21(3): 297–306. DOI: https://doi.org/10.1108/14777260710751753 

  127. Guthrie, B, Saultz, JW, Freeman, GF and Haggerty, JL. Continuity of care matters. British Medical Journal, 2008: 337: a867. DOI: https://doi.org/10.1136/bmj.a867 

  128. Suter, E and Mallinson, S. Accountability for coordinated/integrated health services delivery. Regional Office for Europe: World Health Organization; 2015, working paper. http://www.euro.who.int/__data/assets/pdf_file/0003/286149/Accountability_for_coordinated_integrated_health_services_delivery.pdf?ua=1. 

  129. Stewart, LJ and Greisler, D. Measuring primary care practice performance within an integrated delivery system: A case study. Journal of Healthcare Management, 2002; 47(4): 250–261. 

  130. Williams, JW and Manning, JS. Collaborative mental health and primary care for bipolar disorder. Journal of Psychiatric Practice, 2008; 14(2): 55–64. DOI: https://doi.org/10.1097/01.pra.0000320127.84175.20 

  131. Andrade, GRBD, Vaitsman, J and Farias, LO. Metodologia de elaboração do Índice de Responsividade do Serviço (IRS). [Methodology for developing a health service responsiveness index (SRI).] Cadernos de Saúde Pública, 2010; 26(3): 523–534. [in Portuguese]. DOI: https://doi.org/10.1590/S0102-311X2010000300010 

  132. Chou, AF, Nagykaldi, Z, Aspy, CB and Mold, JW. Promoting patient-centered preventive care using a wellness portal: Preliminary findings. Journal of Primary Care and Community Health, 2010; 1(2): 88–92. DOI: https://doi.org/10.1177/2150131910365358 

  133. Wilkinson, DL and McCarthy, M. Use of comparative data for integrated cancer services. BMC Health Services Research, 2007; 7(204): 204–211. DOI: https://doi.org/10.1186/1472-6963-7-204 

  134. Cameron, KS and Quinn, RE. Diagnosing and changing organizational culture: Based on the competing values framework. San Francisco: John Wiley & Sons; 2005. 

  135. Molden, MM, Brown, CL and Griffith, BF. At the heart of integration: Aligning physicians and administrators to create new value. Frontiers of Health Services Management, 2013; 29(4): 3. 

  136. Chesluk, BJ, Bernabeo, E, Hess, B, Lynn, LA, Reddy, S and Holmboe, ES. A new instrument to give hospitalists feedback to improve interprofessional teamwork and advance patient care. Health Affairs, 2012; 31(11): 2485–2492. DOI: https://doi.org/10.1377/hlthaff.2011.0611 

  137. Dynan, L, Bazzoli, GJ and Burns, LR. Assessing the extent of integration achieved through physician-hospital arrangements. Journal of Healthcare Management, 1998; 43(3): 242–243. 

  138. Milette, L, Hebert, R and Veil, A. Integrated service delivery networks for seniors: Early perceptions of family physicians. Canadian Family Physician, 2005; 51(8): 1104–1105. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1479506/pdf/jCFP_v051_pg1105.pdf. 

  139. Southern, DM, Appleby, NJ and Young, D. Integration from the Australian GP’s perspective. Australian Family Physician, 2001; 30(2): 182–188. 

  140. Bradford, J, Honnold, J, Rives, ME and Hafford, K. The effectiveness of resource allocation methods used by RWCA title II consortia in Virginia. AIDS and Public Policy Journal, 2000; 15(1): 29–42. 

  141. Abendstern, M, Reilly, S, Hughes, J, Venables, D and Challis, D. Levels of integration and specialisation within professional community teams for people with dementia. International Journal of Geriatric Psychiatry, 2006; 21(1): 77–85. DOI: https://doi.org/10.1002/gps.1427 

  142. Bainbridge, D, Brazil, K, Krueger, P, Ploeg, J, Taniguchi, A and Damay, J. Measuring horizontal integration among health care providers in the community: an examination of a collaborative process within a palliative care network. Journal of Interprofessional Care, 2015; 29(3): 245–252. DOI: https://doi.org/10.3109/13561820.2014.984019 

  143. Devers, KJ, Shortell, SM, Gillies, RR, Anderson, DA, Mitchell, JB and Erickson, KLM. Implementing organized delivery systems: An integration scorecard. Health Care Management Review, 1994; 19(3): 7–20. DOI: https://doi.org/10.1097/00004010-199422000-00003 

  144. Friedman, EL, Chawla, N, Morris, PT, Castro, KM, Carrigan, AC, Prabhu Das, I and Clauser, SB. Assessing the development of multidisciplinary care: Experience of the national cancer institute community cancer centers program. Journal of Oncology Practice, 2015; 11(1): e36–e43. DOI: https://doi.org/10.1200/JOP.2014.001535 

  145. Gillies, RR, Shortell, SA, Anderson, DA, Mitchell, JB and Morgan, KL. Conceptualizing and measuring integration: Findings from the health systems integration study. Hospital and Health Services Administration, 1993; 38(4): 467–489. 

  146. Nelson, EC, Batalden, PB, Huber, TP, Mohr, JJ, Godfrey, MM, Headrick, LA and Wasson, JH. Microsystems in health care: Part 1. Learning from high performing front-line clinical units. Journal of Quality Improvement, 2002; 28(9): 472–493. DOI: https://doi.org/10.1016/S1070-3241(02)28051-7 

  147. Ouwens, MMMTJ, Marres, HAM, Hermens, RRP, Hulscher, MME, van den Hoogen, FJA, Grol, RP and Wollersheim, HCH. Quality of integrated care for patients with head and neck cancer: Development and measurement of clinical indicators. Measuring Quality of Integrated Care, 2007; 29(4): 378–386. DOI: https://doi.org/10.1002/hed.20532 

  148. The MacColl Center for Health Care Innovation at Group Health Research Institute and Qualis Health. The Patient-Centered Medical Home Assessment Version 4.0. Seattle: Safety Net Medical Home Initiative; 2014 Sept. [cited 2017 Feb 10]. Available from: http://www.safetynetmedicalhome.org/sites/default/files/PCMH-A.pdf. 

  149. SAMHSA-HRSA Center for Integrated Health Solutions. Organizational Assessment Toolkit for Primary and Behavioural Health Care Integration 2014. [cited 2017 January 24]. Available from: http://www.integration.samhsa.gov/operations-administration/OATI_Overview_FINAL.pdf. 

  150. Martin, LA, Nelson, EC, Lloyd, RC and Nolan, TW. Whole System Measures. IHI Innovation Series white paper. Cambridge, Massachusetts: Institute for Healthcare Improvement; 2007. [cited 2017 24 January]. Available from: http://www.ihi.org/resources/pages/ihiwhitepapers/wholesystemmeasureswhitepaper.aspx. 

  151. Paulhus, DL and Vazire, S. The self-report method. In handbook of research methods in personality psychology, 2007; 1: 224–239. 

  152. Luxford, K and Sutton, S. How does patient experience fit into the overall healthcare picture? Patient Experience Journal, 2014; 1(1): 20–27. 

  153. Clinton-McHarg, T, Yoong, SL, Tzelepis, F, Regan, T, Fielding, A, Skelton, E, Kingsland, M, Ying Ooi, J and Wofenden, L. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the Consolidated Framework for Implementation Research: a systematic review. Implementation Science, 2016; 11: 148–170. DOI: https://doi.org/10.1186/s13012-016-0512-5 

  154. Squires, JE, Estabrooks, CA, O’Rourke, HM, Bustavsson, P, Newburn-Cook, C and Wallin, L. A systematic review of the psychometric properties of self-report research utilization measures used in healthcare. Implementation Science, 2011; 6: 83–101. DOI: https://doi.org/10.1186/1748-5908-6-83 

  155. Bennett, C, Khangura, S, Brehaut, J, Graham, ID, Moher, D, Potter, BK and Grimshaw, JM. Reporting guidelines for survey research: An analysis of published guidance and reporting practices. PLoS Medicine, 2011; 8(8): 1–11. DOI: https://doi.org/10.1371/journal.pmed.1001069 

comments powered by Disqus