IDN rankings and performance: a comment
International Journal of Integrated Care, 1 June 2001 - ISSN 1568-4156
Comment
IDN rankings and performance: a comment
Roice D. Luke, Ph.D., Professor, Department of Health Administration, Virginia Commonwealth University, Richmond, VA 23298-0203

Dr. Wan and colleagues have produced an impressive analysis of the performance of integrated health networks in the United States. Such studies are very much needed, especially given the limited research on IHN performance and the importance of this new organizational form within the U.S. health care system.

The authors examine the increasingly recognized SMG top 100 list in an attempt to determine whether a system's ranking (by SMG) as an integrated system correlates with system performance. At the heart of this study are two key assumptions: 1) that the SMG ranking methodology is valid and 2) that the entities included in the ranking are indeed integrated health networks. We briefly address each of these in this comment.

Validity of SMG scoring system

With regard to the first assumption, Dr. Wan and colleagues are the first to examine the SMG rankings empirically. And, not unexpectedly, they found no association between system ranking and performance.

It is important to note that immediately after Modern Healthcare gave a degree of notoriety to the ranks produced by SMG, by publishing the top 100 list annually, the rankings have become a hot item for system marketers. Most of the top-rated IHNs now report their high rankings prominently on their web pages. Modern Healthcare has thus provided a kind of “institutional” validation of the scoring system on behalf of a practising community hungry for positive indicators of successful system building and integration.

One example within the data themselves signals possible validity problems. The top 100 rankings published in the year 2000 (we note that only 1998 and 1999 rankings were used in the Wan study) included the UCSF Stanford Healthcare System. This system formed in 1997 when the University of California San Francisco health system merged with the Stanford University health system. From the very beginning, the merged entity experienced significant financial and organizational difficulties. The continuing problems were so severe that they voted just two years later, in 1999, to undo their merger. Despite the problems and after their having fallen apart, SMG still included them in the top 100 ranking (ranked 64th). Admittedly, this is but one case. But it is emblematic of the problem inherent in promulgating a system for ranking performance that has not been subjected to external testing (until, of course, the Wan and colleagues study) and appears, for all intent and purposes, to serve more as a marketing tool than a resource designed for scholarship. Accordingly, it seems reasonable to suggest that until further research validates the SMG scoring system, the research community should probably refrain from treating the rankings as indicators of relative levels of health system integration.

Does SMG measure IHNs?

This leads to the second assumption—that the entities ranked by SMG and examined by Wan and colleagues are actually IHNs. This is not an easy assumption to check out, primarily because there are no validated databases or accepted definitions or measures of IHNs [6]. We note that the definition offered by Wan and colleagues is consistent with many others found in the literature: “Recent major trends in health care systems have been (1) to provide all elements of the care continuum from health insurance, outpatient and inpatient services to long-term health maintenance, and (2) to develop system-wide integration of administration, clinical care, information technology, and financing.” These two elements—(1) comprehensive service delivery, perhaps combined with the provision of health insurance, (2) and integration across those service elements—are generally considered to be central to the concept of an IHN. Given well-recognized difficulties inherent in vertical integration [8], the requirement that IHNs combine provider and insurance businesses is often relaxed in practice. In the 1990s, the period in which great attention was given to IDN development, only a small number of systems successfully combined insurance and delivery. It is also common to assume that IHNs join physicians and hospitals into integrated systems. However, since many of the hospital-based systems that ventured into this risky arena lost money as a result, many have sold off their physician practices [3, 4].

The 1990s experimentation with IHN formation, however, did produce a significant number of very large, highly complex and often dominating health care systems. Most that formed in that period were hospital-based. And while few of these successfully integrated with managed care or physicians, most have been classified (through self designations or labels given to them by data collection companies) as IHNs. The essential characteristics that qualify these as IHNs are their overall complexity (they often involve combinations of two or more hospitals, for example) and the fact that they combine service capacity at the local level. It is only locally that the integration of clinical resources and services is possible. Indeed, the very idea of clinical integration was first promulgated by those systems that had already developed substantial system capacity within local markets [1, 2, 5]. The 90s thus became the period in which local system integration was fully conceptualized as well as pursued strategically, by hospitals across the country.

The result of the 90s revolution in system formation changed the landscape of health care in the United States very dramatically. A few statistics exemplify the consequence of system formation in that period. In 1989, just prior to the major restructuring that occurred, about 28 percent of urban hospital bed capacity was involved in local combinations of two or more hospitals [7]. By the year 2000, which comes after the merger and acquisition movement had settled down, the percentage doubled, rising to 59% [9]. The number of local urban hospital clusters reached around 465 in 2000, the clusters ranging in size from two hospitals (representing 52% of the clusters) to 18 hospitals.

As a result of the restructuring, local hospital markets became very concentrated (see figure 1). In the year 2000, for example, the average market shares for the top four firms per urban market, by size of market, was: 99% for markets of 250,000 and less; 93% for markets between 250,000 and less than a million; and 73% for markets that are one million or larger [9]. The four firm ratio for non-urban markets (where no more than one or two hospitals are likely to be present), commonly reaches 100%. Overall, therefore, the vast majority of markets are dominated by a small number of hospital firms, many of which are local clusters. It is thus very clear that the so-called IHN movement produced some significant changes in the structure of urban health care markets. Figure 1

The question then is this: are these (and other local system types) tracked in the SMG database? Many of the local systems are in fact counted by SMG. For example, as pointed out in the Wan and colleagues paper, SMG lists the Henry Ford Health System as an IHN. Henry Ford is an excellent example of a local, multi-hospital, multi-service system. Unfortunately, in addition to such local systems, SMG also counts as IDNs a number of systems that should not be counted as such. Most critically, they include as IHNs a number of multi-market, multi-hospital, and systems as well. Examining their 2000 list of top 100 IDNs, we note that nearly half (46%) of the so-called IHNs ranked by SMG were actually multi-market systems. For example, Banner Healthcare, which is ranked # 72 in the list of IDNs, is a large not for profit system that owns hospitals stretching from Arizona to North Dakota. Likewise, Intermountain Health Care, Sutter Health, Trinity Health, and others were so classified, but they each reach well beyond single markets. It is true that many of these run IDNs locally—for instance, a Banner cluster in Phoenix and one in Salt Lake City run by Intermountain. However, the local clusters within these are not separately identified by SMG. It is interesting to note that SMG does separately identify within its top 100 a cluster owned by Tenet, the second largest for profit system in this country—Tenet of Fort Lauderdale. But this tends to be the exception, not the rule for their handling of multi-market companies. SMG also includes as IDNs a number of freestanding hospitals that have self-identified as IDNs.

A few other companies provide data on IHNs. In addition to the American Hospital Association, which is the most widely used source of data by health services researchers, several proprietary companies provide health care market data. Most of these formed in the 1990s to service pharmaceutical and other supply and distribution companies. These include: USLifeline (owned by Medical Distribution Solutions, Inc.), MDI, MCIC, and Dorenfest & Associates. Of these, only USLifeline distinguishes the local clusters from the multi-market system types. In addition, the Williamson Institute, which is located at the Virginia Commonwealth University, has for years tracked both multi-market and local market hospital combinations, each type separately. None of these sources, however, tracks vertically integrated systems specifically (which systems combine hospital, physician and other providers as well as insurance products). The vertical form, of course, constitutes the fullest expression of the IDN concept.

Conclusions

Wan and colleagues have provided an important service to the research community by examining whether or not the popular rankings created by SMG actually reflect degrees of integration that would be associated with levels of system performance. They found little evidence of any such association. We point out in this commentary that no serious testing of the validity of the system for ranking IDNs by SMG has been conducted, nor is SMG consistent in what it designates as IDNs in its database.

We simply lack a reliable, valid and current database on the more recently formed IDNs. This is very unfortunate, as the IDN (or the various permutations of complex health care systems that have formed) represents the most significant new organizational form to have emerged in the hospital industry in a century. Given their size, complexity and local power, as well as their newness as organizational forms, there is a critical need for research into these entities. We need to understand the organizational variations that exist among the IDNs. We need to know what factors are associated with IDN structure and strategy. And we need to know whether the combinations of health care capacity at the local market level do indeed improve the quality, efficiency and/or access to care, over what is possible by the traditional unstructured “system” arrangements. In effect, we need more work of the kind conducted by Wan and associates, but hopefully research that draws on reliable and valid databases of local health systems.

References
1.
American Hospital Association. Overview: AHA's national reform strategy. Chicago: AHA; 1992.
2.
American Hospital Association. Section for health care systems. Renewing the U.S. Health Care System. Washington, DC: The Office of Constituency Sections; 1990.
3.
Burns LR, Morrissey MA, Alexander JA, Johnson V. Managed care and processes to integrate physician/hospitals. Health Care Management Review 1998; 23(4):70-80.
4.
Burns LR, Cacciamani J, Clement J, Aquino W. The fall of the house of AHERF: the Allegheny bankruptcy. Health Affairs (Millwood) 2000 Jan-Feb; 19(1):7-41. Review.
5.
Catholic Health Association. Setting relationships right: a working proposal for systemic healthcare reform. St. Louis: The Association; 1992.
6.
Luke RD, Begun JW. Have IHNS failed in healthcare?. Front Health Services Management 2001 Summer; 17(4):45-50. discussion 51–4.
7.
Luke RD, Olden PC. Foundations of market restructuring: local hospital cluster and HMO infiltration. Medical Interface 1995 Sep; 8(9):71-5.
8.
Walston SL, Kimberly JR, Burns LR. Owned vertical integration and health care promise and performance. Health Care Management Review 1996; 21(Winter):83-92.
9.
Williamson Institute. Database on strategic hospital alliances. Richmond, VA: VCU; 2000.
Figure
Figure 1