Quality improvement organizations (QIOs) are responsible for all of the following except

(a) Except as otherwise provided under this part, a QIO may not use or disclose a beneficiary's confidential information without an authorization from the beneficiary. The QIO's use or disclosure must be consistent with the authorization.

(b) A valid authorization is a document that contains the following:

(1) A description of the information to be used or disclosed that identifies the information in a specific and meaningful fashion.

(2) The name or other specific identification of the QIO(s) and QIO point(s) of contact making the request to use or disclose the information.

(3) The name or other specific identification of the person(s), or class of persons, to whom the QIO(s) may disclose the information or allow the requested use.

(4) A description of each purpose of the requested use or disclosure. The statement “at the request of the individual” is a sufficient description of the purpose when an individual initiates the authorization and does not, or elects not to, provide a statement of purpose.

(5) An expiration date or an expiration event that relates to the beneficiary or the purpose of the use or disclosure. The statement “end of the QIO research study,” “none,” or similar language is sufficient if the authorization is for a use or disclosure of confidential information for QIO research, including for the creation and maintenance of a research database or research repository.

(6) Signature of the individual and date. If the authorization is signed by a beneficiary's representative, a description of such representative's authority to act for the beneficiary must also be provided.

(c) In addition to those items contained in paragraph (b) of this section, the authorization must contain statements adequate to place the individual on notice of all of the following:

(1) The individual's right to revoke the authorization in writing; and

(2) Any exceptions to the right to revoke and a description of how the individual may revoke the authorization;

(3) The ability or inability of the QIO to condition its review activities on the authorization, by stating either:

(i) That the QIO may not condition the review of complaints, appeals, or payment determinations, or any other QIO reviews or other tasks within the QIO's responsibility on whether the individual signs the authorization;

(ii) The consequences to the individual of a refusal to sign the authorization when the refusal will render the QIO unable to carry out an activity.

(4) The potential for information disclosed pursuant to the authorization to be subject to either appropriate or inappropriate redisclosure by a beneficiary, after which the information would no longer be protected by this subpart.

(d) The authorization must be written in plain language.

(e) If a QIO seeks an authorization from a beneficiary for a use or disclosure of confidential information, the QIO must provide the beneficiary with a copy of the signed authorization.

(f) A beneficiary may revoke an authorization provided under this section at any time, provided the revocation is in writing, except to the extent that the QIO has taken action in reliance upon the authorization.

[77 FR 68564, Nov. 15, 2012]


Page 2

§ 402.300 Request for reinstatement.

(a) An excluded person may submit a written request for reinstatement to the initiating agency no sooner than 120 days prior to the terminal date of exclusion as specified in the notice of exclusion. The written request for reinstatement must include documentation demonstrating that the person has met the standards set forth in § 402.302. Obtaining or reactivating a Medicare provider number (or equivalent) does not constitute reinstatement.

(b) Upon receipt of a written request for reinstatement, the initiating agency may require the person to furnish additional, specific information, and authorization to obtain information from private health insurers, peer review organizations, and others as necessary to determine whether reinstatement is granted.

(c) Failure to submit a written request for reinstatement or to furnish the required information or authorization results in the continuation of the exclusion, unless the exclusion has been in effect for 5 years. In this case, reinstatement is automatic.

(d) If a period of exclusion is reduced on appeal (regardless of whether further appeal is pending), the excluded person may request and apply for reinstatement within 120 days of the expiration of the reduced exclusion period. A written request for the reinstatement includes the same standards as noted in paragraph (b) of this section.

The following state regulations pages link to this page.


Address correspondence to Elizabeth H. Bradley, Ph.D., Associate Professor, Department of Epidemiology and Public Health, Yale School of Medicine, 60 College Street, New Haven, CT 06520-8034. Melissa D. A. Carlson. M.P.H., Ph.D. Candidate, is with the Department of Epidemiology and Public Health, Yale School of Medicine, New Haven. William T. Gallo, Ph.D., Associate Research Scientist, is with the Department of Epidemiology and Public Health, Yale University School of Medicine Program on Aging, New Haven. Jeanne Scinto, Ph.D., M.P.H., Director of Quality Improvement, Saint Raphael Health Care System, New Haven. Miriam K. Campbell, Ph.D., M.P.H., Epidemiologist, is with the U.S. Department of Health and Human Services, Centers for Medicare and Medicaid Services, Division of Quality Improvement, Boston. Harlan M. Krumholz, M.D., Professor, is with the Department of Internal Medicine, Yale University School of Medicine, New Haven.

To describe the perceived impact of the Centers for Medicare and Medicaid Services Quality Improvement Organizations (QIOs) on quality of care for patients hospitalized with acute myocardial infarction, in the context of new efforts to work more collaboratively with hospitals in the pursuit of quality improvement.

Primary data collected from a national random sample of 105 hospital quality management directors interviewed between January and July 2002.

We interviewed quality management directors concerning their interactions with the QIO interventions, the helpfulness of QIO interventions and the degree to which they helped or hindered their hospital quality efforts, and their recommendations for improving QIO effectiveness.

More than 90% of hospitals reported that their QIO had initiated specific interventions, the most common being the provision of educational materials, benchmark data, and hospital performance data. Many respondents (60%) rated most QIO interventions as helpful or very helpful, although only one-quarter of respondents believed quality of care would have been worse without the QIO interventions. To increase QIO efficacy, respondents recommended that QIOs appeal more directly to senior administration, target physicians (not just hospital employees), and enhance the perceived validity and timeliness of data used in quality indicators.

Our study demonstrates that the QIOs have overcome, to some degree, the previously adversarial and punitive roles of Peer Review Organizations with hospitals. The generally positive view among most hospital quality improvement directors concerning the QIO interventions suggests that QIOs are potentially poised to take a leading role in promoting quality of care. However, the full potential of QIOs will likely not be realized until QIOs are able to engender greater engagement from senior hospital administration and physicians.

Keywords: quality, quality improvement organizations, peer review organizations, acute myocardial infarction

Quality Improvement Organizations (QIOs), formerly called Peer Review Organizations (PROs), are the primary means by which the Centers for Medicare and Medicaid Services (CMS) promote quality of care for Medicare's 40 million beneficiaries. Fifty-three QIOs contract with CMS in the United States (CMS 2003). Direct funding to QIOs for the current 3-year contract is approximately $735 million (Sprague 2002), or about $18 per Medicare beneficiary.

Despite the expansion of the QIO mandate and the substantial resources devoted to this effort, there are few published evaluations of the QIO program. Meanwhile, although the quality of care delivered to Medicare beneficiaries has improved significantly over time for many indicators (Jencks and Wilensky 1992; Ellerback et al. 1995), the improvement has been uneven across states (Jencks et al. 2000) and hospitals (Bradley, Herrin et al. 2004), and many patients still do not receive guideline-based care (McGlynn and Brook 2001). In assessing the QIO program, studies of national or statewide changes in quality indicators over time are helpful; however, one aspect of evaluation that is missing from the literature are the hospitals' views on the effectiveness of the QIOs in improving quality of care at their institutions. Such information is vital in assessing the value of the QIO program and identifying opportunities for its improvement.

Accordingly, we interviewed hospital quality management directors to assess their views of the impact of the QIO on quality of care for patients hospitalized with acute myocardial infarction (AMI). AMI provides an excellent opportunity to assess QIO efforts because it is a common reason for admission, it has common indicators by which to measure quality of care, and it was one of the six clinical areas highlighted by CMS for quality improvement initiatives undertaken by the QIOs (Health Care Financing Administration Request for Proposal 1999).

The roots of the QIO program date back to the 1970s with the creation of the Professional Standards Review Organization (PSRO) program. PSROs were charged with performing utilization reviews and studies to improve the quality of care for federally subsidized insurance programs, primarily Medicare and Medicaid (Jost 1989). In 1982, as part of the Tax Equity and Fiscal Responsibility Act, Congress replaced the PSRO program with the Utilization and Quality Control PRO Program (Jost 1989; Institute of Medicine 1990), now known as the QIO Program.

The PROs initially took an “inspect and detect” approach to assessing the quality of services. PROs were to: (1) assemble and analyze data on services provided to Medicare beneficiaries and (2) intervene when services had been provided unnecessarily, inappropriately, or with inadequate quality (Jost 1989). Under their contracts with the Health Care Financing Administration (HCFA, now CMS), the PROs were required to randomly sample a percentage of hospital records for case review. This case review focused on seven indicators of deficient care (General Accounting Office 1988; Jost 1989; Institute of Medicine 1990): inadequate discharge planning, medical instability of the patient at discharge, deaths, nosocomial infection, unscheduled return to surgery, trauma suffered in the hospital, and medication or treatment changes within 24 hours of discharge without adequate observation (Institute of Medicine 1990). If a hospital had too many deficiencies within sampled records, the PROs could do an intensified (100%) review of records to estimate the magnitude of the deficiency (Jost 1989). Physicians with persistent quality problems faced harsh penalties, including payment denial and recommended exclusion from participation in the Medicare program (Jost 1989).

The “inspect and detect” approach was problematic for several reasons. Many physicians and hospitals perceived the review process to be flawed, a view supported by studies showing poor reliability of peer assessments of quality of care (Berry 1992; Goldman 1992) and inconsistent PRO processes across states (Institute of Medicine 1990; Kellie and Kelly 1991). In addition, HCFA was unable to document any improvement in the quality of care delivered to Medicare beneficiaries as a result of the PRO “inspect and detect” efforts (General Accounting Office 1988; Audet and Scott 1993; Ellerback et al. 1995; Weinmann 1998). The PRO processes of retrospective auditing and potential penalties led to an acrimonious relationship between PROs and hospitals and their physicians (Jencks and Wilensky 1992; Nash 1992; Audet and Scott 1993; Weinmann 1998). Rather than fostering an environment for proactive improvement of the quality of care for patients, the inspection approach pitted regulators against hospitals and physicians, and did little to advance quality improvement initiatives (Weinmann 1998).

In 1992, HCFA dramatically changed the role of PROs with the implementation of the Health Care Quality Improvement Initiative (HCQII) (Jencks and Wilensky 1992). The HCQII changed the PRO program's approach to data collection, its quality of care evaluation criteria, and its role in implementing quality initiatives. First, the approach to data collection shifted from individual case reviews to reporting patterns of care delivered to beneficiaries (Audet and Scott 1993). PROs were to examine practice patterns at the institutional, regional, and national level, rather than uncover individual physician lapses in quality for punitive purposes (Audet and Scott 1993). Second, PROs were to evaluate quality of care using national, disease-specific guidelines, rather than local criteria (Jencks and Wilensky 1992). Third, instead of merely collecting and reporting data, PROs were to work collaboratively with hospitals as partners in the development and implementation of hospital quality improvement initiatives (Jencks and Wilensky 1992). These changes, designed to transform the PRO from adversary to partner, represented a dramatic shift in vision and potential impact of the PRO system on hospital quality. The first initiative for the HCQII was the implementation of the Cooperative Cardiovascular Project, which began in 1992 as a four-state pilot project to develop quality indictors for the care of AMI patients and support hospitals in their development of quality improvement initiatives (Sprague 2002). Quality indicators were applied using retrospective review of the medical records of Medicare patients discharged with a principle diagnosis of AMI over a 9-month period in four pilot states (Ellerbeck et al. 1995). PROs also intervened in these four states by facilitating and encouraging the adoption of quality improvement initiatives in AMI care. A follow-up study found significant improvement in the quality indicators and survival rates after AMI in the pilot states, as compared with the nonpilot states, suggesting significant success of the interventions aided by the PROs (Marciniak et al. 1998).

During the next two contract cycles (1996–1999 and 1999–2002), PROs implemented AMI quality initiatives in hospitals nationwide, and, by 1999, had broadened their scope to include national quality of care projects in breast cancer, diabetes, heart failure, pneumonia, and stroke (Sprague 2002). During the period 1998/99 to 2000/01, more than half of the states improved on 20 of the 22 quality of care indicators for these medical conditions (Jencks, Huff, and Cuerdon 2003). Consistent with the new collaborative style and broadening scope of the PRO program, it was officially renamed the QIO Program in 2001 (CMS Federal Register 2002). While this name change was largely symbolic, it was indicative of the expanding role of QIOs in quality improvement initiatives.

QIOs are currently operating within the seventh contract cycle with CMS and have been charged with quality improvement initiatives in numerous clinical areas and across a variety of health care settings. However, despite the expansion in the QIOs' responsibilities, there has been no published assessment of whether or not hospitals believe QIO interventions are improving the quality of care. This information would be helpful to define the extent of the QIOs' current success in assisting hospitals to improve care and to direct future efforts of the QIOs.

We conducted a cross-sectional study of acute care hospitals selected randomly from all acute care hospitals operating in 2001 and listed in the CMS's Online Survey Certification and Reporting System (OSCAR). We randomized all acute care hospitals listed in OSCAR and contacted each in succession until our target sample size of about 100 interviews was achieved. This was reached after attempting contact with 127 hospitals. Of these 127 hospitals, three were eliminated because they reported that they did not care for patients with AMI (n=2) or they had closed their inpatient facility (n=1), leaving 124 hospitals eligible for the study. Of these, two could not be contacted after seven attempts, and 17 refused to participate in the survey, resulting in a final sample of 105 (response rate=105/124 or 85%). The primary reason given for refusal was not having enough time. Nonrespondents did not differ in geographical location from respondents. Hospitals were recruited for the study with a letter from a CMS official stating the purpose of the survey, the expected length of the interview, and the type of person to be interviewed. We then telephoned the hospital to determine who was the most involved in quality improvement efforts for AMI at the hospital, typically the Director of Quality Management. Interviews were completed during January 2002–July 2002.

The survey instrument assessed (1) the reported level of contact between the hospital quality department and the QIO (number of contacts with QIO about improving overall quality of care and about improving AMI care in particular in the last year; number of people at the QIO with whom the hospital had contact about quality of care at least twice per year; preference for more, the same, or less contact with QIO); (2) the reported QIO interventions related to AMI (from a list of 10 possible interventions and activities) that had been conducted in the 3 years prior to the survey and the perceived helpfulness of each as well as the overall helpfulness of the QIO in improving quality of AMI care; and (3) whether the QIO interventions had improved the quality of AMI care at the hospital in the last 3 years (whether the respondent thought AMI care would have been different without the QIO interventions and, if so, would AMI quality have been a little worse, a lot worse, a little better, or a lot better without the QIO interventions). We also asked an open-ended question at the end of survey, “How might the QIO have done better to improve the quality of AMI care at your hospital?” in order to elicit hospitals' recommendations for improving the QIO's effectiveness at their hospital. We tape-recorded all interviews, and responses to the open-ended questions were transcribed verbatim by a professional, independent transcriptionist.

Several respondent and hospital characteristics were also obtained via survey. We assessed the quality management director's background and training (clinical versus nonclinical), tenure at the hospital, tenure in the current position, and to whom they reported (senior management level or lower). We obtained hospital characteristics, including number of operating beds, ownership type, whether the hospital was part of a multi-hospital system, whether the hospital was an academic medical center, whether the hospital performed percutaneous coronary interventions and/or coronary artery bypass graft operations, and the hospital's geographic region. Survey questions are available from the authors upon request.

We described the reported level of contact with QIOs, the prevalence and helpfulness of QIO interventions, and perceived impact of the QIOs on quality of care using means and frequency analyses. We assessed the bivariate associations between the dependent variables and the characteristics of the respondents and of the hospitals using χ2 and t-test statistics as appropriate. We employed multivariable analyses (multiple regression and logistic regression) to explore adjusted relationships between the dependent and independent variables, although the relatively small sample size limited the statistical power of these analyses in this primarily descriptive paper. We analyzed hospital bed size both as a continuous variable, and, in separate analyses, an ordered categorical variable indicating <50, 50–99, 100–199, 200–399, 400–599, and ≥600 beds. Our unit of analysis was the hospital; hence we did not weight the analyses by number of beds; however, we explicitly tested associations between hospital size and reported QIO contact and reported QIO impact on quality of AMI care.

We analyzed open-ended questions using the constant comparative method of qualitative data analysis (Strauss and Corbin 1998). We developed a code sheet “from the ground up” (Patton 2000) reflecting conceptual domains described by respondents. Using line-by-line review of the transcribed, open-ended data, we coded statements as illustrating distinct domains, each reflecting ways in which the QIO might have been more helpful. We used NUD-IST 4 to organize, code, and analyze open-ended data.

Table 1 displays sample hospital characteristics. Reflecting the profile of hospitals nationally, the majority of hospitals surveyed had fewer than 200 beds (68%) and were nonprofit (84%). Most of the respondents were trained as nurses (74%). The mean tenure of the respondents was 12.8 years at the hospital and 5.8 years in their current position.

Description of Hospital Sample (N=105)

n% (n/N)
Hospital size
 <50 beds2322
 50–99 beds2221
 100–199 beds2625
 200–399 beds2322
 400–599 beds55
 ≥600 beds66
Reported ownership
 Nonprofit8884
 For-profit or government1716
Part of multi-hospital system in last 3 years
 Yes4240
 No6360
Academic medical center (missing=1)
 Yes1313
 No9187
Geographic region
 Northeast2625
 South3432
 Midwest2928
 West1615
Cardiac services
 Performs PCI2827
 Performs CABG2625

The reported contact between the hospital and QIO concerning quality of care in general, and quality of AMI care in particular, is reported in Table 2. The mean and median number of contacts regarding quality of AMI care was 3 (standard deviation of the mean=4); however, respondents in 24 hospitals reported that they had no contact with the QIO about AMI quality, and about one-fifth (21%) could not name anyone at the QIO that was involved with quality efforts in AMI care. While the majority (73%) of quality management directors reported that the amount of contact with the QIO was appropriate, 27% preferred more contact with the QIO regarding improving quality of AMI care; none preferred less contact. Of the 24 respondents that indicated no contact with the QIO regarding quality of AMI care, 42% (n=10) reported that they would prefer more contact with the QIO, while 50% (n=12) desired no contact.

Extent of Contact with Quality Improvement Organizations (QIOs) (N=105)

Mean (SD)MedianRange
How many times in the last year did you have contact, including e-mail contact:
 With QIO about quality of care?*10 (11)15(0,40)
 With QIO about quality of AMI care?†3 (4)3(0,21)
Are you able to name anyone at the QIO involved with quality efforts in AMI care?
n (SD)%
 Yes8179
 No2121
 Missing3
Preferred interaction with QIO regarding quality of AMI care:
 More contact preferred2827
 About as much contact as you have now7473
 Less contact preferred00
 Missing3
Are there additional ways in which you would like to interact with the QIO concerning quality of AMI care?
 Yes2830
 No6470
 Missing13

Respondents from for-profit hospitals, compared with nonprofit or government-owned hospitals, reported having more frequent contact with QIOs regarding quality of AMI care (mean contacts per year=6 and 3, respectively, t-test p-value=.04). In addition, having shorter tenure in one's current position was associated with having more contact with the QIO regarding quality of AMI care (p-value=.04). The background and training of the respondent, his/her tenure as hospital employees, and whether he/she reported to senior management were not significantly associated with reported or desired frequency of contact with the QIO in either bivariate or multivariable analyses. Similarly, the number of contacts with the QIO did not differ by any hospital characteristics measured including number of beds, geographical region, being an academic medical center, or presence of cardiac facilities (p-values >.10).

In describing the contact that respondents had with QIOs, several respondents noted that the QIO staff had been particularly approachable, highlighting the importance of the interpersonal relationships quality management staff built with their QIO representatives. For example, respondents reported:

When [the QIO representative] visits, she's nonjudgmental and she really helps us to see different ways to look at charting, and improves our education. … Those ladies [from the QIO] are highly professional and extremely knowledgeable; they have the appropriate skills and skill levels to assess and evaluate and give us highly credible recommendations.

Quality management directors reported diverse interventions and activities performed by the QIO related to improving AMI care quality. Table 3 displays the reported prevalence of each of the 10 interventions or activities queried in the survey and the perceived helpfulness of each intervention. The most commonly reported QIO interventions were the provision of educational materials, benchmark data, and hospital performance data. These were each reported by more than 75% of the respondents. More than 70% of the respondents reported that the QIOs conducted educational programs, and more than half reported that QIOs collected, not merely disseminated, performance data. Fewer reported that the QIOs helped develop quality indicators (40/105 or 38.1%), trained staff in quality improvement methods (36/105 or 34.3%), conducted corrective action (13/105 or 12.4%), participated in quality improvement teams (4/105 or 3.8%), or participated on other hospital committees (1/105 or 1.0%).

Quality Improvement Organization (QIO) Interventions and Their Perceived Helpfulness in Improving Acute Myocardial Infarction (AMI) Care

InterventionPerceived Helpfulness*,¶
n/105Not Very Helpful†(%)In the Middle‡(%)Very Helpful§(%)Mean Rating¶
Provided educational materials99728653.8
Provided benchmark data93524713.9
Provided hospital performance data821029613.8
Conducted educational programs75832603.6
Collected performance data611023673.8
Helped develop quality indicators40213854.2
Trained staff in QI methods36934573.7
Conducted corrective action13838543.7
Participated in QI teams4250753.8
Participated on other committees110000
Overall, how helpful has the QIO been in improving AMI care at your hospital?
n/105%
Not at all helpful=11111
21716
34543
42322
Extremely helpful=588
Missing1

The majority of respondents found each of the QIO interventions, except one, to be “very helpful” in improving AMI care at their hospitals. Activities that were reported to be “very helpful” by the greatest percentage of respondents were: the development of quality indicators (85%), participation in quality improvement (QI) teams (75%), and provision of benchmark data (71%). Illustrating the particular usefulness of providing benchmark data, respondents stated:

During the last three years the way that [the QIO] has been most useful to us is by providing data and benchmarking. We've been able to take that to medical section meetings and to leadership, and it has underscored the importance of protocols, especially medication and process protocols for AMI patients. … By providing benchmark data, [the QIO] tells us how we compare to other institutions in the state and the region. It's very helpful to compare against what's going on in the community. It offers a forum for people to share ideas and strategies that people have implemented in their facilities to build upon successes.

Table 4 displays the reported impact of the QIOs on hospital performance in AMI care. About one-third of the respondents (n=32 of the 99 respondents to this question, or 32%) believed that their hospital's performance in AMI care would have been different if the QIO had not intervened. Of these 32 respondents, the vast majority (n=23 of the 32 respondents, or 72%) believed that AMI care would have been a little worse without the QIO interventions. An additional four respondents (13% of the 32 who reported the QIO affected AMI care) thought AMI care would have been a lot worse without the QIO interventions. Perceptions about whether AMI quality of care would have been different without the QIO were not significantly correlated with any of the respondent or hospital characteristics measured.

Perceived Impact of Quality Improvement Organizations (QIO) on Acute Myocardial Infarction (AMI) Care at Respondent Hospitals

n%
Do you think performance in AMI care would be different at your hospital if the QIO had not intervened? (n=105)
 Yes3232
 No3939
 Don't know2828
 Missing6
If the QIO had not intervened, how do you think the performance in AMI care would be? It would it be: (n=32)
 A lot worse413
 A little worse2372
 About the same00
 A little better13
 A lot better00
 Don't know413

Responses to the open-ended questions illustrated the ways in which quality management directors believed the QIO interventions positively influenced the quality of AMI care at their hospitals. For instance, several described that the QIOs provided the clout to motivate quality improvement and indicated that the QIO had given them credibility in pursuing quality improvement efforts in the hospital. For example, one respondent stated:

They [QIOs] have a big hammer and our doctors respect their hammer. So they do what they're supposed to do. They sometimes don't do things until they are hit over the head with it. … I think [the QIO] has put more credibility into how important this is for our physicians because the QIO is respected. The word “QIO” still does strike some fear. For those [hospital staff] who maybe wouldn't have been as concerned or as involved, [having] the CMS backing really did add to the importance of it. … I think that just generally, as in anything, the idea of someone collecting data and presenting it to you—certainly the first thing [is] the Hawthorne effect; somebody is looking. So I think they served as a conscience and provoked us to do things that we normally wouldn't do.

In contrast, almost 40% of hospital quality management directors (39 of the 99 respondents to this question) reported that the QIO had not affected quality of AMI care either positively or negatively (Table 4). Open-ended responses helped explain why this might be the case. Several respondents suggested that the QIO had not influenced quality of care because the hospital had already been conducting quality improvement efforts in AMI care long before the QIO became involved. The following quotation illustrates this point:

We, as a hospital corporation, were on board with this before the QIO came around. It's something that we've been working on as a corporation for some time, and it wasn't that [the QIO] wasn't helpful; it's just we were on board with this before they were. … A lot of it we could work on, we did work on, but we were doing it before the QIO even pointed anything out.

Additionally, some quality management directors believed it was not possible for QIOs to improve quality regardless of their interventions. These respondents viewed the role of the QIO as one of data collection and reporting only. For example, one respondent remarked:

I think the QIOs are helpful to gather information, but I don't think [they have] one single thing to do with improving care. [It's] kind of philosophical. … I see their role as a passive one. Actually I think we improve the care; I don't think the QIO improves the care.

Recommendation #1: Improve Validity of the Data

Respondents related that QIO data were not always perceived to be valid, and some respondents described that there were substantive problems with data quality. Quality management directors noted that in order to be engaged in quality improvement efforts, physicians had to believe that the data used to measure and report quality were valid; without this credibility, the QIO data would not change physician behavior and, in fact, could hinder improvement efforts in the hospital. For instance, respondents described:

There were errors in the [QIO] data because of the definitions that were used, so the data were not accurate. Our medical staff just went into a tailspin; it took a long time for them to trust the whole program, not just our QIO. The [medical staff] developed some trust again with the QIO, but … it took a long time. … Some of the things that we have to put a “no” for make it look like we missed the criteria, when actually we are working with our tertiary centers a lot of the times on those patients that we are transferring. Sometimes they don't want the aspirin or they don't want the clot buster type thing … sometimes those particular numbers are misleading.

Recommendation #2: Improve Timeliness of Data

In addition to the perceived validity of the data, quality management directors highlighted the importance of making sure that the data were timely. They indicated that the data were not meaningful unless they reflected recent cases; data that were a year or several months old were perceived as not helpful. Respondents stated:

The hardest part, even though QIO has been very helpful, is getting physicians to buy in to the data being relevant and useful. … I think the one limitation is the data tend to come back a little slowly. We like what the QIO does, but we are looking for a little more timeliness in the turnaround. I mean, they'll collect data now, and it will be several months before we see it again. So we will make three or four changes in the meantime, and it's hard to correlate our current performance with what it was.

Recommendation #3: Appeal to Physicians Directly

Appealing more directly to the physicians was viewed by many respondents as a crucial next step for QIOs to enhance their impact on quality of care. Respondents described that QIO interventions were too often targeted to hospital quality management staff, rather than to physicians, who were the ones most responsible for quality of AMI care. The following quotation reflects this theme:

It would be nice to have more involvement [from the QIO] on a personal level with our physicians. … The only other thing [the QIO] might have done better was if they had spoken directly to the physicians themselves. … Go to the physicians directly. We can monitor all those indicators, but it's in the physicians' power. If they don't prescribe the aspirin at discharge it's not the hospital's [fault]. We monitor that, but there are things that the hospital can't control.

Recommendation #4: Appeal to Senior Administration More Effectively

Similarly, quality management directors expressed their wish that the QIOs might have had greater impact at the CEO (chief executive officer) and senior management levels of hospitals. Illustrating this concern, one respondent said:

I think they need to maybe work more with upper administration. My biggest struggle was I didn't have really good physician buy-in. I don't think they understood the importance of the project. So if they could do better with hitting the CEOs on how important this is, it could trickle down and do better.

Quality management directors were generally positive about the role of the QIO in supporting their quality improvement efforts. The most commonly reported interventions included provision of educational materials, benchmark data, and hospital performance data, and many hospitals also reported that QIOs helped develop quality indicators for the hospital. Each of these interventions was rated as helpful or very helpful by more than 60% of hospital quality management directors. About one-quarter of hospitals reported that quality of AMI care would have been worse without the QIO interventions. On the other hand, many hospitals did not think the care would have been any different without the QIO efforts, or were undecided about the impact of the QIO on quality of AMI care. Many believed the hospital was taking steps to improve AMI care prior to the QIO initiatives; therefore, while they viewed the QIO as helpful in supporting improvement efforts, they did not view the QIO as the genesis of such efforts.

During a time when the PRO system was consciously shifting from an “inspect and detect” approach to a proactive, quality improvement approach for promoting hospital quality, our findings identify successes of this new approach and also illuminate the challenges ahead. The QIOs seem to have made a successful transition from adversary to partner with hospitals in promoting quality. This is evidenced by the frequent and positive contacts between the QIO and the hospital quality management staff, the many QIO interventions, and the prevalent view by hospitals that these interventions are helpful.

At the same time, quality management directors identified several opportunities for QIOs to strengthen their influence on quality of care in hospitals. Respondents emphasized enhancing the perceived validity of the data, rather than the objective validity, which was not criticized. The importance of how the data are perceived is consistent with previous studies of data feedback and its effectiveness in changing clinical practice (Thomson O'Brien et al. 2000; Bradley, Holmboe et al. 2004). Many quality management directors believed that the QIOs could be more effective in getting the attention of physicians and senior management. Interventions were generally described as being pitched at quality management and nursing staff, who then found it difficult to enlist the support of physicians or senior management. Clinical leadership and organizational support are key ingredients to successful performance improvement (Lomas et al. 1991; Soumerai et al. 1998; Bradley, Holmboe et al. 2001; Holmboe et al. 2003) and QIOs could be more effective in engaging physicians and senior management in such efforts.

The respondents in this study reflected a national sample of quality management directors from diverse geographical regions and acute care hospitals that differed in size, ownership type, and teaching status. However, the modest sample size limited our statistical power to detect differences by geographical regions and hospital or respondent characteristics. We interviewed quality management directors because they are typically the most informed about the influence of the QIO interventions on hospital care; however, we were unable to obtain the views of physicians and senior managers regarding the influence of QIOs, and their views may have differed from the perceptions of the quality managers. For instance, medical staff members may have been less positive than quality management staff about the impact of the QIOs on hospitals quality; however, we cannot know this from this study. Further, we did not assess the QIO perceptions of their own role in hospitals' improvement efforts. The opinions of hospital staff are only one part of a comprehensive evaluation of QIO activity. These findings can, however, reveal a previously unstudied aspect of QIO performance: their perceived influence on hospital quality of care by those with whom QIOs most commonly interact in the hospital setting.

Our study demonstrates that the QIOs have overcome to some degree the previously adversarial and punitive roles of PROs with hospitals. The generally positive view among most hospital quality management directors concerning the QIO interventions suggests that QIOs are potentially poised to take a leading role in promoting quality of care. However, to fulfill this leadership role effectively, QIOs will need to find ways to integrate physicians and hospital senior management more fully into QIO quality improvement initiatives. With greater involvement of these groups, QIO efforts can augment existing hospital efforts to improve quality and can expand their ability to promote positive change in health care delivery.

Qualidigm prepared this material under contract number 500-99-CT02 modification #7 with the Centers for Medicare and Medicaid Services. The contents presented do not necessarily reflect CMS policy. Pub. # 6SOW-(AMICASPRO, POPS-2003). This work was also supported in part by a grant from the Claude D. Pepper Older Americans Independence Center at Yale (#P30AG21342).

  • Audet AM, Scott HD. “The Uniform Clinical Data Set: An Evaluation of the Proposed National Database for Medicare's Quality Review Program.” Annals of Internal Medicine. 1993;119(12):1209–13. [PubMed] [Google Scholar]
  • Berry CC. “The κ Statistic.” Journal of the American Medical Association. 1992;268(18):2513. [PubMed] [Google Scholar]
  • Bradley EH, Herrin J, Mattera JA, Holmboe E, Wang Y, Frederick P, Roumanis SA, Radford MJ, Krumholz HM. “Hospital-Level Performance Improvement: Beta-Blocker Use after Acute Myocardial Infarction.” Medical Care. 2004;42:591–9. [PubMed] [Google Scholar]
  • Bradley EH, Holmboe E, Mattera J, Roumanis S, Radford MJ, Krumholz HM. “Using Data Feedback to Change Physician Behavior: The Case of Beta-Blocker Use after Myocardial Infarction.” Quality and Safety in Health Care. 2004;13(1):26–31. [PMC free article] [PubMed] [Google Scholar]
  • Bradley EH, Holmboe E, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. “A Qualitative Study of Increasing Beta-Blocker Use after Myocardial Infarction: Why Do Some Hospitals Succeed?” Journal of the American Medical Association. 2001;285(20):2604–11. [PubMed] [Google Scholar]
  • Centers for Medicare and Medicaid Services Federal Register: 24 May 2002 Volume 67, Number 101, Rules and Regulations, Pages 36539–36540 [accessed on July 24, 2003]. Available at http://www.gpoaccess.gov.
  • Centers for Medicare and Medicaid Services Quality Improvement Organizations” [accessed July 2, 2003]. Available at http://www.cms.hhs.gov/qio/
  • Ellerback EF, Jencks SF, Radford MJ, Kresowik TF, Craig AS, Gold JA, Krumholz HM, Vogel RA. “Quality of Care for Medicare Patients with Acute Myocardial Infarction: A Four-State Pilot Study from the Cooperative Cardiovascular Project.” Journal of the American Medical Association. 1995;273(19):1509–14. [PubMed] [Google Scholar]
  • General Accounting Office . Medicare PROs: Extreme Variation in Organizational Structure and Activities. Washington: General Accounting Office; 1988. Publication no. PEMD-89-7FS. [Google Scholar]
  • Goldman RL. “The Reliability of Peer Assessments of Quality of Care.” Journal of the American Medical Association. 1992;267(7):958–60. [PubMed] [Google Scholar]
  • Health Care Financing Administration Health Care Financing Administration Request for Proposal. 1999 RFP No. HCFA-99-001/ELH, March 1. [Google Scholar]
  • Holmboe E, Bradley EH, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. “Characteristics of Physician Leaders Working to Improve the Quality of Care in Acute Myocardial Infarction.” Joint Commission Journal on Quality and Safety. 2003;29:289–96. [PubMed] [Google Scholar]
  • Institute of Medicine . In: Medicare: A Strategy for Quality Assurance. Lohr KN, editor. Washington, DC: National Academy Press; 1990. [Google Scholar]
  • Jencks SF, Cuerdon T, Burwen DR, Fleming PM, Houck AE, Kussmaul DS, Nilasena DL, Ordin B, Arday DR. “Quality of Medical Care Delivered to Medicare Beneficiaries: A Profile at State and National Levels.” Journal of the American Medical Association. 2000;284(13):1670–6. [PubMed] [Google Scholar]
  • Jencks SF, Huff ED, Cuerdon T. “Change in the Quality of Care Delivered to Medicare Beneficiaries, 1998–1999 to 2000–2001.” Journal of the American Medical Association. 2003;289(3):305–12. [PubMed] [Google Scholar]
  • Jencks SF, Wilensky GR. “The Health Care Quality Improvement Initiative. A New Approach to Quality Assurance in Medicare.” Journal of the American Medical Association. 1992;268(7):900–3. [PubMed] [Google Scholar]
  • Jost TS. “Medicare Peer Review Organizations.” Quality Assurance in Health Care. 1989;1(4):235–48. [PubMed] [Google Scholar]
  • Kellie SE, Kelly JT. “Medicare Peer Review Organization Preprocedure Review Criteria. An Analysis of Criteria for Three Procedures.” Journal of the American Medical Association. 1991;265(10):1265–70. [PubMed] [Google Scholar]
  • Lomas J, Enkin M, Anderson GM, Hannah WJ, Vayda E, Singer J. “Opinion Leaders vs Audit and Feedback to Implement Practice Guidelines. Delivery after Previous Cesarean Section.” Journal of the American Medical Association. 1991;265(17):2202–7. [PubMed] [Google Scholar]
  • Marciniak TA, Ellerbeck EF, Radford MJ, Kresowik TF, Gold JA, Krumholz HM, Kiefe CI, Allman RM, Vogel RA, Jencks SF. “Improving the Quality of Care for Medicare Patients with Acute Myocardial Infarction: Results from the Cooperative Cardiovascular Project.” Journal of the American Medical Association. 1998;279(17):1351–7. [PubMed] [Google Scholar]
  • McGlynn EA, Brook RH. “Keeping Quality on the Policy Agenda.” Health Affairs. 2001;20(3):82–90. [PubMed] [Google Scholar]
  • Nash DB. “Is the Quality Cart before the Horse?” Journal of the American Medical Association. 1992;268(7):917–18. [PubMed] [Google Scholar]
  • Patton M. Qualitative Research and Evaluation Methods. 3d ed. Thousand Oaks, CA: Sage Publishers; 2000. [Google Scholar]
  • Soumerai SB, McLaughlin TJ, Gurwitz JH, Guadagnoli E, Hauptman PJ, Borbas C, Morris N, McLaughlin B, Gao X, Willison DJ, Asinger R, Gobel F. “Effect of Local Medical Opinion Leaders on Quality of Care for Acute Myocardial Infarction: A Randomized Controlled Trial.” Journal of the American Medical Association. 1998;279(17):1358–63. [PubMed] [Google Scholar]
  • Sprague L. 2002 “Contracting for Quality: Medicare's Quality Improvement Organizations,” National Health Policy Forum Issue Brief, no. 774, June 3. [Google Scholar]
  • Strauss A, Corbin J. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2d ed. Thousand Oaks, CA: Sage Publications; 1998. [Google Scholar]
  • Thomson O'Brien MA, Oxman AD, Davis DA, Haynes RB, Freemantle N, Harvey EL. “Audit and Feedback: Effects on Professional Practice and Health Care Outcomes.” Cochrane Database of Systematic Reviews. 2000;(2) CD000259. [PubMed] [Google Scholar]
  • Weinmann C. “Quality Improvement in Health Care: A Brief History of the Medicare Peer Review Organization (PRO) Initiative.” Evaluation and the Health Professions. 1998;21(4):413–8. [PubMed] [Google Scholar]