A new study from the University of Minnesota suggests what MN Community Measurement has believed since our inception: that measuring and publicly reporting health care quality information results in higher-quality care for patients.
“Our organization was founded on the notion that public reporting drives improvement,” said MNCM President Jim Chase. “While there is considerable anecdotal and circumstantial evidence to support it, this study provides peer-reviewed analysis that illustrates the critical role that measurement and public reporting have in improving the health of Minnesotans.”
The study, which is published in the March 2015 issue of International Journal of Health Economics and Management, evaluated results from MNCM’s Optimal Diabetes Care measure for 617 clinics whose data was publicly reported between 2007 and 2012. Collection and reporting of health care quality metrics by clinics was voluntary until 2010, when Minnesota’s Statewide Quality and Reporting Measurement System was created and reporting became required. Results in 2011 and later reflect mandatory reporting.
Researchers evaluated clinical quality improvement over time; convergence between reporting clinics; and the persistence of provider quality from year to year.
While all clinics that publicly reported results had “substantial quality improvement” between 2008 and 2012, clinics that reported for the longest periods of time had the most improvement. For example, the study’s earliest reporting clinics saw their quality scores increase from an average of 69 percent to 82 percent of patients with diabetes achieving treatment goals during the five-year period. Researchers noted that clinics which began reporting later followed the same general trajectory of improvement, but the longest-reporting clinics “always maintain higher quality levels.”
Additionally, the study noted that newly reporting clinics always began with poorer average results than clinics that had reported quality metrics previously. As a result, “consumers can infer, on average, that non-reporting clinics are likely to have poorer quality than reporting clinics,” according to the study.
Researchers also found that all reporting clinics tended to merge at a common high level of quality over time. This is “primarily because lower-performing entrants increase their quality scores, not because longer reporting cohorts have declining scores in later years,” the researchers explained.
Finally, the study demonstrated that provider quality rankings don’t change much from year to year. This is important as most publicly-reported metrics reflect the previous year’s patient outcomes. Since quality tends to persist across time, consumers can generally assume that a clinic with a high-quality ranking based on last year’s patients is still providing high-quality care to patients this year. Thus, “publicly-reported measures can inform consumers in choice of clinics, even though they represent measured quality for a previous time period.”
MNCM publicly reports the results of more than 70 cost, quality and patient experience measures on more than 1,400 clinics, 535 medical groups and 140 hospitals in Minnesota and neighboring communities on our consumer-focused website MNHealthScores.org.