IMPORTANT: This site is undergoing maintenance on May 28, 2024 from 2:00 AM EST to 4:00 PM EST. Individual users will not be able to log into their accounts to comment on articles or take CME. Institutional administrators will not be able to make changes. Contact [email protected] with questions.
IMPORTANT: This site is undergoing maintenance on May 28, 2024 from 2:00 AM EST to 4:00 PM EST. Individual users will not be able to log into their accounts to comment on articles or take CME. Institutional administrators will not be able to make changes. Contact [email protected] with questions.
Articles
6 October 2009

Associations Between Structural Capabilities of Primary Care Practices and Performance on Selected Quality Measures

Publication: Annals of Internal Medicine
Volume 151, Number 7

Abstract

Background:

Recent proposals to reform primary care have encouraged physician practices to adopt such structural capabilities as performance feedback and electronic health records. Whether practices with these capabilities have higher performance on measures of primary care quality is unknown.

Objective:

To measure associations between structural capabilities of primary care practices and performance on commonly used quality measures.

Design:

Cross-sectional analysis.

Setting:

Massachusetts.

Participants:

412 primary care practices.

Measurements:

During 2007, 1 physician from each participating primary care practice (median size, 4 physicians) was surveyed about structural capabilities of the practice (responses representing 308 practices were obtained). Data on practice structural capabilities were linked to multipayer performance data on 13 Healthcare Effectiveness Data and Information Set (HEDIS) process measures in 4 clinical areas: screening, diabetes, depression, and overuse.

Results:

Frequently used multifunctional electronic health records were associated with higher performance on 5 HEDIS measures (3 in screening and 2 in diabetes), with statistically significant differences in performance ranging from 3.1 to 7.6 percentage points. Frequent meetings to discuss quality were associated with higher performance on 3 measures of diabetes care (differences ranging from 2.3 to 3.1 percentage points). Physician awareness of patient experience ratings was associated with higher performance on screening for breast cancer and cervical cancer (1.9 and 2.2 percentage points, respectively). No other structural capabilities were associated with performance on more than 1 measure. No capabilities were associated with performance on depression care or overuse.

Limitation:

Structural capabilities of primary care practices were assessed by physician survey.

Conclusion:

Among the investigated structural capabilities of primary care practices, electronic health records were associated with higher performance across multiple HEDIS measures. Overall, the modest magnitude and limited number of associations between structural capabilities and clinical performance suggest the importance of continuing to measure the processes and outcomes of care for patients.

Primary Funding Source:

The Commonwealth Fund.

Context

The patient-centered medical home has garnered attention as a way to improve the quality of primary care. To be designated as a patient-centered medical home, practices need to have certain structural features, such as electronic health records, enhanced access, and ongoing performance feedback.

Contribution

This study examined associations between structural features and quality of care in 412 practices and quality measures related to diabetes, depression, prevention, and overuse of selected tests and treatments. Use of more sophisticated electronic health records was associated with better screening and diabetes care, but other associations between structural capabilities and quality were weaker or absent.

Implication

Structural features of practices do not necessarily translate into better care.
—The Editors
Influential studies (1) have documented problems with the quality of health care in the United States, including gaps in the delivery of preventive and chronic disease care (2, 3). However, solutions to quality problems have been less clear (4). To spur quality improvement, primary care practices are being encouraged to invest in structural capabilities, such as physician performance feedback, reminders, language interpreter services, expanded practice hours, and electronic health records (5–10). These and other structural capabilities have recently been used to form the standards that will qualify practices for enhanced payments in medical home demonstration projects (11–14). In addition, recent federal legislation has created new incentives for physician practices to adopt electronic health records by 2014 (15).
Whether practices adopting these structural capabilities will provide higher-quality care is unclear. For example, the association between electronic health records and performance on quality measures has been inconsistent (2, 16, 17). Other structural capabilities have been evaluated in large groups with many physicians, but most primary care physicians deliver care in practices with fewer physicians (2, 18, 19). Identifying key relationships between practices' structural capabilities and their performance on measures of clinical quality could help to prioritize the structural capabilities encouraged by quality improvement efforts. By using data from a novel statewide multipayer performance reporting system, we sought to determine whether primary care practices with selected structural capabilities have higher performance on widely used clinical quality measures than practices lacking these capabilities.

Methods

We obtained performance data on Healthcare Effectiveness Data and Information Set (HEDIS) measures of primary care quality from Massachusetts Health Quality Partners (MHQP). In addition, we administered a survey to assess 13 structural capabilities of Massachusetts primary care practices. Combining these 2 data sources at the practice level, we tested the hypothesis that use of each structural capability would be associated with higher HEDIS measure performance.

Survey Sample of Physician Practices

We defined a practice to include at least 2 physicians as well as nurses and other staff providing care at a single address. Massachusetts Health Quality Partners maintains a statewide physician directory that includes a roster of physicians at each practice and, if applicable, the practice's affiliation with 1 of the 9 large physician networks in the state (20). These networks contract with payers on behalf of their affiliated practices and may also provide management and infrastructure resources to these practices. Network and practice leaders review the MHQP directory regularly, updating it as practice affiliations evolve.
Massachusetts Health Quality Partners aggregates performance measure data from the 5 largest commercial health plans in Massachusetts, which cover nearly 4 million enrollees and contract with more than 90% of the state's primary care physicians. Each physician contributing HEDIS measure observations is designated as a primary care physician by at least 1 of the health plans. We defined practice size as the number of physicians at each practice who contributed at least 1 observation on a HEDIS measure reflecting a clinical service that a patient aged 18 years or older was eligible to receive during 2005.
The 2005 MHQP physician directory (the most recent available during construction of the study sample) included 729 practices with at least 2 physicians. We excluded 220 practices that had no HEDIS measure with 20 or more denominator observations. We excluded 19 practices providing only pediatric care, which we confirmed by telephone calls.
We chose to survey practicing physicians because they are best situated to report on capabilities that directly affect their delivery of care, and research suggests that this approach can be used to measure important aspects of physician practices (21). To increase the likelihood that a survey respondent would accurately describe the practice of interest, we took 2 precautions (22). First, we excluded physicians providing care in more than 1 practice (6.6% of all physicians), which eliminated 13 practices for which there was no physician exclusive to the practice. Second, because physicians who infrequently deliver primary care may be unreliable reporters of structural capabilities (for example, whether a practice has meetings on quality), we developed an empirical algorithm to identify and exclude physicians providing clinical care 1 or fewer days per week. By using a validation set of approximately 40 physicians whose volume of visits was known to us, we calculated the total HEDIS denominator observations for each physician (across all measures) and identified the threshold of HEDIS denominator observations that adequately discriminated low-volume primary care clinicians from others. On the basis of this threshold, we excluded 62 practices consisting only of clinicians practicing primary care 1 or fewer days per week, yielding the study cohort of 415 practices.
We then sampled 1 physician at random from each practice and confirmed each sampled physician's practice address against data from the Massachusetts Board of Registration in Medicine. We resolved discrepancies between the MHQP physician directory and Board of Registration addresses by calling physicians' offices. Of the 13 sampled physicians no longer at the practice of interest, 10 were replaced by an alternate physician selected at random from the same practice. We excluded 3 practices with no eligible alternate physician. The final study sample consisted of 412 sampled physicians, each verified as representing the intended practice.

Clinical Performance Data

We used performance data on all 13 available HEDIS process measures of adult primary care quality in 4 clinical areas: screening (for breast cancer, cervical cancer, colorectal cancer, and chlamydia among women aged 21 to 26 years), diabetes (hemoglobin A1c testing, eye examinations, cholesterol testing, nephropathy monitoring), depression (acute-phase contacts, acute-phase treatment, continuation-phase treatment), and avoidance of overuse (use of imaging studies for low back pain, avoidance of antibiotic treatment in adults with acute bronchitis). Appendix Table 1 lists measure definitions. These performance data reflected care delivered during 2007.
Appendix Table 1. HEDIS Service Definitions
Appendix Table 1. HEDIS Service Definitions
Each of the 5 participating health plans reported performance data aggregated at the physician level for each HEDIS measure. These performance data included a denominator (the number of the physician's patients eligible for inclusion in the measure) and a numerator (the number of eligible patients who received the care specified by the measure). To create practice-level performance results for each HEDIS measure, we separately summed the numerators and denominators across all the physicians within a practice and divided the aggregate numerator by the aggregate denominator. On each measure, a higher score reflected higher measured quality. Because the database lacked unique patient identifiers, we could not identify how often a single patient generated multiple HEDIS observations (for example, a patient with diabetes eligible for inclusion in all 4 diabetes measures). However, all of the denominator observations for a given patient were assigned to the primary care physician for the patient, as identified by the health plan reporting data on the patient's care.

Structural Capabilities: Survey Instrument

Our survey was designed to ask a practicing physician about the specific structural capabilities of the practice (Appendix Figure). Relying on published literature and existing surveys on physician group characteristics (16, 23–25), we identified 13 capabilities likely to be important to high-quality care that also could be accurately reported by practicing physicians. We grouped these 13 capabilities into 4 domains: patient assistance and reminders (assistance of patient self-management, system for contacting patients for preventive services, and paper-based physician reminder systems), electronic health records (frequently used multifunctional electronic health records), culture of quality (physician awareness of performance on quality and patient experience, new initiatives on quality and patient experience, frequent meetings on quality performance, and presence of a leader for quality improvement), and enhanced access (language interpreters, providers' spoken languages, and regular appointment hours on weekends). The sophistication of electronic health records varies, so the survey assessed 6 specific functionalities in 4 of the core areas identified by the Institute of Medicine: results management (laboratory and radiology results), communication and connectivity (notes from consultants), health information and data (problem and medication lists), and decision support (electronic reminders) (26). Finally, the survey asked about the presence of trainees (medical students, residents, or fellows) and specialist physicians (other than primary care) at the practices.
To improve validity, we revised the survey instrument after cognitive testing with local physician volunteers. In addition, the survey was completed by physicians from a validation set of 4 practices for which we conducted site visits in early 2007. These written survey responses agreed with our site visit observations for all of the structural capabilities included in our analysis.

Survey Administration

We conducted the survey by mail during May 2007 to October 2007, sending the survey along with a letter introducing the study. We used 4 follow-up mailings along with telephone outreach to encourage nonrespondents to complete the survey. Item nonresponse rates were less than 5% for all included survey questions. This study was approved by the Human Subjects Committee of the Harvard School of Public Health.

Statistical Analysis

Most of the survey questions asked about the presence or absence of a structural capability and included “don't know” as a response option. We reasoned that a capability was unlikely to affect the quality of care if a respondent was unaware of its existence at the practice, so “no” and “don't know” responses were combined in the analysis. For items with multiple response categories (for example, frequency of meetings), we created dichotomous variables based on the median of the response distribution. For the 6 items addressing electronic record functionality, we created a composite “all-or-none” measure, classifying a practice as having a “frequently used multifunctional electronic health record” only if the respondent reported availability of all 6 functionalities and used the electronic health record “usually” or “always” during a typical clinical session. To check for nonresponse bias, we used Wilcoxon rank-sum, Fisher exact, and t tests as appropriate to compare the following characteristics of practices with and without survey responses: median size, rate of network affiliation, and mean scores on all 13 HEDIS measures.
We constructed hierarchical linear probability regression models to assess the bivariate relationships between practice use of each structural capability and performance on each HEDIS measure. In each model, the dependent variable was binary, indicating for each eligible patient the performance or nonperformance of the service described by the investigated HEDIS measure. The independent variables included a dummy variable representing the presence or absence of the investigated structural capability, dummy variables indicating the health plan reporting each HEDIS observation (to account for potential health plan influences on reported performance), and random intercepts for the practices. To adjust for potential practice-level confounders of the association between structural capabilities and HEDIS performance, we fitted the hierarchical linear probability regression models again, adding the following independent variables: practice size (number of physicians), dummy variables indicating affiliation with each network (or no network), and dummy variables indicating practice teaching status and multispecialty status.
We used linear probability models so that coefficient estimates could be directly interpreted as expected performance differences between otherwise similar practices with and without each structural capability. The predicted probabilities from these linear mixed models fell within the bounds of probability values (between 0 and 1). To confirm the stability of statistical inferences, we repeated the analyses by using logistic hierarchical models with substantively similar results. To address the problem of multiple comparisons, we calculated the critical P values that would limit the false discovery rate (the expected rate of type I error), separately for unadjusted and adjusted analyses, to no more than 5% of all statistically significant results. Confidence intervals were calculated to correspond to this critical P value (27). All statistical analyses were performed with SAS software, version 9.1.3 (SAS Institute, Cary, North Carolina).

Role of the Funding Source

Our study was supported by The Commonwealth Fund. Dr. Friedberg was supported by a National Research Service Award from the Health Resources and Services Administration. The funding sources had no role in the study design, analyses, or decision to submit the manuscript for publication.

Results

We received completed surveys for 308 practices (response rate, 75%). Three of these practices underwent reorganizations during 2007 and could not be linked to 2007 HEDIS performance data. After we excluded these 3 practices, the final study sample consisted of 305 practices. Responding and nonresponding practices did not statistically significantly differ in the numbers of physicians, proportion with network affiliation, or performance on any of the 13 HEDIS measures (data not shown). Mean HEDIS performance rates ranged from 18% for avoidance of antibiotic treatment in adults with acute bronchitis to 89% for hemoglobin A1c testing in diabetes (Table 1).
Table 1. Mean HEDIS Measure Scores of Primary Care Practices Included in the Study
Table 1. Mean HEDIS Measure Scores of Primary Care Practices Included in the Study
Among responding practices, the prevalence of structural capabilities ranged from 25% (practice regularly open to provide care on weekends) to 89% (respondent aware of results on measures of clinical quality) (Table 2). Only 33% of practices reported frequently used multifunctional electronic health records, although systems with rudimentary functionalities were more common. The median practice size was 4 physicians (range, 2 to 74 physicians); 64% of practices were network-affiliated, 47% were teaching practices, and 46% were multispecialty practices.
Table 2. Prevalence of Structural Capabilities Among Primary Care Practices
Table 2. Prevalence of Structural Capabilities Among Primary Care Practices
In unadjusted analysis, frequently used multifunctional electronic health records were significantly associated with higher performance on 5 of the 13 investigated HEDIS measures, but these were limited to measures of screening and diabetes care (Appendix Table 2). Among the remaining structural capabilities, only physician awareness of patient experience ratings and frequent meetings to discuss quality were associated with higher performance on multiple HEDIS measures (3 measures each), and these capabilities were also limited to measures of screening and diabetes care. Paper-based reminders to physicians were associated with lower performance on 2 measures (chlamydia screening and eye examinations in patients with diabetes). Among the potentially confounding practice characteristics, teaching (of students, residents, or fellows) was associated with higher performance on 2 measures (cervical cancer screening and acute contacts in depression). Practice size, network affiliation, and multispecialty status were not associated with performance on any measure.
Appendix Table 2. Unadjusted Associations Between the Structural Capabilities of Physician Practices and Performance on HEDIS Measures of Primary Care Quality
Appendix Table 2. Unadjusted Associations Between the Structural Capabilities of Physician Practices and Performance on HEDIS Measures of Primary Care Quality
After adjustment for confounders, frequently used multifunctional electronic health records remained significantly associated (P less than the critical value of 0.007) with higher performance on 3 measures of screening (breast cancer, colorectal cancer, and chlamydia) and 2 measures of diabetes care (eye examinations and nephropathy monitoring) (Table 3). Statistically significant expected differences in HEDIS performance scores between practices with and without such electronic records ranged from 3.1 percentage points (nephropathy monitoring) to 7.6 percentage points (chlamydia screening). Associations between specific electronic record functionalities and higher HEDIS performance were more common for advanced functionalities, such as medication and problem lists, than for more rudimentary functionalities (Appendix Table 3).
Table 3. Adjusted Associations Between the Structural Capabilities of Physician Practices and Performance on HEDIS Measures of Primary Care Quality
Table 3. Adjusted Associations Between the Structural Capabilities of Physician Practices and Performance on HEDIS Measures of Primary Care Quality
Appendix Table 3. Adjusted Associations Between the Specific Electronic Health Record Functionalities and Performance on HEDIS Measures of Primary Care Quality
Appendix Table 3. Adjusted Associations Between the Specific Electronic Health Record Functionalities and Performance on HEDIS Measures of Primary Care Quality
The findings for other structural capabilities were mixed. Systems to contact patients for preventive services were associated with higher performance on nephropathy monitoring, whereas paper-based reminders to physicians were associated with lower performance on chlamydia screening and eye examinations in patients with diabetes. Frequent meetings to discuss quality were associated with higher performance on 3 measures of diabetes care: eye examinations, cholesterol screening, and nephropathy monitoring. Physician awareness of clinical quality performance was associated with higher performance on eye examinations in patients with diabetes but not other measures. Awareness of patient experience ratings was associated with higher performance on both breast and cervical cancer screening measures.
As in the unadjusted analyses, none of the structural capabilities was associated with higher performance on measures of depression care or overuse. Having on-site language interpreters was associated with lower performance on 2 depression measures (acute treatment and continuation treatment).

Discussion

To our knowledge, our study is among the first to investigate in a statewide sample of primary care practices whether selected structural capabilities are associated with higher performance on widely used HEDIS measures of primary care quality. Overall, few of the structural capabilities were associated with performance on more than a few quality measures, and observed performance differences were modest. Nearly all statistically significant performance differences were observed for measures of screening and diabetes care, suggesting that practices may target performance in these clinical areas when investing in structural capabilities. Of note, measures of screening and diabetes management are among the clinical indicators most frequently included in pay-for-performance programs (28).
Compared with other investigated structural capabilities, frequently used multifunctional electronic health records were associated with the largest performance differences on the broadest range of HEDIS measures. Some capabilities, such as physician awareness of performance results and frequent meetings to discuss quality, were associated with smaller differences in performance. Other capabilities, such as having a leader for clinical quality, were not associated with higher performance on any measure.
The association of paper-based physician reminders with lower performance on 2 measures was somewhat counterintuitive. Use of paper-based systems may reflect other practice features that impede optimal care, or the electronic reminders included in advanced electronic health records may constitute a more potent intervention. Relative to physician awareness of clinical quality performance, awareness of patient experience results was more consistently associated with higher HEDIS performance, perhaps reflecting a practice culture more highly attuned to performance feedback. However, nearly all respondents reported awareness of clinical quality performance, limiting our ability to detect performance differences associated with this capability. The negative relationship between on-site language interpreters and depression performance was unexpected and may reflect unmeasured differences in patient case mix (29).
Our findings are relevant to “medical home” demonstration projects. Current medical home standards rely on structural assessment (11, 12), and the goals of medical home demonstrations include improving the quality of primary care. Although the specific capabilities defining the medical home are still evolving, some of the capabilities we evaluated are common among medical home descriptions (11–14, 30–32). Our results may guide expectations about the magnitude of clinical performance improvements that might be observed in medical home interventions.
Use of electronic health records has been proposed as a requirement for practices to achieve the highest levels of medical home designation, and recent federal legislation calls for new incentives for physicians who make “meaningful use” of electronic health records (11–13, 15). Our results suggest that electronic health records may have stronger and wider-ranging associations with higher-quality care than other structural capabilities, but only if they include advanced functionalities. One study (17) that found no association between use of electronic health records and quality of care did not assess electronic record functionalities and did not include the HEDIS measures for which we found associations with performance. Electronic health records with advanced features are uncommon at the national level, and our results suggest that increasing their adoption may enable performance improvements in important areas of preventive care and chronic disease management (33).
Our study has limitations. Our data are from Massachusetts, where HEDIS performance rates and the prevalence of structural capabilities may exceed the national average, reducing our power to detect associations. We surveyed a single primary care physician selected at random from each practice (who may or may not have had a leadership role), which constrained our ability to assess such practice features as financial incentives, staffing mix, and team care. Measures of outcomes, patient experience, and costs of care were not available to us. The cross-sectional study design does not allow causal attribution of performance differences to structural capabilities.
Until recently, efforts to improve primary care quality have focused on external incentives (for example, public reporting and pay-for-performance) based on measures of clinical processes and outcomes (5, 28, 34, 35). However, the popularity of medical home proposals and passage of legislation encouraging electronic health record adoption may signal a renewed emphasis on practices' structural capabilities (15, 36, 37). Our results suggest that among the capabilities of primary care practices, advanced electronic record systems may have the strongest relationship to better clinical process performance. However, the modest overall magnitude and limited scope of relationships between practice structure and clinical performance highlight the continued importance of measuring the processes and outcomes of care delivered to patients.

Supplemental Material

Appendix Figure. Survey Items Contributing Data on Structural Capabilities of Primary Care Practice Sites.

References

1.
McGlynn EAAsch SMAdams JKeesey JHicks JDeCristofaro Aet al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635-45. [PMID: 12826639]
2.
Pham HHSchrag DHargraves JLBach PB. Delivery of preventive services to older adults by primary care physicians. JAMA. 2005;294:473-81. [PMID: 16046654]
3.
Kerr EAMcGlynn EAAdams JKeesey JAsch SM. Profiling the quality of care in twelve communities: results from the CQI study. Health Aff (Millwood). 2004;23:247-56. [PMID: 15160823]
4.
Shortell SMRundall TGHsu J. Improving patient care by linking evidence-based medicine and evidence-based management. JAMA. 2007;298:673-6. [PMID: 17684190]
5.
Institute of Medicine. Performance Measurement: Accelerating Improvement. Washington, DC: National Academies Pr; 2006.
6.
Institute of Medicine. Crossing the Quality Chasm: a New Health System for the 21st Century. Washington, DC: National Academies Pr; 2001.
7.
Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academies Pr; 2002.
8.
Murray MBerwick DM. Advanced access: reducing waiting and delays in primary care. JAMA. 2003;289:1035-40. [PMID: 12597760]
9.
Bates DWEbell MGotlieb EZapp JMullins HC. A proposal for electronic medical records in U.S. primary care. J Am Med Inform Assoc. 2003;10:1-10. [PMID: 12531012]
10.
Bodenheimer TGrumbach K. Electronic technology: a spark to revitalize primary care? JAMA. 2003;290:259-64. [PMID: 12851283]
11.
National Committee for Quality Assurance. Standards and guidelines for physician practice connections–patient-centered medical home (PPC-PCMH). Washington, DC: National Committee for Quality Assurance; 2008.
12.
American Medical Association/Specialty Society RVS Update Committee. Medicare medical home demonstration project recommendations. 2008. Accessed at www.ama-assn.org/ama1/pub/upload/mm/380/medicalhomerecommend.pdf on 11 August 2009.
13.
Tax Relief and Health Care Act of 2006, Pub. L. No. 109-432, 120 Stat. 2922 (2006).
14.
Iglehart JK. No place like home—testing a new model of care delivery. N Engl J Med. 2008;359:1200-2. [PMID: 18799555]
15.
Steinbrook R. Health care and the American Recovery and Reinvestment Act. N Engl J Med. 2009;360:1057-60. [PMID: 19224738]
16.
Mehrotra AEpstein AMRosenthal MB. Do integrated medical groups provide higher-quality medical care than individual practice associations? Ann Intern Med. 2006;145:826-33. [PMID: 17146067]
17.
Linder JAMa JBates DWMiddleton BStafford RS. Electronic health record use and the quality of ambulatory care in the United States. Arch Intern Med. 2007;167:1400-5. [PMID: 17620534]
18.
Casalino LPDevers KJLake TKReed MStoddard JJ. Benefits of and barriers to large medical group practice in the United States. Arch Intern Med. 2003;163:1958-64. [PMID: 12963570]
19.
Solberg LIAsche SEPawlson LGScholle SHShih SC. Practice systems are associated with high-quality care for diabetes. Am J Manag Care. 2008;14:85-92. [PMID: 18269304]
20.
Friedberg MWColtin KLPearson SDKleinman KPZheng JSinger JAet al. Does affiliation of physician groups with one another produce higher quality primary care? J Gen Intern Med. 2007;22:1385-92. [PMID: 17594130]
21.
Simon SRKaushal RCleary PDJenter CAVolk LAPoon EGet al. Correlates of electronic health record adoption in office practices: a statewide survey. J Am Med Inform Assoc. 2007;14:110-7. [PMID: 17068351]
22.
Friedberg MWSafran DGColtin KLDresser MSchneider EC. Readiness for the Patient-Centered Medical Home: structural capabilities of Massachusetts primary care practices. J Gen Intern Med. 2009;24:162-9. [PMID: 19050977]
23.
Casalino LGillies RRShortell SMSchmittdiel JABodenheimer TRobinson JCet al. External incentives, information technology, and organized processes to improve health care quality for patients with chronic diseases. JAMA. 2003;289:434-41. [PMID: 12533122]
24.
Audet AMDoty MMShamasdin JSchoenbaum SC. Measure, learn, and improve: physicians' involvement in quality improvement. Health Aff (Millwood). 2005;24:843-53. [PMID: 15886180]
25.
Shortell SMMarsteller JALin MPearson MLWu SYMendel Pet al. The role of perceived team effectiveness in improving chronic illness care. Med Care. 2004;42:1040-8. [PMID: 15586830]
26.
Institute of Medicine. Key Capabilities of an Electronic Health Record System. Washington, DC: National Academies Pr; 2003.
27.
Benjamini YHochberg Y. Controlling the false discovery rate: a practical and powerful approach to multiple testing. J R Stat Soc [Ser B]. 1995;57:289-300.
28.
Rosenthal MBLandon BENormand SLFrank RGEpstein AM. Pay for performance in commercial HMOs. N Engl J Med. 2006;355:1895-902. [PMID: 17079763]
29.
Sentell TShumway MSnowden L. Access to mental health treatment by English language proficiency and race/ethnicity. J Gen Intern Med. 2007;22 Suppl 2 289-93. [PMID: 17957413]
30.
American Academy of Family Physicians, American Academy of Pediatrics, American College of Physicians, American Osteopathic Association. Joint Principles of the Patient-Centered Medical Home. 2007. Accessed at www.medicalhomeinfo.org/Joint%20Statement.pdf on 11 August 2009.
31.
American Academy of Family Physicians. State Medical Home Legislation. 2009. Accessed at www.trendtrack.com/texis/app/viewrpt?event=483e340d37b on 11 August 2009.
32.
American College of Physicians. The Advanced Medical Home: a Patient-Centered, Physician-Guided Model of Health Care. Philadelphia: American Coll of Physicians; 2006. Accessed at www.acponline.org/advocacy/events/state_of_healthcare/statehc06_5.pdf on 11 August 2009.
33.
DesRoches CMCampbell EGRao SRDonelan KFerris TGJha Aet al. Electronic health records in ambulatory care—a national survey of physicians. N Engl J Med. 2008;359:50-60. [PMID: 18565855]
34.
Epstein AM. Paying for performance in the United States and abroad [Editorial]. N Engl J Med. 2006;355:406-8. [PMID: 16870921]
35.
Rosenthal MB. Beyond pay for performance—emerging models of provider-payment reform. N Engl J Med. 2008;359:1197-200. [PMID: 18799554]
36.
American College of Physicians. Patient-Centered Medical Home: State-by-State Demonstration List. 2009. Accessed at www.acponline.org/running_practice/pcmh/demonstrations/locations.htm on 11 August 2009.
37.
Fisher ES. Building a medical neighborhood for the medical home. N Engl J Med. 2008;359:1202-5. [PMID: 18799556]
38.
National Committee for Quality Assurance. 2 Technical specifications. Washington, DC: National Committee for Quality Assurance; 2006.

Comments

0 Comments
Sign In to Submit A Comment

Information & Authors

Information

Published In

cover image Annals of Internal Medicine
Annals of Internal Medicine
Volume 151Number 76 October 2009
Pages: 456 - 463

History

Published online: 6 October 2009
Published in issue: 6 October 2009

Keywords

Authors

Affiliations

Mark W. Friedberg, MD, MPP
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Kathryn L. Coltin, MPH
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Dana Gelb Safran, ScD
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Marguerite Dresser, MS
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Alan M. Zaslavsky, PhD
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Eric C. Schneider, MD, MSc
From Brigham and Women's Hospital, Harvard Medical School, and Blue Cross/Blue Shield of Massachusetts, Boston; Harvard Pilgrim Health Care, Wellesley; and Massachusetts Health Quality Partners, Watertown, Massachusetts.
Acknowledgment: The authors thank Katherine Howitt, MA, for invaluable assistance in fielding the survey; Elaine Kirshenbaum of the Massachusetts Medical Society, for advice regarding development of the physician survey; and Arnold Epstein, MD, MA, for helpful comments on an earlier draft of this paper.
Grant Support: By The Commonwealth Fund. Dr. Friedberg was supported by a National Research Service Award from the Health Resources and Services Administration (5 T32 HP11001 20).
Disclosures: None disclosed.
Reproducible Research Statement: Study protocol: Physician practice survey instrument available from Dr. Friedberg (e-mail, [email protected]). Statistical code: Available from Dr. Friedberg (e-mail, [email protected]). Data set: Not available.
Corresponding Author: Eric C. Schneider, MD, MSc, RAND Boston, 20 Park Plaza, 7th Floor, Suite 720, Boston, MA 02116; e-mail, [email protected].
Current Author Addresses: Dr. Friedberg: RAND, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138.
Ms. Coltin: Harvard Pilgrim Health Care, 93 Worcester Street, Wellesley, MA 02481.
Dr. Safran: Blue Cross Blue Shield of Massachusetts, 401 Park Drive, Boston, MA 02215.
Ms. Dresser: Massachusetts Health Quality Partners, 100 Talcott Avenue, Watertown, MA 02472.
Dr. Zaslavsky: Harvard Medical School, Department of Health Care Policy, 180 Longwood Avenue, Boston, MA 02115.
Dr. Schneider: RAND Boston, 20 Park Plaza, 7th Floor, Suite 720, Boston, MA 02116.
Author Contributions: Conception and design: M.W. Friedberg, K.L. Coltin, D.G. Safran, E.C. Schneider.
Analysis and interpretation of the data: M.W. Friedberg, K.L. Coltin, D.G. Safran, M. Dresser, A.M. Zaslavsky, E.C. Schneider.
Drafting of the article: M.W. Friedberg, D.G. Safran, M. Dresser, A.M. Zaslavsky, E.C. Schneider.
Critical revision of the article for important intellectual content: K.L. Coltin, D.G. Safran, A.M. Zaslavsky, E.C. Schneider.
Final approval of the article: D.G. Safran, A.M. Zaslavsky, E.C. Schneider.
Statistical expertise: A.M. Zaslavsky.
Obtaining of funding: E.C. Schneider.
Administrative, technical, or logistic support: K.L. Coltin, E.C. Schneider.
Collection and assembly of data: M.W. Friedberg, M. Dresser, E.C. Schneider.

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. For an editable text file, please select Medlars format which will download as a .txt file. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format





Download article citation data for:
Mark W. Friedberg, Kathryn L. Coltin, Dana Gelb Safran, et al. Associations Between Structural Capabilities of Primary Care Practices and Performance on Selected Quality Measures. Ann Intern Med.2009;151:456-463. [Epub 6 October 2009]. doi:10.7326/0003-4819-151-7-200910060-00006

View More

View options

PDF/ePub

View PDF/ePub

Get Access

Login Options:
Purchase

You will be redirected to acponline.org to sign-in to Annals to complete your purchase.

Create your Free Account

You will be redirected to acponline.org to create an account that will provide access to Annals.

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share on social media