National Healthcare Quality and Disparities Report
Latest available findings on quality of and access to health care
Data
- Data Infographics
- Data Visualizations
- Data Tools
- Data Innovations
- All-Payer Claims Database
- Healthcare Cost and Utilization Project (HCUP)
- Medical Expenditure Panel Survey (MEPS)
- AHRQ Quality Indicator Tools for Data Analytics
- State Snapshots
- United States Health Information Knowledgebase (USHIK)
- Data Sources Available from AHRQ
Search All Research Studies
AHRQ Research Studies Date
Topics
- Adverse Events (1)
- Behavioral Health (2)
- Children/Adolescents (3)
- Community Partnerships (1)
- Comparative Effectiveness (10)
- Depression (1)
- Digestive Disease and Health (1)
- Education: Continuing Medical Education (1)
- Emergency Department (1)
- (-) Evidence-Based Practice (40)
- Guidelines (7)
- Healthcare Delivery (2)
- Health Services Research (HSR) (8)
- Implementation (4)
- Medicare (1)
- Medication (1)
- Outcomes (3)
- Patient-Centered Healthcare (1)
- Patient-Centered Outcomes Research (13)
- Patient and Family Engagement (2)
- Policy (2)
- Prevention (2)
- Primary Care (2)
- (-) Research Methodologies (40)
- Skin Conditions (1)
- U.S. Preventive Services Task Force (USPSTF) (2)
AHRQ Research Studies
Sign up: AHRQ Research Studies Email updates
Research Studies is a compilation of published research articles funded by AHRQ or authored by AHRQ researchers.
Results
1 to 25 of 40 Research Studies DisplayedCallejo-Black A, Wagner DV, Ramanujam K
A systematic review of external validity in pediatric integrated primary care trials.
This study used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to conduct a systematic review of external validity reporting in integrated primary care (IPC) interventions for mental health concerns. A literature search was conducted to identify relevant literature from 1998 to 2018 reporting on open, randomized or quasi-randomized trials of IPC interventions that targeted child (ages 0-18 years) psychological symptoms. The authors included 39 publications describing 25 studies in the review. Publications rarely reported indicators of external validity, including the representatives of participants (12%), rate of adoption clinics or providers (16%), cost of implementation (8%), or evidence of maintenance (16%). Few studies also included key pragmatic factors such as cost or organizational change processes related to implementation and maintenance.
AHRQ-funded; HS022981.
Citation: Callejo-Black A, Wagner DV, Ramanujam K .
A systematic review of external validity in pediatric integrated primary care trials.
J Pediatr Psychol 2020 Oct 1;45(9):1039-52. doi: 10.1093/jpepsy/jsaa068..
Keywords: Children/Adolescents, Primary Care, Behavioral Health, Healthcare Delivery, Evidence-Based Practice, Health Services Research (HSR), Research Methodologies
Cuthel A, Rogers E, Daniel F
Barriers and facilitators in the recruitment and retention of more than 250 small independent primary care practices for EvidenceNOW.
This study examined barriers and facilitators in the recruitment and retention of small independent practices (SIPs) to participate in research studies. The authors used qualitative data from the HealthyHearts New York City program, part of the EvidenceNOW initiative. This randomized controlled trial took place from 2015 through 2018 across 5 boroughs in NYC. A total of 257 SIPs (<5 full-time clinicians) were recruited originally. The three main factors that facilitated rapid recruitment were: 1) a prior well-established relationship with the local health department; 2) alignment of project goals with practice priorities, and 3) having appropriate monetary incentives. Specific strategies that enhance recruitment of SIPS and fills gaps in knowledge about factors that influence retention are identified.
AHRQ-funded; HS023922.
Citation: Cuthel A, Rogers E, Daniel F .
Barriers and facilitators in the recruitment and retention of more than 250 small independent primary care practices for EvidenceNOW.
Am J Med Qual 2020 Sep/Oct;35(5):388-96. doi: 10.1177/1062860619893422..
Keywords: Primary Care, Evidence-Based Practice, Patient-Centered Outcomes Research, Research Methodologies
Landes SJ, Kerns SEU, Pilar MR
Proceedings of the Fifth Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2019: where the rubber meets the road: the intersection of research, policy, and practice - part 1.
This paper offers a compilation of the abstracts of the oral and poster presentations from the 2019 Society for Implementation Research Collaboration (SIRC) Conference entitled “Where the Rubber Meets the Road: The Intersection of Research, Policy, and Practice” held in Seattle from 12-14 September. The society had evolved following a NIMH-funded conference grant and is now an international society. The conference included 432 attendees. Highlights of the conference are described.
AHRQ-funded; HS025632.
Citation: Landes SJ, Kerns SEU, Pilar MR .
Proceedings of the Fifth Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2019: where the rubber meets the road: the intersection of research, policy, and practice - part 1.
Implement Sci 2020 Sep 30;15(Suppl 3):76. doi: 10.1186/s13012-020-01034-7..
Keywords: Implementation, Health Services Research (HSR), Evidence-Based Practice, Policy, Research Methodologies
Riggs K, Richman J, Kertesz S
Trial design for ineffectiveness research: a mixed-methods survey.
High-quality research demonstrating a lack of effectiveness may facilitate the 'de-adoption' of ineffective health services. However, there has been little debate on the optimal design for ineffectiveness research-studies exploring the research hypothesis that an intervention is ineffective. The aim of this study was to explore investigators' preferences for trial design for ineffectiveness research. The investigators conducted a mixed-methods online survey with principle investigators identified from clinicaltrials.gov.
AHRQ-funded; HS023009.
Citation: Riggs K, Richman J, Kertesz S .
Trial design for ineffectiveness research: a mixed-methods survey.
BMJ Evid Based Med 2020 Aug;25(4):143-44. doi: 10.1136/bmjebm-2019-111276..
Keywords: Research Methodologies, Comparative Effectiveness, Evidence-Based Practice
Lin D, Lapen K, Sherer MV
A systematic review of contouring guidelines in radiation oncology: analysis of frequency, methodology, and delivery of consensus recommendations.
Clinical trials have described variation in radiation therapy plan quality, of which contour delineation is a key component, and linked this to inferior patient outcomes. In response, consensus guidelines have been developed to standardize contour delineation. This investigation assessed trends in contouring guidelines and examined the methodologies used to generate and deliver recommendations. The investigators concluded that this review highlighted an increase in consensus contouring recommendations over time.
AHRQ-funded; HS026881.
Citation: Lin D, Lapen K, Sherer MV .
A systematic review of contouring guidelines in radiation oncology: analysis of frequency, methodology, and delivery of consensus recommendations.
Int J Radiat Oncol Biol Phys 2020 Jul 15;107(4):827-35. doi: 10.1016/j.ijrobp.2020.04.011..
Keywords: Guidelines, Evidence-Based Practice, Research Methodologies
Thomas LE, Yang S, Wojdyla D
Matching with time-dependent treatments: a review and look forward.
Observational studies of treatment effects attempt to mimic a randomized experiment by balancing the covariate distribution in treated and control groups, thus removing biases related to measured confounders. In this paper, the authors define a class of longitudinal matching methods and provide a review of existing variations, with guidance regarding study design, execution, and analysis. They identify avenues for future research and highlight the relevance of this methodology to high-quality comparative effectiveness studies in the era of big data.
AHRQ-funded; HS24310.
Citation: Thomas LE, Yang S, Wojdyla D .
Matching with time-dependent treatments: a review and look forward.
Stat Med 2020 Jul;39(17):2350-70. doi: 10.1002/sim.8533..
Keywords: Research Methodologies, Evidence-Based Practice, Comparative Effectiveness
Tsou AY, Treadwell JR, Erinoff E
Machine learning for screening prioritization in systematic reviews: comparative performance of Abstrackr and EPPI-Reviewer.
Improving the speed of systematic review (SR) development is key to supporting evidence-based medicine. Machine learning tools which semi-automate citation screening might improve efficiency. Few studies have assessed use of screening prioritization functionality or compared two tools head to head. In this project, the investigators compared performance of two machine-learning tools for potential use in citation screening.
AHRQ-funded; HS025859.
Citation: Tsou AY, Treadwell JR, Erinoff E .
Machine learning for screening prioritization in systematic reviews: comparative performance of Abstrackr and EPPI-Reviewer.
Syst Rev 2020 Apr 2;9(1):73. doi: 10.1186/s13643-020-01324-7..
Keywords: Health Services Research (HSR), Research Methodologies, Evidence-Based Practice, Patient-Centered Outcomes Research
Krist AH, Barry MJ, Wolff TA
AHRQ Author: Wolff TA, Fan TM
Evolution of the U.S. Preventive Services Task Force's methods.
In this commentary on an article appearing in the same issue, the authors stated that the methods used by the USPSTF deliberately set a high bar for making evidence-based recommendations. They indicated that consumers of preventive service guidelines need to know concretely what is known and unknown and further need confidence that what is being recommended is not influenced by economic or political pressures or by professional opinion with a limited evidence basis.
AHRQ-authored.
Citation: Krist AH, Barry MJ, Wolff TA .
Evolution of the U.S. Preventive Services Task Force's methods.
Am J Prev Med 2020 Mar;58(3):332-35. doi: 10.1016/j.amepre.2019.11.003..
Keywords: U.S. Preventive Services Task Force (USPSTF), Guidelines, Evidence-Based Practice, Prevention, Research Methodologies
Byham-Gray LD, Peters EN, Rothpletz-Puglia P
Patient-centered model for protein-energy wasting: stakeholder deliberative panels.
Integrating the patient's voice into research prioritization is essential for solving problems that patients care the most about in terms of health, symptom management, and survival. In this study, the investigators used deliberative processes for adapting the existing model of protein-energy wasting (PEW) to one that included stakeholder priorities, addressing gaps from the initial concept.
AHRQ-funded; HS023434.
Citation: Byham-Gray LD, Peters EN, Rothpletz-Puglia P .
Patient-centered model for protein-energy wasting: stakeholder deliberative panels.
J Ren Nutr 2020 Mar;30(2):137-44. doi: 10.1053/j.jrn.2019.06.001..
Keywords: Patient-Centered Healthcare, Patient-Centered Outcomes Research, Evidence-Based Practice, Patient and Family Engagement, Research Methodologies
Gaynes BN, Lux L, Gartlehner G
Defining treatment-resistant depression.
The authors conducted a review for the Centers for Medicare & Medicaid Services and AHRQ to clarify how experts and investigators have defined treatment-resistant depression (TRD) and to review systematically how well this definition comports with TRD definitions in clinical trials through July 5, 2019. They found that no consensus definition existed for TRD. While depressive outcomes and clinical global impressions were commonly measured, functional impairment and quality-of-life tools were rarely used. They recommend stronger approaches to designing and conducting TRD research in order to foster better evidence to translate into clearer guidelines for treating patients with TRD.
AHRQ-funded; 290201500011I.
Citation: Gaynes BN, Lux L, Gartlehner G .
Defining treatment-resistant depression.
Depress Anxiety 2020 Feb;37(2):134-45. doi: 10.1002/da.22968..
Keywords: Depression, Behavioral Health, Evidence-Based Practice, Implementation, Research Methodologies
Lin L, Shi L, Chu H
The magnitude of small-study effects in the Cochrane Database of Systematic Reviews: an empirical study of nearly 30 000 meta-analyses.
The authors’ goal was to provide rules of thumb for interpreting measures to quantify small-study effects' magnitude. They used six measures to evaluate small-study effects in 29,932 meta-analyses from the Cochrane Database of Systematic Reviews. They presented the empirical distributions of the six measures and proposed a rough guide to interpret the measures' magnitude. They suggested that their proposed rules of thumb may help evidence users grade the certainty in evidence as impacted by small-study effects.
AHRQ-funded; HS024743.
Citation: Lin L, Shi L, Chu H .
The magnitude of small-study effects in the Cochrane Database of Systematic Reviews: an empirical study of nearly 30 000 meta-analyses.
BMJ Evid Based Med 2020 Feb;25(1):27-32. doi: 10.1136/bmjebm-2019-111191..
Keywords: Research Methodologies, Evidence-Based Practice
Lin L
Quantifying and presenting overall evidence in network meta-analysis.
This article classified treatment networks into three types under different assumptions; they included networks with each treatment comparison's edge width proportional to the corresponding number of studies, sample size, and precision. In addition, three new measures (ie, the effective number of studies, the effective sample size, and the effective precision) were proposed to preliminarily quantify overall evidence gained in Network meta-analysis.
AHRQ-funded; HS024743.
Citation: Lin L .
Quantifying and presenting overall evidence in network meta-analysis.
Stat Med 2018 Dec 10;37(28):4114-25. doi: 10.1002/sim.7905..
Keywords: Evidence-Based Practice, Research Methodologies
Ishimine P, Adelgais K, Barata I
Executive summary: the 2018 Academic Emergency Medicine Consensus Conference: Aligning the Pediatric Emergency Medicine Research Agenda to Reduce Health Outcome Gaps.
Emergency care providers share a compelling interest in developing an effective patient-centered, outcomes-based research agenda that can decrease variability in pediatric outcomes. The 2018 Academic Emergency Medicine Consensus Conference "Aligning the Pediatric Emergency Medicine Research Agenda to Reduce Health Outcome Gaps (AEMCC)" aimed to fulfill this role. This paper discusses the conference which convened major thought leaders and stakeholders to introduce a research, scholarship, and innovation agenda for pediatric emergency care specifically to reduce health outcome gaps.
AHRQ-funded; HS026101.
Citation: Ishimine P, Adelgais K, Barata I .
Executive summary: the 2018 Academic Emergency Medicine Consensus Conference: Aligning the Pediatric Emergency Medicine Research Agenda to Reduce Health Outcome Gaps.
Acad Emerg Med 2018 Dec;25(12):1317-26. doi: 10.1111/acem.13667..
Keywords: Implementation, Evidence-Based Practice, Patient-Centered Outcomes Research, Children/Adolescents, Emergency Department, Outcomes, Research Methodologies
Marshall IJ, Noel-Storr A, Kuiper J
Machine learning for identifying randomized controlled trials: an evaluation and practitioner's guide.
The purpose of this study was to evaluate machine learning models for RCT classification. Models were evaluated on an external dataset. The authors demonstrate that machine learning approaches are better able to discriminate between RCTs and non-RCTs than traditional database search filters, and also provide practical guidance on the role of machine learning in systematic reviews, and rapid reviews and clinical question answering as well as an open-source software.
AHRQ-funded; HS025024.
Citation: Marshall IJ, Noel-Storr A, Kuiper J .
Machine learning for identifying randomized controlled trials: an evaluation and practitioner's guide.
Res Synth Methods 2018 Dec;9(4):602-14. doi: 10.1002/jrsm.1287..
Keywords: Evidence-Based Practice, Health Services Research (HSR), Research Methodologies
Ma X, Lin L, Qu Z
Performance of between-study heterogeneity measures in the Cochrane Library.
The growth in comparative effectiveness research and evidence-based medicine has increased attention to systematic reviews and meta-analyses. Assessing heterogeneity is critical for performing a meta-analysis and interpreting results. This article evaluates two heterogeneity measures. To evaluate these measures' performance empirically, the investigators applied them to 20,599 meta-analyses in the Cochrane Library.
AHRQ-funded; HS024743.
Citation: Ma X, Lin L, Qu Z .
Performance of between-study heterogeneity measures in the Cochrane Library.
Epidemiology 2018 Nov;29(6):821-24. doi: 10.1097/ede.0000000000000857..
Keywords: Comparative Effectiveness, Evidence-Based Practice, Patient-Centered Outcomes Research, Research Methodologies
Eder MM, Evans E, Funes M
Defining and measuring community engagement and community-engaged research: clinical and translational science institutional practices.
The institutions that comprise the Clinical and Translational Science Award (CTSA) consortium and the National Center for Advancing Translational Sciences continue to explore and develop community-engaged research strategies and to study the role of community academic partnerships in advancing the science of community engagement. The purpose of this study was to explore CTSA institutions in relation to an Institute of Medicine recommendation that community engagement occur in all stages of translational research and be defined and evaluated consistently.
AHRQ-funded; HS020518.
Citation: Eder MM, Evans E, Funes M .
Defining and measuring community engagement and community-engaged research: clinical and translational science institutional practices.
Prog Community Health Partnersh 2018;12(2):145-56. doi: 10.1353/cpr.2018.0034..
Keywords: Community Partnerships, Evidence-Based Practice, Research Methodologies, Implementation
Tricco AC, Lillie E, Zarin W
AHRQ Author: Chang C
PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation.
This article presents the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) checklist and explanation. The checklist was developed by a 24-member expert panel and 2 research leads following published guidance from the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network.
AHRQ-authored.
Citation: Tricco AC, Lillie E, Zarin W .
PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation.
Ann Intern Med 2018 Oct 2;169(7):467-73. doi: 10.7326/m18-0850..
Keywords: Evidence-Based Practice, Research Methodologies
Chang S
Scoping reviews and systematic reviews: is it an either/or question?
This editorial comments on a paper by Tricco et al., (2018), published in the Annals of Internal Medicine and entitled “PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation.
AHRQ-authored.
Citation: Chang S .
Scoping reviews and systematic reviews: is it an either/or question?
Ann Intern Med 2018 Oct 2;169(7):502-03. doi: 10.7326/m18-2205..
Keywords: Evidence-Based Practice, Research Methodologies
Chou R, Baker WL, Banez LL
Agency for Healthcare Research and Quality Evidence-based Practice Center methods provide guidance on prioritization and selection of harms in systematic reviews.
A workgroup of methodologists from Evidence-based Practice Centers (EPCs) developed consensus-based guidance on selection and prioritization of harms in systematic reviews. Ten recommendations were developed on selection and prioritization of harms, including routinely focusing on serious as well as less serious but frequent or bothersome harms; and routinely engaging stakeholders and using literature searches and other data sources to identify important harms.
AHRQ-authored; AHRQ-funded; HS022998.
Citation: Chou R, Baker WL, Banez LL .
Agency for Healthcare Research and Quality Evidence-based Practice Center methods provide guidance on prioritization and selection of harms in systematic reviews.
J Clin Epidemiol 2018 Jun;98:98-104. doi: 10.1016/j.jclinepi.2018.01.007.
.
.
Keywords: Adverse Events, Evidence-Based Practice, Patient-Centered Outcomes Research, Research Methodologies
Villani J, Ngo-Metzger Q, Vincent IS
AHRQ Author: Ngo-Metzger Q
Sources of funding for research in evidence reviews that inform recommendations of the US Preventive Services Task Force.
This study characterizes the sources of funding for the scientific evidence base used by the USPSTF. One or more funding sources were identified for 79 percent of the 1,650 research articles. Government agencies provided support for 931 articles (56 percent). The remaining support came from nonprofits or universities (530 articles, 32 percent) and industry (282 articles, 17 percent).The sources of funding varied by recommendation topic.
AHRQ-authored.
Citation: Villani J, Ngo-Metzger Q, Vincent IS .
Sources of funding for research in evidence reviews that inform recommendations of the US Preventive Services Task Force.
JAMA 2018 May 22;319(20):2132-33. doi: 10.1001/jama.2018.5404.
.
.
Keywords: Evidence-Based Practice, Guidelines, Prevention, Research Methodologies, U.S. Preventive Services Task Force (USPSTF)
Viswanathan M, Patnode CD, Berkman ND
AHRQ Author: Chang S
Recommendations for assessing the risk of bias in systematic reviews of health-care interventions.
Risk-of-bias assessment is a central component of systematic reviews, but little conclusive empirical evidence exists on the validity of such assessments. In the context of such uncertainty, the investigators present pragmatic recommendations that promote transparency and reproducibility in processes, address methodological advances in the risk-of-bias assessment, and can be applied consistently across review topics.
AHRQ-authored; AHRQ-funded; 290201500011I; 290201500001I; 290201500005I; 290201500006I; 290201500013I; 290201500008I; 290201500009I.
Citation: Viswanathan M, Patnode CD, Berkman ND .
Recommendations for assessing the risk of bias in systematic reviews of health-care interventions.
J Clin Epidemiol 2018 May;97:26-34. doi: 10.1016/j.jclinepi.2017.12.004..
Keywords: Evidence-Based Practice, Research Methodologies
Adam GP, Springs S, Trikalinos T
AHRQ Author: Berliner E
Does information from ClinicalTrials.gov increase transparency and reduce bias? Results from a five-report case series.
The researchers investigated whether information in ClinicalTrials.gov would impact the conclusions of five ongoing systematic reviews. Of the 173 total ClinicalTrials.gov records identified across the five projects, between 11 and 43 percent did not have an associated publication. In the 14 percent of records that contained results, the new data provided in the ClinicalTrials.gov records did not change the results or conclusions of the reviews.
AHRQ-authored; AHRQ-funded; HS022998.
Citation: Adam GP, Springs S, Trikalinos T .
Does information from ClinicalTrials.gov increase transparency and reduce bias? Results from a five-report case series.
Syst Rev 2018 Apr 16;7(1):59. doi: 10.1186/s13643-018-0726-5.
.
.
Keywords: Evidence-Based Practice, Patient-Centered Outcomes Research, Research Methodologies
Armstrong MJ, Mullins CD, Gronseth GS
Impact of patient involvement on clinical practice guideline development: a parallel group study.
The aim of this study was to investigate the effect of patient and public involvement (PPI) on guideline question formation and validate a conceptual model of patient and public contributions to guidelines. The qualitative analysis of the discussions occurring during guideline question development demonstrated key differences in group conduct and validated the proposed conceptual model of patient and public contributions to guidelines.
AHRQ-funded; HS024159; HS022135.
Citation: Armstrong MJ, Mullins CD, Gronseth GS .
Impact of patient involvement on clinical practice guideline development: a parallel group study.
Implement Sci 2018 Apr 16;13(1):55. doi: 10.1186/s13012-018-0745-6.
.
.
Keywords: Evidence-Based Practice, Guidelines, Patient and Family Engagement, Patient-Centered Outcomes Research, Research Methodologies
Dunn AG, Coiera E, Bourgeois FT
Unreported links between trial registrations and published articles were identified using document similarity measures in a cross-sectional analysis of ClinicalTrials.gov.
Trial registries can be used to measure reporting biases and support systematic reviews, but 45% of registrations do not provide a link to the article reporting on the trial. The investigators evaluated the use of document similarity methods to identify unreported links between ClinicalTrials.gov and PubMed. The investigators found that document similarity methods can assist in the identification of unreported links between trial registrations and corresponding articles.
AHRQ-funded; HS024798.
Citation: Dunn AG, Coiera E, Bourgeois FT .
Unreported links between trial registrations and published articles were identified using document similarity measures in a cross-sectional analysis of ClinicalTrials.gov.
J Clin Epidemiol 2018 Mar;95:94-101. doi: 10.1016/j.jclinepi.2017.12.007..
Keywords: Research Methodologies, Evidence-Based Practice
Surian D, Dunn AG, Orenstein L
A shared latent space matrix factorisation method for recommending new trial evidence for systematic review updates.
The purpose of this study was to evaluate a new method to partially automate the identification of trial registrations that may be relevant for systematic review updates. After identifying 179 systematic reviews of drug interventions for type 2 diabetes, researchers tested a matrix factorization approach that ranks relevant trial registrations for each review. Text from the trial registrations were also used as features. These two approaches were tested on a holdout set of the newest trials. The authors conclude that this matrix was useful in ranking trial registrations and could be used as part of a semi-automated pipeline.
AHRQ-funded; HS024798.
Citation: Surian D, Dunn AG, Orenstein L .
A shared latent space matrix factorisation method for recommending new trial evidence for systematic review updates.
J Biomed Inform 2018 Mar;79:32-40. doi: 10.1016/j.jbi.2018.01.008..
Keywords: Evidence-Based Practice, Health Services Research (HSR), Research Methodologies