How to Conduct a Meta-Analysis Study
Conducting a meta-analysis requires a systematic, four-step process: developing a clear research hypothesis with a registered protocol, implementing comprehensive literature search and study selection, performing rigorous data extraction and quality assessment, and conducting statistical synthesis with appropriate heterogeneity evaluation. 1
Step 1: Develop a Clear Research Hypothesis and Protocol
The foundation of any meta-analysis begins with formulating a precise research question using the PICO framework (Population, Intervention, Comparator, Outcome), which specifies the target patient population, intervention or exposure, comparator, outcome definition, relevant time frame, and optimal study designs. 1
Protocol Development and Registration
- Develop a comprehensive protocol following the PRISMA-P (Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Protocols) 17-item checklist before beginning your literature search. 1
- Register your protocol prospectively to serve as your roadmap and minimize concerns about methodological rigor. 1
- Define explicit inclusion and exclusion criteria with sufficient detail, as vague criteria introduce unnecessary subjectivity during screening and can undermine evidence quality. 1
- Specify your analytic plan including methods for harmonizing exposure/outcome units, effect estimate scales, weighing schemes (fixed vs. random effects), and approaches for dose-response analysis or meta-regression. 1
Common Pitfall: An ill-defined research question (such as failing to specify a comparator) leads to inappropriate study inclusion, substantial heterogeneity, and inability to draw meaningful conclusions. 1
Consider Statistical Power Requirements
- Aim to include at least 17-20 experiments in coordinate-based meta-analyses to ensure sufficient power to detect smaller effects and prevent results from being driven by single experiments. 1
- Balance the trade-off between number of included studies (power) and their quality/homogeneity—more studies may increase heterogeneity or include lower quality evidence. 1
- Recognize that required sample size depends on expected effect size; smaller samples may suffice for strong expected effects, but analyses expecting small-to-medium effects with fewer experiments should be interpreted cautiously. 1
Step 2: Implement Systematic Literature Search and Study Selection
Comprehensive Literature Search
- Conduct thorough searches across multiple databases including PubMed, Web of Science, and Google Scholar using combinations of keywords that restrict searches to specific experiments, study types, and populations. 1
- Perform reference tracing in identified articles and review articles to complete the literature search. 1
- Document everything meticulously: record search engines used, keywords, date boundaries, number of papers identified, number excluded, and reasons for rejection. 1
- Create PRISMA workflow charts that graphically illustrate this information, as many journals require this for publication. 1
Study Selection and Screening
- Screen citations and full manuscripts systematically with at least two authors independently to confirm eligibility against pre-specified inclusion/exclusion criteria. 1
- Ensure each publication provides minimum required information: for coordinate-based meta-analyses, this includes x/y/z coordinates in standard space (MNI or TAL), sample size, confirmation of whole-brain analysis without small volume corrections, and reporting of both increases and decreases. 1
- Contact authors when necessary to clarify standard space used or confirm whole-brain analysis was conducted. 1
Step 3: Data Extraction and Quality Assessment
Systematic Data Extraction
- Extract data elements independently by at least two authors to minimize bias and errors. 1
- Create detailed tables documenting all extracted information from each included experiment, which provides overview for identifying aggregation criteria and planning subgroup analyses. 1
- Extract results exactly as reported in original studies, including effect sizes, confidence intervals, and statistical measures. 1
Quality Assessment and Bias Evaluation
- Conduct qualitative assessment of individual studies (blinded to results when possible), categorizing them as unacceptable (dropped from analysis), acceptable, or good based on methodological quality. 2
- Assess risk of bias using standardized tools appropriate for your study designs. 1
- Describe in plain terms how design or execution flaws could bias results, explaining the reasoning behind these judgments. 1
- Consider stratifying or weighting studies according to preset quality criteria during analysis. 2
Critical Consideration: Never combine results from observational cohorts and randomized intervention trials without careful justification, as differences in study designs, follow-up duration, and exposure definitions can lead to misleading findings. 1
Step 4: Statistical Analysis and Evidence Synthesis
Decide Whether Quantitative Synthesis is Appropriate
Do not automatically perform meta-analysis just because you have multiple studies—quantitative synthesis is inappropriate when studies use different exposure assessment methods, have different endpoints, or employ incompatible study designs. 1
- Explain why pooled estimates would be useful to decision makers if proceeding with meta-analysis. 1
- Recognize that while technically only two data points are needed, the question of "whether data should be statistically combined" supersedes downstream considerations. 1
Perform Statistical Analysis
- Assess statistical heterogeneity using established tools (I², Cochran Q statistic) before pooling results. 1, 2
- Use fixed-effects models when there is homogeneity among combined studies; use random-effects models when heterogeneity exists (which results in wider confidence limits and more conservative estimates). 3
- Create forest plots for visual inspection of results and heterogeneity patterns. 1
- Conduct predefined subgroup analyses to examine robustness of findings and potential effect modifications. 1
- Perform meta-regression to examine sources of heterogeneity when appropriate. 1
Assess Publication Bias and Quality of Evidence
- Evaluate publication bias using funnel plots and appropriate statistical tests, recognizing that studies with negative results are less likely to be published. 1, 3, 4
- Assess the certainty of meta-evidence using appropriate tools (such as GRADE). 1
- Examine impact of funding sources of individual studies on findings and describe funding source of the meta-analysis itself. 1
Step 5: Formulate Conclusions and Report Results
Interpret Results in Context
- Formulate conclusions informed by both results and quality of evidence, not results alone. 1
- Describe limitations of both original studies and the meta-analysis methodology. 1
- Consider findings in alignment with other study designs on the topic to provide broader context. 1
- Provide general interpretation in context of other evidence and implications for future research. 1
Prepare Structured Report
- Follow PRISMA reporting guidelines for systematic reviews and meta-analyses. 1
- Include structured sections: title, abstract, executive summary, introduction with rationale and objectives, methods, results, and discussion. 1
- Present summary tables of study characteristics and results for transparency. 1
Final Caveat: Meta-analysis cannot correct shortcomings of existing studies or data—if potential pitfalls are recognized and addressed, it provides an objective quantitative synthesis that increases precision and power to detect relationships, but must be understood and assessed critically. 3, 5