The metawars

See allHide authors and affiliations

Science  21 Sep 2018:
Vol. 361, Issue 6408, pp. 1184-1188
DOI: 10.1126/science.361.6408.1184

eLetters is an online forum for ongoing peer review. Submission of eLetters are open to all. eLetters are not edited, proofread, or indexed.  Please read our Terms of Service before submitting your own eLetter.

Compose eLetter

Plain text

  • Plain text
    No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Author Information
First or given name, e.g. 'Peter'.
Your last, or family, name, e.g. 'MacMoody'.
Your email address, e.g. higgs-boson@gmail.com
Your role and/or occupation, e.g. 'Orthopedic Surgeon'.
Your organization or institution (if applicable), e.g. 'Royal Free Hospital'.
Statement of Competing Interests

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Enter the characters shown in the image.

Vertical Tabs

  • Resolving Bias in Meta-Analyses of the Link Between Exposure to Violent Video Games and Aggression
    • Brad J. Bushman, Professor of Communication and Psychology, The Ohio State University

    In his article in Science, Jop de Vrieze makes an interesting point that the biases of a particular researcher might influence the conclusions of a meta-analysis. As an example, he notes that Craig Anderson and Brad Bushman believe that violent video games increase aggression, and their meta-analyses show a link. In contrast, Christopher Ferguson believes that violent video games have no effect on aggression, and his meta-analyses show no link.

    There are two ways to combat this problem. One way is to compare the results from researchers who take sides on a debate with the results of other researchers. A recent meta-analysis did just that (Greitmeyer & Mügge, 2014; see Table 3). It compared the results from the Anderson/Bushman lab, the results from the Ferguson lab, and the results from other labs. The authors found that the results from the 19 studies conducted in the Anderson/Bushman lab (average r = .20, 95% CI: .15, .25) agree with the results from the 58 studies conducted in other labs (average r = .18, 95% CI: .16, .21). The results from the 8 studies conduct in the Ferguson lab disagree with the others (average r = .03, 95% CI: -.05, .10).

    A second way to combat potential bias that could creep into the results of a meta-analysis from those with vested interest is to have a non-biased party conduct the meta-analysis. This is exactly what the American Psychological Association (2015) did when they conducted a meta-analysis on the link between expos...

    Show More
    Competing Interests: None declared.
  • Intelligence in meta-analysis tools should be embedded for detecting publication bias and skewed data in order to avoid misleading conclusions

    Jop de Vrieze wrote an article entitled “The metawars” (1). Researchers must be aware before relying on the data and statistics (2). The major problem in meta-analysis lies in the potential for publication bias and skewed data. If the meta-study is restricted to research with positive results, then the validity of the entire endeavor is compromised (2). Using bias or skewed data, the result of meta-analysis may mislead the conclusion.

    In a normal distribution, the mean and the median are the same number while the mean and median in a skewed distribution become different numbers. The "mean" is the "average" where you add up all the numbers and then divide by the number of numbers. The "median" is the "middle" value in the list of numbers. Therefore, visualizing data enables us to detect and identify bias and skewed data.

    According to Wikipedia, statistical bias is a feature of a statistical technique or of its results whereby the expected value of the results differs from the true underlying quantitative parameter being estimated. In order to avoid misleading conclusions by meta-analysis, we have to eliminate the following biases from data: Funding bias, Reporting bias, Analytical bias, Exclusion bias, Attrition bias, Recall bias, and Observer bias.

    We cannot assume that all users have enough knowledge on meta-analysis for avoiding bias and skewed data. Therefore, intelligence in meta-analysis tools should be emb...

    Show More
    Competing Interests: None declared.