Policy ForumSCIENCE AND SOCIETY

Building an evidence base for stakeholder engagement

See allHide authors and affiliations

Science  10 Aug 2018:
Vol. 361, Issue 6402, pp. 554-556
DOI: 10.1126/science.aat8429

eLetters is an online forum for ongoing peer review. Submission of eLetters are open to all. eLetters are not edited, proofread, or indexed.  Please read our Terms of Service before submitting your own eLetter.

Compose eLetter

Plain text

  • Plain text
    No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Author Information
First or given name, e.g. 'Peter'.
Your last, or family, name, e.g. 'MacMoody'.
Your email address, e.g. higgs-boson@gmail.com
Your role and/or occupation, e.g. 'Orthopedic Surgeon'.
Your organization or institution (if applicable), e.g. 'Royal Free Hospital'.
Statement of Competing Interests
CAPTCHA

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Image CAPTCHA
Enter the characters shown in the image.

Vertical Tabs

  • Cumulative evidence for stakeholder engagement
    • Thomas Dietz, Professor, Michigan State University
    • Other Contributors:
      • Paul C Stern, President, Social and Environmental Research Institute

    Lavery argues that science grounded in a solid evidential base for community and stakeholder engagement (CSE) will have improved quality and ethical grounding and greater relevance to the needs of interested and affected parties (1). CSE is important, inter alia, when communities are the subjects of research, when their cooperation is needed to conduct research, and when the results of research affect them. As Lavery notes, fragmentation of existing literatures constrains the cumulative knowledge required for effective CSE, with research in one domain of CSE, e.g. public health interventions, not integrated with research in other domains.

    Cumulative understanding can benefit from long traditions of research identifying and testing design principles for CSE around risk and environmental assessments, management of common-pool resources, and introducing emerging technologies (2-4). A decade ago the U.S. National Academies reviewed ~1000 studies of environmental public participation and concluded that "When done well, public participation improves the quality and legitimacy of decisions and builds the capacity of all involved to engage in the policy process." (5). The report identified eleven diagnostic questions and fifteen design recommendations to guide CSE and serve as hypotheses for further research. It recommends linking scientific analysis with public deliberation in iterated, co-designed processes drawing on multiple forms of expertise including i...

    Show More
    Competing Interests: None declared.
  • Funded evaluation criteria must be provided for building an evidence base

    An article entitled “Building an evidence base for stakeholder engagement” was published (1). The article emphasizes the importance of substantive community and stakeholder engagement (CSE). Evaluation criteria in CSE play a key role in verifying evidences. Based on provided evidences, we must determine or verify/grade a project whether it is successful or not. In order to verify evidences, evaluation criteria must be used. In other words, the evaluation criteria must be clearly defined and provided before funding a new project. Ambiguous evaluation criteria will deteriorate the evaluation quality so that eventually funding organizations will lose their reputations. The more losing their reputations will lead to the less donations. An impact factor method has been used in many funding organizations including NIH for evaluating funded projects. However, editorial states that reviews receive higher citations than original research papers (2,3,4,5). The review paper authors may not have contributed anything to the discoveries they summarized, yet they end up getting the credit for them (5). This means that using the impact factor method is not suitable/fair for evaluating funded projects (2). We need to create fair evaluation criteria for building an evidence base.

    References:
    1. James V. Lavery, Building an evidence base for stakeholder engagement, Science 10 Aug 2018: Vol. 361, Issue 6402, pp. 554-556
    2. Y. Takefuji,...

    Show More
    Competing Interests: None declared.

Navigate This Article