By Jonathan A. Walter, UC Davis Center for Watershed Sciences
Research results can depend not only on the data itself, but on how they are analyzed.1 This is importantly different from how stakeholders with different interests may interpret results differently, perhaps (consciously or subconsciously) motivated by their interest in the outcome.
A recent study “Same data, different analysts: variation in effect sizes due to analytical decisions in ecology and evolutionary biology” highlights concerns for how we draw conclusions from scientific study and how science can inform policy. “Many-analyst” studies, like this one, are done with the aim to investigate whether scientific results are reproducible, and what factors influence their reproducibility. When given the same data and asked to answer the same question, choices by researchers (e.g., what method to use, how to filter and aggregate data, etc.) can produce substantially different qualitative (e.g., direction of effect of a variable x on another variable y) and quantitative (e.g., numerical estimate of effect) conclusions. Analyst choices on study design and how to conduct an analysis have been dubbed “researcher degrees of freedom,” in analogy to the statistical concept degrees of freedom, an influential factor in conclusions from statistical tests.2 Moreover, peer review, a primary mechanism of quality assurance of scientific studies, did not reliably identify aberrant results. Similar studies in other fields report similar findings, so these implications seem likely to extend to many fields.

What do researcher degrees of freedom mean for how science informs policy? Even as studies like this reveal its foibles—science is, after all, a human enterprise—science remains critical for understanding the inner workings of our world. Though policymakers must also grapple with other concerns, I assume3 that scientific understanding of how things work is valuable for crafting policy.
A central implication is that answers supported by multiple investigations might be more trustworthy. In the study above, despite the wide variability from analyst decisions, most analyses converged toward a similar answer. This suggests that consistency across independent investigations—often employing different methods of data collection and analysis—should strengthen confidence, offering a roadmap for overcoming uncertainty.
Coherence across multiple lines of evidence strengthens scientific conclusions with which to guide policy, but pervasive incentives can work against developing these multiple lines. Scientists receive their greatest professional rewards for new discoveries, not replicating or confirming earlier findings—who, for example, wants to fund studies to rediscover things we (think we) know already? However, the value of multiple studies for strengthening certainty suggests that scientists ought to revisit earlier findings, and be supported for doing so, more often.
Although multiple studies pointing toward the same answer should generally increase confidence in that answer, we should at the same time be cautious of the potential for long-entrenched ideas and ways of conducting research to impede accurate understanding4. Bluntly put, the majority answer isn’t the right answer if based on a shared error.
Collaborative synthesis science is one way to strengthen consensus and to understand the roots of disparities between different studies and approaches, leading to more robust science. In the realm of California water, contemporary models of collaborative synthesis include the CVPIA Science Integration Team and subgroups, Interagency Ecological Program Project Work Teams, and working groups at the National Center for Ecological Analysis and Synthesis. At its best, this approach brings together cooperative teams with diverse perspectives and expertise to achieve highly innovative solutions to research problems. By integrating different domains of knowledge and research approaches, these solutions are also generally beyond the abilities of a single investigator. Continuing and expanding support for collaborative synthesis science may be critical to solving some of our thorniest water problems. But collaborative synthesis science is not without challenges. Openness around data, software, and other intellectual property is essential, but establishing needed trust can be challenging when group members have different interests. Additionally, participation in synthesis groups has often been an unfunded mandate, restricting the contributions of individual researchers and limiting the scope and pace of the groups’ work. Improving support for participation in synthesis groups5 could yield faster progress toward more ambitious objectives and enable broader participation.
For problems where we expect the scientific or political nature of the problem will yield calls for multiple studies—sometimes sponsored by interested parties—with divergent results, it seems desirable to establish protocols for digesting (and perhaps concluding) from multiple studies. Simple vote-counting approaches (e.g., 7 studies say this, 3 say this, what the 7 says must be right) seem inadequate; for example, if those 7 studies all come from the same research group or use the same approach, perhaps that should not count the same as 7 studies by different groups using distinct approaches. This is a challenging problem needing further thought.
While it is concerning that seemingly innocuous choices in the scientific process can have important bearing on research results, by recognizing this and compensating for it—for example by seeking coherent and confirmatory results, including through collaborative synthesis science—we can achieve robust scientific evidence to support wise and sound policy.
Footnotes
1Possibly this is unsurprising to those familiar with the line, popularized by Mark Twain, “There are three kinds of lies: lies, damned lies, and statistics.” But since nearly every result in modern ecology and environmental science is underpinned by statistical analysis, it’s impractical to be so nihilistic.
2Degrees of freedom are the number of independent pieces of information that were included in the calculation of an estimate. For further reading, see https://medium.com/@dlectus/degrees-of-freedom-simply-explained-a96cafa3b39f.
3Perhaps somewhat utopian, but hopefully uncontroversial. Maybe naïve.
4Relevant here is the quip by the physicist Max Planck, “science advances one funeral at a time,” meaning that new ideas often gain acceptance not through convincing the old guard, but by waiting for them to die and a new generation to embrace new concepts. Relevantly humorous, I knew a geology professor who went to his grave in 2013 refusing to accept plate tectonics. The origin of modern plate tectonics is attributed to Alfred Wegener’s 1912 theory on continental drift; plate tectonics were widely accepted as scientific fact in the 1950s.
5Not without its challenges, including the different funding and employment models at universities, government agencies, consultancies, and non-profit organizations, all of whom have seats at the table in California water issues.
Jonathan Walter is a Senior Researcher and quantitative ecologist at the Center for Watershed Sciences, who works on issues relating to the stability and resilience of aquatic ecosystems and organisms.
Further Reading
Gould, E., Fraser, H.S., Parker, T.H. et al. (2025) Same data, different analysts: variation in effect sizes due to analytical decisions in ecology and evolutionary biology. BMC Biol 23, 35. https://doi.org/10.1186/s12915-024-02101-x.
Dementyey, A. (2021) Degrees of freedom (explained simply). Medium. https://medium.com/@dlectus/degrees-of-freedom-simply-explained-a96cafa3b39f.
Fox, J. (2024). Quantifying “researcher degrees of freedom” in ecology: comments on the new(ish) pre-print by Gould et al. Dynamic Ecology. https://dynamicecology.wordpress.com/2024/02/14/quantifying-researcher-degrees-of-freedom-in-ecology-comments-on-the-newish-preprint-by-gould-et-al/.
Wikipedia (2025) Replication Crisis. https://en.wikipedia.org/?curid=44984325.
Leave a Reply