SuggestedSources

This response was generated by ChatGPT (o3) using the provided sources.

← Back to Question README ← Back to Index View All Sources

Want to add more sources or suggest new questions? Contribute on GitHub


How do you distinguish good science from propaganda?

Good science and propaganda can both wear the clothes of expertise, cite “studies,” and invoke authority. Yet they differ profoundly in motive, method, and the way results are communicated. The table-stakes for judging them are not technical training so much as a set of practical questions that anyone can ask.

  1. Clarify the Goals
    • Science seeks to discover how the world works; its conclusions are always provisional and open to revision.
    • Propaganda seeks to persuade or mobilize; the message is fixed in advance and evidence is selected—or invented—to serve it. McKenna calls these “Big-P” Propaganda, aimed at shaping opinion, versus ordinary persuasive communication that still allows for doubt and debate [2].

  2. Examine Method, Not Message
    a. Falsifiability. If a claim cannot in principle be proven wrong, it is outside science [5].
    b. Replication. Independent researchers must be able to reproduce the result. The failure of other labs to replicate the Stanford Prison Experiment exposed its theatrical nature [3]; large-scale replication failures are a statistical red flag [7].
    c. Transparency. Good science shares data, code, and protocols so critics can scrutinize them. Troublingly, internal NIH documents stressed finding “trusted messengers” rather than releasing more data on COVID studies—an inversion of transparency [1].

  3. Look for the Norms of Science (CUDOS)
    • Communalism—data are communal property
    • Universalism—claims are judged independently of the speaker
    • Disinterestedness—authors gain no special benefit from a particular outcome
    • Organized Skepticism—critical peer review is welcomed [6]

    Propaganda violates these norms: it is proprietary, identity-based (“believe the scientist”), interested (political or financial), and hostile to dissent.

  4. Watch the Rhetoric
    • Certainty and Moralization. Statements framed as “the science is settled” on complex, evolving topics signal politics, not empiricism. Lysenko denounced geneticists as saboteurs and promised bumper crops—certainty masking ideological coercion [4].
    • Emotional Appeals. Heavy reliance on fear, disgust, or virtue to carry the argument, rather than statistics and confidence intervals, points to propaganda.
    • Cherry-picking. Are dissenting data mentioned and addressed, or ignored?

  5. Follow the Incentives
    • Conflicts of interest (funding, political backing, career stakes) do not automatically invalidate a study, but undisclosed or denied conflicts are warning signs.
    • Institutional pressure. Soviet agronomists conformed to Lysenkoism because dissent meant prison [4]; junior psychologists praised the Stanford Prison Experiment because it was a citation machine [3]. An atmosphere where careers hinge on affirming a narrative breeds propaganda.

  6. Look for Independent Corroboration
    • Multiple lines of evidence (epidemiological, mechanistic, randomized) that converge strengthen reliability.
    • Diverse authorship—different labs, funding sources, and countries—reduces the risk of coordinated messaging.

  7. Heuristic Checklist for Readers
    1. Is the primary literature accessible, or are you only offered press releases?
    2. Are the methods and raw data posted?
    3. Have other groups replicated the result?
    4. Does the author acknowledge limitations and uncertainty?
    5. Is disagreement allowed without personal attacks?
    6. Who benefits if the claim is accepted?
  8. Case-Study Contrast
    • Good Science: The eventual replication crisis in psychology arose because data and protocols were (sometimes) available; open criticism led to reforms like preregistration and larger samples [7].
    • Propaganda Masquerading as Science: Lysenko’s agronomy banned Mendelian genetics, suppressed critics, and reported imaginary yield increases, leading to crop failures [4]. The goal was political loyalty, not agronomic truth.

  9. Practical Bottom Line
    A single heuristic (“trust experts” or “trust contrarians”) is unreliable. Instead, cultivate procedural trust: trust the practices—open data, falsifiable hypotheses, peer scrutiny, and tolerance for being proven wrong. When those practices are absent and the communication leans on authority, certainty, and emotional urgency, you are probably looking at propaganda instead of science.

Sources
[1] Unsafe Science. “NIH Staff Revolt Promotes Propaganda.” The author argues that internal NIH messaging emphasized finding “trusted messengers” rather than strengthening evidence transparency. https://unsafescience.substack.com/p/nih-staff-revolt-promotes-propaganda

[2] RBN McKenna. “Two Ways of Thinking About Propaganda.” Distinguishes neutral persuasion from manipulative “Big-P” Propaganda and notes warning signs. https://rbnmckenna86.substack.com/p/two-ways-of-thinking-about-propaganda

[3] Vox. “The Stanford Prison Experiment Was Massively Influential. We Just Learned It Was a Fraud.” Explains missing data, theatrical coaching, and the inability to replicate—hallmarks of bad science edging into propaganda. https://www.vox.com/2018/6/13/17449118/stanford-prison-experiment-fraud-psychology-replication

[4] Encyclopedia.com. “The Disastrous Effects of Lysenkoism on Soviet Agriculture.” Illustrates how ideological propaganda dressed as science devastated Soviet crops. https://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/disastrous-effects-lysenkoism-soviet-agriculture

[5] Popper, Karl. The Logic of Scientific Discovery (1959). Introduces falsifiability as the demarcation criterion between science and non-science.

[6] Merton, Robert K. “The Normative Structure of Science” (1942), reprinted in The Sociology of Science (1973). Defines CUDOS norms that good science follows.

[7] National Academies of Sciences, Engineering, and Medicine. Reproducibility and Replicability in Science (2019). Describes replication as a cornerstone of trustworthy research.