In today’s world, information is power; it is decisive. Decisions are shaped by the quality of the information we consume and trust. Accurate data can guide effective action; misinformation can mislead entire communities.Yet, the challenge of our time is not scarcity, but overload. We scroll through headlines daily, watch debates, forward messages, and consume audiovisual content. But not everything that appears convincing is scientifically valid. Not everything that is loudly repeated is true. And not every number represents evidence.But how do we tell the difference?AdvertisementFact, Opinion, or EvidenceA fact can be verified as true or false through observation, measurement, or reliable sources. “Earth orbits the Sun” is a fact — it holds regardless of personal belief or agreement. An opinion is an interpretation shaped by personal experiences, beliefs, and biases — no matter how passionate or persuasive. Evidence is more demanding. It is not a single data point or dramatic event, but an accumulation of systematic analyses of information, connecting facts into reliable conclusions.The trouble begins when these distinctions collapse.A single extreme rainfall event is treated as “proof” of climate breakdown; a cooler season becomes “proof” it’s exaggerated. A viral anecdote outweighs years of clinical trials. A graph without context becomes ammunition in ideological battles.Beyond the SensationalHowever, this is not a new problem. In 1610, Galileo Galilei, based on careful telescope observations — phases of Venus, movements of Jupiter’s moons, and sunspots — proved heliocentrism. Yet, his conclusions were dismissed because they challenged prevailing beliefs.AdvertisementThis is not a one-off; there have been many cases before and after Galileo in which strong evidence and facts are questioned or rejected, reminding us that the struggle between truth and belief still exists today.During the Covid pandemic, misinformation spread almost as rapidly as the virus itself; unverified claims, miracle cures, conspiracy theories, and misleading statistics flooded social media. Anecdotes were passed off as evidence, and decontextualised videos went viral. When recommendations evolved with new data, it was not inconsistency — it was the scientific method at work. Moreover, swinging between alarmism and denial, this infodemic led to vaccine hesitancy and anti-vaccine protests, panic hoarding of edibles, toilet papers, masks, and sanitizers, ingestion of “miracle preventatives” such as bleach, disinfectants, garlic, deepened stigma, strained health systems, and eroded trust in science and public health institutions—ultimately worsening outcomes. Science is cautious. It speaks in probabilities and margins of error. It revises conclusions when new information emerges. But in mass culture, “uncertainty” becomes equivalent to weakness, while “certainty” spreads rapidly, no matter how baseless.Claims about cow excrement curing disease, peacocks reproducing through tears, vaccines causing autism, or race determining immunity are not fringe—they are entering classrooms, discourse, and even textbooks. Some go further, branding meat-eaters as immoral or linking LGBTQ+ identities with pathology—turning prejudice into “knowledge.”This is not a harmless distortion. It breeds stigma, fear, and bad decisions—because misinformation doesn’t just misinform; it shapes behaviour. And sometimes, behaviour is the difference between containment and collapse.Unfortunately, the tension between evidence and belief continues to intensify. Part of the confusion lies in how we consume information. Social media algorithms do not reward evidence; they reward engagement. Simplified narratives outperform complex systems analysis.And in this chaos, we forget that science does not promise simplicity. It promises rigour.Evidence MattersWhen policies on infrastructure, disaster preparedness, climate adaptation, financial investments, or health systems are shaped by perception or opinion rather than proof, the consequences are real. Floods worsened where ecological data was ignored. Health systems were strained where epidemiological warnings were dismissed. Resources are misallocated when short-term narratives override long-term research.The crisis is not just misinformation, but also a lack of scientific literacy.Scientific temper is about cultivating habits of mind: Curiosity, scepticism, logical reasoning, openness to revision, and respect for evidence. It is about understanding not just what science concludes but how it arrives at those conclusions, which is the most important aspect.It begins with simple practices.Reading beyond headlines and the thumbnails. These are often designed to attract attention. Read the full article before forming a judgment.Examining the source. Credible journalism and scientific reporting cite data, name experts, and link to reports. Anonymous posts, cropped screenshots, and forwarded messages without attribution deserve caution.Numbers, too, require scrutiny because, without context, a table or graph alone is not evidence.Ask questions. Can this claim be independently verified? Is this an interpretation or data? What evidence supports it? Is it based on one example — or a large body of research?If a piece of content immediately provokes anger, fear, or triumph, pause, be cautious. Strong emotion is not proof.Always check the date and context. Old images, videos, or reports are frequently reshared as current events. Verify when and where the information originated.Practice distinguishing anecdotes from trends. One person’s experience does not automatically represent a larger reality.Finally, cross-verify. Use the Google search engine freely. Important claims are usually reported by multiple credible outlets. And before forwarding, take a moment to reflect.you may likeA historian examining archival sources, a journalist verifying information, a farmer observing seasonal shifts, a voter evaluating campaign promises — all can exercise scientific temper. It is about the method, not the profession.Students can be trained in formal methodology, but critical thinking, ckepticism, and evidence-based reasoning are universal skills. In a democracy, increasingly driven by data, the ability to question information is not optional. It is essential.Facts do not become less true when inconvenient.Evidence does not weaken when it is unpopular.Opinion does not transform into truth through repetition.In an era overflowing with information, perhaps the most radical act is restraint. To be able to pause, search, and think. To be able to differentiate between fact, opinion, and evidence. This may be the most important scientific experiment of our time.The writers are researchers at WOTR-Centre for Resilience Studies