In theory scientific studies aim to uncover truths through careful, unbiased methods.

In practice marketing can often sneak into the process, bending data to highlight what sells, not what’s most accurate.

Flashy statistics and selective storytelling can overshadow important context (like how a study was designed, who funded it, what limitations were ignored etc). Instead of receiving clear, evidence-based insights, consumers are frequently fed research-shaped messaging that’s actually engineered to persuade and profit.

In other words…

the “science” you see in ads or product claims may only be a polished fragment of the full picture.

 IF YOU ARE AN INFLUENCER: Before you tell your followers to “do their own research” consider what true research actually entails.

There’s a world of difference between responsibly analyzing peer-reviewed scientific studies and simply browsing marketing-driven content that looks like objective data. Many brands strategically present statistics that highlight their product’s strengths, shaping perception through carefully curated “evidence.”

If you, as an influencer, can’t distinguish between these subtle marketing tactics and genuine scientific inquiry, you risk leading your audience the wrong way. The majority of people don’t have the time, knowledge or resources to dissect a complex study. Telling your audience to “do their own research” may sound empowering, but in reality, it could leave them vulnerable to misinformation and confusion. Instead, consider guiding them toward credible sources or experts who can help interpret the data meaningfully.

How to critically read and understand research

Reading and interpreting research is about more than reading a results section or headline. It involves understanding the study’s design, the methods used and the context in which results are presented.

That is the small part of the information I learned at the research reading working group:

You have to know the types of studies.

Retrospective studies (looking back at smth) : These look back at past data, searching for patterns and correlations after outcomes have already occurred. While valuable, retrospective studies can be more prone to certain biases because they rely on existing records.

Prospective studies (forward-looking) : These follow groups into the future, measuring variables and outcomes over time, which can help establish cause-and-effect relationships more clearly.

Cross-sectional studies : These capture a “snapshot” at a single point in time. They’re useful for understanding prevalence or associations at a given moment, but not for uncovering causal links.

Statistical vs. scientific significance:

A “statistically significant” result means there’s a low probability the finding is due to random chance. Yet statistical significance does not always translate into real-world or “scientific” significance. Just because a treatment group saw a 5% improvement doesn’t mean it’s a game-changer for public health or clinical practice. Context matters: how large is the effect in absolute terms, and is it meaningful in a real-world setting?

Absolute vs. relative measures:

Relative measures: Saying something doubles your risk sounds alarming. But if the baseline risk is extremely low – from 1 in 10,000 to 2 in 10,000 – this dramatic-sounding “100% increase” may still be a very minimal actual increase.

Absolute measures: By focusing on the absolute change, you see the true, tangible difference. It’s the difference between being swayed by a marketing claim (“doubles your protection!”) and understanding that the actual benefit might still be quite small.

and there are countless other details I can’t even begin to explain, because I simply don’t have the necessary expertise.

Back to marketing & science:

Marketers know how to present data in a way that captures attention and creates persuasive narratives, often emphasizing relative changes or eye-catching percentages while downplaying context and limitations. They might highlight selective aspects of a study that support their product, glossing over weaker evidence or contrary results. This doesn’t mean all marketing-related data is deceptive, BUT it does mean you need to maintain a skeptical and informed stance.

Real research involves critical thinking, expertise and careful interpretation – not just glancing at a stat-laden infographic or a single study’s summary. If you’re not trained to evaluate sources for bias, understand methodology and properly interpret results, be cautious in encouraging your followers to “do their own research” independently. A better approach might be to collaborate with qualified experts or direct your audience toward reputable institutions and peer-reviewed publications.