Lies, Damned Lies, and Statistics

15. August 2024
Kategorien
Newsletter abonnieren

Lies, damned lies, and statistics.

«Lies, damned lies, and statistics» is a phrase describing the persuasive power of statistics to bolster weak arguments. 

It is also sometimes used to doubt the statistics used to prove an opponent’s point.

Last night I watched a startup pitching and they presented a slide with some statistics showing the effectiveness of their solution to a particular problem, and the first thing that came to my mind was exactly this phrase.

In statistics, there are several techniques (sometimes referred to as «tricks») that can be used to manipulate data or present results in a way that supports a particular point of view. 

While these methods can be used for legitimate analysis, they can also be misused to mislead or deceive.  

When you validate a business case or investment opportunity you should be aware of these tricks, and that is why I collected the most common ones for you.

1. Cherry-Picking Data

Selecting only the data that supports a particular conclusion while ignoring data that contradicts it.

Example: A study might report only the time periods where a particular stock performed well, ignoring periods of poor performance.

2. P-Hacking

Manipulating data or testing multiple hypotheses until a statistically significant result is found, often by increasing the number of tests without proper correction.

Example: Running many different statistical tests on a dataset and only reporting the ones that give a p-value below 0.05.

3. Misleading Graphs

Presenting data in a graph with a misleading scale, axis manipulation, or selective data points to exaggerate or downplay trends.

Example: Using a y-axis that starts at a non-zero value to exaggerate differences between groups.

4. Overgeneralization

Drawing broad conclusions from a small or unrepresentative sample.

Example: Conducting a survey in one city and generalizing the results to the entire country.

5. Omitting the Baseline

Failing to provide a baseline or control group for comparison, making the results seem more significant than they are.

Example: Reporting that a treatment led to a 50% improvement without mentioning that a placebo led to a 45% improvement.

6. Selective Reporting of Outcomes

Reporting only positive outcomes while ignoring negative or neutral results.

Example: A drug trial that only reports the successful outcomes while ignoring cases where the drug had no effect or caused harm.

7. Data Dredging

Analyzing large volumes of data in search of any statistically significant relationship, often without a prior hypothesis.

Example: Examining multiple variables in a dataset until any two variables show a correlation, then presenting this as meaningful without further validation.

8. Ignoring Confounding Variables

Failing to account for variables that could influence the results, leading to spurious conclusions.

Example: Claiming that ice cream sales cause drowning deaths without accounting for the confounding variable of temperature (both increase during summer).

9. Manipulating the Sample Size

Choosing a sample size that is too small to detect an effect or too large, which may exaggerate the significance of minor effects.

Example: Conducting a survey with only a few participants and claiming the results are representative of the entire population.

10. Misinterpreting Statistical Significance

Confusing statistical significance with practical significance or misrepresenting what a p-value actually indicates.

Example: Claiming that a treatment is effective based on a p-value below 0.05 without discussing the actual effect size or its practical implications.

11. Simpson’s Paradox

Aggregating data without considering subgroups, which can lead to contradictory conclusions when the data is disaggregated.

Example: A treatment might seem effective in the overall population but ineffective or even harmful when broken down by specific demographic groups.

12. Non-Comparative Metrics

Presenting data without proper context, such as not comparing it to a relevant benchmark.

Example: Reporting that a company’s profits increased by 20% without mentioning that its competitors increased by 50%.

13. Double Dipping

Using the same data twice in a way that inflates the significance of the findings.

Example: Reporting an outcome as both a primary and secondary result, thus artificially increasing the perceived importance of the data.

14. Using Relative vs. Absolute Risk

Emphasizing relative risk instead of absolute risk to make a finding seem more significant.

Example: Saying a drug reduces the risk of disease by 50% (relative risk) when the absolute risk reduction is from 2% to 1%.

These techniques can be powerful when used correctly, but they can also be deceptive if not used with care and transparency. 

Ethical statistical practice involves full disclosure of methods, careful interpretation of results, and avoiding the intentional misuse of these tricks.

 
Tags

Das könnte Sie auch interessieren

The Five Elements of a Strong Governance Structure for Critical Projects

16. Januar 2025

Every executive has nightmares about that project—the one that spirals into an unmitigated disaster.  In general there are four ways a project can end up in a boardroom-shaking failure that can destroy value, reputations, and trust in one fell swoop. 1. The Titanic Failure: The project chugs along, oblivious to the iceberg ahead, burning millions

Weiterlesen

Why Every Critical Project Needs Independent Reviews

14. Januar 2025

«Trust, but verify.» That timeless adage applies as much to critical projects as it does to diplomacy. Without an independent review, even the best-run projects can veer off course, leaving organizations blindsided by delays, cost overruns, or outright failures. Here’s the uncomfortable truth: internal stakeholders are often too close to the project to see the

Weiterlesen

Why Every Critical Project Needs an Executive Sponsor

13. Januar 2025

Launching a critical project without an executive sponsor is like sending a ship to sea without a captain—good luck steering through the storm. Projects don’t fail because of bad intentions. They fail because of a lack of alignment, authority, and support.  That’s where the executive sponsor steps in—not just as a figurehead but as the

Weiterlesen

Why Every Critical Project Needs a Dedicated Project Manager

12. Januar 2025

Far too often, organizations assign critical projects to people who already have full-time roles or, worse, delegate management to a loosely organized team with no single point of accountability. The results? Missed deadlines, blown budgets, and a whole lot of finger-pointing. Here’s the hard truth: if the project is important, it deserves a dedicated project

Weiterlesen

Case Study 21: The Australian Securities Exchange (ASX) $250 Million CHESS Blunder

6. Januar 2025

The Australian Securities Exchange (ASX) embarked on an ambitious journey to replace its 25-year-old Clearing House Electronic Subregister System (CHESS) with a state-of-the-art, blockchain-based platform.  Initially envisioned as a groundbreaking project to enhance efficiency, security, and scalability, the CHESS replacement project quickly turned into a cautionary tale.  The initiative faced repeated delays and escalating costs

Weiterlesen

Project Recovery

2. Januar 2025

  Projects fail for a variety of reasons. Especially technology projects have a low success rate. Typically more than half of them are considered a failure. If your current in-house or outsourced software or web development project is off track, chances are I can bring the necessary input and expertise to get the job done. Troubled projects

Weiterlesen

When $100 Million Technology Projects Fail, It’s the Board’s Fault—Every Single Time

2. Januar 2025

In Switzerland, rumors suggest that both Bank Julius Bär and Raiffeisen Schweiz are grappling with failed technology projects, each costing over $100 million so far. Bank Julius Bär is reportedly trying to replace its existing core banking system for the Swiss booking center with Temenos, while Raiffeisen Schweiz is attempting to build a modern e-banking

Weiterlesen

10 Essential Questions Every Board Should Ask About Technology

16. Dezember 2024

Board members play an important role in steering organizations through the complexities of technology initiatives.  To fulfil this role effectively, it’s essential to ask the right questions that probe the strategic, operational, and risk aspects of technology projects.  Here are ten critical questions every board should consider: 1) How does this technology initiative align with

Weiterlesen

Independent Board Advisory

16. Dezember 2024

Effective boards provide clarity, governance, and oversight to steer organizations toward success. However, large technology initiatives, digital transformations, and innovation efforts often challenge even the most seasoned boards.  My Board Advisory service empowers boards and board members to navigate the complexities of modern technology decisions with confidence and precision. As a trusted advisor and experienced

Weiterlesen

Case Study 20: The $4 Billion AI Failure of IBM Watson for Oncology

7. Dezember 2024

In 2011, IBM’s Watson took the world by storm when it won the television game show Jeopardy!, showcasing the power of artificial intelligence (AI). Emboldened by this success, IBM sought to extend Watson’s capabilities beyond trivia to address real-world challenges.    Healthcare, with its complex data and critical decision-making needs, became a primary focus. Among

Weiterlesen
Next