- Bias

Exponential Bias Automated

The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency. - Bill Gates
“Data is never this raw, truthful input and never neutral. It is information that has been collected in certain ways by certain actors and institutions for certain reasons.” - Catherine D’Ignazio, Assistant Professor at Massachusetts Institute of Technology (MIT)

What is Bias

“the degree to which a reference value deviates from the truth"

Vicious Cycle - Automating Bias

notion image

71% of YouTube's COVID-19 misinformation was recommended by algorithm

A crowdsourced investigation into YouTube's recommendation algorithm

"When it's actively suggesting that people watch content that violates YouTube's policies, the algorithm seems to be working at odds with the platform's stated aims, their own community guidelines, and the goal of making the platform a safe place for people." Brandi Geurkink, Mozilla's Senior Manager of Advocacy

Dissecting racial bias in an algorithm used to manage the health of populations

A health care algorithm affecting millions is biased against black patients
A health care algorithm makes black patients substantially less likely than their white counterparts to receive important medical treatment - @colinlecher
We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias.
This disparity results increase the percentage of Black patients receiving additional help from 17.7 to 46.5%.
The bias arises because the algorithm predicts health care costs rather than illness
Unequal access to care means that we spend less money caring for Black patients than for White patients.
Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise.
Cost Is A Reasonable Proxy For Health, But It’s A Biased One
We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of bias [Algorithmic] in many contexts.
We must change the data we feed the algorithm—specifically, the labels we give it. Because labels are the key determinant of both predictive equality and predictive bias, careful choice can allow us to enjoy the benefits of algorithmic predictions while minimizing their risks - Obermeyer

Bias a Sepsis - Unfortunately Prevalent

The Epic Sepsis Model (ESM), a proprietary sepsis prediction model, is implemented at hundreds of US hospitals. The ESM’s ability to identify patients with sepsis has not been adequately evaluated despite widespread use.
27 697 patients who had 38,455 hospitalizations met inclusion criteria
Low Sensitivity: ESM identified only 183 Sepsis occurred in 2,552 (7%).
Low Specificity: ESM did not identify 1,709 (67%) patients with sepsis.
Alert Fatigue: Falsely identified 6,971 of all 38,455 hospitalized patients (18%), thus creating a large burden of alert fatigue.
Conclusions and Relevance: This external validation cohort study suggests that the ESM has poor discrimination and calibration in predicting the onset of sepsis. The widespread adoption of the ESM despite its poor performance raises fundamental concerns about sepsis management on a national level.
The REAL Problem: The algorithm used information on bills for sepsis to define which patients had sepsis. Not the measure of sepsis that researchers would ordinarily use.
We Need to Stop using Billing, Cost and other Proxy Data for algorithm development
We Need to independently evaluate and report real world results before implementation of algorithms in Medicine.
notion image

Double Standard in Medicine

"I don’t know how bad this is yet, but I think we’re going to keep uncovering a bunch of cases where algorithms are biased and possibly doing harm.” - Heather Mattie, Harvard University
A clear double standard in medicine: While health care institutions carefully scrutinize clinical trials, no such process is in place to test algorithms commonly used to guide care for millions of people.

Algorithmic Bias Cheat Sheet

Step 1: Inventory Algorithms

Step 1A: Talk to relevant stakeholders about how and when algorithms are used: Create a list of algorithms within your organization; consider broad definitions of algorithms and ask open ended questions. Step 1B: Designate a ‘steward’ to maintain and update the inventory: Choose a person to be responsible for keeping the inventory current, in consultation with a diverse group.

Step 2: Screen for Bias

Step 2A: Articulate the ideal target (what the algorithm should be predicting) vs. the actual target (what it is actually predicting): Consider whether there is a mismatch that can cause bias. Step 2B: Analyze and interrogate bias: Choose comparison groups (e.g. race), and perform some basic checks of how well the algorithm predicts its actual target. Then, investigate how label choice might create bias in how well the algorithm predicts its ideal target.

Step 3: Retrain Biased Algorithms (or Throw Them Out)

Step 3A: Try retraining the model on a label closer to the ideal target: Assess possible mitigations to label choice bias by comparing results between different labels. Step 3B: Consider alternative options (if necessary): If you are unable to improve or retrain the algorithm, consider other possible solutions. If data is the problem — a non-representative dataset, or no variables that match the ideal target — consider collecting new data.
Step 3C: Consider suspending or discontinuing the use of the algorithm (if necessary): If you are unable to improve the algorithm and/or its inputs, pause the use of the algorithm until you find a solution — or discontinue its use altogether.

Step 4: Set Up Structures to Prevent Future Bias

Step 4A: Implement best practices for organizations working with algorithms: Under the aegis of the steward and a diverse team, conduct recurring audits and ensure rigorous documentation of current and future models.

Further Reading

A Living Pocketbook of Neurology
A Living Pocketbook of Neurology
Applied, Concise, Practical, Up-to-date, Mobile-friendly, peer-reviewed & Open-Access Pocketbook of Neurology and related clinical specialties
Unleash the Digital Healer in You! >> 🌐 https://junaidkalia.com <<
Unleash the Digital Healer in You! >> 🌐 https://junaidkalia.com <<
Learn Digital Health, AI-in-Healthcare, ValueBased Care & More!
🎯 Follow Junaid Kalia MD [Editor-in-Chief]
         "If anyone saved a life, it would be as if one saved the life of all mankind"
"If anyone saved a life, it would be as if one saved the life of all mankind"
notion image

An AINeuroCare Academy Project

Coaching Clinicians in Digital Healthcare

💡 Learn

🎥 YouTube
🎓 Academy
📝 Blog

👨‍⚕️ Connect

🕊️ Twitter

notion image
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Disclaimer: For Education purposes only. You must NOT rely on the information on this website as an alternative to medical advice from your healthcare provider