Ethical AI in Healthcare
My Approach to AI Strategy for Healthcare
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β A LIFE IN MISMATCHED PASTELS β
β Every color misaligns just enough to make meaning β
β Looks accidentalβuntil you see the pattern β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
FDF2F8 ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
FFE0F5 ββββββββ 2022: Lab coat meets unexpected pastels βββββββββββ
D4FFE4 ββββββββ Where pink met mint met purpose ββββββββββββββββββββ
Name: Cazzy A.
Current Role: Head of Data @ FoXX Health
Background: Quality Control Scientist β Ethical AI
Trajectory: Former lab scientist who traded pipettes for Python
Education:
- MS Data Science (University of Denver)
- BS Integrative Biology & Chemistry (OSU Cascades)
- AI in Healthcare Certificate (Johns Hopkins, 2025)
Mission: Building AI that addresses healthcare inequities for women
Specialty: Pattern discovery in distribution tails & bias detection
Philosophy: Every model must be validated, evidence-based & production-ready
Approach: Effectiveness + Attractiveness + Impact = Excellence
# My ideal palette: Mismatched pastels that shouldn't work but doI started in a lab coat, where I learned that good science means obsessing over validation and reproducibility. Turns out, those habits translate pretty well to machine learning. Iβm here to help teams build ethical & sustainable systems. I like the weird edges of data: tails, drift, the places fairness breaks. I bridge research and production. I say βnoβ when the data canβt support the claim. Bring me the messy dataset youβve been avoiding; Iβll tell you what the tails are saying, and weβll make it useful together. If you care less about hype and more about calibration curves, weβll get along. I like turning messy data into useful, fair systemsβmodels that explain themselves, pass their audits, and still look good in a dashboard. If youβre curious about outliers, tail behavior, and pushing code that doesnβt quietly exclude half the population, say hi. |
D4FFE4 ββββββββββββββββββββββββββββββββββββββββββββββββββββββ
93C5FD ββββββββ 2019: Data Science emerges βββββββββββββββββ
FFCCE5 ββββββββ 30% reduction in errors ββββββββββββββββββββ
A7F3D0 ββββββββββββββββββββββββββββββββββββββββββββββββββββββ
FFCCE5 ββββββββ 2024: Lead Data Scientist ββββββββββββββββββ
93C5FD ββββββββ Built ML platform from scratch ββββββββββββββ
def career_acceleration():
"""
The gradient mismatches intentionally.
Knowledge compounds in unexpected colors.
"""
timeline = {
"2024_Q1": "Lead Data Scientist",
"2024_Q2": "Architected frameworks",
"2024_Q3": "AI Engineer",
"2025": "Head of Data",
"gradient": "exponential",
"palette": "Always mismatched"
}
return "Where patterns emerge from chaos" |
%%{init: {'theme': 'base'}}%%
graph TD
A[Dataset] -->|r = 0.06| B[Everyone: No Pattern]
A -->|But...| C[Me: Check the Extremes]
C -->|Top 5%| D[r = 0.85!]
C -->|Bottom 5%| E[r = 0.85!]
D --> F[PATTERN HIDDEN IN EXTREMES]
E --> F
F --> G[Being evaluated for drug safety & financial risk]
style A fill:#FFE0F5,stroke:#D4FFE4
style C fill:#D4FFE4,stroke:#93C5FD
style D fill:#93C5FD,stroke:#FFCCE5
style F fill:#A7F3D0,stroke:#E6E0FF
style G fill:#34D399,stroke:#FFE0F5
|
The stuff that actually ships to production |
Where math meets aesthetics |
Because models need homes too |
Because learning new languages keeps me curious |
|
From measuring chemical reactions to measuring algorithmic bias. |
Comprehensive technical portfolio showcasing systems architecture, ML engineering, and infrastructure expertise
Full stack engineering excellence with production-grade implementations
THE GRADIENT OF HARM:
βββββββββββββββββββββββββββ Clinical trials exclude women
ββββββββββββββββββββββββββ 8/10 drugs affect women differently
ββββββββββββββββββββββββββ 50% higher misdiagnosis rate
ββββββββββββββββββββββββββββ Real people harmed daily
MY INTERVENTION (IN MISMATCHED PASTELS):
ββββββββββββββββββββββββββββ Detect bias (pink on mint)
ββββββββββββββββββββββββββ Balance data (blue on blush)
ββββββββββββββββββββββββββ Fair models (lavender on sage)
βββββββββββββββββββββββββββ Healthcare for all (in every shade)
|
| I'm always interested in conversations about pattern discovery, ethical AI, or why medical algorithms think everyone is a 70kg male. Also happy to discuss career transitions, the beauty of well-documented code, or why pastel color schemes are objectively superior. Hidden patterns in data β’ Building fair AI systems β’ Healthcare innovation |
βββββββββββββββββββββββββββββββββ Curious about patterns?
ββββββββββββββββββββββββββββββββ Interested in fairness?
ββββββββββββββββββββββββββββββββ Want to build together?
ββββββββββββββββββββββββββββββββββββ Let's make AI fair
|
|
|
|
|
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β
β COMPRESSION COMPLETE: β
β β
β Career = β«(Pink β Blue β Mint β Purpose)dt β
β β
β Every color combination intentionally unexpected β
β Pink on mint, blue on blush, lavender on sage β
β Mismatched but never unintentional β
β β
β I find patterns in noise β
β I fix bias in algorithms β
β I do it all in mismatched pastels β
β β
β Because different is powerful β
β And unexpected is memorable β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
%%{init: {'theme': 'base', 'themeVariables': {'primaryColor':'#FFE0F5','fontSize':'14px'}}}%%
pie title Where My Code Lives
"Python (Data Science)" : 45
"Python (ML/AI)" : 30
"JavaScript (Viz)" : 15
"R (Stats)" : 8
"Shell (Automation)" : 2
I write code like I used to write lab reports: obsessively documented, thoroughly tested, and with enough comments that future-me won't hate past-me. |
