When Your Health App Becomes Your Worst Playlist

Your Spotify algorithm got you obsessed with that one annoying song. Your healthcare algorithm just got you prescribed a medication that costs 10 times more than the generic alternative. Guess which mistake actually matters?
If you've been listening to Spotify lately, chances are you've heard "Cold Heart" by Elton John and Dua Lipa more times than you'd care to admit. That's no accident. Through programmes like Discovery Mode, Spotify allows artists and labels to pay reduced royalty rates in exchange for algorithmic preference. The song not only dominates your playlists because it's universally beloved, it's also there because someone paid for the privilege.
Now imagine Sarah, a 35-year-old marketing professional, starting her day. She skips "Cold Heart" for the third time this week on Spotify, then opens her health app to log her persistent headaches. The app's AI suggests she might have migraines and recommends a consultation with a neurologist it partners with, who charges €180 per session instead of the €80 specialist covered by her insurance. It also suggests a premium supplement blend for €45 monthly, rather than the €4 magnesium tablets that contain the same active ingredient. Sarah has just encountered healthcare's version of "health hit singles": expensive treatments that become algorithmic favourites not through superior efficacy, but through superior promotional budgets.
The streaming symptoms economy
Sarah's experience reveals how healthcare recommendation systems have quietly adopted the music industry's playbook. Just as record labels pay for playlist placement, pharmaceutical companies, supplement manufacturers, and medical device makers now pay health apps for algorithmic preference. The difference? A bad song recommendation wastes three minutes. A compromised healthcare recommendation can waste lives.
Consider Sarah's journey through her health app ecosystem. Her symptom checker, influenced by partnerships with diagnostic companies, steers her towards expensive imaging rather than a simple elimination diet. Her medication reminder app, subsidised by pharmaceutical companies, consistently suggests brand-name drugs over generics. Even her fitness tracker, eager to maintain premium partnerships, recommends costly personal training sessions over free exercise programmes that might be equally effective.
This creates what researchers call "healthcare echo chambers", algorithmic feedback loops where patients get trapped in expensive treatment cycles. The app learns that Sarah is willing to pay for premium options, so it continues recommending them. Soon, Sarah's healthcare journey resembles an endless scroll of premium recommendations, designed to keep her consuming rather than healing.
The promotional prescription problem
The ethical implications run deeper than individual choice. Should healthcare algorithms be required to disclose their "promotional rates" like Spotify's Discovery Mode? Currently, most health apps operate as black boxes, with no transparency about which companies pay for algorithmic preference or how these financial relationships influence patient care decisions.
Unlike entertainment algorithms, healthcare recommendations carry fiduciary responsibilities. When Spotify's algorithm promotes a mediocre song, the worst outcome is mild annoyance. When a health app's algorithm promotes a suboptimal treatment because of financial incentives, the consequences can be severe: delayed diagnoses, medication errors, or chronic conditions that persist rather than resolve. This isn't just about individual apps going rogue. Insurance companies could embed cost preferences into recommendation algorithms, subtly steering patients towards cheaper treatments regardless of efficacy. Medical device manufacturers could pay to ensure their products appear first in recommendation engines. The entire healthcare recommendation ecosystem could become, in essence, a sophisticated payola system where patient welfare competes with profit margins.
Reclaiming algorithmic independence
The solution isn't to abandon recommendation systems entirely, they offer genuine benefits for personalised care and improved access. Instead, we need to fundamentally reshape how these systems operate. Healthcare organisations can start by conducting algorithmic audits this week. Create a simple spreadsheet tracking every algorithm-influenced decision in your patient journey, from triage to treatment to follow-up. Count how many touchpoints involve financial incentives from third parties. The number will likely surprise you.
More ambitiously, healthcare systems should embrace open-source alternatives: community-developed recommendation systems without commercial influence. These platforms allow objective evaluation of treatment options based purely on clinical evidence and patient-specific factors, not promotional budgets.
For individual patients like Sarah, the path forward involves developing "recommendation literacy": learning to recognise and question algorithmic suggestions. Before following any health app recommendation, ask three critical questions: Who benefits financially from this suggestion? What's the most cost-effective alternative? Would my doctor recommend this without the app's influence?
The playlist we actually need
We can continue down the path of commercialised recommendations, where treatments go viral based on marketing budgets rather than medical merit. Or we can build systems that truly serve patient welfare, creating algorithms that prioritise healing over revenue.
Sarah deserves healthcare recommendations as transparent and accountable as her music streaming choices. Actually, she deserves better, because while she can skip a song she doesn't like, she can't skip the consequences of algorithmic manipulation in her medical care.
The technology exists to create ethical, effective healthcare recommendation systems. The question isn't whether we can build them, but whether we have the collective will to demand them. Your health app should be your most trusted advisor, not your most expensive playlist.