You're already training your replacement

A gig worker in Brooklyn straps an iPhone to her chest and films herself folding laundry. She holds each sock up to the camera, pauses for the frame, then drops it in the basket. DoorDash pays her $15 an hour for this. The footage goes straight to companies building humanoid robots. She knows what she's doing: teaching a machine to do her work without her.
Three weeks ago, 8 million couriers across the US got the same offer through DoorDash's new Tasks app. Film yourself washing dishes, cooking, walking through a park. Get paid. The data trains AI models and robots. The transaction is explicit. The ethics are visible. Everyone can see the deal.
Healthcare's version of this is identical in structure and invisible in practice.
Every time a radiologist corrects an AI's flagged finding, she's annotating training data. Every time a pathologist labels a suspicious cluster on a digital slide, he's building the model that learns from his eye. Every time a nurse overrides a clinical decision support alert, the system logs it and learns. None of them gets $15 an hour for it. Most don't even know it's happening.
Then a hospital CEO says the quiet part out loud. Mitchell Katz, head of NYC Health + Hospitals, told a Crain's panel he's ready to replace radiologists with AI. Today. Only the regulations stand in his way.
The backlash was swift. Radiologists called it dangerous and naive. Stanford researchers found that frontier AI imaging models hallucinate findings on X-rays they never actually analysed. And the sharpest counter came from Dr. Waseem Ullah on MedPage Today: replace the CEO first. Admin personnel in US hospitals grew 3,200% between 1975 and 2010. Physicians grew 150%. If AI is ready for the reading room, surely it's ready for the boardroom.
The data supports a calmer reading. BCG's latest analysis shows 50 to 55% of US jobs will be reshaped by AI in the next two to three years. Not replaced: reshaped. The distinction matters enormously. Most of these roles will remain, but what people do in them will be different. When AI lowers the cost of reading a scan, demand for imaging expands. More scans get read, not fewer radiologists hired. NVIDIA's Jensen Huang made the same point: AI was supposed to kill radiology first. Instead, the field grew.
The question isn't whether AI will reshape your role. It's who controls the reshaping: the person who understands the problem, or the administrator who sees your salary as the problem.
DoorDash made the transaction explicit: film your work, train the machine, get paid. Healthcare buries the same transaction under compliance workflows and system logs. But there's a third option. Instead of passively training a system that someone else deploys, build the tool yourself.
That's not a fantasy. A father whose daughter was diagnosed with Type 1 diabetes assembled a team of game developers and physician advisors and built a gamified management app. Dr. Robert Pearl argues on the Fixing Healthcare podcast that physician-built AI tools are the most practical path to higher-quality, lower-cost care. The pattern is consistent: the best clinical tools come from people who live the problem, not from people who sign the procurement contract.
If you're a clinician in Belgium, there's now a day for exactly this. Care and Code is a clinical build day on 3 October where healthcare professionals build their own AI-powered care tools in a single afternoon. No programming experience needed. Just a problem worth solving.
You're already training your replacement. The question is whether you'll be the one who builds your upgrade.
💥 May this inspire you to build the tool you've been correcting from the sidelines.