Theoretical Fault Lines: On Precision, Care, and the Quiet Drift Toward Automation
Let me walk you through a conversation I had recently â one that started with CyberKnife, of all things, and ended somewhere much deeper. If youâve ever felt the coldness of modern healthcare, or wondered why everything feels more mechanical than it used to, you might find yourself nodding along.
It began with a simple observation: sometimes itâs cheaper to selfâpay for medical care than to carry insurance. And yet, even when you walk in with your card ready, even when youâve prepaid, youâre treated like an exception â like youâre stepping outside the script. Meanwhile, insured patients glide through the system as if theyâre on rails.
If youâve ever felt that weird tension â âIâm doing the responsible thing, so why does it feel like Iâm being processed?â â youâre not alone.
As a kid, I remember healthcare feeling warmer. Maybe it was because I could hide behind my mother while she handled the paperwork. Maybe it was because the adults around me had more time, more presence, more humanity to give. But somewhere along the way, the warmth drained out. Now, even doctors â the people we expect to be the emotional center of the room â feel rushed, cold, and exhausted.
And hereâs the part that hits hardest: sometimes I catch myself feeling more compassion for the doctor than for myself. I walk in needing help, but I end up trying not to burden them. Thatâs how upsideâdown the system has become.
Then thereâs the bed shortage. If youâve had a family member who needed a hospital recently, youâve probably seen it firsthand. Itâs not your imagination â there really are fewer staffed beds than there used to be. Not because the beds disappeared, but because the nurses did. Burnout, trauma, impossible workloads, low pay, and years of understaffing pushed them out. A bed without a nurse is just furniture, and thatâs how you end up driving miles to find a hospital that can take your grandmother.
And when humans are overwhelmed, the instinctive response â especially from executives â is to automate.
On paper, it sounds clean: let robots handle the charting, the monitoring, the repetitive tasks. Let AI fill the gaps. But hereâs the fault line: once you automate the visible parts of nursing, someone will inevitably ask, âWhy not automate the rest?â And thatâs where things start to drift into an incomplete dystopia.
Because a robot can shadow a nurse, but it only learns the visible tasks. It doesnât learn intuition. It doesnât learn presence. It doesnât learn the subtle âsomethingâs offâ that comes from thousands of lived moments. LLMs can sound warm, but they donât feel anything. They predict patterns â they donât carry responsibility.
I learned this the hard way in a completely different context. I trusted an LLM too much on a technical project â the ConnectWise automation we built â and for a moment, I believed the whole thing was impossible. I stepped away, came back the next day, and realized the architecture was not only possible, it was nearly complete. The model sounded confident, but it was wrong. My own intuition was the missing piece.
Thatâs the danger in healthcare too. Not that robots will replace nurses, but that humans will start deferring to automation in places where human judgment is irreplaceable. The real dystopia isnât machines taking over â itâs humans being forced to behave like machines because the system is collapsing around them.
No one can predict the future. But we can see the shape of the risks. Automation can help â but only if it removes bureaucracy so humans can do the human parts. Only if itâs introduced thoughtfully, not as a shortcut. Only if it strengthens the system instead of hollowing it out.
These are the theoretical fault lines â the cracks beneath the surface where technology, humanity, and pressure intersect. And if we donât map them now, weâll stumble into them later.