AI Has No Empathy
I read this headline about AI in end-of-life care. Can it make those decisions easier?
After my initial horror, I sat with the question for a while, looking back on my own experiences. The short answer is, of course, no. It’s no because people aren’t math problem. Healthcare isn’t an equation to be solved. And the worst, most difficult moments of someone’s life—and someone’s last moments—should not be left to a hallucinating, pattern-recognition machine.
AI is good at pattern recognition. But patterns aren’t people. They’re not someone’s heartbeat. They’re not their first date. First kiss. Favorite Christmas. They’re not the person they’d call and 3am and say come get me, no questions asked.
AI has no empathy, no compassion. And when someone is dying especially, you need both. In abundance. I know, because I’ve been there. And I can say how much worse the situation would’ve been if there was no social worker, just some kind of push-button survey and a coldly spat out answer.
Compassion matters. On the awful, terrible, no good, worst days—and the rest of the days, too. I’ve seen, in recent weeks, so much vitriol and a lack of compassion that it’s turned my stomach. Compassion is more productive than nearly anything else. And when people start to lose that, the line ends nowhere good.
Thinking back on my mother’s death, it was hard. Harder than I care to look at right now. But it was also filled with kind strangers. Compassionate, deeply feeling humans who eased the bad things in a way that I will always be grateful for. Imagining them gone from the equation is weirdly gutting.
Months back, at the end of my dog’s life, the decisions were hard. I was a mess. I’m still a bit of a mess, to be honest. But never did I wish for someone else to make the choice for me—or something else. Because he was my best buddy and my responsibility, and loving him meant making that awful choice for him. Love means doing the hard thing even when it’s the opposite of what you want, because it’s what someone else needs.
A machine can’t step in and evaluate, discern, or assess. AI can simply compile. And while we are absolutely a compilation of the past, our memories and our love, our loses and our wins, you can’t sift through that in a meaningful way without compassion and love and affection.
A machine can’t give you the best choice, because best is subjective. Almost no decision is made without feelings, and feelings are beyond AI. To posit it’d make a choice easier, I’d wonder at the guilt left in the wake of letting a machine decide anything. I cannot help but remember Stanislav Petrov, although the situation is wildly different.
People matter. Compassion matters. The heart matters. If we lose that—if we give it away—we are damning ourselves. And I don’t know about you, but absolutely not.
100% this. The compassion, especially in the face of loss.