top of page

The Quiet Role AI Played at the End of My Mother’s Life

  • Writer: Carolina MIlanesi
    Carolina MIlanesi
  • Dec 30, 2025
  • 4 min read

I have spent much of my career thinking about technology, how it shapes our lives, how it can augment our productivity, creativity, and ability to learn. I have long believed in the power of artificial intelligence as a tool: something that helps us work better, faster, and in ways that are more tailored to who we are.


What I have never believed in is AI as a friend or a companion.


I have always been deeply skeptical of the idea that machines could, or should, fill emotional roles meant for humans. Empathy, in my view, is grounded in lived experience. AI does not live a life. It does not grieve, love, or fear. At best, it can reflect what the humans who build it encode into it.


And yet, in the final days of my mother’s life, AI became something I did not expect: a steady, grounding presence when I needed clarity more than comfort.


My mother lived with dementia, an illness that relentlessly strips away certainty. As a self-described control freak, dementia challenged me in ways I was not prepared for. There is no roadmap, no lever you can pull to slow the progression in a predictable way. Watching someone you love change, sometimes daily, while being powerless to alter the course is a particular kind of torment.


When my mom entered hospice care at home, I entered unfamiliar territory. It was my first experience with hospice, and like many people, I carried assumptions. The language around hospice is comforting: keeping the patient comfortable, supporting the family, dignity at the end of life. The reality, I learned quickly, is more operational. Services have boxes to check. Timelines matter. Coverage matters. You are still navigating a system, even at the end.


We were fortunate. We had a dedicated carer in addition to hospice support, and within the hospice team we were assigned a nurse who was kind, present, and deeply human. Still, timing matters. It was the holiday season, and in my mother’s final hours, I found myself alone with her.


That is when I turned to AI.


Not for companionship. Not for reassurance that everything would be okay. I turned to it because I needed answers.


For me, being prepared is how I cope. Being able to read the signs, to understand what was happening physiologically, was my way of regaining a sense of control, not over the outcome, but over how I showed up for my mom. I wanted to make sure that when the moment came, I would not be surprised. That I could be calm, reassuring, present. That I could hold her hand and let her go peacefully, without panic or fear clouding the room.

I asked AI what to expect. What changes mattered. What signs indicated pain, or the absence of it. What the final hours might look like. It responded in ways that genuinely surprised me.


ChatGPT expressed sorrow for what I was going through. Not in a performative or overly emotional way, but with a quiet acknowledgment of loss. It consistently reminded me to reach out to hospice or medical professionals, reinforcing that it was not a replacement for human care. It reassured me, based on the information I shared, that my mom was likely not in pain. It helped me understand what I was seeing: changes in breathing, responsiveness, temperature.


The empathy it showed was measured. Clinical, even. And that mattered.


It was not the empathy of a friend, which might have felt intrusive or overwhelming in that moment. It was closer to the tone of the nurse who visited us: compassionate but grounded, emotionally aware without being emotionally demanding. That balance was critical. I did not need someone, or something, to grieve with me. I needed to be supported in my role as a caregiver at the very end.


This experience did not change my belief that AI does not possess true empathy. I still believe empathy requires lived experience. But it reinforced something else I have long argued: the people behind AI must intentionally instill empathy into these systems, not to make them seem human, and not to manipulate trust, but because as we rely on AI more deeply, emotional intelligence becomes part of functional intelligence.


In moments of crisis, how information is delivered matters as much as the information itself.

Another realization struck me during those days: the importance of memory and context. Yes, I could have searched for the same information online. But search is fragmented. It lacks continuity. It cannot synthesize days of small changes into a coherent picture. AI could. It remembered what I had shared previously and helped me assess decline over time. It connected dots that would have required enormous emotional energy for me to piece together myself.


That continuity mattered.


At the same time, my experience forced me to confront my own privilege. My mom was white. Dementia, however, disproportionately affects Black and Hispanic communities in the United States. And yet, much of the data, language, and framing available reflects demographics like my mother’s, not those who are more heavily impacted.


I cannot help but wonder how different this experience might have been had my mom been Black. Would the data have been as robust? Would the guidance have reflected cultural differences in caregiving, communication, or end-of-life expectations? AI is only as good as the data it is trained on, and that reality has consequences.


In the end, AI did not replace human connection. It did not lessen my grief. It did not make the loss easier. What it did was help me be the person my mom needed in her final moments: calm, informed, and present.


I did not expect that. And because of it, I find myself more committed than ever to advocating for AI that is built with diverse perspectives, intentional empathy, and a clear understanding of the human moments it may one day be invited into.

Not as a friend.But as a tool we can trust when it matters most.

 
 

©2023 by The Heart of Tech

bottom of page