7 minute read

When Systems Fall Short, Patients Scaffold: How AI Helped Me Navigate Cancer

by Alvina Nadeem, ing., PPCC, ProSci CP

Hi, I’m Alvina. I’m a mechanical engineer by training, a change management consultant by profession, a certified coach, a systems thinker, and a mother of two. I help people and organizations redesign how they work. But nothing prepared me for the redesign I’d be forced to make in my own life.

In March 2023, I was diagnosed with stage 1 high-grade serous ovarian cancer. I didn’t see it coming. But strangely, I was more prepared than most, not because of a checklist or a spreadsheet, but because of an unexpected partner: ChatGPT.

This isn’t a story about tech. It’s a story about survival in the cracks of a system. It’s about what happens when healthcare can’t keep pace with complexity, and the patient has to fill the gap. It’s about how AI became a cognitive scaffold when the map ran out. Because when systems fall short, patients improvise. And we do it because we’re hardwired to survive. But what scaffolds us when we’re the ones holding everything together?

Before the Diagnosis: Listening, Doubting,

Searching

It all started in February of 2023. First came fatigue. Then a sudden drop in appetite. Subtle, almost ignorable changes, but something felt off.

I didn’t rush to the doctor. I didn’t panic. But I did start taking mental notes, tiny stickies in my brain that said, “This isn’t normal for me.” That “for me” part is crucial. Because it wasn’t just about recognizing symptoms, it was about knowing what feeling good in my own body actually felt like, and having the confidence to trust that knowledge. Many people don’t realize how much they’ve normalized discomfort. When you're told to push through or downplay your pain, your baseline shifts. You start confusing resilience with silence. But I had a reference point. I knew what my normal was. And this wasn’t it.

If you recall, ChatGPT had become publicly available in November of 2022, and so I was already experimenting with it for work when I started to feel a little off. My curiosity made me wonder what if, instead of Googling what I was feeling, what if I asked ChatGPT? Google felt like a fear factory: endless links, worst-case scenarios, and information overload. ChatGPT felt different. It was structured, calm, and nonjudgmental like a thinking partner. I asked it things like: “Why am I so tired? Could it be hormonal? What else should I track this week?” It didn’t give me yes/no answers. It gave me possibilities. It connected the dots. It nudged me to keep observing.

When Systems Fall Short, Patients Scaffold: How AI Helped Me Navigate Cancer

When I think back, what really sends shivers down my spine is that I had also spoken to an older cousin about my symptoms around that.

“It’s probably perimenopause,” she had said, lovingly. I wanted to believe her, but AI had already signalled otherwise: This might be serious. You should get checked. That contrast was critical. One voice soothed. The other sharpened. I chose the one that made me act.

A few days later, I had an ultrasound. It revealed an 8 cm mass on my left ovary—one that would double in size by surgery three weeks later.

Diagnosis: AI as Anchor in a Flood

Hearing “ovarian cancer” felt like being hit with concrete. But even in the chaos, I wasn’t starting from scratch.

I had a timeline of symptoms. I had a basic grasp of lab language. I had already practiced asking hard questions. All because AI helped me build that foundation.

I asked it:

“What subtype is this tumour?”

“Should I push for genetic testing?”

“What questions should I ask at my first oncology appointment?”

It helped decode jargon in test results and chemotherapy plans. It helped me prepare:

“Which meds are most likely to cause nausea or fatigue?”

“What are common coping strategies for these side effects?”

It helped me translate culture into care:

“What South Asian foods are gentle on digestion during chemo?”

“Can you adapt this Western cancer nutrition recipe into something daal-inspired?”

This wasn’t trivial. Culturally relevant care can make the difference between following through on nutrition recommendations or not. Not everyone can, or wants to, survive on chicken broth and crackers.

AI also helped me carry emotional weight that the system didn’t catch. It helped me process questions that often go unspoken:

“How do I talk to my kids about this?”

“Why do I feel angry at people who are trying to help me?”

When Systems Fall Short, Patients Scaffold: How AI Helped Me Navigate Cancer

Yes, I had support. My husband stepped up in every possible way. That man deserves an award for how he showed up for our family, and for me. But even the most loving partner has a breaking point, and I didn’t want to push him to it. There were things I might have kept in. But I didn’t have to—because at 2 am, when I was spiralling, I knew I didn’t have to wake him up. I had another option. AI was there. Limitless. Nonjudgmental. Always ready to listen. It made me feel less like a burden. No one said I was one, but ask anyone who’s gone through something like this. That fear creeps in. Not because others make us feel helpless, but because we’re not used to being helpless. It wasn’t therapy. But it was therapeutic.

Post-Treatment: Rebuilding with Scaffolds, Not Scripts

People assume survivorship is the finish line. It’s not. It’s the beginning of something uncharted. When treatment ends, and the healthcare system steps back, you are made to feel as if you're “done.” You’re supposed to bounce back. Be grateful. Be normal. But I wasn’t. I was in full surgical menopause at 36. My energy was unpredictable. My thoughts were foggy. My identity was in pieces. This wasn’t a return. It was a redesign. And again, AI helped. It helped me:

Build new routines around fatigue and family life. Create language for needs I hadn’t fully articulated to myself.

Rediscover purpose without reverting to the person I was before.

AI became the external brain I could borrow, when mine felt foggy, it gave me structure to think through pain, confusion, identity shifts, and reintegration. It wasn’t always profound. I didn’t need it to be that way at all times. Sometimes it helped me plan meals. Sometimes it helped me reframe thoughts. But it was always there. And that mattered.

What the Healthcare System Missed: The Gaps You Can’t See Without Asking

As someone navigating the healthcare system, I had many advantages: education with a science background, access, insurance, and digital literacy. And I still almost missed the signs, and wasn’t always told what all of my options were. A lot of those gaps were filled by me through selfadvocacy and exercising my curiosity. That tells me the problem isn’t individual vigilance.

When Systems Fall Short, Patients Scaffold: How AI Helped Me Navigate Cancer

As an engineer, I can name structural fragility when I see it. But I want to make it clear. I am not saying that healthcare fails us maliciously. It fails because of its design, because the design is outdated.

It was built for volume, not nuance. It rewards clarity and wasn’t built for ambiguity. It listens better to patients who speak its language, and flattens those who don’t. I don’t blame the providers. They’re not the architects of this system. They’re suffering under its weight, too. AI didn’t replace my care team. It helped me show up better to it. But if I had to work this hard to be heard, what’s happening to everyone else?

An Example of Cultural Blind Spots

Here’s one example. At no point during post-treatment did anyone mention increased cardiovascular risk. But something in me, maybe instinct, maybe engineering intuition, flagged it. So I asked ChatGPT:

“Are there specific health risks for South Asian women in surgical menopause after chemotherapy for ovarian cancer?” The answer was striking.

South Asian women are at higher risk for insulin resistance, cardiovascular disease, and type 2 diabetes—risks that are further amplified by chemotherapy and surgical menopause. These conditions often emerge at lower BMIs than in Western populations and go undetected due to Eurocentric baselines in survivorship care. That information didn’t come from my care team. It came from a prompt One that AI answered neutrally, accessibly, and without judgment Was it perfect? No But it opened the door for me to ask better questions and advocate for care that accounted for my biology, not a generalized default.

This is where the notion of “bias” in AI deserves a second look. People worry that AI will reinforce structural biases, and it can. But so do the systems we already trust. Doctors are trained on data that’s historically limited and demographically narrow

AI can’t see me It can’t judge me based on what I wear, how I speak, or what my skin looks like when I enter the room. That’s a limitation. But it’s also a kind of freedom.

When Systems Fall Short, Patients Scaffold: How AI Helped Me Navigate Cancer

The Bigger Picture: AI Literacy as a Public Health Imperative

Let me be clear: this isn’t a tech endorsement. I’m not saying ChatGPT should replace your doctor, nor would I ever! I’m saying that, like any tool, it reflects the intention and literacy of the person using it. And that’s the real risk: not that people will use AI, but that they’ll use it badly or be too afraid to use it at all. The truth is, the genie is out of the bottle. Patients are already using AI. Some are cautious. Some are reckless. Some don’t tell their providers at all, for fear of being dismissed.

That’s what makes this moment urgent. We need AI literacy on both sides of the conversation. Not just to protect people, but to empower them. Not to replace providers, but to allow patients to participate more meaningfully in their own care Because the worst-case scenario isn’t that patients use AI. It’s that they use it in secret. And when that happens, trust erodes, safety erodes, and opportunities for collaboration vanish.

Not the Hero. Just a Mirror. AI didn’t save me. It didn’t diagnose me. It didn’t cure me. But it helped me think clearly in a fog. It helped me speak up when I was scared. It helped me prepare when the system didn’t have time. It helped me design a life when survivorship felt like a blank page

AI wasn’t the main character. I was. But it was the tool that helped me stay whole in a system that often asks patients to fragment themselves just to be seen. When systems fall short, patients scaffold. Let’s make sure they’re not doing it alone.

A mechanical engineer, certified coach, and change management expert With over 15 years of experience, she specializes in project and change management, delivering sustainable, inclusive solutions that address complex challenges across organizations, communities, and industries Certified in EQ-i 2 0/EQ 360 and ProSci Change Management, Alvina combines technical expertise, emotional intelligence, and coaching to guide organizations through transformation. She integrates a systems-focused engineering background with a deep understanding of human dynamics, ensuring that solutions are both technically sound and people-centered. Following her ovarian cancer diagnosis in 2023, Alvina shifted her focus toward healthcare transformation, patient engagement, and system efficiencies.

This article is from: