
It’s impossible to scroll LinkedIn, attend a conference, or even have a coffee chat in our industry without the topic of AI popping up. The buzz is deafening — and, to be fair, much of it is warranted. AI tools are already transforming parts of the research process, from automating transcription to identifying themes at lightning speed. I’ve experimented with them myself and I’m not going to argue that they don’t have a place.
But here’s where I pause (and a deep breathe). In all the excitement, I’ve noticed a subtle — and dangerous — undercurrent: the idea that AI might soon replace the role of the qualitative researcher altogether. That somehow, algorithms can “listen” as well as we can.
Spoiler alert: year right; they can’t!
Not when it comes to one of the most essential skills in our craft — empathy.
Why Empathy Still Rules
Empathy is not just about “being nice” or “understanding someone’s point of view.” It’s about sensing what’s not being said, recognizing when a respondent’s tone shifts, and creating a safe environment that invites disclosure. In healthcare research — my wheelhouse for over two decades — this matters more than anything.
When a patient is sharing their fears about a new diagnosis, or a caregiver is explaining what it’s really like to manage a loved one’s condition, there’s an emotional undercurrent that is impossible to script. The moderator’s role isn’t simply to record these words or rush through their discussion guide; it’s to meet them with the right response in the moment — a pause, a nod, a follow-up question that says I heard you, and I understand this is important.
AI can transcribe that sentence perfectly.
AI can even suggest a follow-up question based on keywords.
But AI cannot feel the tension in the room. It cannot choose to soften its voice, change its body language, or sit in shared silence to give the participant space to gather their thoughts.
That’s not a knock on AI — it’s just reality. Empathy is human. And. It. Matters. For. Effective Qualitative. Research.
The Subtle Art of Reading Between the Lines
A lot of what makes qualitative research valuable happens in the white space — the moments between the answers.
Recently, I moderated a session with an oncology patient. The patient described her medication routine in great detail, smiling the entire time. On paper (or in an AI transcript), it would look like everything was going smoothly. But I noticed something: every time she mentioned the evening dose, she avoided eye contact.
That subtle shift told me there was more to the story. When I gently circled back, she admitted she often skipped that dose because the side effects made her nauseated — but she didn’t want to disappoint her doctor. That insight changed the entire conversation for the client.
No algorithm would have picked up on that moment.
Why? Because it wasn’t about the words — it was about the way she delivered them.
Trust is Earned, Not Programmed
Building rapport is another area where humans still have the edge. Trust doesn’t happen just because someone has agreed to participate in a research session. Especially in healthcare, many participants are sharing deeply personal information that they might not even tell close friends.
When I walk into a room (virtual or in-person), I’m not just there to run through a discussion guide. I’m there to earn trust. That might mean starting with light conversation, adjusting my language to match the participant’s comfort level, or sharing a small piece of my own humanity so they feel it’s safe to share theirs.
AI can be trained on polite scripts and friendly tone markers. But trust is about genuine human connection — and humans can sense the difference between “I’m asking this because I have to” and “I’m asking this because I truly care about your answer.”
The Danger of Over-Automation
Here’s the part that concerns me most about the current AI buzz: the temptation to over-automate qualitative research in the name of speed and cost savings. Yes, AI can handle certain tasks faster and cheaper — but when we start cutting the human element out of conversations that are fundamentally about people, we risk flattening the very insights we’re trying to uncover.
Think of it this way: if your goal is to understand a complex emotional journey — like how a patient decides whether to start a new treatment — you need to explore fears, motivations, and personal trade-offs. That’s not something you can capture fully by running sentiment analysis on a transcript.
In fact, the more emotionally loaded the topic, the more important it is to have a skilled human in the room.
Humans and AI: Better Together
This isn’t an either/or argument. The smartest approach is to use AI to free human moderators to do what we do best. Let AI handle the tedious tasks — coding open-ends, summarizing large data sets, even suggesting potential probes. That’s where it shines.
Then let humans step in where nuance, empathy, and rapport matter most.
In my own work, I’ve found AI to be a helpful assistant, not a competitor. I’ll use it to surface initial themes, but I always go back to the recordings, the facial expressions, the pauses, the moments of hesitation. Those are where the real insights often live.
Why This Matters Now
The industry is at a crossroads. Clients are being told they can get “qualitative insights” faster, cheaper, and without the hassle of human scheduling. It’s tempting — I get it. But I believe it’s our responsibility, as professional moderators, to educate clients about what’s lost when you remove the human element.
Especially in healthcare, we’re not just gathering opinions. We’re holding space for stories that are often tender, complex, and life-shaping. Those stories deserve more than just efficient capture — they deserve to be truly heard.
AI can give you the data.
Human moderators can give you the truth.
And in my experience, those are not always the same thing.
About the Author
Judithe Andre is a strategic qualitative researcher, certified health coach, and founder of Verbal Clue Research. With over 20 years of experience in healthcare, pharma, and wellness, she blends rigorous methodology with a deep commitment to human connection — uncovering the emotional drivers and hidden needs that shape behavior and decision-making.
Comments