Blame Is the Easiest Story to Tell

Earlier this week, I had a follow-up visit with my dental surgeon. We were talking about a cracked tooth I’d been dealing with since last year when he mentioned, casually, that he’d been reading my book.

He complimented the book as a whole and said he liked my writing style. Then he paused and asked “What about that Henri guy in Chapter 1?”

I asked him what he meant.

In so many words, he explained that Henri had been paid to participate in a clinical study, that he had lied to the study staff about his health, and that he died as a result.

Then came his analogy.

“If someone is taking coumadin even though I’ve told them not to, and they start bleeding out on the table while I’m operating, whose fault is that?”

Before I could respond, he added something else - that a woman in my book, Leticia, must have been “crazy” to travel from Toronto to Montreal to participate in a study in the first place.

I left feeling unsettled, not because of malice in his tone, but because of how familiar the logic sounded.

It’s a story we often reach for when harm occurs in research:

someone broke the rules

someone made a bad choice,

someone assumed the risk.

Case closed.

Henri was seventy-five years old when he participated in a clinical trial at a private test facility in Montreal. He did it to help one of his children financially. He had a known heart condition, stabilized by medication. During the study, he missed multiple doses of that medication while confined at the facility. He experienced flu-like symptoms, dizziness, sweating, and hot flashes - all known side effects of the study drug. He returned to the facility repeatedly in the days following dosing, where blood samples were taken but no physical exams or vital signs were assessed.

Two hours after his final study visit, Henri collapsed at home and died. The research facility quickly declared his death unrelated to the study and discouraged Henri’s wife from requesting an autopsy.

The simple version of the story is that Henri “lied.”

The fuller version is that multiple warning signs were visible, yet no meaningful safeguards were in place to detect or respond to them.

In research ethics, systems are not built on the assumption that research participants will behave perfectly. They are built precisely because humans are imperfect.

People forget.

People misunderstand.

People minimize symptoms.

People conceal information out of fear of being excluded, losing compensation, or disappointing authority figures.

That is why ethical research requires careful screening, monitoring, follow-up, and governance.

Responsibility does not end with a signature on a consent form.

Leticia’s story reveals another dimension of this same logic.

She was a recent immigrant to Canada, recruited into a clinical trial that required over a month of confinement in a Montreal facility - far from her home near Toronto. Research participants were bused in and bused back out after the trial concluded. Payment was backloaded, meaning they would only receive compensation if they completed the entire study.

During confinement, Leticia was assaulted by another participant in a communal bathroom.

When she reported it, she was told someone would come to see her.

No one did.

Hours later, terrified and crying, she called the only number listed on her consent form - the research ethics board’s inquiry line. She begged to go home.

When I relayed this part of the story to my dental surgeon, he frowned.

“Why didn’t she just call the police?” he asked. “Instead she called an ethics board?”

I explained that she had done exactly what most people would do first - she reported the assault to the study staff.

They were the ones responsible for her safety.

They were the ones in control of the environment.

They were the ones who should have intervened.

They were the ones who should have called the police.

Leticia waited for hours after reporting what had happened. Frightened and isolated, she turned to the only number listed on her consent form - the place she had been told to call if her rights or welfare were at risk.

Once again, the focus shifted away from the system that failed her and back onto the individual who was harmed.

The analogy my surgeon offered - the patient who continues taking Coumadin before surgery - is revealing.

In clinical care, we do not rely solely on trust.

We ask about medications.

We verify histories.

We review charts.

We order tests.

We postpone procedures when risk is unclear.

If a patient bleeds out during surgery, we don’t simply ask whether they followed instructions. We ask what safeguards were in place.

Research ethics demands the same level of care - often more - because there are unknown risks. Research participants are contributing to knowledge, not receiving treatment.

Yet when harm occurs in research, the narrative so often shifts to personal blame.

He lied.

She was reckless.

They knew the risks.

It is a convenient story.

It absolves systems of responsibility.

What troubles me most about this conversation is not that a healthcare professional expressed these views.

It’s that they are so widely held.

Most ethical erosion does not happen through cruelty.

It happens through normalization.

Through the quiet belief that informed consent transfers responsibility.

Through the assumption that payment justifies risk.

Through the idea that vulnerability is a personal failing rather than a condition that demands protection.

 Henri and Leticia were not statistics.

They were human beings navigating financial pressure, trust in institutions, and systems that prioritized efficiency over care.

Their stories are not about bad choices.

They are about what happens when safeguards weaken and accountability drifts.

Blame is the easiest story to tell.

But it is rarely the truthful one.

Previous
Previous

“Some Teachers Leave a Mark that Lasts a Lifetime”

Next
Next

Our research ethics boards aren’t the problem – the system wasn’t built to protect participants