In vials and clinical trials, attention to ethical details

News Archive

In vials and clinical trials, attention to ethical details

Q&A: Medical research tilts toward transparency and caution – not always to the public’s advantage, surprisingly.
Brian Donohue

[Editors’ note: This is the fifth in a series of seven articles about bioethics. Q&A’s include UW experts discussing the beginning of life, end of life/futility, clinical consultation, pain care, research and teaching. An overview started the series.]

Dr. Ben Wilfond directs the Truman Katz Center for pediatric bioethics at Seattle Children’s Research Institute and is chief of the division of bioethics in UW School of Medicine’s Department of Pediatrics.

Q: When I say ethics and medical research, is there a historic case that leaps to mind? Do you have a go-to example?

A: Many historic cases, and current cases as well. I’m pausing because there are a lot of answers. (Pulls out a big textbook on clinical research ethics.) It’s a huge area with tons of issues.

Q: Henrietta Lacks? Tuskeegee?

Brian Donohue
“Consent is not the biggest issue in research ethics. I think it’s really about how we evaluate the risks and benefits to justify doing the research in the first place,” Wilfond said.
picture of Ben Wilfond in his office

A: Those are examples of scandals. They’re important historically, in terms of how our system of ethics regulation developed and underlying concerns, but many of the really interesting and complex cases are novel as they emerge and we have to sort through them. Most of our time, and this is why I love research ethics, is thinking hard about new issues.

For example, there’s the cystic fibrosis newborn screening trial done in Wisconsin in the 80s and 90s. It’s a case that illustrates the importance of research when we don’t know the right clinical decision.

Some states thought that once the technology became available to screen newborns for cystic fibrosis, it should be performed on every infant. Other states said we don’t have the evidence yet to know if it’s helpful. In the 1980s, we didn’t know whether diagnosing an infant with cystic fibrosis, rather than waiting for them to develop symptoms naturally within a year or two of birth, would improve their outcome.

In Wisconsin they did a randomized clinical trial to see if screening made a difference. Half of the babies were screened and had results reported at 6 weeks of age; the other half didn’t have the results reported until they were 4 years of age. This was 650,000 newborns over nine years, so there wasn’t actually an informed consent form; it was part of a brochure that was given out by the state as part of the screening.

Q: So the information was available, but who knows if you saw it?

Wisconsin’s cystic fibrosis screening in the 1990s spurred ethical debate, said Wilfond, whose office has a poster of a conference that discussed questions raised by the study.
picture of poster for an ethics conference

A: Right, that’s one issue. Also, was it appropriate to randomize people? Was it appropriate to withhold this type of information for four years?

The main advantage of knowing earlier turned out to be a nutritional advantage; we could do nutritional interventions earlier and improve growth. We also learned, inadvertently, that some of the people in the group that was screened acquired bacteria called pseudomonas because we didn’t know they would catch it from being with other kids who had it. Now we know that, and we’re very careful about infection control.

So the study is a great example of how research is important: We learn things that we weren’t expecting. Part of it is that informed consent is neither necessary nor sufficient for ethical research. Some research can still be ethical without informed consent, and even if you get informed consent, it doesn’t mean that the study is good. The other thing is that, even in the best of circumstances, unexpected things can happen.

Q: Why hasn’t medical science solved the informed-consent issue?

A: One problem is that researchers constantly come up with new things that they think are important to let subjects know about. The assumption is that everyone wants to know about everything: transparency is the default. So if you want to be in a study, you have to read a 20-page consent form. And studies have actually shown that the longer consent forms, the less people are likely to remember details. There’s an inverse relationship between disclosure and comprehension.

Other studies show that people make up their mind right away whether they want to be involved in a trial, so the fact that some detail is included in a consent form doesn’t necessarily help people make decisions. We haven’t figured out the best way to help people make decisions about whether being in a research study is a good thing for them.

Q: What’s the main message of a consent form?

A: One of my favorite consent forms is on my wall here. It’s a brochure titled “Making the Decision that’s Right for Your Family.” That to me is the message for any study: It’s right for somebody but not for somebody else. It even gives reasons not to participate in the study.

Consent is not the biggest issue in research ethics. I think it’s really about how we evaluate the risks and benefits to justify doing the research in the first place.

A 2013 U.S. report concluded that the government would have to take multiple steps before anthrax vaccine trials with children could be ethically considered.
picture of pamphlet related to anthrax testing in children

I just came back from a meeting with the FDA that discussed whether it’s appropriate to test an anthrax vaccine in children … out of concern of a terrorist attack. There is an anthrax vaccine that we know is safe in adults but we have no idea whether it is safe or effective in children. So the question is, do we wait until there’s an anthrax attack and give the untested vaccine to kids and see what happens, or do we do this research where we vaccinate kids to see with their immune response is. These kids are otherwise healthy, so the risk of the research can’t be justified by the benefit to them because there is no benefit to them. But there’s benefit to people in the future.
That’s one essential dilemma to research: It puts people at risk now for the benefit of others down the road.
Q: What about cancer drugs in clinical trial?

A: In chemotherapy, there’s Phase 3 trials, Phase 2 and Phase 1 trials. In Phase 1 trials, we’re mostly testing how high dose can we give before you get really sick and maybe die. There really are very few expectations of benefit in that stage: Hundreds of people die from their disease and maybe a few people live a little longer.
Q: Patients with inoperable cancer are desperate. Aren’t they liable to say yes to any risk? Should they have the same threshold of need for consent?

A: Some patients will say damn the torpedoes, but I think we have a higher obligation to explain to them what they’re getting into. If there’s ever a place where I think informed consent is critical, it’s in Phase 1 cancer research.

Q: Talk about risk and potential benefit. Striking that balance sounds mind-boggling.

A: When there are studies that even the investigators and IRBs think are controversial, there’s a worry that the participants will overestimate benefits and underestimate risks.

There was a study of gene therapy for a disease that was not fatal but caused a fair amount of illness. And it was debatable whether gene therapy was an appropriate clinical approach to develop. But these researchers thought it was, so when the study was being done they asked us in bioethics to talk to families about why they want to be involved.

In one family, the woman said, “I think about the risks all the time. I worry about them. But I think it’s important and good for us to be able to contribute to this study.” She understood the risk and was very altruistic.

Clare McLean
Every generation brings new ethical research dilemmas. Genome sequencing (shown here at a UW Medicine lab) to advance personalized medicine is today's issue, Wilfond said.
picture of a gene-sequencing lab

Another family said, “We have no choice. I’m concerned about my son and I believe this is the best thing for him.”

That’s not how we want people to think about it but it shows that people in the same circumstances look at things very differently. They make decisions that are right for them even though, from our perspective, maybe they aren’t comprehending everything.

Q: If one patient says, “Yes, this research is useful to me now,” doesn’t that help researchers’ rationale?

A: Researchers do research because they are committed to improving healthcare down the line. A fundamental underlying tension of research is that, while its purpose is to improve medical care in the future, it has the potential to take advantage of people in the present day.

So part of our goal is to figure out how not to take too much advantage of patients – making sure they get good information so they know what they’re getting into, and ensuring that the risks are not so great that participating is a bad decision. And the research itself must truly have value.

Q: Is that caution a sign of progress?

A: It’s a good question. There’s no doubt that, over the decades, study participants have been exploited and taken advantage of by biomedical research. But today, I think we’re in danger of limiting research that can be helpful because of our concerns that it is taking advantage of people. This trend could have the effect of altering our ability to take care of people in the future.

Brian Donohue
Nurse Sarah Marcet gives Wilfond, a board-certified pediatric pulmonologist, an update on an infant in UW Medical Center’s neonatal intensive care unit.
picture of Sarah Marcet and Ben Wilfond in the NICU at UW Medical Center

Q: Talk about the influence of the academic pressure to publish, the desire for tenure – other forces at work that might motivate ethical lapses.

A: Of course there are academic pressures; if you don’t do your job, you’ll lose your job. But I think pressure to publish is probably similar to pressures exerted in any other industry. There are people who misbehave, but I think they’re far in the minority.

Q: Most breaches and lapses are entirely unintentional?

A: No, I wouldn’t say that. There’s the responsible conduct of researchers – are they doing things according to scientific method, are they falsifying, plagiarizing? Some people are intentionally deceptive, but it can also be because researchers lack awareness about recognizing or navigating gray areas.

Here at UW, we have a biomedical research integrity program that helps researchers be aware of pressures that might tempt them to think about cutting a corner.

One example we address in this program is, imagine you are reviewing a colleague’s paper or grant proposal. What if you read something in it that gives you an idea for your own research? How do you decide what to do with that information? What is your obligation with that review? You might think, “I don’t want them to publish because I want to publish my paper.”

So professionalism is certainly a component of ethics but I think professionalism is more about how you relate to people. These are not equivalent to ethical dilemmas of respecting a parent’s wish to maintain or withdraw life support.

Q: If you think of today vs. 25 years ago, are ethical issues harder to address now because there so many stakeholders have opinions about what is right, or easier because so much knowledge has been tapped for these topics?

A: Some things are easier because they have been thought about longer, and over time we have figured out how to approach something. But novel issues always come up, raising the necessity to rethink. Genomics research wasn’t even a question years ago and now we’re grappling with it.

I think there has been progress related to informed consent. We do a better job of letting people know what’s involved in research. Probably the biggest change is that we pay better attention to what participants think about. Years ago, we only paid attention to what we thought was right: Now there’s a much greater appreciation for what patients think, their ethical judgments. Now we work with the community to help decide what is ethical. That is a significant change.

For instance, a lot of hospitals now are routinely collecting samples from patients’ treatments to use in medical research. Here, patients are asked if they want to opt in, but in other places, like Nashville, they have to opt out to have their specimens excluded. Why is Nashville doing it differently? Well, they spent a lot of time talking with local people and there’s great community appreciation for Vanderbilt University. Our system reflects that people in Seattle would get irritated if they were opted-in by default; they would look at it as if they were being taken advantage of.

Q: Different approaches are OK because they represent individual communities?

A: Right. Both positions are ethically defensible because, in each case, you’re giving people the opportunity to choose the default preference.

Q: What have I neglected to ask?

A: In the same way that there’s a clinical ethics consultation service, we offer a research ethics consult service through the UW system. We are available if researchers or IRBs want input on a study. We can be involved at any stage, from study design to sharing of results.