Combating misinformation in a crisis: Lessons from Deepwater Horizon

Tiny Matters

On April 20, 2010, a drilling rig called Deepwater Horizon exploded, capsizing 36 hours later. Eleven workers were killed and, over the next 87 days, more than 100 million gallons of oil spilled into the Gulf of Mexico in what the EPA has called the largest marine oil spill in history. With public distrust of the companies responsible mounting, scientists had to find a way to study the spill and communicate what they found. So when faced with a crisis of this magnitude, when the stakes are so high, how do you dispel misinformation and effectively communicate what you know?

Transcript of this Episode

Sam Jones: Hi everyone! Sam here. Popping in before the episode starts to ask for a favor. Deboki and I are so grateful you’re listening and it has been a ton of fun seeing the number of listers grow so much over the last year and half since Tiny Matters launched. We’d love to grow even more, and the way we do that is by getting the word out. So send us to your friends, family, even enemies if they like science! Rating and reviewing us on Apple Podcasts, Spotify, wherever you’re listening, is also incredibly helpful. And if you wanna support us even more, we have Tiny Matters coffee mugs! We’ve left a link in the episode description. OK, on to the episode.

Deboki Chakravarti: On April 20, 2010, a drilling rig called Deepwater Horizon was operating around 41 miles off the coast of Louisiana, in the Gulf of Mexico. The company operating the rig, Halliburton, had been hired by the oil company BP, and by that point, the crew had drilled into the bedrock and sealed a metal tube in place with cement.

Sam: But when the crew prepared to move Deepwater Horizon away from the well so that a smaller rig could connect there instead, something went wrong: gaseous hydrocarbons had somehow gone into the well, expanding and increasing the pressure in the pipe until they were able to get into the rig, where something—heat from equipment, or maybe a spark—ignited them.

Deboki: That night, Deepwater Horizon exploded, capsizing 36 hours later. Eleven workers were killed, and over the next 87 days, over 100 million gallons of oil spilled into the surrounding areas in what the U.S. Environmental Protection Agency has called the largest marine oil spill in history.

Sam: With millions of gallons of oil spilling into the Gulf and public distrust of the companies responsible mounting, scientists had to find a way to study the spill and communicate what they found.

So when faced with a crisis of this size, how do scientists approach their work? And just as importantly, how do they explain what they know—and don’t know—when the stakes are so high?

Welcome to Tiny Matters, I’m Sam Jones. And I’m joined by my co-host Deboki Chakravarti.

Deboki: Today on Tiny Matters, Sam and I are going to be talking about science communication in the face of crisis. This is the second of a two-parter, and in our last episode we focused on the science underlying Hurricane Katrina and the more recent Ohio train derailment, as well as other factors that made those events so disastrous.

Now we’re going to focus on the lessons one scientist learned during the Deepwater Horizon oil spill.

Chris Reddy: So, I am in theory, I am a chemical oceanographer, but in reality, I'm just a chemist whose beaker is the ocean. And for the most part, I study oil spills and other senses of pollution, not necessarily from a toxicity perspective, but how does nature fight back with all these uninvited guests whether it's oil or plastic? And how you can learn and make better chemicals in the future, or respond to spills in a more informed manner.

Sam: That’s Chris Reddy, a senior scientist at Woods Hole Oceanographic Institution in the Department of Marine Chemistry and Geochemistry, and he’s spent quite a bit of time studying the Deepwater Horizon oil spill.

Chris: I ended up studying it for about seven or eight years. I got involved relatively early and was involved with estimates of what the flow rate was, because it wasn't just a crude oil spill, it was crude oil and natural gas. So I studied, you know, what happens and how fast it was coming out of the bottom of the floor, what happens to the oil when it does, some of it doesn't reach the surface. And then, you know, on a longer scale, how fast the oil broke down and by whom and by what, and what were the conditions that were driving it.

Sam: Deboki and I talked to Chris because when it comes to science communication in a crisis, he wrote the book—literally, he just published a book called Science Communication in a Crisis, where he uses his experience as a scientist during the Deepwater Horizon disaster as a way to explain what he’s learned about science communication in those high pressure situations.

And one of the things Deboki and I learned quickly when reading the book is that Chris is very open about the mistakes he’s made talking to the media.

Deboki: In 1996, as Chris was finishing up his PhD, a tank barge called North Cape was carrying around 3.9 million gallons of home heating oil when it hit Moonstone Beach in Rhode Island. Chris’ advisor told him not to get involved, in part because he didn’t have media training, and working on this oil spill was sure to draw attention. But Chris’ curiosity won out, and he started gathering samples and looking around.

Chris: I was a very jerky, very naive, extremely overly unsupported, confident chemist. But, you know, long story short, I had made a finding from this North Cape oil spill. I was on the front page, you know, everybody was excited. The State of Rhode Island's gonna get millions of dollars more because of my finding. And a local reporter, Peter Lord, who's now passed away, interviewed me. And apparently when he was interviewing me, I had said something like, well, the reason why I found it was that I was a smarter, better scientist than the government scientist.

That didn't make the article. And about 10 years later, I had worked on a couple oil spills. I had a little bit of appreciation for science communication, decided to teach a course with a friend of mine who was a science writer. I had one of my guest speakers, Peter Lord, 10 years later, and he shows up and he goes, whatever you do, don't do what Chris did in his first interview with me, which is then he recited all of my mistakes. And then he said, “But I didn't run them because I thought it wasn't part of the story, and I didn't wanna hurt this guy's career.” And to me, it's just such a strong reminder that his kindness and his humanity to me in that, you know, the media isn't the enemy to science.

But it took me 10 years to find out. I lived 10 years thinking that I did a good interview. <laugh>. Come to find out I bombed it. I just so happened to have a really good writer, you know?

Sam with Chris: So obviously, based on those stories, you weren't going into the Deepwater Horizon oil spill without any experience with the media, but why do you think that the Deepwater Horizon was a particularly formative experience for understanding the challenges of science and communication during an environmental crisis?

Chris: There wasn't a lot of active oil spill scientists in academia before Deepwater Horizon. I happened to be one. There's a few others out there, and I had some media exposure. So arguably I was extremely well positioned. I think I did okay, but I still made a lot of mistakes. But I think the reason why, you know, Deepwater Horizon became a big science communication challenge was it was so long, right? I mean, it was an 87 day release, and then there was another two months of oil on the surface.

And I think personally, the reason why there was a lot of science challenges was it gave enough time for other scientists to get into the mix. You know, a lot of environmental crises and other crises are fast, right? So it's hard to even get into the game. Nobody knows your phone number, right? But you have an 87 day release. A lot of folks can get into play. You get a lot more people in the media industry. You get a lot bigger audience.

Deboki with Chris: And one of the things you talk about as a recurring challenge was the amount of public distrust coming with all of that information. And that goes in a lot of directions towards the government, towards BP, towards the scientists. Can you talk a little bit about how that distrust built, and did it end up factoring into the choices that you made when it came to communication?

Chris: There's a lot of problems that happened during Deepwater Horizon, you know, it came in the Gulf of Mexico and New Orleans area a lot, you know, was home base. And folks were still very upset and about the aftermath of Katrina and how the government response was.

I think you put that into play on top of the fact that you have a very fast moving, but difficult way to understand what's going on. And I think that it also highlighted this idea that, you know, science is not a house of cards, right? This idea of one scientist said this, and the government scientist said this, and you know, it's some type of shootout or duel from Hamilton, but that's just not the way science works. It's so much more incremental, you know? And when you try to accelerate science in an area where there's a lot of uncertainty, it is not a good mix. Science was not built for speed.

And we live in a world where we can pick up our phone and in four seconds get, you know, how many people were at the Red Sox game yesterday, right? So we get certainty and speed. And science is neither certain or fast. And so you have an impatient world who wants certainty. And you have scientists who are uncomfortable giving facts because they're way afraid of how their colleagues are going to think. Or on the flip side, they're being too certain and too over the top. And when you have these places where you have either no knowledge or conflicting knowledge, that's where other problems get to come into play.

You have an expert going on TV who says and lists all the things they don't know. Well, we don't know this, we don't know that, we don’t know this. So you have a scientist who's saying, we don't know that, because in part, they're not talking to the audience of the radio show. They're talking to their colleagues to let them know that they're not being hacks. But the end result is you have scientists saying, I don't know. And then you have an audience who's hungry for something to know, and that's when you have an opportunity for misinformation to grow. And so my advice to scientists is always to lead with what we know, you know, be surprised that we do know that. And be mindful that your audience is whoever's listening to you and not worry about being critiqued by your colleagues.

Sam: Of course, Chris was not at Deepwater Horizon just to talk about the spill. As he mentioned, he was there to study the area, which ended up teaching him surprising lessons about how oil degrades.

Chris: I have a great chemistry story. It's so much fun to talk on a podcast with chemists. No, no disrespect to other podcasts.

Up to Deepwater Horizon in April, 2010, prevailing wisdom was that sunlight broke down a very small fraction of hydrocarbons, ones that would absorb light, you know, aromatics. And it took a long time and it was kind of a non-player, right? It was just kind of like a small percentage. I walked into Deepwater Horizon thinking that myself. And you know, a couple years later I'm working with this woman Catherine Carmichael, who's a tech in my lab…

Deboki: Catherine was analyzing oil samples exposed to sunlight with a technique called gas chromatography and found that the amount of hydrocarbons in the sample was around half of what they’d expected. They thought it might be a mistake, but when she repeated the experiment, she got the same result.

Chris: And come to find out what we had learned, you know, years and years later, was that about 50% of the oil that reached the surface of the Deepwater Horizon was photochemically changed…

I walked into Deepwater Horizon thinking that photochemistry on oil was an insignificant non-player on a small fraction of compounds present. And what we learned was is that 50% of the oil that reached the surface from Deepwater Horizon was changed by sunlight within days to the point where it added oxygen and changes chemical behavior, to the point in which dispersants that were added were less effective to the sunburnt oil than in non-sunburn oil. And you know, I've been studying oil spills for 15 years. I never would've believed that. I was lucky to be part of a couple teams who were both sniffing around that topic, but in our case, we tripped over it because I doubted Catherine's math.

Sam: Chris mentioned dispersants there. Dispersants are chemicals that, when mixed with oil, make the oil droplets smaller and more dilute or degradable. Dispersants would end up being a big part of the response around Deepwater Horizon, and they posed a communication challenge that was magnified by the need to implement a solution quickly. This might sound familiar to the conversations around vaccines during the pandemic. And in fact, when we chatted with Chris, he drew a comparison between dispersants and the COVID vaccine rollout.

Chris: Deepwater Horizon, they used 1.6 million gallons of dispersants. About half of it was dropped onto the surface. And about another half was injected into the bottom of the sea floor. First time ever as a means to think that it would improve the air quality. It is a hot topic. Most people want to talk about the 1.6 million gallons of dispersants being used by the responders, then the 160 million gallons of oil.

Sam with Chris: Why do you think that is?

Chris: I think people have a problem, you know, using a chemical to solve another chemical. I wanna be very careful here. There's a lot of analogs between using dispersants and vaccines. Personally, based on the science that's in hand, I think that dispersants worked.

The disconnect between using dispersants and the vaccines, and this is where I would've done Deepwater Horizon much differently, was they did not put in a monitoring program to gauge the efficacy and whether or not they needed to do course corrections about whether or not using the dispersants worked or not worked. They did a study, it looked like it got a little bit better, and they just went for it.

Now granted, doing science in a crisis is not ideal, but that is the disconnect, that there is very little science available about using dispersants during Deepwater Horizon. And because of that, it still festers along. When you look at the vaccine and just a tremendous amount of work that was done to monitor the efficacy and follow ups and all the science because the stakes were so high.

Deboki with Chris: Do you think there's an opportunity to at least create some set of data that could help people in the future?

Chris: The good thing about Deep Horizon was that as part of a settlement, BP put forth 500 million to study research, and there's a lot of research funds out there, and I think they've made some great strides of recognizing some of these shortfalls and what needs to be done. But at the end of the day, nobody gets to plan. Right. And time, the clock is ticking. And, you know, if you're a responder and you only have two boats and you have three problem areas, but you also have the opportunity to maybe do a monitoring program, what do you do? You know, where do you build in the good science to maybe monitor something that might be useful then or now, or you rely on past experience and past knowledge when the clock is ticking.

Deboki: One of the things I took from reading Chris’ book and talking to him was an understanding of the logistics that surround these crises, that scientists have to navigate.

Chris:
I think the one thing we have to keep in mind is that Deepwater Horizon, as bad as it looked, was not as bad as what folks predicted. And it points to nature's resilience—not by any means giving a free pass to the oil spill. But I think it's, I think more importantly, a sense of logistics and infrastructure. The Deepwater Horizon was not as bad as what was playing out because you had access to boats, helicopters, believe it or not, hotels and restaurants. Right? You had the capacity to support an army of responders to get there in place and make this bad thing from getting worse. And, you know, that's the difference between the same type of event in a remote area, which would've been much, much worse.

Sam: And for Chris, as someone who has been talking to people about oil spills for a while now, that makes it all the more important for scientists to be involved with their communities before a crisis happens.

Chris: Former Commandant to the United States Coast Guard Thad Allen had the saying during Deepwater, it might be earlier: “You don't exchange business cards in a crisis.” And it couldn't be more true. The best outcome during Deepwater Horizon, and this was said by Jane Lubhenko, who was then the director of NOAA, and she wrote a really nice piece about it with Marcia McNutt who is now at National Academy, was that scientists who had a preexisting relationship with the government ended up having the best outcome in terms of transfer of knowledge.

Now that's challenging when you're on a national scale, right? And that's why I recommend scientists to all start local, you know, meet with your fire department, meet with your police, meet with whomever you think you might cross paths if everything would be of value to them.

I also think that on a local scale, people will trust you more. Um, and I think you can be more effective because you're mindful of the culture and the value system of your neighbors. And so the sense of building relationships and starting local, I think is a very powerful way and often always will yield a better outcome.

Sam with Chris:
There were many takeaways from your book, but one of the key ones I think was understanding that there are so many different stakeholders and they all have different goals. And so you have a fast moving situation. The stakes are high, and different goals can make messaging to the public to each other, you know, between stakeholders, it makes it really difficult. And so for you as a scientist, how do you define success when it comes to communication in these settings?

Chris: I would hope that if I was communicating science now, is that I would provide to the media and people whose lives or livelihoods are touched on that I provided some information that would at least give them some sense of knowledge. You know there are folks who know this, they think about it, and this is what this person said, and I'll take that better than something on Reddit. Right. That I have a net victory there.

Obviously if I was speaking to the responders and they were asking me my advice about, whatever the spill was, is to give them some information that was actionable, that they actually could use my insight and again, make a bad thing from getting worse. That certainly would be success.

But that's for me, right? My life doesn't revolve around whether or not the fishery gets shut down in the Gulf of Mexico, right? I mean, you can have a public who's interested in science or in pollution living in Seattle. That's a very different public that, you know, one spouse works as a fisherman and the other one works as the fish processing plant in Louisiana. And that's a completely different, you know, messaging and need, because the outcome is one, maybe the person in Seattle is outta curiosity or interest to the environment. But an oil spill in your backyard is your life and your quality of life. And, you know, you can't fish and you can't work at the fish processing plant. You have two people at home who are trying to figure out whether or not, you know, should they be going on vacation or when are they going to eat next. The stakes are significantly higher for a scientist to try to inject knowledge in there to help these folks out as best as we can. I mean, no scientist is gonna save the day with a cape, but obviously trying to help folks get through to what matters to them the most is a victory.

Sam: Alrighty. Tiny show and tell. I think I'm first today.

Deboki: Cool. Do it.

Sam: So as we were kicking off this recording, we were talking about how a lot of the things that we've been covering lately have been a little bit depressing. I want to think that there was some hope that came from this episode with Chris, but either way, I wanted to bring something hopeful for my tiny show and tell. So my tiny show and tell today is about how the first gene therapy for children with Duchenne muscular dystrophy has been approved by the US FDA, which is really exciting news. So gene therapy, generally speaking, it's where you modify the genes in a person's own cells so that they no longer carry the disease version of the gene and instead have the normally functioning one, which ideally will help cure the disease. So cool to just think about the fact that we can actually go in and change the DNA in people's cells so that they are cured of a disease. Incredible.

So Duchenne muscular dystrophy, it's caused by mutations in the dystrophin gene, which normally makes this large protein that acts kind of like a shock absorber that keeps people's muscles intact. Patients that have this disease don't have that shock absorption, so every time their muscles contract, they actually become damaged, which is so horrible. So the disease does become deadly when the most important muscles like our heart muscles and muscles that control breathing once those start to deteriorate because of this lack of shock absorption.

This is a pretty rare disease. It's about six out of every 100,000 people in Europe and North America, and it mostly affects males. What scientists are doing is they actually have created a shortened form of the dystrophin gene. So they've created essentially the form that it should be in, and then they're packing it into a harmless virus and then delivering it to muscle cells in these kids so that their muscle shock absorption can happen normally and they will no longer develop all of these horrible symptoms of this disease. Whenever we talk about disease, not all of it is happy because the disease exists, but I think the fact that really young kids, so this is for four and five year olds, the fact that this is something that is now available, I'm sure provides a lot of hope for families, kids who are dealing with this.

Deboki:
That's really exciting. My grad school work was with engineering T-cells for cancer therapy, so I always get really excited about any kind of gene therapy that gets to go forward because it takes so long to develop these therapies. A lot of the technology is still, relatively speaking, young. There's been so much work that's built up to it, but their use in therapies, that is a relatively new technology given how long we've been making medicines for, and so it's so exciting to see it be used and to see it getting to actually help people.

Sam: Yeah.

Deboki:
Well, I don't know that mine is good news, but I do like to think of this as something that will make you, at least you, Sam happy. I hope, because I was in the car, I was listening to the Economists' News podcast and they started telling the story. As soon as I heard it. I got very excited because a lot of times for tiny show and tell, it's a last minute thing. What are the cool science articles I've read lately? But this one I heard it and was like, "This is the thing that I'm going to tell Sam about," because it was about the important export of body parts for medical students to learn from in England.

Sam:
Oh my gosh. Yeah. This is totally something I'd want to learn about.

Deboki: Yeah, so for those of you who are newer listeners, we have a whole episode on Body Farms that I think is basically the reason Sam started this podcast because she really wanted to get to explore it. And so it was an interesting episode to learn a lot about the body farms as we were working on it and kind of how they work and a lot of the ethics and stuff that goes into it. And so that was part of why I found this article really interesting because it's basically about the availability of bodies for medical students to learn from in Britain. And one of the things they were talking about is that there are actually computer generated simulations now to help medical students learn. And so those are helpful. But there's also just a very fundamental difference between dealing with something that is computer generated and actually dealing with a real body.

A common experience is for students to basically have to learn to not faint when they're dealing with a real body. And so that is something that you can only get with an actual body to learn from. But there are also obviously lots of ethical issues around this. So one of the big differences between Britain and America is actually the consent that you can give when you are donating your body. In America, if you have died, someone else can consent to donating your body, but in Britain, they only allow first person consent, so you have to be of sound mind while donating your own body and nobody else can actually donate it. And so on the one hand, that definitely creates this regimented, ethical sort of guideline. But on the other hand, it also means that there are ultimately fewer bodies to be studied from. And so that creates challenges.

And it also for the doctors themselves, it creates this weird situation where they end up having to learn a lot from bodies that are imported, and there's kind of an uncomfortable feeling around that. The article is about this national repository center at Nottingham City Hospital that was created in 2011, which operates similar to a lot of what these American companies do to help distribute body parts, but just kind of keeping it in England and it actually loans body parts out. It was really interesting reading this article, how these complicated ethical frameworks, what their effect actually is for the people who are both donating bodies and also for the people who are hoping to use them. And the title of the article is How Much is a Human Head, because it gets into the pricing schemes for what these different body parts are.

Sam:
Yeah. The exchange of money for this kind of stuff is so weird. I understand that it takes money to ship and store and employ people to help with that, but then, yeah, it just feel so complicated. I'm excited to read this because I feel like there's 10 different ethical issues that popped up into my mind.

Deboki: So one of the things that was interesting about this National Repository Center is that they don't sell the body parts. They technically just loan them out. So when surgeons are done with them, the body parts return to the center and they'll eventually cremate it. And I think kind of have a sort of process there. But yeah, it is, I feel like anytime I read about this, I think it's supposed to be uncomfortable. I don't think we should come out of it feeling like this is a comfortable thing, totally chill. I think it should always feel uncomfortable because sometimes you just need something to feel icky to make sure that it's going right, but it's still so hard to know what it ultimately would look like for all of us to feel all right.

Sam: And yeah, if people are interested in these anthropological research facilities, which are in slang, they're called body farms, but these are not black market body harvesting, whatever. These are legit research facilities. And I think we had a really wonderful conversation with some of the people that work at anthropological research facilities, these body farms, talking about ownership of your own body and ethics and consent and all of these different things. I feel like I need to read that article and send it to one of the scientists we talk with for the Body Farms episode and see what they think of this, because yeah, that's so weird. If I donated my body to science and then ultimately someone made a profit off of it, storing it in a facility, I'd be dead, but I'd be pissed. I don't know. I'd be haunting that facility like crazy.

Deboki:
Thanks for tuning in to this week’s episode of Tiny Matters, a production of the American Chemical Society. This week’s script was written by me. And it was edited by Sam, who is also our executive producer, as well as by Michael David. It was fact-checked by Michelle Boucher. The Tiny Matters theme and episode sound design are by Michael Simonelli and the Charts & Leisure team. Our artwork was created by Derek Bressler.

Sam: Thanks so much to Chris Reddy for joining us, we’ve left a link to where you can pick up his book Science Communication in a Crisis: An Insider’s Guide. If you have thoughts, questions, ideas about future Tiny Matters episodes, send us an email at tinymatters@acs.org. And if you’d like to support us, pick up a Tiny Matters coffee mug! We’ve left a link in the episode description. You can find me on social at samjscience.

Deboki:
And you can find me at okidoki_boki. See you next time.

LISTEN AND SUBSCRIBE