The Washington PostDemocracy Dies in Darkness

They turn to Facebook and YouTube to find a cure for cancer — and get sucked into a world of bogus medicine

June 25, 2019 at 3:10 p.m. EDT
(Cameron Cottrill for The Washington Post)

Mari pressed kale leaves through the juicer, preparing the smoothie that she believed had saved her life.

“I’m a cancer-killer, girl,” Mari told her niece, who stood next to her in the kitchen. The pair were filming themselves for a YouTube video.

Mari said she was in remission from a dangerous form of cancer, and the video was meant as a testimony to what she believed was the power of the “lemon ginger blast.” In went some cucumber, some apple, some bok choy, a whole habanero pepper.

While she pressed, she preached.

“I’m telling you, it’s anti-cancer,” Mari said. “It’ll kill your cancer cells.”

The video, first uploaded in 2016, remains on YouTube, but there’s an “important update” attached to the video’s description. It was written by Liz, the niece, a year later.

Mari’s cancer had returned, the note said, and she had died.

When Mari’s cancer came back, Liz wrote, her aunt opted to do chemotherapy. Her smoothie recipe remains online, with 506,000 views and counting. “I will not take down her videos,” wrote Liz, who declined to comment for this story, in the description of a follow-up video, “as they continue to help people.”

I found Mari’s videos without looking for them last fall, when a search for a smoothie recipe opened up an algorithmic tunnel to videos that claimed to know the secret to curing cancer. These tunnels, forged by Google searches and Facebook recommendations, connect relatively staid health and nutrition advice to fringe theories, false claims and miracle juices.

But the web of false, misleading and potentially dangerous cancer “cures” and conspiracy theories isn’t just there for those who stumble into it accidentally. More often it ensnares people who are reeling from bad news and groping for answers.

“People with a new cancer diagnosis are often feeling vulnerable and scared,” said Renee DiResta, a researcher who studies disinformation. The treatments for cancer, especially chemotherapy — which targets cancerous cells but can also kill or damage healthy ones — can come with significant, unpleasant side effects. Facing the horrors of such a diagnosis and treatment, some people start searching for information and community online.

What they find can be quite disturbing to medical professionals: home remedies that purport to cure diseases with baking soda, frankincense, silver particles.

Google and Facebook have promised to crack down on health misinformation in recent months, as links between anti-vaccine conspiracy theories and measles outbreaks in the United States become major news. But bogus health information cannot be eradicated from the Web with a shock of chlorine. Health conspiracy theories and false cures have polluted social media for years, abetted by companies that have been more focused on building out the plumbing than keeping the pipes clean of misinformation.

Internet companies have long argued that Section 230 of the Communications Decency Act gives them the autonomy to moderate their platforms for abusive and harmful content, while shielding them from legal liability over those moderation decisions — and what their users post. Increased attention on how tech companies moderate themselves has led to calls from lawmakers on both sides of the aisle to weaken, or revoke, this immunity. Tech companies have warned that doing so would limit their ability to remove hateful content.

For now, tech companies respond to reports of harmful content on their own terms, and at their own pace. The result, in the case of health misinformation? A long period during which seekers, ushered by algorithms, found themselves immersed in wells of dubious advice and conspiracy thinking. They soaked in the wisdom they found there, and carried it into their own networks by the bucketful. In this way, the proliferation of bogus medical science in the Internet age resembles a public-health crisis: The harm can be hard to calculate, and remedies cannot undo the damage already done.

As recently as late April, searching “cure for cancer” in YouTube (turning on “incognito mode” so that my prior search history wouldn’t skew the results) surfaced several troubling results: The sixth video, with more than 1.4 million views, claimed that baking soda could cure cancer. The eighth was an interview with self-described cancer expert Leonard Coldwell, in which Coldwell explains that every cancer can be cured in weeks with a special diet that “alkalizes” the body, a claim that has been debunked by scientists. The video has more than 7 million views. (In an emailed statement to The Washington Post, a spokeswoman for Coldwell identifying herself as “Danielle” claimed that Coldwell, who no longer treats patients, had the “Highest Cancer Patient Cure Rate in the world,” and boasted that Coldwell remained popular despite being “the most blocked Cancer Patient Expert in the world.”)

YouTube is trying to plug the holes that lead to videos like the Coldwell interview. When I ran the “cure for cancer” search again, in May, YouTube’s search results were a completely different story. The baking soda and Coldwell videos are still online, but no longer appear among the top pages of results. Instead, most of the top results came from major cancer research centers.

Top AI researchers race to detect ‘deepfake’ videos: ‘We are outgunned’

I asked YouTube about the change, which occurred just before we reached out to the company for comment on this story. I was told YouTube has started to treat search results for different types of topics differently: When its algorithms decide a search query is related to news or information-gathering on a topic like cancer, they will attempt to populate the results with more authoritative sources. The company said it is working with experts on certain health-related topics to improve results.

Even as YouTube patches “cure for cancer,” medical misinformation remains available and popular in other ways. People who are susceptible to cancer misinformation aren’t just typing keywords into YouTube. They’re also turning to fellow travelers who followed the same algorithmic tunnels to the same wells, where community members who have never met in person swap folk remedies and discuss the untrustworthiness of cancer doctors and pharmaceutical companies.

It’s tempting to think of medical misinformation as a technological problem in need of a technological solution, but that’s only part of it. The social media age has made humans part of the infrastructure of the Internet. And when it comes to medical information, it’s not just algorithms that direct online seekers who are trying to figure out how to cope with a bad diagnosis. It’s also other people.

For those facing a battle with a terrifying illness, hopeful anecdotes can be powerful. Anecdotes can turn seekers into believers, who can turn other seekers into believers. And on Facebook, those anecdotes continue to attract large audiences.

Even as Facebook works to limit the reach of anti-vaccine chatter, other medical misinformation is thriving — including bogus cancer cures. The boundaries between false medical beliefs are permeable: If you believe baking soda can cure cancer, you might also believe that the measles vaccine causes autism. (It doesn’t.) Behind each “alternative” theory of cures and causes lurks a deep suspicion of doctors, drug sellers and especially chemotherapy.

On Facebook, I easily found groups devoted to sharing “natural” cures for cancer, where people who have cancer diagnoses, or care for someone who does, asked other group members for ideas for how to cure it. “Cancer Cures & Natural Healing Research Group” has just under 100,000 members. I joined the closed group in February, identifying myself as a Washington Post journalist to the administrators.

The administrator for that group initially agreed to speak with me in private messages. But then I was blocked from the group and the administrator’s personal Facebook page. (The administrator did not return a follow-up email seeking comment.)

Facebook’s algorithms then began suggesting other groups I might like to join: “Alternative Cancer Treatments” (7,000 members), “Colloidal Silver Success Stories” (9,000 members) and “Natural healing + foods” (more than 100,000 members). I requested access to some of those groups, too, and several admitted me. People in the groups would ask one another for cancer-fighting advice. Some would be told to use baking soda or frankincense.

Rather than remove the groups, Facebook’s strategy to limit health misinformation centers on making it harder to join them unknowingly. Facebook said in an emailed statement that it “will alert group members by showing Related Articles” for any post already deemed false by Facebook’s third-party fact-checkers, for instance.

Facebook is in the process of experimenting with how to address health misinformation beyond vaccines. One possibility might be alerting users who are invited to join a group that it has circulated debunked hoaxes.

To this point, it’s been up to users to steer their peers toward or away from bad health advice. In one Facebook group, in February, a parent asked for advice on how to cure a child’s strep throat without antibiotics. The responses were split; some told the parent not to mess around and go to the doctor for antibiotics; others recommended colloidal silver and hydrogen peroxide. The National Capital Poison Center notes that even food-grade hydrogen peroxide “should never be taken internally” unless extremely diluted, and that its use as an alternative therapy is “not based on scientific evidence.”

The world of alternative medicine seekers has its own celebrities. The names are like pass phrases. Post a question about natural cancer treatments in the right Facebook group, and you’ll get the names of supposed success stories that the pharmaceutical industry doesn’t want you to know about, and the instruction to “do your own research” into their stories.

“CHRIS BEAT CANCER! Look it up,” one Facebook user advised on a discussion thread.

So I did. The first Google result, when I ran the search in mid-May, was Chris Wark’s website, where Wark sells access to his method for $147. Below that, Google also suggested a few specific videos from Wark, promoting his “cancer fighting salad” and a lecture on how he beat cancer with “diet.”

Joanna Tackett, a spokeswoman for Wark, said in an email that Wark is not a doctor and does not provide medical advice, and that he has given free access to the paid program to hundreds of thousands of people.

Misinformation experts worry about “data voids,” created by the way information gets indexed online. If the only people discussing and looking up a particular term or phrase are those advocating a certain view, people searching that phrase would be shown information that supports that view.

“You can easily dominate search results for a term when you’ve created the term and only in-groups use it,” DiResta said. As social media companies identify and crack down on one search term, she said, 20 more might be rising in interest to take its place.

A Google spokesman said the company has worked to improve the accuracy of results for general health-related queries, but for very specific searches, such as “Chris beat cancer,” the system is designed to return “results from a diverse range of sources to help you form your own opinion — some of these provide information about the book, while others provide critiques.”

Google users searching for cancer information more generally might end up being served ads that promote dubious treatments, even if those sources don’t show up in the search results. In a search for “cure for cancer,” in May under incognito mode, the first result was an ad: “Stage four cancer survivor | thanks to natural cancer cures.” The ad promoted a cancer clinic in Mexico that appears to use unconventional treatments. Another Google spokesman said that ads promoting miracle cures are against the company’s rules, and that “if we find ads that violate our policies we remove them.” The ad was removed after I flagged it in an email.

For some searches, the results will be a tug-of-war between the believers and debunkers. Searching “Chris beat cancer” did not reveal a total data void. Google did show me two results challenging Wark’s claims about beating cancer with a healthy diet — but only after links to his YouTube channel, his website, the salad video, the lecture and Wark’s book. (When I ran the search later, after asking Google about the results, the challenges appeared a bit more prominently.)

One of the challengers is David Gorski, a surgical oncologist at Wayne State University School of Medicine who runs a blog, called Science-Based Medicine, devoted to medical bunk.

The Washington Post’s guide to manipulated video

Gorski dove down the cancer conspiracy theory tunnels a decade ago, armed with science and determined to stanch as much misinformation as he could. Back then, he said, “all there really was were websites, blogs and discussion boards that were privately maintained. Their reach was nowhere near what Facebook came to be.”

Now, Gorski faces not only a more forceful tide of misinformation, but also intense blowback from those who have responded to his work. Accusations of wrongdoing from holistic healing sites and a wave of negative ratings on his Vitals.com profile have tainted the results of Google searches for his name.

Gorski’s debunking of Wark’s story was simple. Wark, who says he had surgery for his Stage 3 colon cancer but refused chemotherapy after, had a 64 percent chance of surviving five years with surgery alone, Gorski said. To get that figure, the oncologist used a tool called Adjuvant Online, which helps doctors assess the risks and benefits of potential therapies designed to prevent the recurrence of cancer after it is treated. The database used clinical trial data from a wide range of studies.

“Attempts to discredit me because I had surgery give far too much weight to my personal story, and miss the larger message. . . . People have healed all types and stages of cancer holistically (against the odds),” Wark said in a statement. “As a patient advocate, I am highly critical of the cancer industry and pharmaceutical industry,” he added, before saying that “I do not tell patients not to do the treatment.”

Surgery was the recommended primary treatment for Wark’s cancer, Gorski said. Chemotherapy is a secondary measure, meant to help prevent the cancer from coming back. Wark’s decision to forgo the post-surgery chemo was a risk, but by then the odds were in his favor.

After I was kicked out of the “Cancer Cures and Natural Healing Research Group,” I joined the similarly named “Natural Healing & Cancer Cures Research Group,” a closed Facebook group with more than 40,000 members. (Again I identified myself as a journalist while joining the group.)

That’s where I saw a post by Beth Anne Rekowski, who said her sister was sick with Stage 3 lung cancer.

“She and I both agree,” Rekowski wrote, “NO chemo or radiation.”

I called Rekowski to find out why.

Cancer has haunted Rekowski for much of her adult life. She wanted to be a nurse, but when her young son got cancer, she dropped out of nursing school. When he died in 1992, at age 4, Rekowski felt like she had died with him. Grief became activism, and she started raising money for charities that helped pay for cancer research. Later, her brother-in-law and her father were diagnosed. They died, too.

Rekowski had health issues of her own, and on the advice of a friend, she visited two naturopathic doctors. When their practices closed (Rekowski blames Big Pharma), she started seeking out remedies online. She found Facebook groups full of them.

“I used Facebook health groups just full-speed ahead,” Rekowski told me in a March phone interview from her mother’s home, where she is a full-time caretaker, “and I couldn’t believe what resources were on there. People, you know, to help other people because they’ve been there.” The Facebook groups were Rekowski’s lifeline.

Meet the New York couple donating millions to the anti-vax movement

She came to believe that chemotherapy, not cancer, had killed her son, father and brother-in-law. “Talk about parental guilt and remorse,” Rekowski told me.

Now, every sick relative is a chance for redemption. She advised her sister against chemotherapy or radiation to treat her lung cancer. Rekowski wanted her to use “naturals” and “immunotherapy” instead. And so she turned to her lifeline: the other members of the Natural Healing & Cancer Cures Research Group.

“If you can please give me a list,” she wrote in a post on the Facebook group, “in order of urgency and priority, of what you feel is imperative for nutrition, immune boosting, cancer killing, and whatever else you feel my sister needs.”

The responses flooded in by the dozens.

Salt water baths. 4 times a day.

B17 vitamin. . .CBD Oil full spectrum.

Add Wheatgrass juice to your sister’s diet.

Rekowski’s sister trusted her doctors: She did chemotherapy. Then she got an infection in her lungs, according to Rekowski, and in March her doctors said it was time to enter hospice. But Rekowski still had hope. She said she convinced her sister to wait on hospice, and went to a health food store that evening and bought a small fortune’s worth of essential oils.

She believed she could heal her sister’s lungs with fenugreek, licorice root, peppermint oil and oregano oil. Once her sister’s lungs were better, Rekowski believed, she could get to work curing her cancer.

In May, Rekowski wanted me to know that she believed her sister was a miracle. Once the infection had subsided, she texted to say the doctors had offered to start her sister on chemotherapy again. “My sister declined,” she wrote, and decided to continue with “natural supplements and natural oils.”

When Rekowski and I talked about health, it sometimes felt like we were talking about faith. The story she tells about her sister’s illness is meant as a parable about how chemotherapy can kill. She saw me, and the readers of this article, as potential converts.

I told Rekowski that I believed the groups she depended on exploited people’s desire for hope in the face of a bleak prognosis. When there are no options left, it’s powerful to find a community that tells you otherwise, even if those options turn out to be ineffective or even harmful.

But each time I challenged her with a counterpoint, Rekowski waved it away. The government was covering up evidence that supported her views, she told me. The treatments she found on Facebook worked for her, she believed, and that was all the proof she needed.

As a surgical oncologist, Gorski sees the effect that medical misinformation can have on the body. A couple of times a year, he says, he’ll treat “patients with neglected cancers, who try to treat their cancers naturally” before turning to medicine. The tumors have become “nasty ulcerating masses.” Even for patients with terminal diagnoses, traditional medicine can offer palliative care that can manage the pain and may be covered by health insurance.

After years of allowing health misinformation to spread, social media companies are beginning to treat the problem the best they can. They didn’t create cancer conspiracy theories, but experts like Gorski have observed how they made the problem worse. “It’s just way more concentrated and effective,” he said. “You go on Facebook and type in ‘alternative cancer cures’ and you’ll find stuff real fast.”

Correction: An earlier version of this story misspelled Beth Anne Rekowski’s name.

YouTube banned Alex Jones. Logan Paul, one of the platform’s biggest stars, invited him back.

Anti-vaxxers target communities battling measles

The ‘Momo challenge’ isn’t a viral danger to children online. But it sure is viral.