An hour-long video filled with COVID-19 conspiracy theories has been watched more than half a million times on Facebook and YouTube – prompting both firms to take it down. The 67-minute clip, which focuses on lockdown in Ireland, features a conversation between right-wing vlogger Dave Cullen and anti-EU activist Dolores Cahill. Throughout the interview, Cahill claims lockdown is unnecessary, touts the "miracle drug" hydroxychloroquine as an effective treatment for COVID-19, and claims recovered patients will have immunity for life. Ireland has slowly started loosening lockdown restrictions following 25,000 reported cases and around 1,500 deaths linked to COVID-19. Visit Business Insider's homepage for more stories.
A video filled with misleading claims about COVID-19 was watched almost a million times on Facebook and YouTube over the last week, prompting both firms to remove the clip. The hour-long video was watched more than 500,000 times across both platforms over the last week, after being uploaded by Dave Cullen, a right-wing vlogger and activist based in Ireland, best known for his "Computing Forever" YouTube channel, in which he regularly rails against political correctness, online censorship and "wokeness". In the clip, Cullen and Prof Dolores Cahill, chairperson of the Irish Freedom Party, a fringe political outfit with ties to the alt right, make a series of inflammatory and unsubstantiated claims about COVID-19. Ireland started easing COVID-19 restrictions on Monday, following a strict two-week period of lockdown. The country has suffered around 25,000 confirmed cases and close to 1,500 deaths linked to the virus. While Cahill, a professor at University College Dublin, appears to have a credible background in virology and disease transmission, which she details extensively in the clip's opening 10 minutes – while failing to mention her own political affiliations or aspirations. Insisting social distancing in Ireland is unnecessary, Cahill tells viewers the country should exit lockdown completely "within the next week or 10 days", adding that she would "be happy to take responsibility for those actions and be held to account". Cahill goes on to claim those that have recovered from COVID-19 are immune for life, that a combination of vitamins C, D, and zinc will stop most people from developing symptoms, and that hydroxychloroquine will effectively cure victims of the virus. In the latter half of the video, Cahill's pronouncements become increasingly political, with calls for "an inquiry into the media and the politicians" in Ireland, suggesting the country's national broadcaster RTE should have its licence fee revoked. Almost all of the claims made by Cahill and Cullen in the video have either been debunked or remain subject to intense scientific research, such as the effectiveness of hydroxychloroquine in treating COVID-19. Despite being touted by US President Donald Trump as a miracle drug, trials have shown it to have little to no impact in treating the virus. Two observational studies, published in the New England Journal of Medicine and the Journal of the American Medical Association, found that from thousands of hospitalized coronavirus patients, those who got the drug did not do any better or worse than those who didn't get it. The JAMA study also found that those who received hydroxychloroquine combined with the antibiotic azithromycin had a higher rate of cardiac arrest. Additionally, while some initial studies saw promising results from the drug, experts warned that those studies were "limited by their low quality, often enrolling tiny groups of patients or lacking a control group to compare the results against." There has been limited evidence that vitamin supplements, used to boost an individual's immune system, could help fight off infection, the results are far from conclusive. At the same time, there is no reliable evidence that a recovered COVID-19 patient will be "immune for life". In a statement released in April, the World Health Organization said there was "currently no evidence that people who have recovered from Covid-19 and have antibodies are protected from a second infection". Although it is likely that recovered patients will have developed some degree of immunity, there is no consensus on how long it would be likely to last. Cullen's video, which accrued more than 500,000 views across Facebook and YouTube, raises questions over the tech firms' approaches to misinformation. Facebook launched its "COVID-19 Information Hub" earlier this year, compiling guidance from reliable sources – such as the CDC and WHO – and placing it at the top of every user's news feed. Meanwhile, YouTube claims to have been manually reviewing and removing thousands of videos that spread dangerous or misleading coronavirus information. The company has not made clear if Cullen's video meets that threshold. Apparently aware that Big Tech social media companies will struggle to keep his video offline, Cullen told viewers to "download and reupload this video everywhere," adding: "Please do." At the time of writing, at least one other version of the video remained live on YouTube, uploaded by Kerry Baldwin, whose channel focuses "on the philosophical thought of liberty". An unknown number of Facebook users have posted alternative links to the video on Bitchute, a YouTube alternative known for accommodating right-wing vloggers. A Facebook spokesperson told Business Insider: "We have removed this video for violating our stringent harmful misinformation policies. "We are taking aggressive steps to stop misinformation and harmful content from spreading on our platforms and have removed hundreds of thousands of pieces of content, both proactively and following user reports." A YouTube spokesperson said: "We're committed to providing timely and helpful information at this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using WHO data and information from local health authorities, to help combat misinformation." Business Insider approached Cullen and Cahill for comment. Join the conversation about this story » NOW WATCH: Why electric planes haven't taken off yet
More like this (3)
84% of medical misinformation on Facebook is never tagged with a warning, and is viewed billions of times, report says
84% of medical misinformation posts on Facebook were left up with no warning label, despite the...84% of medical misinformation posts on Facebook were left up with no warning label, despite the platform's policy to counter bogus claims, an investigation found. Non-profit organization Avaaz said 16% of the posts it examined that had been fact checked were labelled, while the other 84% were not. It said bogus claims were viewed some 3.8 billion times, with an especially large audience as the coronavirus pandemic first began to spread. Facebook said Avaaz's report did "not reflect the steps we've taken" to counter misinformation, and argued that it was working to circulate "credible health information." Visit Business Insider's homepage for more stories. A large majority of Facebook posts containing medical misinformation — 84% — were left online with no labels or warnings, according to a report by the group Avaaz. It said that its survey of bogus medical claims and advice on the platform found that only 16% of posts were given a label highlighting their contents as untrue, unproven, or harmful. The other 84% were not. The report by Avaaz, a US non-profit, looked at 174 pieces of content that were fact-checked by a credible third party and found to contain health misinformation. Avaaz found that many posts, some of which reached millions of people, managed to avoid being labelled by Facebook by reposting content from other pages or translating it into other languages. Avaaz found that in the last year misinformation about health has been viewed 3.8 billion times on Facebook across at least five countries — the US, the UK, France, Germany, and Italy. It said the volume peaked in April, as the coronavirus pandemic spread quickly around the world. The group concluded that Facebook poses a "major threat" to public health. It found that content from the 10 biggest websites for spreading health misinformation had almost four times the Facebook views as content from the 10 large health bodies, like the World Health Organisation (WHO) and the US Centers for Disease Control and Prevention (CDC). The group said that Facebook should put independent and fact-checked corrections alongside the misinformation on the platform, and said this could reduce people's belief in the misinformation by an average of almost 50%. And it said Facebook should alter its algorithm to reduce the reach of misinformation by 80%. Avaaz said: "Facebook has yet to effectively apply these solutions at the scale and sophistication needed to defeat this infodemic, despite repeated calls from doctors and health experts to do so." The coronavirus pandemic has put Facebook under a new spotlight, as false and misleading information about the virus, its source, vaccines, cures, and what role major figures are spreading spreads across social media platforms. Facebook has taken down conspiratorial videos, given warnings to people who may have spread misinformation, and has taken down anti-lockdown event pages. But Avaaz said the prevalence of misinformation despite these measures show that "even the most ambitious among Facebook's strategies are falling short of what is needed to effectively protect society." It called its investigation "one of the first to measure the extent to which Facebook's efforts to combat vaccine and health misinformation on its platform have been successful, both before and during its biggest test yet: the coronavirus pandemic. Facebook rebuffed Avaaz's findings to the BBC, saying they did "not reflect the steps we've taken" "We share Avaaz's goal of limiting misinformation. Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of Covid-19 misinformation and removed seven million pieces of content that could lead to imminent harm. "We've directed over two billion people to resources from health authorities and when someone tries to share a link about COVID-19, we show them a pop-up to connect them with credible health information."Join the conversation about this story » NOW WATCH: A cleaning expert reveals her 3-step method for cleaning your entire home quickly
Post included video in which Trump wrongly said that children were ‘almost immune’ from illnessFacebook has...Post included video in which Trump wrongly said that children were ‘almost immune’ from illnessFacebook has removed a post from Donald Trump’s page for spreading false information about the coronavirus, a first for the social company that has been harshly criticized for repeatedly allowing the president to break its content rules.The post included video of Trump falsely asserting that children were “almost immune from Covid-19” during an appearance on Fox News. There is evidence to suggest that children who contract Covid-19 generally experience milder symptoms than adults do. However, they are not immune, and some children have become severely ill or died from the disease. Continue reading...
Facebook and Twitter are cracking down on videos that promote conspiracy theories and unproven COVID cures, but researchers say that removing already viral misinformation can backfire and make things worse (FB, TWTR)
Social media platforms have enacted bans on disinformation related to COVID-19 and other issues, but researchers...Social media platforms have enacted bans on disinformation related to COVID-19 and other issues, but researchers say that banning content after it's already gone viral can do more harm than good. For example, the platforms recently banned a viral video of doctors urging COVID-19 treatment with hydroxychloroquine, which federal agencies have called ineffective and dangerous. That ban prompted news coverage and charges by conspiracy theorists that the video contained truth being suppressed by authorities. Social media platforms say they are addressing disinformation as quickly as they can. Visit Business Insider's homepage for more stories. Disinformation campaigns can rocket to virality by capitalizing on the very bans that social media companies have enacted to address them, researchers and analysts say. A key case in point is the "America's Frontline Doctors" video posted on July 27 in which a white-coated group urged treatment of COVID-19 with hydroxychloroquine, which federal agencies have called ineffective and potentially dangerous. The video went viral after President Trump tweeted it, and the platforms removed it that day. But posts about it spiked July 28, the day after Twitter banned it, researchers say. Why? Viral posts about a video that was suddenly unavailable piqued interest even more, they say. The social media companies' bans are "stopping viral disinformation at a very high rate of engagement — once they have already been established," says Annie Klomhaus, cofounder and chief operating officer of the Austin internet and social media research firm Yonder. By cutting off the content after it's already gained so much steam, the bans end up putting the disinformation in a media spotlight. In the case of the "Frontline Doctors" video and the controversial drug, its proponents can then claim the ban is part of a suppression campaign connected to the government, furthering their conspiracy theory. In the case of the video, it spread through a formula that's proven incredibly effective for disseminating other messages (or disinformation), too. The groups pass around videos and other disinformation in private groups, which Facebook doesn't closely monitor. Once the content has momentum, users post it to Twitter and seek to engage large influential accounts that share the same ideology. At that point, it is difficult for social media companies to take any action that doesn't exacerbate the issue, in part because of intense coverage by traditional media, which will write up both the fact that disinformation has gone viral and its removal. "The companies are in a really hard place," Klomhaus says. "They're trying to do the right thing, but addressing something that is already viral is a really hard problem." Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights, who has authored recent widely-cited research on social media disinformation, agrees. "By the time platforms even notice the existence of such a video, it's often gone viral, and millions of people have seen it, possibly being misled on important issues such as the effectiveness of supposed medical cures," he said. "What's more, the very act of taking down such content can feed into conspiracy theories that the material is being suppressed by malign interests." Twitter has said its moderators take action swiftly when disinformation is discovered. In the case of the "Frontline Doctors" video, a Twitter spokesperson says, "Tweets with the video were in violation of our COVID-19 misinformation policy. We are taking action in line with our policy." A Facebook spokesperson said, "It took us several hours to enforce against the video and we're doing a review to understand why this took longer than it should have." The company said it has removed more than 7 million pieces of content on Facebook and Instagram for violating its policy against sharing COVID-19 misinformation. Some familiar social media influencers helped to make the "Frontline Doctors" video go viral. The video picked up momentum in private Facebook groups, then made the jump to Twitter, where the right-wing youth group Turning Point USA amplified it. (That group has opposed masks and social distancing despite its cofounder's death from COVID-19.) The far-right blog Breitbart News picked the story up, as did several programs on a favorite news source of the president, Fox News. The video reached millions when it was tweeted by President Trump and his son, Don Jr. This led to bans by Twitter, Facebook, and YouTube. Those bans were widely covered by news agencies, and the disinformation campaign reached its peak of 120,000 social media posts about the drug the day after Twitter banned the video. In the week before, there were 18,000 posts about the drug, according to researchers at Yonder. This may have been exactly what the groups promoting the video wanted. One of the main promoters of the video appears to confirm that view. A doctor thrust into the spotlight by the video, Stella Immanuel, posted on Twitter that her religious ministry – which says some health issues are caused by people having sex dreams about "demons" – benefited from the TV coverage brought about by the social media ban. "CNN, MSNBC etc are doing free commercials on our deliverance ministry," she said in a tweet on July 28, the day after Twitter banned the video. Woah CNN, MSNBC etc are doing free commercials on our deliverance ministry. Fire Power is main stream. Thank you CNN and let me know when y'all need some of them demons cast out of you. I will gladly oblige. You will feel a lot better. Keep up the good work. #cnn #MSNBC — Stella Immanuel MD (@stella_immanuel) July 29, 2020 Immanuel did not respond to several requests for comment. The White House did not immediately respond to a request for comment. When asked about his sharing of the video last week, President Trump said, "[Immanuel] said that she's had tremendous success with hundreds of different patients, and I thought her voice was an important voice, but I know nothing about her." The group is backed by Tea Party Patriots, a conservative group that has supported protests against lockdown measures. The New York Times reported that the group posted the video to its YouTube channel on July 27 before it went viral. Klomhaus of Yonder notes that a similar hydroxychloroquine disinformation campaign followed this path in April, and more disinformation campaigns are likely to exploit this process, especially as the pursuit of a coronavirus cure continues, along with the upcoming election. "As a vaccine comes closer to coming out, this narrative will probably continue," Klomhaus said. "If it follows the previous pattern of recurring in a few months, that would put this kind of viral politicalization of the virus squarely right in front of the election." For example, there are many many conspiracy theories about Microsoft founder Bill Gates and COVID-19 that have no basis in fact that are spreading in similar ways on social media, Klomhaus says. Barrett of NYU says social media platforms must address the holes in their techniques for addressing disinformation, because there's no way the problem is going away in the leadup to November's election. "Unfortunately, the platforms have no choice but to improve their technical and human content moderation methods and press ahead with removing content that is dangerous to users," he said. "The platforms cannot just throw up their hands and say the problem has no solution."SEE ALSO: Facebook banned disinformation networks that downplayed the severity of COVID-19 Join the conversation about this story » NOW WATCH: A cleaning expert reveals her 3-step method for cleaning your entire home quickly