YouTube is the world’s largest source of free video streaming and lately, its recommendation system has been filled with fake cancer cure videos, according to an article published by the BBC.
After watching 80 videos in over 10 languages, BBC contacted a medical expert, led by Professor Justin Stebbing from Imperial College London, to debunk the videos’ claims. Many of the claims in the videos consist of eating foods like vegetables and donkey’s milk.
In a video with Professor Stebbing, he states that there is not sufficient data to prove the videos’ claims.
In fact, baking soda seems to be commonly connected as a “cure” in the videos but Stebbing states that in reality, it can actually be harmful.
The videos themselves claim that sources of cancer cures can be juicing foods, donkey milk, baking soda, vegetables and exotic foods.
YouTube uses advertisements placed on videos as a source of income for both the content creator and itself. With the rise of these cancer cure videos, YouTube stands to gain a profit from people watching these videos however, after being contacted by the BBC, YouTube has removed advertisements on these types of videos.
Yet, only English-speaking videos have currently had their advertisements removed.
Companies like Grammarly and universities have also requested their advertisements on these types of videos have also contacted YouTube about resolving this issue.
The main issue with these advertisements is that they are not specifically tailored to the videos they are on.
An algorithm sets the advertisements on the videos based on the viewers’ preferences, so an ad on a video that contains false claims can result in bad representation for companies.
Additionally, the algorithm that promotes these types of videos does not consider the credibility of the video creators. To improve this, YouTube has banned misleading videos.
BBC has also contacted some of the people who have created these videos. In response, two videos have been removed by the users while two other video creators have not responded. One of the videos removed had 1.4 million views before being taken down, showing the great reach of these videos.
The main danger of these videos is not just misinformation but the possibility of people advocating for the YouTube cures instead of professional treatment for cancer.
As a serious condition, misinformation should not be treated lightly.
A solution proposed by Hai Fung, a professor at Georgia Southern University, was the inclusion of informative videos by professionals available to the public masses. These educational videos could clear any misconception present in people searching for cures for cancer.
It was estimated that in March 2015, there were 31.8 million YouTube users aged 18 to 24 and around 19.4 million users were reported to be around the age of 65 and older, according to an article in Digiday.
Some studies have shown a possible correlation between age and cancer, where the risk of cancer increases as we grow older because of the accumulation of damage over time, according to an article in Cancer Research UK.
If young healthy YouTube viewers retained the misinformation as they grow older, it could lead to grave consequences in the future.
Although misinformation might be nothing new for YouTube, it is crucial for it to stop as soon as possible. In response to all of this, YouTube has stated it will try to stop the spread of misinformation and uphold its YouTube community guidelines.
A similar case with the rise of anti-vaccine misinformation has led other content creators to reply with more reliable information. For instance, the channel Kurzgesagt, with the help of medical professionals, replied to anti-vaccine claims with a video on the effects of vaccines to clarify the controversial issue.
As information grows easier to mismanage and spread, it is important to see how credible the source is, especially on YouTube.