If YouTube’s Algorithm was an Editor, Trump’s Campaign Would Hire Them

AdiCo
7 min readDec 3, 2020

A lot has been written about the dismissal of facts and reality by President Trump and his allies and their relentless effort to push “alternative facts” in order to create a content ecosystem claiming Trump has won the election but has been prevented from securing the presidency due to collusion of the media, Democrats, RINOs (“Republicans in name only”) and, according to one theory, the late Venezuelan dictator Hugo Chavez. An analysis of the top 10,000 links shared on Facebook public groups over the past week (Nov 15–22, 2020) highlight how the information ecosystem is polarized. Interestingly, the polarization is between mainstream media and more fringe outlets and YouTube channels that are dominating the ecosystem made up of Trump supporters and conservative Facebook groups.

Right wing outlets such as Fox News, The Daily Caller and The Daily Wire are sidelined on this map, while content from more fringe domains such asNational File (1,945 shares), Gateway Pundit (1,877 shares), The Epoch Times (1,786 shares) and Newsmax (1,352 shares) is shared more widely. That’s not to say that Fox, The Daily Caller or Daily Wire articles about these matters are not successful on Facebook in general. Quite the opposite, they are doing pretty well. However, this map indicates that they are shared less on such public groups, perhaps because they are less radical in their point of view and headlines.

Many shared articles from these far right fringe websites have focused on the latest statements made by Sidney Powell (who pushed forward the Dominion conspiracy theory) and promises she’s made about releasing election fraud evidence. Top groups where shares originated on the right included pro-Trump, far right and conservative groups such as Kayleigh McEnany Fan Club (418k members), wexitmovement.com (250k members) and Team Trump 2020 (50k members). Given Powell has been a prominent voice in these posts, Facebook has flagged and taken down some of these posts in their system.

On the left, top groups included I watch Rachel Maddow and MSNBC (92k members), Dump Trump (33k members) and Resist Trump (35k members), who shared links from prominent mainstream media such as Washington Post (1,943 shares), NYTimes (985 shares) andCNN (934 shares).

This map, perhaps unsurprisingly, is very similar to a map about “Covid vaccine” related articles generated in August: two polarized clusters, where one links to credible mainstream media sources such as CNN, NYTimes, Washington Post and the other cluster is dominated by YouTube, fringe and antivax websites, and Kremlin funded RT.com.

It’s YouTube Again

One challenge that stands out in both maps is the prominence of YouTube as a source of disinformation on Facebook (and other platforms). This issue was resurfaced recently by journalists who pointed to the success of voter fraud allegation videos on YouTube. YouTube then suspended an OANN channel from their site for a week for spreading Covid-19 misinformation. Clearly, this looks and feels like an attempt to appease criticism rather than take proactive initiative to address the issues.

As the map above shows, YouTube videos from Newsmax, OANN, EpochTimes/NTD and other conspiracy channels are circulating and shared widely within Facebook communities. Popular URLs from YouTube in this map include an OANN feature, “Retired Air Force General blows whistle on CIA vote hacking”, an OANN Dominion related segment shared by Donald Trump’s official YouTube account and a segment feature interview with attorney Lin Wood . These and other conspiratorial videos gained hundreds of thousands to millions of views in days.

YouTube’s problem extends beyond misinformation dissemination through other networks. To its credit, YouTube has been taking a more aggressive approach over the past two years (since the Parkland shooting and ensuing conspiracy videos) to prioritize videos from credible sources on current event matters and controversial issues. But they often seem to be caught off guard, especially when the “data voids” are so big . The 2020 elections are a great example. It is not every day that a sitting U.S. President uses disinformation as part of an effort to overturn election results and erode trust in the U.S. election systems. Since election day, Trump and his allies have pushed multiple conspiracy theories while failing to provide substantive evidence that would ground the theories in reality. Yet, random searches on YouTube of phrases and words that headline those theories reveal that, if the YouTube algorithm would have been an editor, s/he would probably work for Trump’s White House TV. For example, a search for “Dominion” (through incognito browser) yielded a mix of results: new results including challenges to Dominion’s credibility (from Fox News), allegations of conspiracy (from OANN, and The Epoch Times) and several older videos that show how election systems can be hacked. Though some of these videos are three years old, they gained most of their views in November 2020, emphasizing how the platform is used as a repository to find archival content that fits a certain narrative. Since we’re all now expected to “do our own research”, YouTube provide us the exact content that would fail us. There is no mention on the website that CISA has claimed this has been the most secure election in American history.

Search “Dominion” on YouTube, 11/27/2020

When looking at the most viewed “Dominion” videos over the past month, results include mainly OANN and NTD network (Epoch Times producer video) segments. None of the videos include coverage of the theories calling them out for what they are - baseless and conspiratorial. In fact, for those who are utilizing YouTube to spread disinformation, one of the great advantages is the ability to ignore the context: a user clicking on one of these videos might notice the YouTube disclosure that “The AP has called the Presidential Race for Joe Biden” and that “Robust safeguard help ensure the integrity of the elections”. But viewers will not see anything about the fact that this conspiracy has been debunked, thrown out of court, and that its main propagator, Sidney Powell, was even kicked out of Trump’s legal team. In the alternative reality enabled by YouTube, and brought to you by sources such as The Epoch Times, OANN and Newsmax, Sidney Powell is still discovering the corruption and lies by Dominion that made this win possible for Joe Biden. Further, the post-election disinformation storm brought an opportunity for The Epoch Time to launch two new channels, Front Page and Eye Opener, that have continued to construct alternative realities.

Can YouTube continue to wait for the next media request to take laser-focused (and perhaps symbolic) action against a certain channel or video? Yes. But the expected changes to the U.S. political environment and the growing trend of advertisers to adopt stricter standards suggest it could actually work to YouTube’s benefit to be more proactive. Some options might include:

  1. Identify “Data Voids” and mitigate in “real time”: it is a “whack-a-mole”, but also pretty basic to implement. A more hands-on approach, with a bit more human and machine brain power could help activate a quick response. Sometimes it seems YouTube and other Silicon Valley actors fail to acknowledge the gravity of the issues on the line. In 2020, it is a threat to our public health and U.S. democracy. It is hard to imagine more serious scenarios - if this doesn’t shake them, what will?
  2. Better contextualization: one of the advantages of YouTube is the ability of creators to contextualize their videos as they want. But this provides a great tool to escape from reality and facts, which, in many cases, is a dangerous path. During this election, YouTube made some efforts and put a disclosure under videos with information from AP. Perhaps it should make more efforts to give context. It can apply the “Redirect Method” in an attempt to give viewers a better sense of reality. Importantly, at least part of this contextualization should be in the video (before, during or after), as many of the viewers see videos embedded in other websites. For some, it won’t be enough. But it could prevent others from falling into the rabbit hole.
  3. Provide independent researchers and journalists with access to more YouTube data: media articles criticizing YouTube’s handling of misinformation often use the measurements of “number of views” and “engagement on social media”. Those benchmarks are important, but they are also the only free available ones (and the last one, courtesy of Facebook/CrowdTangle). Cross-platform influence might be very different from how we perceive it given our data is partial. YouTube could do much more for researchers, which, in turn, would help the platform better identify and remove or limit disinformation efforts. Providing researchers with tools that help identify emerging narratives, sources of traffic to misinformed videos and trending footage and channels might help not just society at large, but YouTube itself, in identifying approaches and tools to more efficiently detecting misinformation and disinformation efforts.
    There is an inherent tension between social media platforms on one side, and the journalists and communication researchers who cover them on the other, especially as legacy media sees social media as a competitor. The history of attempts to collaborate is not all positive (see Facebook’s “Social Media One” experiences as an example) but considering the level of threats and growing political pressure, YouTube can and should do more to serve its users, creators and advertisers.
  4. Finally, the company could benefit from creating tools to identify synthetic and doctored content (for itself and for users and researchers). When the next deceptive Joe Biden video emerges, they will be prepared.

--

--