FOR IMMEDIATE RELEASE
March 25th, 2021 at 4pm ET
Press contact: [email protected]
The following statement was submitted to the House Energy and Commerce Committee for their hearing about the misinformation and disinformation plaguing online platforms:
Workers across Alphabet have previously organized against the company’s continued refusal to take meaningful action to stop the proliferation of hate, harassment, incitement of violence, or harmful misinformation from YouTube and other Alphabet-operated platforms, without good faith engagement from leadership.
Alphabet is responsible for directly contributing to harmful misinformation campaigns that fuel fascist, white nationalist and hateful movements that perpetrate violence in the United States and around the world. While much attention has been paid to YouTube and other online platforms’ role in radicalizing white supremacists, this hearing is also an opportunity to illuminate how these technologies contribute to dangerous disinformation movements including QAnon, “Patriot” militias, and anti-vaccine advocacy. (1), (2).
Alphabet has demonstrated a continued policy of reactive, selective and insufficient enforcement of its guidelines against disinformation and hate. As a union that fights for, and welcomes the contributions of, every worker in Alphabet, we find it abhorrent that systems to which we have dedicated our work continue to profit from the hate and disinformation that harms so many of these same workers. (3), (4), (5), (6), (7), (8).
Online misinformation can have real and dire consequences offline, ranging from targeted violence to vaccine hesitancy. Misinformation on Alphabet products facilitates the proliferation of hate-filled conspiracy theories, like QAnon, which has repeatedly lead to incidents of targeted violence and been deemed a domestic terror threat by the FBI. Beyond physical violence, the spread of conspiracy theories and misinformation online contributes to diffuse harms that affect those across the US and abroad, such as public health violations, lower vaccination rates, and decreased democratic participation. (11), (12), (13).
Misinformation on Alphabet products can also facilitate the undermining of democracy, as was the case with virulent election-related misinformation in the United States and elsewhere in 2020. While some key disinformation peddlers were retroactively deplatformed—after other companies decided to act first—there remains an entire ecosystem of disinformation influencers and publishers who have profited from attacking the foundations of democracy on YouTube and Google Ads infrastructure. The processes behind these decisions and those processes’ influence on community guidelines and authoritative source lists is not transparent which makes it more difficult for researchers at Alphabet and elsewhere to solve these problems. (10).
From election misinformation to anti-Semitic conspiracies, Alphabet’s incremental and reactive policies have proven to be a lose-lose approach, angering free speech advocates and allowing harmful movements to fuel atrocities and insurrections before they are penalized. Failing to crack down on misinformation and disinformation proactively has led to harassment, threats, injuries, and death—real trauma and real pain for so many people (9).
YouTube has said it welcomes peer-reviewed research on disinformation, but it needs to engage directly with academics and outside experts, including sharing data on removed videos, in order to credibly foster more understanding of the problem (14).
Alphabet workers built these systems and we know how to make them better—the company has a variety of tools to mitigate misinformation at its disposal that we believe ought to be deployed beyond retroactive removals and ranking. We invite Alphabet’s leadership to tackle these issues and others collaboratively with our union. We are ready and eager to share our perspectives and proposals constructively with Alphabet leadership, on this and other issues of the public good.
One thing we will not be, however, is silent.