Cicilline Remarks on COVID-19 Social Media Disinformation

Jun 16, 2020

PAWTUCKET– House Antitrust Subcommittee Chairman David N. Cicilline (RI-01), took part in a panel discussion on COVID-19 social media disinformation at George Washington University’s Institute for Data, Democracy & Politics this morning.

The text of Cicilline’s opening remarks, as delivered, is embedded below.


David N. Cicilline
Remarks As Delivered
June 16, 2020

Thank you, Frank.

Good morning, and thank you for having me at today’s important conversation on the real-world effects of social media disinformation on people’s health and safety.

I’m thrilled to be here with such an impressive group of panelists today, including Speaker Pelosi, Chairwoman Schakowsky, Senator Murray, and several of my other colleagues. 

This important event also touches on several of the topics that have been actively discussed as part of hearings before the International Grand Committee, and I’m happy to join Damian Collins and several of the other members of the IGC for today’s conversation as well. 

As we all know, Facebook, Google, and other social platforms are designed to keep users on their platforms, whatever the cost. Because disinformation, propaganda, and hateful speech are good for engagement, they are good for business. That’s why these companies will not change unless their business models change as well.  

Before the outbreak of the COVID-19 pandemic, we knew that this business model magnifies extremist content online. There has been a spike in hate crimes worldwide due to misinformation, propaganda, and mass conspiracy theories on social media and in search results. 

Facebook and Google host or enable countless pages, targeted advertisements, and suggested content that are dedicated to conspiracy theories and calls to violence—content that often violate Facebook’s and Google’s stated policies, yet which these firms are often too lax in enforcing.

For example, it was only after the white supremacist rally in Charlottesville that Google and Facebook removed links to the Daily Stormer—a neo-Nazi website—for violating their terms of service.  

This should not have been a close call or required national attention to get right. 

And there are still plenty of instances in which Google and Facebook do not get it right, allowing harmful content to proliferate—all while these platforms continue to profit from it. 

In the COVID-19 era, disinformation online about the virus is costing lives every day.

As a group of 2,000 leading health experts around the world noted in a recent letter led by Avaaz— and as Doctors Duncan Maru and Jasmine Weiss will discuss on today’s panel—false claims about coronavirus have “spread faster than the virus itself” online. 

Doctors, nurses, and other healthcare professionals are fighting “twin pandemics” against the spread of the virus and the spread of misinformation online. 

What’s more, the companies that facilitate and profit from this spread of misinformation have not done nearly enough to save lives by countering it. 

And even when companies do attempt to police misinformation—such as Facebook removing hundreds of thousands of posts about false cures—they are still unsuccessful, suggesting either that this misinformation is either too profitable to eliminate completely, or that these platforms are simply too massive to manage.

This practice must end. 

Dominant platforms should be accountable for the ways they spread and amplify harmful content online. Moreover, we should question whether companies that serve as increasingly essential communication networks should have business models that profit from this harmful content in the first place. 

Whether it’s changing a business model that creates powerful incentives to turn a blind eye to this content, restricting the ability of platforms harvest people’s data to serve behavioral advertisements, or removing liability protections that shield platforms from accountability, it is critical that policymakers around the world confront this issue in a serious and meaningful way.

These are enormous problems with serious harms that are not confined by geopolitical borders.

Before I close, I just wanted to take a moment to talk about the role of the International Grand Committee in examining these issues.

Last year, I was proud to join the IGC for its hearing in Dublin, Ireland on “Disinformation and Fake News.”  

While we had planned to hold the next meeting of the Committee in Washington, D.C. this summer, that is unfortunately not possible due to the pandemic. 

However, I have recently conferred with the chairs of the prior hearings—Damien Collins, Bob Zimmer, and Hildegarde Naughton—and we agree that it is essential to continue to build international consensus on these issues.

With that in mind, we plan to hold a virtual meeting of the Committee in the early fall as a continuation of these efforts.

So thank you again for including me in this important discussion. 

In closing, I look forward to participating as part of today’s conversation on this matter, and I’m proud to join Doctors Maru and Weiss on this panel.