Last week Facebook CEO, Mark Zuckerberg, and Twitter CEO, Jack Dorsey came before the US Senate Judiciary Committee for a hearing on censorship, suppression and the 2020 election. Both political sides scrutinized how content is moderated, especially in light of both platforms blocking or limiting the distribution of the New York Post’s article about Hunter Biden.

Senators universally acknowledged the immense power of social media and the need for Section 230 to be changed. Section 230 is a part of US legislation, enacted in 1996, which gives internet platforms different rights than traditional media outlets. Section 230 does not hold platforms such as Facebook and Twitter responsible for the content produced on their website but solely the individual who posts the content. It also allows platforms to restrict user access or material considered to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

Some senators raised concerns that these platforms are practicing bias editorial control, while avoiding the legal restrictions and consequences of traditional publishers. Other senators raised concerns that these companies weaponize ill-intentioned people and allow hate speech, propaganda and misinformation to flourish.

The US Congress will have to carefully weigh what will bind some of the most influential and globally reaching companies. It is no small feat to write regulating law which allows free speech, preserves privacy, and prevents virtual extremist ideas from turning into harmful reality.

Although the matters discussed during the hearing primarily concerned the election and the events surrounding it, Sen. Graham (R- SC), and Sen. Blumenthal (D- Conn) also briefly emphasized Twitter’s and Facebook’s responsibility to block and report sexual exploitation. Sen. Ernst (R-Iowa) further pressed the witnesses on the matter, pointing out that child sexual abuse material (CSAM) is “present on nearly every single social media platform that exists.”

Both CEOs responded that the gravity of exploitation is paramount and their increasingly sophisticated ability to find “bad actors” is based on activity patterns rather than the specific content of an individual— ideally this allows for privacy to be respected while moderating the platforms and targeting predators.

As with most laws, reforming Section 230 will take time. Yet, the US federal law already requires that any CSAM be reported to the National Center on Missing and Exploited Children. Facebook and Twitter, in cooperation with authorities, must be completely intolerant of CSAM. With their enormous influence, it is essential that these companies also encourage other tech giants to cooperate with the already existing laws.