Skip to content

Social media’s regulation reckoning has arrived

Social media’s regulation reckoning has arrived

It is hard to believe we are only two weeks into 2021. News headlines which once would have defined a 12-month period are now an everyday occurrence. Yet last week’s insurrection at the Capitol will be seen as one of the most defining moments of the year. The event itself was significant. But its aftermath will be equally significant, specifically, its impact on the tolerance of disinformation online, but in particular, on social media platforms.

The suspension of President Trump’s social media accounts has renewed debate over the role of social media – are they platforms or publishers? Matt Hancock seemed to be in favour of the latter, arguing that by banning Trump, social media platforms have started taking “editorial decisions”.  With the insurrection at the Capitol we have passed the point of no return and the debate for how to regulate social media platforms is starting to heat up.

To date, private companies have failed to take responsibility and adequately deal with illegal and harmful content on their platforms. The recent introduction of tagging posts containing disinformation, such as Trump’s repeated false claims that the US Election was rigged, was an important first step. But it is too little, too late.

Private companies should no longer be left to regulate themselves – this is the domain of democratically elected governments – and last week’s events highlight that measures need to be tightened. Some would argue that tightening measures on social media platforms is a violation of freedom of speech. But should we consider whether tougher limits should be in place in order to prevent illegality? Governments have placed limits on our behaviour in society to prevent illegality for centuries, so why not online too?

Consistency is key

The approach needs to be consistent. Angela Merkel’s criticism of Trump’s ban earlier this week highlighted the disparate nature of social media regulation between Europe and the US. In Europe, the laws are much stricter and regulators have the power to define what constitutes legal or illegal content online.

Social media platforms have evaded the legal obligations imposed on news organisations for too long. At the very least, they must be held to account for removing illegal and harmful content quickly – within 24 hours – and be penalised for failing to do so.

This must be applied consistently across jurisdictions and the definitions of what constitutes “illegal” and “harmful” content should be uniform too. This will ensure that everyone has the same access to freedom of speech online, but that illegal and harmful content is removed in a consistent fashion.

In 2021, social media should face its regulation reckoning. And it can not come soon enough.

Where are you visiting from?

Select from the regional list below.


    Apply now

    Submit your application directly to our careers team using this form. We look forward to hearing from you!

    Fields marked with a * are required to submit the form.

    Name *
    * Upload your CV and any other supporting documentsSupported file types: doc, docx, pdf, rtf, txt. (MAX size: 6MB)

      Sign up to hear from us

      We send a range of frequent newsletters on several topics below. Submit your details here to receive some or any of these communications.

      Fields marked with a * are required to submit the form.

      Name *
      Which area are you interested in hearing about? *

      If you have a general enquiry, please contact us here.

        Get in touch

        How can we help? Contact us using the form below, or via the following:


        Phone: +44 207 457 2020

        Fields marked with a * are required to submit the form.

        Name *