By clicking a retailer link you consent to third-party cookies that track your onward journey. If you make a purchase, Which? will receive an affiliate commission, which supports our mission to be the UK's consumer champion.

Landmark report targets new laws for online safety

Online Safety Bill could give tech giants increased responsibility for content on their platforms, including scams and fraud

New 'landmark' recommendations have been published today that aim to hold tech giants such as Google and Facebook more responsible for content on their sites.

MPs and Peers on the Joint Committee on the draft Online Safety Bill, chaired by Damian Collins MP, have issued recommendations that would tighten proposed regulation of internet service providers and would bring an end, they claim, to the internet being the 'land of the lawless.'

The Online Safety Bill aims to tackle a wide range of harms - including fraud, scams, racist abuse, child abuse and content that promotes self harm and also against violence against women - and will be put to Parliament for debate in 2022.


We're campaigning to make tech giants take responsibility for scams, dangerous products and fake reviews online. Find out more, and add your name to our petition.


What is the Online Safety Bill?

There have long been concerns that the internet, along with the online platforms that we use on it, have evolved at such a pace that existing laws are not sufficient to regulate it.

The draft Online Safety Bill is an attempt by the government to achieve its aspiration of making the UK the safest place in the world to be online. It was first published in May but was met with widespread criticism that it did not go far enough to protect people online - including that it failed to properly bring online scams and fraud into the scope of the bill.

Which? joined a coalition of 17 organisations, including the the Money and Mental Health Policy Institute, UK Finance, City of London Police, MoneySavingExpert and Age UK to campaign for greater protections.

Today's report makes recommendations for how the draft Bill can be improved, and attempts - in the words of Damian Collins MP - to ensure that 'What's illegal offline should be regulated online.'

'A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life,' he said in a press release.

'The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.' 

What does the report recommend?

The report states that big tech has 'failed its chance to self-regulate' and that Ofcom - the regulator - should be given new powers to investigate, audit and fine companies to hold them accountable.

The report is wide ranging, covering a number of serious harms, but when it comes to consumers the most important recommendation is that online companies will be compelled to proactively tackle fraudulent content and harmful advertising such as scam adverts - something Which? has campaigned for.

Paid-for advertising can be used by fraudsters and criminals to trick unsuspecting consumers. A Which? survey of 2,000 people last year found almost one in ten people may have fallen victim to scam adverts on social media, while 42% said they didn't know how to identify a scam advert.

Under the recommendations, online platforms will be held accountable for such illegal content.


What to do if you've been scammed, or have come across a scam



This is a vital addition. Online scams are growing and cause real harm - with Action Fraud figures showing that £1.7 billion was lost to scams in the last year, much originating online. Scams lead to psychological, as well as financial, harm. Which? has repeatedly exposed how easy it is for scammers to set up fake adverts, and the devastating impact scams can have on victims.

Research we commissioned in October showed that being a scam victim is associated with lower life satisfaction equivalent in financial terms to £2,509 per year. Scaled up to include the 3.7 million fraud incidents reported in 2019-20, our research indicated that online fraud alone amounts to a collective £7.2bn drop in perceived wellbeing among victims, who suffer lower levels of happiness and higher levels of anxiety.

The report also recommends individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law and that internet providers designate an official who would be made liable for any 'repeated and systemic failings that result in a significant risk of serious harm to users.'

As well as covering fraud and scams, a number of other new laws are recommended, such as making cyberflashing - the practice of sending unsolicited nudes - illegal, and making tech companies responsible for illegal content that encourages self-harm.

What happens next?

Rocio Concha, Which? Director of Policy and Advocacy, said it was 'absolutely right' that the recommendations aim to tackle fraudulent paid for advertising. But what is now crucial is that the government accepts the recommendations - which is something Which? is calling for.

'[The committee] has acted on the overwhelming evidence it's received of the devastating financial and emotional impact on innocent scam victims, when online platforms with some of the most sophisticated technology in the world fail to protect them,' she said.

'It's positive that the committee has recommended that online platforms will need to prevent fraudulent paid-for adverts appearing in the first place, as opposed to just adopting reactive measures,' Concha added.

'If the government is serious about cracking down on the epidemic of fraud, it must act on these recommendations and use the Online Safety Bill to finally require online platforms to take responsibility and prevent fraudulent paid-for adverts appearing on their sites.'