ScamsMay 20 2022

Online Safety Bill may not be 'fully operational' until 2024’

twitter-iconfacebook-iconlinkedin-iconmail-iconprint-icon
Search supported by
Online Safety Bill may not be 'fully operational' until 2024’
Carnegie UK Trust associate Maeve Walsh. [Photo: Pimfa panel session]

Walsh said the bill, which is effectively a framework aiming to protect internet users from scams and hold tech giants to account, is not likely to be operational for another two years.

In March, it came one step closer to becoming law after being introduced to parliament, nearly a year after the first draft was published.

However, speaking at the Pimfa Financial Crime conference earlier this week, Walsh argued the bill was still some time away.

“The main bill is in the House of Commons now. ... Evidence sessions start next week.

>It's a big step forward to have the arrangements that are set out in the online safety bill but it would be a big mistake to see that as a job done.Money and Mental Health’s head of external affairs Brian Semple

“They will then go through a series of line-by-line scrutiny sessions, which run all the way through to June and then the bill is likely to come back for third reading reports at the end of July for summer recess.

"Then it is into the Lords in the autumn and the same process happens there.”

She explained that there was an expectation of having this by the end of the year or potentially early next year, political circumstances notwithstanding.

“But because so much of the bill does depend on secondary legislation and codes of practice, that detail probably won't be available and the regime won't be fully operational probably until 2024,” she said.

“Given where it started in terms of where these policy origins lay in May 2017 through to 2024, it's been quite a long time for something that's obviously a priority area.”

The online safety bill requires social media platforms, search engines and other apps and websites to protect children and tackle illegal activity, all while maintaining freedom of speech.

Ofcom, the industry regulator, will be given the power to fine companies up to 10 per cent of their annual global turnover if they do not comply with the laws. 

Walsh said: “Companies need to put in place effective and proportionate mitigation plans and the focus on risk assessment is designed to make a systemic approach to ensure that it is not primarily focused on content and content takedown, which is a kind of problematic area in a number of aspects of the design and the operation and services.

“How the algorithms potentially work, how content is promoted, and how the decisions that companies have made in terms of how users interact with their platforms affect the way that harms materialise online.”

Loopholes

In March, the government published draft legislation stating social media firms would be responsible for preventing paid-for fraudulent adverts on their platforms.

In an amendment to the draft bill, search engines and platforms which host user-generated content, video-sharing or live streaming will have a duty of care to protect users from fraud committed by other users.

These companies will be required to prevent paid-for fraud ads on their platforms, whether or not the ads are controlled by the platform itself or an advertising intermediary.

The amendment came after pressure from a coalition of consumer groups, charities and financial services industry bodies, who have been calling for the government to increase the scope of the bill to include paid-for adverts.

We would like to see a more holistic approach.UK Finance principal Lee Crouch

Another panellist, UK Finance principal Lee Crouch, said the fight against fraud and economic crime was constant.

“If you take the social media platforms and you take search engines, they're treated differently, the way that they should deal with performance gaps,” he said. “Now, our argument would be that when you do that, you incentivise criminals to look at the weaker length and go there. 

“What we would like to see is a more holistic approach, if you will, one that treats these search engines and social media platforms similarly in order to try and close down any current loopholes.”

Walsh agreed as she explained that the duty on social media platforms, in relation to paying for advertising in particular, is to take measures to prevent users from seeing that content of that type of material and to take measures as soon as they can to mitigate the risk of that.

“This is where it does go into kind of individual bits of content, but obviously then within a risk assessment that looks at the whole service within search engines. 

“There is an issue that search engines are treated differently throughout and I think probably in relation to some of the others as far as part of the bill seeks to address that may be appropriate.

" I suspect they've translated that across here without identifying the fact that the advertising through search engines is a huge issue.”

Meanwhile, Money and Mental Health’s head of external affairs Brian Semple said it would be essential to have collaboration between different sectors and regulators.

“It's a big step forward to have the arrangements that are set out in the online safety bill but it would be a big mistake to see that as a job done. One of the things that we know, unfortunately, is that scammers will find ways to do this kind of crime.

“They're often very well-resourced criminal gangs who are doing this kind of thing and they will find new technology and new platforms and regulation will need to be responsive to those developments.”

One way this can be done, explained Semple, is by having Ofcom working alongside the other regulators, as well as Action Fraud and the NationalCentre for cybersecurity to be able to share data and insights into the new types of scams and fraud.

A spokesperson for the Department for Digital, Culture, Media and Sport, said: "We know how quickly technology changes and have designed our world-leading online safety bill in a way which means it can take into account new and emerging harms. 

"We have also strengthened it to make sure social media sites and search engines must prevent paid-for scam adverts and protect users from them. Companies which fail in their responsibilities will face huge fines."

Updates

The bill has been updated a number of times since the first draft was published in May 2021.

Changes include bringing paid-for scam adverts on social media and search engines into scope, after intense pressure from a coalition of consumer groups, charities and financial services industry bodies.

The bill has also been updated to bring forward the time within which executives would be liable for prosecution.

In the previous draft, executives whose companies fail to co-operate with Ofcom’s information requests could now face prosecution or jail time within two months of the bill becoming law, instead of two years.

The government has also launched a consultation on proposals to tighten the rules for the online advertising industry. This includes tougher rules and sanctions for harmful or misleading adverts, or those for illegal activities such as weapons sales.

Influencers failing to declare they have been paid to promote products on social media platforms could also be subject to stronger penalties.

FTAdviser understands that the the DCMS will be taking a phased approach to bringing duties under the bill into effect. 

Ofcom’s initial focus will be on illegal content, so that the most serious harms can be addressed as soon as possible. The first codes of practice are expected to be submitted to the Secretary of State around 12 months after Royal Assent.

After Parliamentary approval the codes will be published and active enforcement of duties concerning illegal content will commence.

sonia.rach@ft.com

What do you think about the issues raised by this story? Email us on FTAletters@ft.com to let us know