ScamsSep 23 2021

Online safety bill could not stop scam ads, says Google

twitter-iconfacebook-iconlinkedin-iconmail-iconprint-icon
Search supported by
Online safety bill could not stop scam ads, says Google
(AP Photo/Matt Rourke)

Amanda Storey, Google’s director of trust and safety, spoke before the Treasury Committee in Westminster yesterday (September 22), alongside representatives from Facebook, Amazon and eBay.

“In terms of the online safety bill, it's designed for user generated content,” she explained. “And when you look at the traits of user generated content versus a scam, they are quite different.”

Storey explained: “With user generated content, someone looking at a policy and the content can make a pretty clear decision about whether that content is violative or not. 

“With a scam, you're looking at one signal. And actually that can't necessarily tell you whether it's a scam or not. 

“You need to look at the actor, the behaviour and the piece of content itself to ultimately make a decision about whether it's a scam. So, the techniques for user generated content versus the scams are quite different.

“The online safety bill is not necessarily targeted in the way it would need to be to be efficient at tackling online scams. So I do think that needs to be considered.”

Julian Knight, chairman of the all-party parliamentary group for new and advanced technologies, hit back at Storey, accusing Google of distinguishing between user generated content and paid-for online scams because the latter is simply “too much hard work”.

“To paraphrase what you said was effectively ‘this is too much hard work’,” said Knight. “Do you not owe a duty of care to your users?”

Storey replied: “We absolutely do. We take a huge responsibility to make sure you're protected on our services. That's literally what my team does every day.”

The Google executive later clarified that her firm agrees with the bill, which recommends automations to help tackle user-generated content, but added automations alone aren’t enough.

“Automated means alone are not going to be sufficient for tackling scams,” she said.

“It really does require the kind of sustained and strong collaboration that we've started to engage in with the online fraud steering group, with Stop Scams UK. 

“You have to have the signals from all sorts of different players across the value chain to actually identify something as a scam.”

Knight questioned whether big tech firms really were all for “strong collaboration”, accusing initiatives like the ones Storey cited mere “talking shops” if the firms themselves - such as Google and Facebook - don’t share data with each other.

“If you wanted to persuade legislators not to bring about legislation that applies to your industries to give you a softer, lighter touch regulation, then you need to show that each of these big new, four huge players that are with us today, are actually working together,” said Knight.

He then proceeded to ask the firms how much information they share with each other. All firms agreed they “share information with law enforcement”, but not with each other.

On why these firms don’t share data with each other, Allison Lucas, Facebook’s content policy director, said: “The concern that we would have would be privacy. So anything that we can share in a private, safe way we would do so through the online steering group.”

Knight hit back: “So you’re more concerned about their privacy than people being robbed of their life savings?”

He added: “I can't see why, frankly, it isn't beholden on you to actually put in place systems in order to flag across platforms. 

"Because you're asking us not to regulate you. Well, therefore, you need to show that you do everything in self regulation. I don't really buy privacy, frankly, as remotely sustainable in this context.”

The committee also asked Google and Facebook about ad spend racked up by the Financial Conduct Authority - to the tune of £600,000 in Google’s case - to warn consumers against harmful ads.

Google and Facebook were quizzed on their progress in reimbursing the UK’s financial watchdog, but neither representative could answer this question so they were asked to write to the committee.

Debbie Barton, financial crime prevention expert at Quilter, said: “With so many politicians and organisations saying exactly the same thing – that the Online Safety Bill must include paid for advertising and cloned websites – the onus was really on the tech companies to explain exactly why this change is unnecessary.

"But what was abundantly clear from the evidence presented to the Treasury Committee is that the tech companies didn’t really have an answer to why advert scams and cloned website scams shouldn’t be included in the bill alongside user-generated content.

“On the day that UK Finance reported a 30 per cent increase in fraud in the UK, consumers across the UK would have been looking for assurances from the tech companies that they are serious about eradicating the threat of online fraud – and working together with other tech companies to share intelligence on known scammers.

"But it’s clear that no such coordination is taking place."

ruby.hinchliffe@ft.com