Google has hit out at suggestions fraud committed through paid-for advertisements should be included in the much-debated online safety bill, saying the bill was not “targeted” enough to “efficiently tackle” these types of online scams if they were included.
Amanda Storey, Google’s director of trust and safety, spoke before the Treasury Committee in Westminster yesterday (September 22), alongside representatives from Facebook, Amazon and eBay.
“In terms of the online safety bill, it's designed for user generated content,” she explained. “And when you look at the traits of user generated content versus a scam, they are quite different.”
Storey explained: “With user generated content, someone looking at a policy and the content can make a pretty clear decision about whether that content is violative or not.
“With a scam, you're looking at one signal. And actually that can't necessarily tell you whether it's a scam or not.
“You need to look at the actor, the behaviour and the piece of content itself to ultimately make a decision about whether it's a scam. So, the techniques for user generated content versus the scams are quite different.
“The online safety bill is not necessarily targeted in the way it would need to be to be efficient at tackling online scams. So I do think that needs to be considered.”
Julian Knight, chairman of the all-party parliamentary group for new and advanced technologies, hit back at Storey, accusing Google of distinguishing between user generated content and paid-for online scams because the latter is simply “too much hard work”.
“To paraphrase what you said was effectively ‘this is too much hard work’,” said Knight. “Do you not owe a duty of care to your users?”
Storey replied: “We absolutely do. We take a huge responsibility to make sure you're protected on our services. That's literally what my team does every day.”
The Google executive later clarified that her firm agrees with the bill, which recommends automations to help tackle user-generated content, but added automations alone aren’t enough.
“Automated means alone are not going to be sufficient for tackling scams,” she said.
“It really does require the kind of sustained and strong collaboration that we've started to engage in with the online fraud steering group, with Stop Scams UK.
“You have to have the signals from all sorts of different players across the value chain to actually identify something as a scam.”
Knight questioned whether big tech firms really were all for “strong collaboration”, accusing initiatives like the ones Storey cited mere “talking shops” if the firms themselves - such as Google and Facebook - don’t share data with each other.
“If you wanted to persuade legislators not to bring about legislation that applies to your industries to give you a softer, lighter touch regulation, then you need to show that each of these big new, four huge players that are with us today, are actually working together,” said Knight.
He then proceeded to ask the firms how much information they share with each other. All firms agreed they “share information with law enforcement”, but not with each other.