Pressure will be placed on social media platforms after Christchurch shooting
Source: https://www.stuff.co.nz/news/111362741/-

2019-03-18 01:20:16

Social media companies will come under increasing pressure from the public and the Government in the wake of the Christchurch terror attack.

The Government's Cabinet meeting on Monday is expected to be mostly focused on gun law but it is understood the Government is also keen to call on social networks to do more to fight radicalisation in the wake of the mosque shootings.

This could include a call to share more data directly with intelligence agencies.

The alleged terrorist Brenton Tarrant livestreamed the attack for 17 minutes on Facebook. It was downloaded and re-shared by many.

Facebook has confirmed that there were 1.5 million attempted re-uploads of the video in the first 24 hours after the attack, with 1.2 million being stopped automatically. YouTube also struggled to contain the spread of the video on its platform.

But the focus on social media will be wider than simply the livestream. Tarrant was not on any Australian or New Zealand watch-lists ahead of the attack.

Former Prime Minister Helen Clark, who is close to the current Prime Minister, told TVNZ social media platforms could be to blame as well as security services.

"If this man or these men were active on social media with hate speech, one would frankly expect that to be picked up, not only by our own services but frankly also by social media platforms," Clark said.

"Social media platforms have been very slow to respond in closing down hate speech and accounts, how much was this man active in the two years leading up to it that we're told he was planning it?

"I think this will add to all the calls around the world for more effective regulations of social media platforms. From their performance to date, self-regulation isn't cutting it."

Prime Minister Jacinda Ardern has spoken to Facebook chief operating officer Sheryl Sandberg since the attack. It's understood a strong statement concerning social media could come as soon as Monday.

Facebook's global policy manager Monica Bickert and counterterrorism policy manager Brian Fishman put out a brief statement in 2017 concerning the efforts Facebook makes to counter terrorism.

In the statement the pair say Facebook removes content from terrorists when it becomes aware of them and in "in the rare cases when we uncover evidence of imminent harm, we promptly inform authorities".

They also shared details of how artificial intelligence was being used to augment human efforts - although this was focused on Islamic terror at the time of writing.

When law enforcement agencies request information with a warrant Facebook complies to the extent that it can - although end-to-end encrypted messaging on apps it owns like WhatsApp and Messenger are generally impossible to decrypt even by Facebook itself.

New Zealand does not currently have any legal mass-surveillance programmes that would hoover up every message sent in the country, as the United States does.

A GCSB proposal to do so under "Project Speargun" was eventually ditched by the last Government.

But social networks do pick up some of this information themselves already in order to target advertising. It's understood the Government may ask social networks to share more of this information directly with security agencies in order to increase awareness of radicalisation.

Facebook has been asked for comment.

YouTube have come under fire for showing users more and more radical content through its content recommendation engine. It has pledged to fix this issue multiple times, but an investigation by the Wall Street Journal in 2018 found extreme content was still being recommended.

A spokeswoman for YouTube said hate speech had no place on the platform.

"Hate speech and content that promotes violence have no place on YouTube. Over the last few years we have heavily invested in human review teams and smart technology that helps us quickly detect, review, and remove this type of content. We have thousands of people around the world who review and counter abuse of our platforms and we encourage users to flag any videos that they believe violate our guidelines."

A report on YouTube's content moderation found that 73 per cent of videos that are removed from the platform are gone before anyone has seen them. YouTube employs around 10,000 people to address violent content.

The Association of New Zealand Advertisers and the Commercial Communications Council have also pushed for changes in the light of the attack.

"Advertising funds social media. Businesses are already asking if they wish to be associated with social media platforms unable or unwilling to take responsibility for content on those sites. The events in Christchurch raise the question, if the site owners can target consumers with advertising in microseconds, why can't the same technology be applied to prevent this kind of content being streamed live?" the two asked in a press release on Monday.

"ANZA and the Comms Council encourage all advertisers to recognise they have choice where their advertising dollars are spent, and carefully consider, with their agency partners, where their ads appear."

Back to the top ^

Related Articles (30)