Facebook, Google and Twitter have had staff across time zones working around the clock in coordination with their various artificial intelligence platforms to track down and remove the video.
The complete removal of the content is not easy and goes to the core of what these platforms were set up to do – give people an unrestrained voice to share their views. The gunman’s various social media accounts were removed and the technology giants were also proactively searching for accounts being set up in his name to prevent impersonation and further spreading.
The technology being used to track down the video takes a visual footprint of the footage, meaning that specific version can be banked and blocked across a platform. The problem is the video is being downloaded and edited then uploaded and shared again, creating a new visual footprint that the likes of Facebook, YouTube and Twitter need to track down and block.
Each edit can modify colour, add watermarks or captions, making the task challenging. Facebook, YouTube and Twitter have removed and blocked hundreds of different versions of the video.
Essentially, it’s a game of whack-a-mole.
“We are deeply saddened by the shootings in Christchurch on Friday,” a Twitter spokesman said.
“Twitter has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also cooperate with law enforcement to facilitate their investigations as required.”
Facebook director of policy in Australia and New Zealand Mia Garlick said: “We continue to work around the clock to remove violating content from our site using a combination of technology and people. In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload. Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.”
As Peter Kafka quoted Facebook boss Mark Zuckerberg in Recode from a post relating to the spread of misinformation by Russians on the platform in 2017; these platforms put no roadblocks to the spouting of hate speech and now a terrorist attack before it has actually happened.
“We don’t check what people say before they say it, and frankly, I don’t think society should want us to. Freedom means you don’t have to ask for permission first, and by default, you can say what you want,” Kafka quoted from a Zuckerberg response in 2017.
from Credence news https://ift.tt/2F6y5Js
0 Comments