Terrorism is an attack on open societies, and addressing the threat  posed by violence is a critical challenge for all. Google  and YouTube are committed to being part of the solution. They are working  with government, law enforcement and civil society groups to tackle the  problem of violent extremism online. There are thousands of  people around the world who review and counter abuse of Google platforms. Their engineers have developed technology to prevent re-uploads of known  terrorist content using image-matching technology. Google invested in  systems that use content-based signals to help identify new videos for  removal. And Google developed partnerships with expert groups,  counter-extremism agencies, and the other technology companies to help  inform and strengthen efforts to counter terrorism
Additionally, Google takes four additional steps.
First, Google is increasing their use of technology to help identify extremist and  terrorism-related videos. They have used video analysis models to find and assess  more than 50 percent of the terrorism-related content and removed  over the past six months. Google now devote more engineering resources  to apply our most advanced machine learning research to train new  “content classifiers” to quickly identify and remove  extremist and terrorism-related content
.
Second, Google will greatly increase the  number of independent experts in YouTube’s Trusted Flagger programme.  Machines can help identify problematic videos, but human experts still  play a role in nuanced decisions about the line between violent  propaganda and religious or newsworthy speech. While many user flags can  be inaccurate, Trusted Flagger reports are accurate over 90 per cent of  the time. Google will also expand our work with  counter-extremist groups to help identify content that may be being used  to radicalise and recruit extremists.
Third, Google will be taking a  tougher stance on videos that do not clearly violate their policies — for  example, videos that contain inflammatory religious or supremacist  content. In future these will appear behind an interstitial warning and  they will not be monetised, recommended or eligible for comments or user  endorsements. That means these videos will have less engagement and be  harder to find. 
Finally, YouTube will expand its role in  counter-radicalisation efforts. Google is working with Jigsaw to implement the “Redirect  Method” more broadly across Europe. This promising approach harnesses  the power of targeted online advertising to reach potential Isis  recruits, and redirects them towards anti-terrorist videos that can  change their minds about joining. In previous deployments of this  system, potential recruits have clicked through on the ads at an  unusually high rate, and watched over half a million minutes of video  content that debunks terrorist recruiting messages.

No comments:
Post a Comment