Advertisement
trendingNowenglish1897120https://zeenews.india.com/news/world/software-unveiled-to-tackle-online-extremism-violence_1897120.html

Software unveiled to tackle online extremism, violence

A software tool unveiled Friday aims to help online firms quickly find and eliminate extremist content used to spread and incite violence and attacks.

A software tool unveiled Friday aims to help online firms quickly find and eliminate extremist content used to spread and incite violence and attacks.

The Counter Extremism Project, a nongovernment group based in Washington, proposed its software be used in a system similar to one used to prevent the spread on online child pornography.

The software was developed by Dartmouth University computer scientist Hany Farid, who also worked on the PhotoDNA system now widely used by Internet companies to stop the spread of content showing sexual exploitation or pornography involving children.

But social media firms have yet to commit to using the tool for extremist content, and some are skeptical about it, according to an industry source.

The announcement comes amid growing concerns about radical jihadists using social networks to diffuse violent and gruesome content and recruit people for attacks.

"We think this is the technological solution to combat online extremism," said Mark Wallace, chief executive of the organization that includes former diplomats and public officials from the United States and other countries.

The group proposed the creation of an independent "National Office for Reporting Extremism" that would operate in a similar fashion to the child pornography center -- identifying and flagging certain content to enable online firms to automatically remove it.

This system, if adopted by Internet firms, "would go a long way to making sure than online extremist is no longer pervasive," Wallace said during a conference call with journalists.

He said it could be useful in stopping the "viral" spread of videos of beheadings and killings such as those produced by the Islamic State group.

Wallace said he expects "robust debate" on what is acceptable content but added that "I think we could agree that the beheading videos, the drowning videos, the torture videos... should be removed."Farid, who also spoke on the call, said he believes the new system would be an effective tool for companies that must now manually review each complaint on objectionable content.

"We are simply developing a technology that allows companies to accurately and effectively enforce their terms of service," Farid said.

"They do it anyway, but it`s slow."

Farid said he developed the software with a grant from Microsoft, and that he and the Counter Extremism Project would work to provide it to online companies.

The system is based on "robust hashing" or finding so-called digital signatures of content of text, images, audio and video that can be tracked to enable platforms to identify and stop content from being posted or reposted.

"The technology has been developed, it has been tested and we are in the final stages of engineering to get it ready for deployment," Farid said. "We`re talking about a matter of months."

Social networks have long stressed they will help legitimate investigations of crimes and attacks, but have resisted efforts to police or censor the vast amounts content flowing through them.

But governments in the United States, France and elsewhere have been pressing online firms to do more to curb extremist content.

And a lawsuit filed on behalf of a victim in the 2015 Paris attacks seeks to hold Facebook, Google and Twitter liable for the violence.

"Without defendants Twitter, Facebook and Google (YouTube), the explosive growth of (the Islamic State group) over the last few years into the most-feared terrorist group in the world would not have been possible," said the lawsuit filed by the family of Nohemi Gonzalez, killed in Paris.

A tech industry representative, who asked not to be identified, said social media firms had concerns, including about privacy and the effectiveness of the tool.

"Child pornography is very different from extremist content," according to the source, summarizing tech firms` views.

"Participants were understandably concerned about which governments would be able to impose their definition of terrorist. Saudi Arabia? Russia? China?" the industry representative said.

Discussions were held earlier this year in a conference call, the source said, but no agreement was reached with the tech firms before Friday`s announcement.

"That tells us that they weren`t able to build a consensus among the companies, and that CEP is more interested in grandstanding for press coverage than actually getting something done," the source said.

"They can`t accuse the tech companies of treason and then expect to get invited over for dinner the next day."

Stay informed on all the latest news, real-time breaking news updates, and follow all the important headlines in india news and world News on Zee News.

NEWS ON ONE CLICK