Facebook has created a One Strike Policy to prevent live streaming of violence and its sharing. The company took this decision after live streaming of violence by an attacker in Christchurch, New Zealand. In Facebook, VP Guy Rosen of Integrity said that people who have broken the rules will be banned for misusing Facebook’s live streaming feature. If a user has made a live video of violent video on the Facebook wall, then they will not be able to use this feature further.
Rosen said that the live attack of the terrorist attack in Christchurch was made live in March. Also, this video was shared by many users. In such a way, our efforts are to limit such services so that people can not work to spread hatred on Facebook. For this, forest strike policy will be implemented. After the implementation of this policy, the user who breaks the condition given in it will be banned from the account, or features.
Facebook deleted videos of New Zealand attack
Rosen said that if a user shares a link to the statement of a terrorist organization, then it will also be against the policy. His account will be banned. The video of the attack in New Zealand was deleted by many users from Facebook, but many people shared their edited videos. This is also a challenge for us.
Facebook using Artificial Intelligence
In Australia and New Zealand, Facebook is also using the Artificial Intelligence to identify hate groups and remove them from their platform. Such groups will not be able to use any of Facebook’s services. Facebook had also announced in the past that it would ban the white nationalism and white separatism or support its support. This ban on Facebook and Instagram will be implemented next week.
Facebook has contracted with three US universities to improve photo and video analysis techniques. For this, he is spending $ 7.5 million (about Rs 51 crore).