In a statement on its website, the company said that the attribute will probably be rolled out globally in the forthcoming weeks for users that have the latest version of the app. The spreading of rumors can occur when users find themselves added without their permission to group chats.

Read : Back in India’s Upcoming Elections, Bollywood Wages a Battle for Hearts and Minds
The most recent changes aren’t the first effort WhatsApp has enacted to fight abuse of its own platform. After a story circulating in India’s Assam state led to mob violence at June, WhatsApp announced message routing limitations, which limit the amount of times each user is allowed to forward person messages.
Similar steps have been taken by social networking websites. Facebook recently announced that it would act to decrease the spread of false information on its own platform, blocking bogus reports and hiring outside fact-checking organizations to help with content moderation.

The team chat restrictions aren’t the only move the company has made to fight with misinformation in days. In India, where federal elections begin on April 11, the business just started a fact-checking service.

In a bid to “limit abuse,” the Facebook-owned messaging program WhatsApp introduced a new safety feature Wednesday enabling users to control who is able to add them to groups.

The “checkpoint allows users to send in messages, videos and photos to look at their veracity, according to the BBC.

Social networking was used to spread false information and exacerbate sectarian tension with lethal results in the country, where you will find more than 200 million WhatsApp users.