Instagram is updating its policy on how it handles deleting accounts that post inappropriate content, the Facebook-owned company announced on Thursday.
“These changes will help us quickly detect and remove accounts that repeatedly violate our policies,” Instagram wrote on its official blog.
Moving forward, the company says it will delete accounts that make multiple violations within a short timeframe. Previously, timeframe wasn’t something Instagram considered in its decisions related to account violations.
The company says the change will allow it to enforce its policies “more consistently,” as well as “hold people accountable for what they post on Instagram.”
Additionally, Instagram has added a new notification to its app that will notify individuals if their account is at risk of being deleted. The same notification will also allow users to appeal a decision by the company. Before expanding their scope, Instagram plans to limit appeals to content that violates the company’s nudity and pornography, bullying and harassment, hate speech, drug sales and counter-terrorism policies.
Earlier in the week, Instagram announced it’s expanding its test of hiding likes from users to include six more countries.