Instagram aims to remove all offensive comments

741

KBC-survey-feedback-poster

By BBC

Instagram has launched a new system that aims to prevent offensive comments being posted on photos shared on the platform.

The social network says it is a step towards keeping Instagram a “safe place for self-expression”.

They hope the filter will help stop bullying and abuse taking place on the platform.

Get breaking news on your Mobile as-it-happens. SMS ‘NEWS’ to 20153

In May 2017 it was rated the worst social media platform for young people’s mental health.

Also Read  Tanzanian wins Safal Eye in the Wild photography competition

Instagram is now making visible efforts to change that perception.

“We know that toxic comments create negativity and discourage people from sharing. But it’s also a hard problem to solve,” the company says in an official statement.

Of the millions of comments left every day, the statement adds “only a small fraction of these are inappropriate.”

Also Read  Tanzanian wins Safal Eye in the Wild photography competition

The company is now employing people to teach computer systems to identify abusive comments, which will then monitor comments in various languages.

“The training process has taken several months, and has involved the review of more than 2 million comments,” says Michelle Napchan, Instagram’s head of public policy.

“We know that this isn’t easy. It’s not just the words themselves — context is incredibly important to determining what’s an offensive comment.”

Also Read  Tanzanian wins Safal Eye in the Wild photography competition

She says the computers will be looking at not only the words used in comments but sequence and combinations as well as the relationship between the person posting the comment and the one who shared the photo.

Instagram previously used a similar system to help remove spam comments from photos and says most people will not notice anything different in their Instagram feed.

KBC-You-tube-728x90-New-2

Tell Us What You Think