Follow us for updates
© 2021 reportr.world
Read the Story →

Facebook Beefs Up Tools for Keeping Group Chats Civil

Artificial intelligence against online conflicts.
by Agence France Presse
3 days ago
Photo/s: shutterstock
Shares

SAN FRANCISCO -- Facebook on Wednesday beefed up automated tools to assist group moderators striving to keep exchanges civil in a time of clashing viewpoints.

The leading social network hammered by critics over vitriolic content in news feeds has played up groups as enclaves where people with conflicting views on hot-button issues can bond based on share interests from music to hobbies or parenthood.

"Some of these groups are millions of people," Facebook engineering vice president Tom Alison said while briefing AFP on new tools for moderators.

More than 1.8 billion people use groups every month, and there are more than 70 million administrators working to "keep conversations healthy" in the forums, according to Alison.

While much content is not public in groups, Facebook has been looking at ways to control hateful and abusive content in these forums.

More on Facebook

Automated systems at Facebook already check for posts in groups that violate the social network's rules about what content is acceptable.

Continue reading below ↓

A new "Admin Assist" feature lets moderators set criteria for what is considered acceptable at the group and then have posts or comments automatically checked for violations.

Moderators can also employ software to eliminate comments with links to unwanted content; slow down heated conversations to let heads cool, or require that people be members of the social network or a group for a while before being able to join in conversations.

Continue reading below ↓
Recommended Videos

"What these tools do is automate things admins did manually, but not expose anything they didn't have access to before," Alison said.

Facebook is testing artificial intelligence that can watch for indications of conversations getting nasty, perhaps with rapid-fire replies or controversial content involved, then alert moderators.

"The AI looks at things associated with threads that have conflicts," Alison said.

"Some admins welcome people having debates; others don't want contentious conversations."

A group called Dads With Daughters by nonprofit Fathering Together is among those given early access to test the tools, according to moderator Brian Anderson.

Continue reading below ↓

The online community for fathers sharing advice, resources, and pride in raising daughters has more than 127,000 members.

"In the beginning there was really nothing, you just put rules up there and what a group is about," Anderson said of moderation in the early days.

Tools added by Facebook have reduce the number of moderators needed at the group, according to Anderson.

Hard stands by moderators against "toxic masculine tropes" such as toting shotguns to protect daughters from suitors was credited with helping establish the tenor in the community.

"You can tell the groups that are really putting in the effort to keep it a civil space versus posts that just get nasty right away."

Latest Headlines
Read Next
Recent News
The news. So what? Subscribe to the newsletter that explains what the news means for you.
The email address you entered is invalid.
Thank you for signing up to On Three, reportr's weekly newsletter delivered to your mailbox three times a week. Only the latest, most useful and most insightful reads.
By signing up to reportr.world newsletter, you agree to our Terms of Service and Privacy Policy.