New Twitter Tool to Check Spam in Periscope Broadcasts
|Twitter is on a reformation spree. Close on the heels of its announcement that it will make it easier for people to express themselves with 140-character tweets, the microblogging site introduced Tuesday a new feature.
The comment moderation tool in Periscope broadcasts will now allow viewers the ability to report and vote on comments in real time. The new tool will also help combat spam and abuse in Periscope broadcasts.
Periscope is a Twitter-owned live video app that lets you share and experience live video. It allows anyone to broadcast live to a global audience and enables viewers to interact in real time.
[ How to Say More with 140 Characters on Twitter ]
Designed to be transparent, live, and community-led, the new reporting tool gives people watching the broadcast the ability to report comments they find inappropriate as they appear on the screen.
A small group of randomly selected live viewers will then vote on whether they agree with the reporter’s assessment. Those found to be sending spammy or abusive comments will be temporarily disabled from commenting within the broadcast.
“We want our community to feel comfortable when broadcasting,” said Kayvon Beykpour, Periscope CEO and co-founder.
The new report tool is accessible by tapping on a comment within a broadcast. People can report a comment for being spam, abuse, or for other reasons, i.e., the comment is in a foreign language.
Comments that are flagged as spam or abuse are then put to a jury of randomly selected live viewers who are asked to help moderate the comment by determining whether it is abuse or spam, looks okay, or they’re not sure.
Periscope acknowledges that live viewers are best suited to ascertain what constitutes abuse, given that the context of the broadcast impacts the tone of the comments. Regardless of the outcome, the person reporting the broadcast won’t see comments from the person they reported for the rest of the broadcast.
People whose comments have been deemed spam or abuse are then temporarily disabled from commenting within the broadcast. Repeat offenses will result in chat being disabled for that commenter for the remainder of the broadcast.
Periscope has designed the feature to be transparent so people on both sides – reporter and moderators as well as commenter – understand the outcome, and lightweight so they’re able to participate with minimal disruption to their viewing experience.
This system works in tandem with other safety tools already in place: the ability to report ongoing harassment or abuse, block and remove people from broadcasts, and restrict comments to mutual followers.
People will have the choice to disable the comment moderation tool in their broadcasts or choose not to participate as moderators.
The update is starting to roll out this week on iOS and Android devices.