Could the days of toxic comments finally be over? Is the Internet about to become a pleasant, polite place where opinions are always offered with respect and talkbacks are actually worth reading? If Google has its way, that might be exactly what’s about to happen.
The company’s Jigsaw division is beginning to roll out Perspective, an API powered by artificial intelligence that “reads” comments as they’re typed. The comments are then scored by “toxicity,” a characteristic that Jigsaw defines as the “likelihood that this comment will make someone leave the conversation.” Jigsaw claims that just telling users that their comment is likely to put people off can change what they write. But if commenters are not bothered by their toxicity score, publishers can also set their own toxicity threshold. A sliding scale can filter out the most toxic comments to balance free speech against the preference for a pleasant atmosphere.
The technology is still undergoing testing but is being employed at the websites of The New York Times, the Economist and the Guardian.
One immediate result should be an easier time for comment checkers at those publications. The Times says that it employs fourteen people on its community desk to check comments as they come in. They review around 11,000 comments every day, weeding out “personal attacks, obscenity, vulgarity, profanity (including expletives and letters followed by dashes), commercial promotion, impersonations, name-calling, incoherence and SHOUTING.” The Times offers a test where you can see for yourself how hard that is to do quickly and consistently. (It’s hard.)
As you’re building your private social network though, that’s exactly what you’re going to have to do. Jigsaw is allowing other publishers to request API access but there’s no indication that the system will be universally available any time soon. Until that happens, it will remain your role to weed out commenters who are driving away members and killing conversations.
To some extent, you’ll depend on the members of your community to do that. You’ll need them to block and report so that you can issue bans and warnings. But as the manager of your private social network, you also have a vital role to play. One of the smartest aspects of Perspective isn’t the artificial intelligence that can look beyond keywords to assess the tone of a comment but the recognition that different sites will have different thresholds of toxicity. A political site might be more tolerant of insults than a fashion site, for example. It will be up to you to make clear to your community where you feel the threshold lies so that members know what they can and can’t say.
The WordFilter add-on will help. Start by installing PeepSo. Add WordFilter and use your blog to explain the rules and keep your members engaged in conversation.
Reactions & comments
Due to increased security measures, you can now only log-in using your e-mail address.
If you do not remember your e-mail address, send us an email at support@peepso.com
Comments