As smartphones bring social media to every corner of the globe, companies like Facebook are struggling to stop hate speech and other problematic content, unable to keep up with a plethora of languages on their services.
Reuters tech reporter Paresh Dave has the story: (SOUNDBITE)(ENGLISH) REUTERS TECH REPORTER PARESH DAVE, SAYING: “The issue right now is that Facebook offers its services in more than 100 languages.
That means menus, settings, typing options are available in those more than 100 languages, but the content rules, the community standards are only translated into about 40 of them, and that was as of early March.
But that’s hard to do if they’re not aware of the rules in the first place.
To deal with the mounting challenge - Facebook has focused on creating automated tools to monitor content.
But some work in fewer than 20 languages.
A Reuters report found last year that hate speech on Facebook that helped foster ethnic cleansing in Myanmar went unchecked.
In part because Facebook until last year has insufficient tools and moderators for the local language in Myanmar.
Human rights officials fear Facebook is moving too slow again.
In parts of Africa, Facebook is trying to address the issue: (SOUNDBITE)(ENGLISH) REUTERS TECH REPORTER PARESH DAVE, SAYING: “They just opened their first content moderation site or are actually in the process of opening it in sub-Saharan Africa.
And that site will specialize in languages like Somali and a few others.
And so trying to bring that cultural context and cultural expertise locally to content moderation is another step that Facebook is taking to try to deal with this issue.” And the stakes are only growing.
Countries including Australia, Singapore and the UK are now threatening harsh new regulations if a company fails to promptly removed objectionable posts...punishable by steep fines or even jail time for executives.