Chris Priebe is the owner of Two Hat, a company of Kelowna, British Columbia, who moderates the content based on artificial intelligence.
June 7, 2020 15h03
Social media: moderation more difficult content during the pandemic
The Canadian Press
TORONTO — many companies have seen the work stop because of the crisis of the COVID-19, but for Chris Priebe, it is the opposite.
He is the owner of Two Hat, a company of Kelowna, British Columbia, who moderates the content based on artificial intelligence.
It has never been so busy helping clients such as Nintendo Switch, Habbo, Rovio and Supercell to browse through billions of comments and conversations. The goal: to identify and quickly remove everything that could harm users.
“We have treated 60 billion last month. Previously, it was 30 billion. This is how the coronavirus is bad. It is at least double the normal volume, said Priebe in April, before the treatment volumes monthly reach 90 billion.
“The platforms are sometimes faced with 15 times the volume. That’s not to say that revenues have increased 15 times, or that they can afford to hire more people.”
Companies such as Facebook, Instagram, Twitter, YouTube and Google are aware of a lack of moderators of content, which leads to a delay in the removal of publications harmful.
An unprecedented number of people spend more time at home on their preferred platforms. It will be a challenge for the servers, and transforms e-mail services, social networks and the comment sections in Far West.
The situation has increased the concerns about the misinformation and the likelihood that users reach the content that is hateful, pornographic, or violent.
“Some people are rather dissatisfied with the process of moderating content as it is, and there you add this pandemic.”
A professor of Ottawa abounds in the same direction.
“There is a huge increase in bullying behaviour and issues,” says Suzie Dunn, specializing in the intersection of technology, gender and the law, at the University of Ottawa.
“Some people might not be able to work on some things on what they would be working in the office,” adds Kevin Chan, head of public policy, Facebook Canada. They are studying the content of potentially private and sensitive. It is necessary to ensure that it is treated securely and in private, as it should be.”
Twitter automation assistance to review the comments most likely to cause damage.
“We are working to ensure the coherence of the systems, but it may be missing the context provided by our teams. It may cause errors, says the company via blog post. We do not suspend the final accounts based solely on our automated systems.”
Regardless of the way in which the moderation, some things will always through the meshes of the net, even more during a pandemic, ” said Ms. Dunn.
“No system is perfect.”