Screenings at ACMI from October 19
Interview with directors Hans Block and Moritz Riesewieck and review of The Cleaners.
Hidden away in developing countries are teams sitting in crowded and dark rooms. Sworn to secrecy as they watch every video and click through every image uploaded on social media. Standing behind them are security guards ensuring they don’t break protocol. Once they leave the office, they are monitored making sure they don’t talk about their job or even hint about what they do online. The giant social media networks hire these companies to decide if the image you just uploaded is worthy of staying online. They can’t tell anyone they do this job and they most certainly can’t explain what they have to watch.
Hans Block, Writer and Director of The Cleaners thinks this sounds like the script of a Hollywood blockbuster, “The hidden exploitation of thousands of young workers in the developing world and the silencing of critical thinking in the digital space”.
When it comes to the online world there is a lot most people don’t know about. There’s also a whole lot that people know about but don’t want to do anything to fix it. There are parts of the web that harbour criminals and worse, there are people in this world that will pay for some horrible, disturbing and highly illegal video and images.
With some 300 million photos uploaded to Facebook every day, you might ask yourself how illegal and disturbing images are filtered so that they don’t end up on your feed. This was the same question the Berlin duo Hans Block and Moritz Riesewieck had in mind after seeing a disturbing video.
“…thousands of users around the world discovered in their Facebook newsfeed a video of a small girl being raped by an older man. Before the video was deleted from Facebook, it had been shared 16,000 times, and liked 4,000 times,” Block said.
With this in mind Block and his partner Moritz Riesewieck who work under the label “Laokoon” decided to find out exactly how this hidden world works when there is supposed to be a great deal of transparency.
Block explains that Laokoon in Greek Mythology is the only person who warned everyone of the true nature of the Trojan Horse. And they wanted to follow in his footsteps exposing the Trojan Horses of our time.
“The Cleaners, it’s the double bottom of the safe Internet which billions of users experience every day. It appears as a matter of course but actually comes at a high price.
Thousands of secret workers sort out what we are not supposed to see. But what appears on their screens, what should be withheld from us? It seemed very important to us to show that this actually exists,” Block says.
At first you have a group of people that are looking at pictures and watching videos of torture, hardcore pornography and rape of innocent under age victims. This would be a highly stressful and detrimental job to any individual. But on the other hand, “these young people determine what can be seen in our digital world and what’s not”, said Riesewieck.
Consider that these decisions made are related to politicians or public figures where the public has a right to know for their corruption. How are these “cleaners” able to make these decisions for an entire online world.
In researching for The Cleaners Riesewieck explains how in a meeting with media scientist Sarah T. Roberts, all these decisions are made by people and not algorithms as most would expect.
“We really wanted to get in contact with these workers doing that kind of job, because for us this is an interesting work on several levels,” Riesewieck said.
It was assumed that most of this work was outsourced to developing countries. But getting in contact with these individuals in Manila ended up being the hardest part of making this film for Block and Rosenwieg. It’s a secretive and very well-hidden industry where not only do the social media companies do their best to keep this a secret, the third-party companies also take the secrecy just as seriously.
“The companies use codewords to hide for which companies they are working for. Workers are never allowed to say that they work for Facebook. Whenever they are asked, they have to say that they are working for the “Honeybadger Project”,” Riesewieck explains.
The Cleaners delves into the lives of five moderators as they explain exactly what they see and how it affects their lives and the lives of people around them. They also go onto explain how they view their jobs and what they feel the purpose is of their position.
Some moderators take their jobs very seriously adding pressure on themselves as they feel they have to prevent war, suicide and bullying.
They are made to memorise information to be able to make quick decisions are they click through vast numbers of images and videos. The United States Homeland Security have these teams remember some 27 terrorist groups. They need to be able to identify these groups flags to be able to block them from uploading anything.
But what makes this story interesting is the fact that major online companies like Google and Facebook deny the use of these services. The way they are able to hide this information is through using third parties in developing countries who are contractually obliged to keep their partnership secret.
Block said the workers must keep their job secret otherwise they are sued. They aren’t allowed to talk to strangers and they are controlled by security firms who also screen their social media accounts.
“It took us quite a while to get in touch with the workers. We collected all the information to get a bigger picture about what is going on there. But when we eventually did it in collaboration with a network of locals, we were surprised how proud many of them were to do this job,” Block said.
These moderators feel without them social media would be a complete mess. But with that, many of them silently suffer from the impact the work has on their mental health said Block.
“After a very long period of research, in which we built up a relationship of trust with the workers, we worked together with our protagonists of the film to find out how to make a film about this work,” Block explained.
One of the moderators tells the story of some of the more graphic video’s she has had to watch. In one case she witnessed a six-year-old girl in a cube perform fellatio of an older male. She expressed to her team leader that she can’t look at these images anymore. She was told was to just do it because it’s her job.
This is for the most part how these moderators are treated. And in turn they very quickly start to build symptoms “…as a result of their daily work [the symptoms] similar to the post-traumatic stress disorders soldiers suffer from.
Is it any wonder content moderators who see rape videos and other kinds of sexual violence for 8-10 hours per day are not interested in sex when they come home in the end of the day? Is it any wonder content moderators who have seen thousands of beheadings cannot trust in other humans anymore, lose all their social relationships, develop sleeping or eating disorders? Is it any wonder there is a seriously increased suicide rate among content moderators if they have to handle all kinds of self-harm-videos?” Riesewieck said.
Riesewieck continues to explain that Psychologists from Manila and from Berlin said what makes this entire situation worse for these moderators is when traumatised people aren’t allowed to verbalise their experiences.
With so much secrecy surrounding the subject matter, there’s no doubt making a film to expose not only the control of information to the public but also the effect it has on the individuals controlling that information would have some backlash.
Block explains how there was no attempt from the online giants to make contact regarding the film even when offered the final cut.
“From the very beginning, we wanted to connect with the executives of the big social media companies. We contacted several dozen people and never got an answer. We asked for a public statement and we even offered to include this statement in the film. Again, there was no answer. Intransparency is unfortunately still one of the main characteristics of these companies,” said Block.
The only actual contact they had was with YouTube said Riesewieck as it violated their community guidelines. “Unfortunately, we didn’t receive a detailed explanation. Facebook has also tried several times to prevent the trailer from being uploaded on the platform,” he said.
Despite the trailer being blocked, The Cleaners is now making its way around the festival circuit gaining a great deal of interest. It comes at a critical time when social media sites are being held more and more accountable for their actions.
While some countries still continue to control the information available to its citizens, there is a number of individuals deciding what information the rest of the world will see. And these decisions are based on their own beliefs and values. Not to mention whatever the current directive from the social media outlets is asking of them as well.
But what you have in the end are these individuals filtering some horrific images that is causing mental health issues to the point of suicide. As this is a secret industry there are no repercussions for the online giants. And because these are developing countries there is little being done to ensure the safety of these individuals.
The Cleaners is a horrible yet eye opening look at a seedy world none of us will ever see due to the help of these individuals. But on the other hand, there is also a whole lot of information you will never know existed, possibly by the most trusted individuals or companies in the world.
What and who can you trust is what Directors Block and Riesewieck set to answer. But they find themselves realising the people behind the scenes are doing a lot more controlling with dire personal sacrifice.
The cleaners doesn’t feel like it’s out to make change to this convoluted industry. So, the question is, will the documentary do anything other than inform the users of the world wide web. Will opening this information up to a wider audience make these large corporations accountable to change. Not only to stop these individuals having to deal with this horrid material that makes its way online but also to stop them from making decisions about what information is shared on our behalf.