Can ethical algorithms exist?
Hero image

We reaffirm our dedication to Design Justice and seek to share more about our Reading Circle. Recently, we explored government regulation, tech companies, and their connection to the tragic New Zealand mosque attacks. We discussed content moderation, social media's dark side, and questioned how tech companies can make humane choices despite potential profit loss.

We’re re-upping our commitment to Design Justice and want to publish more about our Design Justice Reading Circle. In a recent, very timely meeting, we discussed the intersection of government regulation and tech companies. The topic was spurred by the tragic mosque attacks in New Zealand, and their seeming birth from Internet forum culture; this point is well articulated by The Daily podcast from March 18, 2019. To prepare for discussion, we asked folks to listen to The Daily from March 18, to read “Youtube’s Recommendation Algorithm has a Dark Side” by Zeynep Tufekci from Scientific American’s April 2019 issue, and watch the “Content Moderation and Free Speech” episode of the Patriot Act with Hasan Minhaj (available on Netflix).

During our discussion, we discussed themes and asked some big questions. The article explicitly calls out the “darkside” of the algorithm where engagement comes from recommendations of “wild claims, as well as hate speech and outrage peddling.” In the Daily podcast, they discuss that the mosque attacks seemed almost “born from the Internet” as the shooter called out for people to follow a YouTuber and live-streamed the attacks. Given that the content is not only available, but also pushed for more engagement, and that the content can push people to violence, what can be done?

In Hasan Minhaj’s episode, he reframes social media companies from content platforms to  ad companies (right around 15:30), looking to engage more people and, thus, maximize ad sales. This reframe shifts our perception of Facebook’s ideals from connecting people to interesting and useful social content to the less lofty goal of keeping people’s eyeballs on their site for as long as possible. Thinking about their business aims in this way, it seems futile to believe that Facebook, or any other company, would moderate their content. The inflammatory content is devoured and shared rapidly, which increases engagement and positively affects the company’s profits. If the name of the game is to increase participation and drive ad sales, then content moderation seems counter to the goals of the company.

Since tech companies conveniently oscillate between being curators of user generated content and publishers of content, they reap the benefits of unclear regulations. Who then is taking responsibility for hate messages, brainwashing, planning a white supremacist march or watching a child being beaten thousands of times? Or even simply for the addictive behavior they encourage in users?

This begs the question:

Is there a way to structure tech companies so that they are able to make humane choices that may hurt their earning potential? Should all tech companies be B-Corps with a mission to save humanity?

Now, people may say that innovation and tech companies are not changing human behavior—though there is a community which believes otherwise—that what we’re seeing on the Internet has always been a part of human nature. We can also argue that empathy, kindness and a moral compass is also part of human nature and it’s up to us as researchers and designers to choose what we want to augment.

Another question we pondered was who is determining right and wrong? Is it better to have a company or the government moderate content? Or have AI moderate content? Can we really get away with biases? Can AI ever become smart enough to understand the nuances of censorship and freedom of speech?

For this particular session, the questions and thoughts seem more important than solutioning. We also feel that we’re just touching the tip of the iceberg and welcome conversation and additional questions or resources.

Cover Photo by Thought Catalog on Unsplash, Thumbnail Photo by Marc Schäfer on Unsplash