Weekly Outstanding Work

The W.O.W. Award

Welcome to our WOW (weekly outstanding work) award, sharing some of the best articles published on Impaakt.com. 

If you want to make an impact, you can join our community of writers by getting Certified, or you can make sure you make your ratings about the impact described - by rating you'll be shaping the company impact score that tells the rest of the world how sustainable a business is!

On to the winner this week...

"Facebook’s racially biased algorithm contributes to systemic racism in the US."

Written by: Paushali Bhattacharya

Company: Facebook
SDG 10: Reduced Inequalities



Feedback from the reviewing team:

"Tackling the behemoth that is Facebook, this engaging piece eloquently unpicks the concerning racial bias detected within the social platform. Very good writing."


Read the full impact analysis below and ensure you log in or sign-up to make your ratings about the impact the topic in this analysis has on the environment and society.

"Facebook’s racially biased algorithm contributes to systemic racism in the US."


Systemic racism is a form of racism that has become "normal" within a social structure, and thus is often overlooked4. A 2017 survey indicated the presence of systemic racism on social media when it revealed that most people (40%) in the US would ignore racist posts on social media5.

About 4.75 billion posts are shared by Facebook users daily7. From July to September 2018 alone, Facebook removed 2.9 million pieces of content for violating its hate speech rules, more than half of which was identified by its algorithms3. However, Facebook acknowledges that there are frequent mistakes in flagging content3.


Facebook’s algorithm is designed to defend all races and genders equally in an effort to apply consistent standards worldwide1. In an effort to be neutral, Facebook’s gender and color-blind policies deal with attacks on white people or men in exactly the same way as it treats comments about Black people or women without taking into account the historical context of racism and sexism2.


(Continued below)

Facebook racial bias


In 2019, researchers at Facebook discovered that the new set of rules proposed for the content moderation for Facebook's subsidiary, Instagram, was 50% more likely to automatically disable the accounts of people whose activity indicated that they were Black than people whose activity suggested that they were white2. However, the researchers were discouraged to conduct further research into the company’s inherent racial bias instead of changing the faulty rules2.

In 2019, there were 44.1 million African-American residents in the US6. A 2019 survey suggests that 70% of American black adults use Facebook, while 43% use Instagram3. Reports from Black Facebook users in 2019 suggest that Facebook’s content moderation systems are biased against people of color, as it disproportionately stifles the voices of marginalized groups, and rarely takes action on repeated reports of hate speech targeting Black users3.

Facebook’s inherent racial bias is adding to the problem of systemic racism in the US.




Sources



    Make sure you sign-in or sign-up and make your ratings about the impact described on the planet and society.