Plus-Size Comedian Celeste Barber’s Post Was Flagged For Nudity But The Same Post By A Supermodel Was Fine. Why So Discriminatory?

Plus-Size Comedian Celeste Barber’s Post Was Flagged For Nudity But The Same Post By A Supermodel Was Fine.  Why So Discriminatory?

Social media was sold to us as a platform that enables freedom of expression and equality. We signed up on Instagram and today, a majority of us find ourselves scrolling through our feeds, lying on the bed, assuming we are tasting freedom. We believe we can express ourselves through our photo grid and consume content that we give a nod to. But is it really that way? Recently Australian comedian Celeste Barber called out the platform for their body-shaming practices and it only gets murkier from here.

The plus-size comedian had posted a semi-nude picture parodying another picture of Victoria’s Secret’s model Candace Swanepoel. She captioned her picture, “When you finally sit down and your kid asks for a drink.”

Instagram users found that hilarious and possibly relatable and wanted to share except the platform flagged it to be in violation of “community guidelines on nudity or sexual activity.” Of course, Swanepoel’s image remained shareable and somehow didn’t violate any guidelines whatsoever.

https://www.instagram.com/p/CGXGDWOHl02/?

The platform’s biased policies were called out by Celeste Barber who wrote on her story, “Hey Instagram, sort out your body-shaming standards, guys. It’s 2020. Catch up.” As mentioned on Facebook, Instagram flags content according to reports by users and Artificial Intelligence (AI). Yet, without any editorial discretion, the platform has been criticised for being discriminatory against marginalised groups including people of colour, plus-size people as well as the LGBTQ+ community.

“Instagram emailed me an apology, bless, saying it was an error and that they are working to fix it. They also said that even the people that tried to share my post will be able to,” Celeste Barber informed. Instagram released a statement that read, “This shouldn’t be happening and we are committed to addressing any inequality on our platforms. We expect to update our breast covering policies very soon, to make sure all body types are treated fairly.”

https://www.instagram.com/p/CGhYOBmn9XP/

While this “error” got fixed for now, Celeste Barber’s isn’t a standalone incident where Instagram’s discriminatory algorithm has come to the forefront. Instagram’s community guidelines seem straightforward. “Photos of post-mastectomy scarring and women actively breastfeeding” are cool but “content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks” isn’t okay. However, what Instagram flags as sexualised content comes laced with all sorts of biases.

Semi-nude photos of big women are considered “sexual” while similar or nude-r images of thin silhouettes are okay. Black women are moderated and white women aren’t. In fact, researchers believe that Instagram shadow bans certain sections and restricts the reach of their content. Playboy is full thin, white women and the images are pretty much there.

Salty, a digital platform conducted research to delve deeper into this problem and their findings showed how social media is nothing about equality. “Risque content featuring thin, cis white women seems less censored than content featuring plus-sized, black, queer women – and cis white men appear to have a free pass to behave and post in any way they please, regardless of the harm they inflict. There are few options for stopping harassment on the platform. Female Founders tell us they are banned from advertising femme focused products, while men’s products in the same categories can. We know that this is happening around us, there is a myriad of personal stories and experiences shared online, but it is hard to find definitive, data-driven, evidence that this is a systemic problem on Instagram. So we decided to get some,” Salty reads.

Last year, a few Salty ads featuring “fully clothed BIPOC, disabled, plus-sized and trans women” were flagged as “promoting escorting services” by Instagram. What even? “We collected data from our community to better tell the story of the way these algorithms affect us, and formulate recommendations to make FB/ Instagram a safer place for women, trans and non binary people,” Salty states.

https://www.instagram.com/p/CDRUNf9gOvt/?

The report identified the demographic group that suffered the most due to Instagram’s biased censorship. “Salty distributed a survey on to its followers on Instagram and via newsletter. As a community for and by women, nonbinary people, and queer folx, the Salty’s followers is heavily composed of these demographic groups,” the report says.

Talking about being shadow banned, a user told Salty, “My posts have been banned from showing up in hashtags because I used a hashtag associated with pornography. So my account has been labeled as pornographic and no longer shows up in searches. I use tags such as body positive, body image, HAES, black girl magic.” The report also said that plus-sized women were often flagged for posting sexual content or nudity.

In fact, while Instagram’s anti-racism policy aims at discouraging racism, the report revealed that often the victims of racism suffered and not the attackers. Victims of racism had their content deleted or shadow banned, possibly due to being reported by trolls. Similarly, pages with feminist content also reported being shadow banned.

“The high incidence of accounts that were reinstated after deletion shows a high rate of false flagging for users in our data set,” the Salty report reveals. Clearly, Instagram’s AI is glitching and like Barber said, it’s 2020 and by now they should have had things in place.

ALSO READ:AI Transforms Pictures Of Clothed Women Into Nudes That Are Realistic. Nope, The Women Don’t Know Their Pictures Are Being Used

We are way beyond entertaining a social media platform like Instagram having an algorithm that discriminates – on the basis of colour, body type and identity. Without editorial oversight, Instagram’s AI is not helping the community. Instead, it is becoming a digitized version of the toxic offline world of the pre-progress times. Today we are talking about and encouraging body positivity. The world is roaring against racism as #BlackLivesMatter became a global movement. Even a developing country like India has fought and made same-sex relationships legal. Instagram’s algorithm needs to step up! And we don’t need more women body-shamed for that to happen.

ALSO READ: Bipasha Basu Drops Truth Bombs On Colourism, Body shaming And Stereotypes. We Love Her Candid Nature

Akanksha Narang

Read More From Akanksha
Seen it all?

We’ve got more!