Dear Mr. Zuckerberg,

Since launching our underwear brand last year we have consistently had our content blocked for violating Meta’s policies.

This is due to your algorithms deeming our content i.e. women in underwear, as sexual content. 

For the last year we have justified this censorship, at the end of the day, it's intended to keep us safe - right?

We even got creative, shooting our underwear on fruit instead of women. Like so many brands, we have to be inventive to work around these limitations.

But the thing is, Mr Zuckerberg, we want to show our products on real bodies.  

We want to demonstrate to people of all shapes and sizes how our products fit. And, the way in which Meta’s algorithm is configured, does not allow us to do this.

We're hyperaware that women in underwear being deemed as sexual content is not an ideology you thought up, but one that is deeply rooted in society. Indeed, we have faced similar happenings on YouTube and TikTok.

We’re just a small brand, with an even smaller marketing budget standing in front of one of the world's most powerful conglomerates asking for change.  

We live in a world where there’s too much focus on how we look and I'm sure you'll agree it’s time to prioritise how we feel.

We firmly believe, Mr Zuckerberg, that the solution does not lie with us, but with you. Meta actively leading on these issues could be revolutionary and we are certain, rewarded for doing so.

We would love the opportunity to continue this conversation and work collaboratively towards a solution that supports creators, small brands and activists alike to use your platforms as intended; to connect, collaborate and ignite change in a world that is so desperately seeking it.

You have our number.


Amanda & Katie

Pantee Founders




Sarcasm aside, there’s a real issue here: our society, especially our social media platforms, automatically sexualise images of women. And it’s got to stop. 

On a daily basis, our social media ads are blocked because we show our products in use — images of people in our ridiculously comfortable, sustainable underwear. The carefully programmed algorithms “see” skin — the edge of a bum cheek, the curve of a boob — and block our ads. 

It’s not just a Pantee problem either; many other brands experience the same issues. A report by the Center for Intimacy Justice found that Facebook has rejected ads from over 60 companies for products and services related to menopause, periods, breastfeeding workshops and consent education. 

This is not sexual content; it’s helpful content, and it’s being censored. Social media has an incredible influence on how we see the world, and images of normal bodies and the products associated with them are being excluded. Just because a product or service is for women, doesn’t mean it should automatically be considered sexual.

It’s an industry-wide issue, and it’s not just the algorithm bots to blame; women and their bodies are called out as inherently sexual in platform policies too. 

Facebook’s adult content policy lists examples of “sexually suggestive content”, calling out cleavage and “images focused on individual body parts… even if not explicitly sexual in nature”. 

YouTube’s policy puts “clothed breasts” in the same category as videos containing masturbation or bestiality; even new kid on the block TikTok specifically bans “female nipples or areola” in their community guidelines.

Instagram loosened their rules slightly after a brilliant, passionate letter from model Nyome Nicholas-Williams pointed out their censorship bias against plus-size, non-white bodies, but it still bans nudity, with a few exceptions for breastfeeding and post-masectomy scarring. 

What’s scary is that these social media policies are just the tip of the iceberg — they reflect the world they operate in. This attitude towards images of women points to something much wider, more fundamental, more concerning: a culture of sexualising women by default, reducing them to what’s on the surface. 

You see it across all types of media and industries, whether it’s reams of articles on Theresa May’s clothes whilst she was Prime Minister to TV interviews where Serena Williams is asked why she isn’t smiling enough instead of about the match. 

We’re trained to think about and comment on how women look first. As well as overlooking women’s skills and personality, this sends the message that it’s not about how a woman feels or even how she perceives herself; it’s about how she’s perceived by other people. 

That was traditionally the case for women’s underwear. Ads were often slim women in lacy red bras, often popping up at Valentine’s Day, and focused very much on the person who’d be looking at the underwear rather than wearing it. 

The Facebook algorithms seem to work from this old school stereotype. Fearful of “smut”, they’ve taken the easy route - showing skin is bad, especially women's skin. It sounds like something out of The Handmaid’s Tale, not real life 2022. 

It also creates a vicious circle: images of women in underwear get banned from social media, so the only place you’d end up seeing these types of images are adult websites. That cements the idea that women in underwear is sexual — for someone else’s pleasure — and the cycle begins again.

So how do we break it? Awareness and representation. 

We need images of underwear that aren’t all lace and red hearts. We need to see images of different body types. We need to be able to reach and talk to women about the products designed for them.

We all have a role to play. Let’s stop commenting first on women’s bodies and how they look. Let’s stop sexualising women and images of women.

Just as individuals need to take personal responsibility for not automatically sexualising women, so do social media platforms that not only reflect but shape the world we live in. 

We’re calling on Facebook, Instagram, TikTok and YouTube to take a long, hard look at the assumptions behind their algorithms and policies. Instead of just updating them so they’re fit for 2022, why not lead real change in the way we see and talk about women and create policies fit for the future?