Woke jargon and censorship

The power of language on the internet

Scrolling on social media lately, it seems an entirely new set of vocabulary has arisen. Of course, it has long been acknowledged that internet use can impact the words we use, especially in recent years, where slang, memes, and inside jokes rapidly go in and out of trend. However, what started out as a means to avoid getting demonetized on platforms like TikTok and Instagram has now become its own separate type of jargon.

Because of community guidelines on social media, creators who wish to discuss serious or potentially offensive topics often censor particular words in the subtitles of their videos to avoid demonetization and getting shadowbanned from the algorithm. This type of censorship is often referred to as ‘algospeak.’ If you’re on social media, words like ‘unalived’ instead of ‘killed’ or ‘murdered,’ ‘seggs’ instead of ‘sex,’ or ‘sewerslide’ instead of ‘suicide’ are often encountered. Sometimes, even emojis are used to indicate certain topics rather than the word itself, like the grape emoji for sexual assault or the corn emoji when referring to porn. The large language models that moderate content on social media often limit the reach of queer creators, so they resort to using ‘le dollar bean,’ (le$bian), or the rainbow emoji when discussing LGBTQ+ issues.

While using algospeak can help to raise awareness about important issues despite censorship, it is important to note that this means of evading content moderation can also let discriminatory and harmful content slip through the cracks. Many alt-right content creators use a particular vocabulary to dog whistle their beliefs. Not only that, but using this means of censorship can also let content glorifying and promoting eating disorders and suicidal ideation have a platform. For example, content from the pro-ana community encourages behaviours associated with eating disorders and dissuades viewers from seeking help and recovery. Using algospeak to discuss serious mental health issues can create a false idea that these issues aren’t as severe as they appear, and using incorrect terminology and euphemisms can perpetuate stigma surrounding mental health. Social media has dramatically altered our discourse on mental health and political correctness. Now, with the rise of conservatism, even among Gen-Z, the outlets for hateful and discriminatory speech to gain traction through the internet have become vast.

Language constantly fluctuates with time, but with this development, new vocabulary as self-expression can also become a form of resistance. For example, the use of the word ‘cisgender,’ which remains a controversial topic politically, allows for the normalization of identifying as transgender. It also prevents the erasure of the trans experience by insisting that the opposite of transgender isn’t ‘normal,’ as many alt-right creators insist. Recently, the word “unalive” has been seen on a plaque in Seattle’s Museum of Pop Culture detailing Kurt Cobain’s death, to many people’s dismay. Though the plaque has since been changed, this use of an online term in a formal setting was shocking to many. The museum’s curator explained that the term was used to be respectful towards those struggling with mental health. However, that this term was used at all in this setting reflects how powerful social media is in dictating the vocabulary we use, especially when it comes to showing respect or sensitivity for more serious topics.

Furthermore, algospeak can provide a form of resistance against the stifling of voices in the fight against genocide, such as the use of the watermelon emoji to ensure that content raising support for Palestine isn’t shadowbanned. Many creators in and outside of Palestine use emojis and censored words not only to raise awareness, but also to raise funds to support victims of the genocide. These videos gain traction because social media users attempt to outsmart the algorithm. Commenters manipulate the algorithm by typing random sentences about trending topics, such as Labubus and Dubai chocolate, so that the content can be pushed to a wider audience. Manipulating the algorithm by using trending words and algospeak to resist the silencing of Palestinian voices on social media—and even further, to expose what Western media fails to—clearly shows that using this form of vocabulary can act as resistance.

Words have power. Institutions run on language. And altering the language of the majority with a new vocabulary engenders systemic change—whether for better or for worse. Systemically altering language and limiting accessible outlets for expression are efforts of dehumanization and oppression. However, social media can be used as a weapon against this with the rapid spread of new vocabulary. Changing the ways we speak fosters genuine change in the ways we think. And when systems must evolve to match, it is this change that makes the difference.