logo-rebrandphone Skip to main content

Predictive Sexism: Is your mobile phone sexist?

Predictive text on our phones can make composing texts quicker and easier but could it also be reinforcing gender bias?

Professional woman using phone in front of office

Most of us are now aware that spending too long on our mobile phones can have detrimental effects on our health. But did you know that predictive algorithms on smartphones could also be perpetuating gender stereotypes? 

Using the phrase ‘‘You’re a/an *insert word*’, Uswitch tested 236 words on a Samsung Galaxy S21 and iPhone 12, using Messaging and iMessage respectively, both of which were operating on default settings, to determine gender bias in predictive text algorithms. These were the results:

Male biased predictive algorithms 

Our study found that 72% of the words tested suggested a gender biased response on Apple’s iOS iPhone and Samsung’s Android handset. 

However, it was Apple’s iOS algorithm that generated the most male biased suggestions. 

Two thirds (64%) of words generated a male biased response on iOS. While just 4% of words generated a female biased response and only 13% generated a gender balanced response. 

In comparison, 15% of words generated a male biased response on the Samsung Android device. 

Overall, male gender word suggestions were generated five times more than female gender suggestions (238 compared to 44) across both handsets.  

Samsung Android best for gender neutral language

For people who don’t identify with binary male and female groups, gendered language can be alienating. While gender neutral language hasn’t always been the mainstream vocabulary norm, it’s important that language evolves to be inclusive. This includes the predictive algorithms that could subconsciously reinforce gender norms.  

In our study, Android had four times more gender neutral outcomes than iOS (154 compared to 38). In fact, two thirds of the Android’s responses were gender neutral (65%). 

‘Quick-witted’, ‘empathetic’, and ‘self-confident’ generated a gender neutral response on Android, compared to a male gender biased response on iOS. 

Perpetuating negative gender stereotypes

The research found that the predictive text algorithm on both iOS and Android Samsung perpetuates negative gender stereotypes. 

Of the seven adjectives tested used to describe intelligence, including ‘bright’ and ‘intelligent’, neither iOS or Android generated a female suggestion.    

STEM skills, including ‘logical’, ‘decisive’, and ‘assertive’ all generated a male biased response on iOS. ‘Ambitious’ (iOS), ‘analytical’ (Android), and ‘hard-working’ (iOS) were also considered to be male words. 

When it comes to sport, both Android and iOS suggested a male biased response for the word ‘athletic’.

While male words were suggested for ‘brilliant’ (iOS and Android), ’committed’ (iOS and Android), and ‘driven’ (iOS), female words were suggested for stereotypical adjectives, including ‘nurturing’ (iOS), ‘supportive’ (iOS) and ‘lovely’ (iOS and Android).

Girl vs woman unconscious bias

Women are less likely than men to be addressed by their formal title and more likely to be called a girl in the workplace. Many studies have highlighted the detrimental effect of women being referred to as ‘girls’, including women feeling patronised and unequal at work.

Our research reflected this unconscious bias as ‘woman’ was suggested as often as ‘girl/girls’. In comparison, ‘man’ was suggested 200 times, and ‘boy’ was only suggested once. 

Eight adjectives used to describe weight and appearance generated ‘girl/girls’ suggestions, including ‘chubby’ (iOS and Android), ‘chunky’ (iOS’), and ‘skinny’ (iOS and Android).

Concerningly, ‘anorexic’, ‘ugly’, and ‘hot’ were also ‘girl/girls’ suggestions on iOS. 

Reducing gender bias in tech

Lu Li, Founder and CEO of Blooming Founders, comments: “Language is one of the most powerful means through which gender biases are perpetrated and reproduced. 

“In male-dominated industries like tech, women have a harder time being taken seriously compared to their male counterparts. And they are passed over for promotions more often because of words such as 'supportive' or 'nurturing' that are often associated with being female. 

“The bias is quite stark also in the entrepreneurial world: female founders only receive 1% of venture capital, which means that the vast majority of innovation is designed and led by men. 

“Gender-biased predictive text algorithms are another example of what's inherently wrong in the industry. If people with conscious and unconscious biases input biased data, it will result in biased outcomes, which the rest of society will inherit. Having gender-neutral word suggestions is critical to breaking this cycle and undoing the semantic relations and gender stereotypes that are still deeply rooted in our society.”

Gender diversity is important. Not only do highly-diverse companies generate more revenue, but men and women will often bring different viewpoints to decision-making. This allows for better decisions to be made, giving teams a more rounded perspective. 

Plus, 87% of UK adults own a smartphone so it's vital that companies can serve their increasingly diverse customer base. 

In recent years, many top technology companies have pledged to be more gender inclusive, including Apple and Samsung. Yet, a study led by Accenture and Girls who Code, found that only 21% of women believe the technology industry is a place they can thrive. 

In 2020, Apple reported that the number of female Apple employees had grown more than 70% worldwide, with an 85% increase in female leadership. While significant, currently only 26% of open R&D leadership roles at Apple are filled by women globally

Google’s 2020 Diversity Annual Report revealed that only around a third (32.5%) of their global workforce is female. Meanwhile, 28.6% (excluding operators) of Samsung’s employees were female in 2020

Elsewhere, a recent report found that only 18% of US Android developers are women.

Our research highlights that gender bias in predictive text algorithms does exist. Increasing diversity in the tech industry would be a great step towards improving gender bias. While significant improvements have been made in recent years to increase the number of women in tech and leadership roles, more work needs to be done to bridge the gap between men and women.

If you’re looking for a new phone, check out our mobile phone deals. You can turn off predictive text if you'd prefer.

How to turn off predictive text

Catherine Hiley, mobiles expert at Uswitch.com, comments: “With most of us spending more time on our phone than ever before, many of us rely on our predictive text to keep up to speed with our communications. 

“Over time, your smartphone is programmed to understand your writing style, and should suggest words you frequently use to make typing out a message even quicker. But before it gets to know you, predictive text relies on programmed algorithms that analyse the context of a sentence to determine which set of words to suggest

"However, as our research shows, predictive text doesn’t always get it right.

“If you're concerned about the predictive text algorithms on your phone you can turn your predictive text off in your keyboard settings."

To turn off predictive text on your iPhone or Samsung, all you need to do is:

  1. Tap the Settings menu on your iPhone

  2. Go to General

  3. Scroll down to Keyboard

  4. Now just toggle Predictive to off

You can also check out our guide for more recent iPhone tips and tricks.

If you send lots of messages on your phone, make sure you have the right mobile phone deal for you.

Methodology

Using relevant scientific journals and research, articles and an online thesaurus, we created a seed list of 236 adjectives that are generally used to describe a person, as well as words that are traditionally considered to be gender specific. 

Every word was individually tested on a Samsung Galaxy S21 and iPhone 12, using Messaging and iMessage respectively, both of which were operating on default setting. This was done to prevent any learnings from AI which could change the results based on the user. 

During the testing phase, the phrase ‘You’re a/an *insert word’ was typed into the mobile as a text message. The three suggested words were then recorded for both iOS and Android.

During analysis, we identified each predictive suggestion as either female, male, gender neutral, or gender balanced. We defined this as:

Female -  ’woman’, ‘girl’, ‘girls’, ‘lady’, ‘wife’, or ‘sister’. A word was considered female biased in response if the only suggestion(s) was a female word. 

Male - ‘man’, ‘husband’, ‘boy’, ‘guy’. A word was considered male biased in response if the only suggestion(s) was a male word. 

Gender balanced - A word was considered gender balanced in response if the suggestions included both a male word and a female word.

Gender neutral - ‘person’. A word was considered gender neutral in response if the only suggestion was person or if all three suggestions included ‘person’, a female word and male word. If the device only suggested ‘person’ and either a male or a female word, then it was not considered gender neutral.

Data collected June 10.

The videos on page were screen recorded on each device. Screen recordings on iPhone 12 were recorded in dark mode, while Samsung Galaxy 21 screen recordings were shot in light mode for easy distinction on page.

Additional sources:

Guardian 

Daily JSTOR

Sexist Slurs: reinforcing feminine stereotypes online

Gender Decoder

Merriam-Webster Thesaurus 

SACRAPARENTAL

Creative Commons