Strategic HR Job advert bias
A different language
SARAH RONAN hears why job adverts can
discourage certain segments of the population,
and how to debias them
Y
Are you looking to recruit a
‘dynamic leader’ or a ‘committed
people person’? Chances are you’re just
looking for the best person for the job.
But the choice of language used in the
job description could be alienating and
dissuading the best – and most diverse
– candidates from even applying.
Recent research from Adzuna
revealed that 60% of businesses showed
significant male bias in the wording of
their job adverts. This research was
based on a study by academics Gaucher,
Friesen and Kay, which found that job
descriptions with more masculine
wording were less likely to appeal to
female applicants. It wasn’t for the
most part that female candidates
assumed they weren’t up to the job, the
research found. Rather they –
consciously or unconsciously – were
less likely to feel they’d belong at such
an employer, and didn’t want to work
for a company whose first impression
was one of being biased in favour
of men.
And so debate on the issue is
hotting up. The UK government
recently announced a trial of genderneutral
language to define science,
technology, engineering and
maths apprenticeships to encourage
more women to apply. A pilot will
apply gender-neutral language to
12 apprenticeship standards.
But while most HR leaders are aware
that biased language exists in job
descriptions, many don’t know how to
fix this. Part of the problem is an
inability to identify biased language
because of its subtlety. Words that
seem innocuous are often rooted in
societal conditioning.
A 2017 analysis of 77,000 UK job
adverts by Totaljobs revealed ‘lead’ to
be the most common male-gendered
word used in job specs, while ‘support’
was the most used female-gendered
word. According to Gaucher, Friesen
and Kay, popular recruiting adjectives
such as ‘ambitious, assertive, decisive,
determined and self-reliant’ are male
gendered. While words like
‘committed, connect, interpersonal,
responsible and yield’ are considered
female gendered. For instance, in a
male-gendered job description a
company might be described as ‘a
dominant engineering firm that boasts
many clients’. Whereas the femalegendered
version could read ‘we are a
community of engineers who have
effective relationships with many
satisfied clients’.
So how can HR de-bias a job
description to make the language
gender neutral? According to Andrea
Singh, HR director of BAM, the first
step is to focus on gender-coded
words. Job titles should be neutral and
descriptive language should give equal
weighting to male- and female-coded
descriptors, she explains. However,
Singh also points out that de-biasing a
job description goes beyond replacing
adjectives. Employers need to make
sure that the requirements listed are
actually necessary, because “women
will typically only put themselves
forward for a job when they meet
100% of the criteria”.
But with unconscious bias ever
present there are questions around
whether it’s possible for humans to
conduct this de-biasing. Singh believes
that with the right training it is. But
she admits the best results come when
software and learning are combined.
“Technology brings information and
suggestions to the fingertips but job
specs need to feel authentic. The
people writing and editing specs
need to be trained to spot the bias too,”
she says.
I K
J K
H
However, Richard Marr, co-founder
and chief technology officer of
Applied, doubts whether training a
person to remove biased language can
be as effective as relying on dedicated
software. “The evidence is pretty weak
that training is effective,” states Marr.
“Processes trump training and tools
trump processes. With training
you’re just expecting people to do
the right thing.”
That said, the trouble with using
software is that neither Applied nor its
competitors AdPro and Textio
currently extend their job description
analysis beyond gender to include
other demographics such as BAME,
LGBTQ+, disabled or economicallydisadvantaged
candidates. Applied is
working with Google to expand its
analysis tool to incorporate ethnicity
(and other dimensions). But until such
tech is available removing gendered
language from job descriptions can
still have a positive impact on other
diverse groups, Singh believes.
“I think language can be looked at
in the same way. Masculine phrasing
might also be off-putting for
candidates from particular BAME
backgrounds where their culture
Words that
seem
innocuous
are often
rooted in
societal
conditioning
Z
30 HR June 2019 hrmagazine.co.uk
L
A
P B
U
G
L H
R P B L H
G Y L S U Z
/hrmagazine.co.uk