Writing about people

There are a number of principles to follow when writing about people. These principles apply to identity, including disability, race, class, gender, and sexuality.

Principles#


The goal of all UX content is to be understandable to all — not just to the people paying for something or for those in a certain industry — and to accommodate the many ways that people use products. Think and write by centering the person you’re writing to or about in a way that’s compassionate, inclusive, and respectful. Work to grasp the perspective of the most historically underinvested, and avoid “othering” groups of people. You can use methods like co-designing and UX research.

Keep the following principles in mind when writing:

Use neutral, precise, relevant descriptions#

Only include personal qualities if they’re relevant and important. Write what you mean, then look back at what you wrote and think about whom you’re centering with your words. Doing this can show the folks you’re leaving out. What’s the sentiment behind your words? Are you othering (viewing or treating someone as intrinsically different from and alien to yourself) certain groups?

Avoid terms associated with bigotry and violence#

This is a list of words that you should never, ever use. Be cautious, too, of appropriating terms from marginalized communities. In this guide, we say “historically underinvested communities” to point out the main way that these groups of people have been marginalized around the world.

Be specific and kind#

Be on the lookout for proxy questions and statements, which appeal to generalizations and stereotypes. For example, saying “just buy more storage” is a proxy statement on economic status. Communicate from a place of equality, not condescension, and think about the worst-case interpretation of your words. Clear intent excludes fewer people and reduces bias.

Account for machine learning and AI#

When collecting data, first think about whether that information is actually needed, and then if it really is, communicate why. Allow for both common and custom responses, self-identification, multiple selections, and the option to opt out of responding. Artificial intelligence learns only from the information we provide to it, so our inherent biases can easily become included in training data. If content allows for variable and AI-provided information, consider the ways that may affect any copy.

Writing about disability#


Use neutral, precise, relevant descriptions#

Person-first language centers the person, not their qualities, by using those qualities as modifiers: “Design Adobe apps for people who use assistive technology.” But for identity-first language, which some communities and individuals prefer instead, language highlights the disability: “Design Adobe apps for deaf people.” No group unilaterally chooses one over the other, so when you’re writing about someone, ask them how they want to be identified. Avoid euphemisms, like “differently abled,” and descriptors used as nouns, like “the disabled” or “the blind.”

Avoid euphemisms and descriptors used as nouns.

Avoid words associated with bigotry and violence#

Some phrases in common parlance that imply negativity are based on slurs against people with disabilities, such as “crazy” or “lame.” Never imply that a person is “suffering” from a disability or is a “victim” of a condition. Avoid appropriating terms from the disability community.

PreferredAvoid
Ridiculous, incompetent, bad, unpredictableDumb, lame, tone-deaf, crazy
Keyanna is autistic. / Keyanna has autism, which affects the way they use Photoshop.Keyanna uses Photoshop differently because she is suffering from autism.
Placeholder variableDummy variable
Amir uses a wheelchair.Amir is confined to a wheelchair.
Unavailable, locked, turned off, deactivatedGrayed out
Coherence checkSanity check
Organize your filesBe OCD about your files

Be specific and kind#

With imagery and language, avoid implying that a person has to look a certain way, be a certain size, or have a certain cognitive ability to do something. Depict more types of people as typical.

PreferredAvoid
This tutorial teaches cropping and usually takes 5 minutes.Follow this fast, easy tutorial.
This feature works best when you zoom out to 75%.This feature isn’t for the vision-impaired.

Account for machine learning and AI#

Enter metadata with caution. For example, don’t tag a photograph of a child with words such as “crazy” or “weird.”

Key example showing how not to account for machine learning and AI. Incorrect usage, photograph with metadata tag field, label Tags. Inside tag field, tag label Crazy.

Writing about race and class#


Use neutral, precise, relevant descriptions#

Let’s say you’re writing a persona. Don’t simply say that this person is a Southeast Asian woman and expect that to be a sufficient description — doing this is irresponsible UX design. Race is only pertinent to biographical and announcement-related content that involves significant, groundbreaking, or historical events, such as President Barack Obama being elected the first Black U.S. president. Capitalize nationalities, peoples, races (other than white, per the AP Stylebook), and tribes.

Avoid terms associated with bigotry, oppression, and violence#

Many phrases in American English stem from language around slavery. Adobe avoids using software terms such as “whitelist,” “blacklist,” “master,” and “slave.” Don’t use terms assigning value to racial characteristics, such as “dark pattern.”

PreferredAvoid
Shared domains, approved people, targeted sites, allowlist (for coding constructs)

Use this format to provide contextual clarity:

(Result in past participle form) (object)
Whitelist
Blocked users, prohibited IP addresses, blocklist (for coding constructs)

Use this format to provide contextual clarity:

(Result in past participle form) (object)
Blacklist
LegacyGrandfather clause
Futile undertaking, a project destined to failDeath march
Primary, main, source (e.g., “main track”)“Master” descriptors
Primary/secondaryMaster/slave

Avoid phrases rooted in belittling someone’s command of a language#

A lot of idioms and phrases in our everyday speech are based on belittling people or characters who communicate in English as a foreign language.

PreferredAvoid
Welcome backLong time no see
2-step processWax on, wax off
Sorry, something went wrongNo can do

Depict more types of people as typical#

Which of our users is the default, or belong in the center? Throughout history, white-skinned people from Western cultures were considered the “standard,” and anyone else were in the margins. Make diversity typical.

PreferredAvoid
Dark brown, beige, tan, peach, etc.Skin, nude (referring to color swatch)
CriticsPeanut gallery
More names from non-white cultures (e.g., Ayesha, Ibrahim, Vignesh, Quynh) in a product or experienceUsing only culturally white names (e.g., John, Bill, Karen, Amy) in a product or experience
Need more help? Contact us.This app is so simple, anyone can use it.
Key example of correct way to show more people as typical in a product. A coach marks shows a new feature, titled "Introducing coediting." There are 4 avatars showing a diverse group of coeditors: a Black woman, an Asian man, a white man, and an Indian woman. The featured name of one coeditor is Ayesha Wilson.
Key example of incorrect way to show people in a product. A coach marks shows a new feature, titled "Introducing coediting." There are 4 avatars showing non-diverse group of coeditors: 3 white men and 1 white woman. The featured name of one coeditor is John Smith.

Avoid appropriating words from historically underinvested communities#

Metaphors are all around us. When we use words or expressions that are important to someone’s identity, we downplay their importance and meaning. We can be descriptive and clear without appropriating language.

PreferredAvoid
Built-inNative
MeetingPow wow, circle the wagons
Vision statement, strategic statement, value propositionZen statement, Zen garden
Role model, kindred spiritSpirit animal
GuideSherpa
Authority, expertGuru, ninja

Writing about gender and sexuality#


Use neutral, precise, relevant descriptions#

Misogyny (particularly against women who are Black, trans, or both) is baked into society, and gender stereotypes are everywhere. The language we use can subtly reinforce or dramatically defy the patriarchy. Replace words like “wife,” “businessman,” or “waitress” with neutral alternatives. Rather than “he” or “she,” if you don’t know a person’s pronouns, make the phrase plural and use “they” instead. Use of “they” to describe one person is also accepted, although the syntax remains plural (e.g., “they are” = “that person is”).

PreferredAvoid
ServerWaitress
BusinesspersonBusinessman
Flight attendantStewardess
They can fill out this form.He/she can fill out this form.
A group of people, a group of womenGuys, girls, ladies
ParentsMoms

Avoid terms associated with bigotry and violence#

To emphasize every person’s humanity rather than alienating folks who aren’t cisgender and heterosexual, remember that gender and sexuality descriptors are modifiers, not nouns (e.g., “transgender woman” rather than “a transgender,” “bisexual person” rather than “a bisexual”). A person’s pronouns are not opinion or preference, even if they may change over time. View Spectrum’s guidelines on pronouns.

PreferredAvoid
Trans womenTrans-women
A transgender manA transman
Transgender people, trans peopleTransgendered people, transgenders, the transgendered, transexuals
Saadi is cisSaadi is CIS
Alejandra, a lesbian womanAlejandra is a lesbian
Jing, a woman / Jing, a non-binary personJing is a non-binary
What are your pronouns?What are your preferred pronouns?
Jamal’s pronouns are he/him/his.Jamal prefers he/him pronouns.
Intense, impassionedHysterical

Be specific and kind#

Avoid conflating sex (male/female) with gender (man/woman). Data collection and forms, while useful to product builders, can feel intrusive when asking about gender. For example, do you have any real reason to ask for a person’s gender after they download your app? When you really do need the information, allow for both common and custom responses, self-identification, multiple selections, and the option to opt out of responding. Avoid asking proxy questions (e.g., asking for someone’s gender when the information that is actually needed is their bike size).

PreferredAvoid
CustomOther
Bike frame size: 54-55 cm/56-57-58 cmGender: M/F/Prefer not to say
Gender:
  • Male
  • Female
  • Prefer to self-describe:
  • Prefer not to answer
Gender: M/F/Prefer not to say
A key example showing preferred usage of being specific and kind. A picker with label Gender, value text Female. Picker menu options: Male, Female, Custom, Prefer not to answer.
A key example showing incorrect usage of being specific and kind. A picker with label Gender, value text Female. Picker menu options: Male, Female, Other.

Account for machine learning and AI#

Don’t assume that everyone is heterosexual or that this is the norm to be teaching artificial intelligence. It wasn’t until June 2018 that the World Health Organization (WHO) declassified being transgender as a mental illness, so machine learning and AI can still perpetuate biases that we as humans have unlearned. Also, datasets trained to classify more than 2 genders are rare, so anytime a person’s face is added, AI assigns them into one of those 2 categories. Don’t use AI or machine learning to guess genders based on image recognition, text analysis, or anything else.