We all harbor biases — subconsciously, at least. We may automatically associate men with law enforcement work, for example, or women with children and family. In the workplace, these biases can affect managers' hiring and promotion decisions.

So when Pete Sinclair, who's chief of operations at the cybersecurity firm RedSeal, realized that — like many other Silicon Valley companies — his company had very few female engineers and few employees who weren't white, Chinese or Indian, he wanted to do something about it.

"I was trying to figure out, 'How do I expand my employment base to include those under-represented groups?' Because if we do appeal to those, we'll have more candidates to hire from," he says.

Sinclair figured the company was either turning off or turning down these minorities, so he turned to another software startup called Unitive, which helps companies develop job postings that attract a range of candidates, and helps structure job interviews to focus on specific qualifications and mitigate the effect of the interviews' biases.

Companies often err by using phrases like "fast-paced" and "work hard, play hard," which telegraph "mainstream male," says Unitive CEO Laura Mather. Instead, she encourages firms to use terms like "support" and "teamwork," which tend to attract minorities, in job descriptions.

Such adjustments seem to have worked for RedSeal: Sinclair says job applications shot up 30 percent, and the percentage of women among the company's three-dozen engineers has doubled.

"Our last hire was a Middle Eastern woman who would've frankly, in the past, never applied for the job much less gotten hired, just because she didn't fit the mold of people we hired," he says. "And she's turned out to be one of our top team members."

Sinclair says the motivation to diversify wasn't altruism. His company competes with Facebook and Google for talent, so it had to look off the beaten path and draw from a more diverse pool.

The idea that everyone makes automatic, subconscious associations about people is not new. But recently companies — especially tech firms — have been trying to reduce the impact of such biases in the workplace.

Unitive's Mather says companies realize group-think is harmful to the bottom line.

And research shows that "getting in different perspectives into your company makes your company more innovative, more profitable, more productive," Mather says. "All kinds of really great things happen when you stop making decisions based on how much you like the person's personality."

Unitive's software is based on social science research, including work by Anthony Greenwald, a psychologist at the University of Washington who developed the seminal implicit-association test in the 1990s. It measures how easy — or difficult — it is for the test-takers to associate words like "good" and "bad" with images of Caucasians or African-Americans.

Greenwald has tested various words and race associations on himself. "I produced a result that could only be described as my having relatively strong association of white with pleasant and black with unpleasant," he says. "That was something I didn't know I had in my head, and that just grabbed me."

No matter how many times Greenwald took the test, or how he tried to game it, he couldn't get rid of that result. He was disturbed, and also fascinated. Research indicates that unconscious biases tend to stay constant, he says, making them very hard to address within organizations.

"People who are claiming that they can train away implicit biases," he adds, are "making those claims, I think, without evidence."

So rather than trying to get rid of bias, Greenwald and other experts advocate, instead, mitigating their effect. Companies could remove identifying information from resumes, for example, or conduct very structured job interviews where candidates are asked the same questions and scored on the same criteria.

Some organizations are trying such methods.

Gap Jumpers, for example, is a startup that helps companies vet tech talent through blind auditions, which test for skills relevant to the job. That allows companies to avoid asking for a resume, which might include clues to a person's race or gender, says Heidi Walker, a spokeswoman.

Plus, Walker says, "That allows the company to actually see how a candidate will approach and develop solutions on the job." And, she adds, half their applicants are women.

Still, unconscious biases can affect all sorts of workplace behavior and decision-making, so addressing it can be a challenge.

A year and a half ago, cloud-computing company VMWare started training managers to identify their own unconscious biases, then started tracking their hiring, retention and promotion of women, which make up a fifth of their workforce. They also analyzed whether biases had seeped into employee evaluations.

It's been an eye-opening process, says Betsy Sutter, VMWare's chief people officer. "We have more work to do. A lot more work to do."

Copyright 2015 NPR. To see more, visit http://www.npr.org/.

Transcript

ROBERT SIEGEL, HOST:

When it comes to the issue of discrimination, the focus is often on explicit forms of it - racial profiling, hate crimes or the gender pay gap. But psychologists say unconscious bias probably influences our everyday lives more. NPR's Yuki Noguchi reports that some companies, especially tech firms, are trying to prevent that from affecting hiring and promotion decisions.

YUKI NOGUCHI, BYLINE: Pete Sinclair is chief of operations at RedSeal, a cybersecurity firm that, like many other Silicon Valley companies, had very few female engineers and few employees who weren't white, Chinese or Indian.

PETE SINCLAIR: So I was trying to figure out, how do I expand my employment base to include those underrepresented groups? Because if we do appeal to those, we will have more candidates from which to hire from.

NOGUCHI: Sinclair figured the company was turning off or turning down these minorities. He turned to another software startup called Unitive which helps flag words with male associations in job postings and helps structure interviews to focus on specific qualifications. Sinclair says job applications shot up 30 percent, and so far, doubled the company's percentage of women among three-dozen engineers.

SINCLAIR: Our last hire was a Middle Eastern woman who would've never, frankly, in the past have even applied for the job much less gotten hired and she's turned out to be one of our top team members.

NOGUCHI: Sinclair says the motivation wasn't altruism. His company competes with Facebook and Google for talent so it had to look off the beaten path and draw from a more diverse pool.

The idea that everyone makes automatic, subconscious associations about people based on their socialization is not new, but only recently are companies focusing on developing ways to reduce its impact in the workplace. GapJumpers, for example, is a startup that helps companies vet tech talent through blind auditions. The auditions are tests of skills relevant to the jobs a person is applying for. Heidi Walker, a spokeswoman, says by using tests they avoid asking for a resume which might include clues to a person's race or gender.

HEIDI WALKER: That allows the company to actually see how a candidate will approach and develop solutions on the job.

NOGUCHI: And she says half their applicants are women.

Laura Mather is founder and CEO of Unitive. She says companies often err by using phrases like fast-paced and work hard, play hard, which telegraph mainstream male. Instead, she encourages use of terms like support and teamwork in job descriptions, which tend to attract minorities. Mather says companies realize groupthink is harmful to the bottom line.

LAURA MATHER: Getting in different perspectives into your company, the research shows, make your company more innovative, more profitable, more productive. All kinds of really great things happen when you stop making decisions based on how much you like the person's personality.

NOGUCHI: Unitive's software is based on social science research, including from psychologist Anthony Greenwald, who developed the seminal Implicit Association Test in the 1990s after testing various words and race associations on himself.

ANTHONY GREENWALD: I produced a result that could only be described as my having a relatively strong association of white with pleasant and black with unpleasant. That was something I didn't know I had in my head, and that just grabbed me.

NOGUCHI: No matter how many times Greenwald took the test or how he tried to game it, he couldn't get rid of that result. He says unconscious bias tends to stay constant.

GREENWALD: They haven't gone away in me. I know that they're still here. And people who are claiming that they can train away implicit biases are, I think, making those claims without evidence.

NOGUCHI: Greenwald and other experts advocate instead mitigating their effect - removing identifying information from resumes, for example, or doing structured job interviews where candidates are asked the same questions and scored on the same criteria.

A year and a half ago, cloud computing company VMware started training managers to identify their own unconscious biases then started tracking their hiring, retention and promotion of women, which make up a fifth of their workforce. Betsy Sutter is VMware's chief people officer. She says they also analyzed employee evaluations.

BETSY SUTTER: There's some language used more or less with women versus men, and that's some pretty interesting results, but that's also been kind of eye-opening.

NOGUCHI: What have you noticed?

SUTTER: That we have more work to do - that we have a lot more work to do.

NOGUCHI: Understanding unconscious bias is one thing, she says, addressing it can be a challenge. Yuki Noguchi, NPR News, Washington. Transcript provided by NPR, Copyright NPR.

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate