Jen DuBois By: Jen DuBois

Unconscious Bias – What Is It and Why is it Bad for Data Science?

Bias, specifically “unconscious bias” has become the new “B” word in tech recruitment and hiring because it works against diversity.

A growing body of evidence such as the findings from McKinsey depicted below, and other expert opinions identify many benefits of diversity in data science teams.

 

Companies are also under social and legal pressure to increase diversity so they’re rushing to recruit “Diversity Managers” tasked with implementing plans to promote diversity and inclusion within the organization.

Despite all this, statistics show that tech and data science-driven companies lack employee diversity. The data visualization below created by Information is Beautiful demonstrates this (for the most up to data data follow this link).

With the exception of Amazon and perhaps Apple, tech companies still have a ways to go.

The Benefits of Diversity for Data Science Teams

Quite a bit of research proves the cultural, performance and financial value of promoting diverse teams.  Like the previously mentioned McKinsey study, a 2018 BCG study also found that increasing the diversity of teams leads to more and better innovation and improved financial performance.

Companies like Intuit attest to to the benefits of diversity in data science, as Intuit’s Director of Technical Programs Kavita Sangwan expressed,

“Delivering relevant, seamless, and high-value experiences requires a deep understanding of customer needs and perspectives… It’s why we place such importance on diversity and inclusion.”

People often think of data science skills as technical skills such as computational mathematics, engineering, and data analysis. However soft skills are equally if not even more important.

Being a team player, a collaborator and good communicator allow data scientists to better operate in a team setting interacting with product managers, business people and other technical staff across a variety of diverse projects. It’s hard to find all of these qualities in just one person or one type of person.

To build all these skills into a data science team you need talent diversity.

Sure, you might be comfortable hiring from the same 3 university statistics and computer science degree programs because the graduates’ skills are predictable and you and your HR department feel comfortable with this model.

But how much diversity of thinking and skills does that approach bring to your organization and your data science teams?

What is Unconscious Bias?

The lack of employee diversity in data science and tech has in large part been attributed to unconscious bias in the hiring process.

Here’s the Kirwan Institute for the Study of Race and Ethnicity at Ohio State University’s definition of unconscious bias:

 “the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner….these biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control.”

Unconscious bias is not just racial bias or gender bias that lurks just below our conscious thoughts of others. It’s also that “gut feel” that so many hiring managers trust and potentially use to overrule any “red flags”.

Many other unconscious biases can come in the form of a judgement based on height, weight, hair color, clothing style, manner of speech, surname, physical aspect, university degree, introversion versus extroversion,  plain eye contact or lack thereof.

It’s important to understand that unconscious bias developed in our brains to help our ancestors make quick groupings and decisions for survival based on previous experiences and (un) familiarity.

So it’s good for survival, but maybe it’s not so good for promoting openness to new and diverse experiences and to recruiting different types of people to a data science team. Unconscious bias tells interviewers to surround themselves with people with backgrounds similar to their own and which don’t cause discomfort.

Why Unconscious Bias Exists in Data Science Recruitment

This is why unconscious bias exists particularly in the field of data science. The majority of data scientists and software engineers working in the tech industry who are doing the hiring have simply been hiring people who are like them over the years. Of course there are exceptions to this generalization, but research abounds to support this notion.

Imagine your stereotypical employee working in tech or data science who probably looks a lot like the guy in the image to the right. Roughly 75% of data scientists and analytics team managers are male, and a significantly large proportion are white or Asian.  Many have a master’s degree or higher in a specific field such as statistics – requirements which may be out of reach for women raising a family or minorities who cannot access this level of education.

A typical data science interviewer, even if they are open to people of all backgrounds, might subconsciously expect a well-educated white or Asian male to walk into the room because that is their experience (remember our ancestors) of who a data scientist is.  Likewise, HR may unwittingly screen out candidates who don’t meet this profile in an attempt to send candidates to data leaders that they know managers will be comfortable with.

If the candidate who comes to the interview does not fit this common profile, right away unconscious bias can creep into the interviewer’s thoughts. Their brain naturally says, “Wait, this person does not fit my previous experience working with data scientists, can they be grouped into my team?”

Further feeding this unconscious bias is the fact that if the interviewer is left to come up with their own line of questioning they may find that they do not have that much in common with say, someone born and raised in India.

The primitive instinct that is unconscious bias tells them, “This person is not familiar to me and does not fit my grouping of ‘data scientists’.”

Unconscious Bias in Tech Skills Assessment

When interviewing job candidates technical managers want to believe that their judgement of a person’s skills is a good enough predictor of a candidate’s potential to succeed.  After all, the hiring manager is an experienced expert in a highly technical field.

Technical hiring managers assume that they are most capable of assessing a candidate’s skills in whatever way they feel is best –  by asking technical questions and stumpers, giving a candidate scenarios and problem solving situations and/or asking them in a variety of ways to present their technical skills – with a take home challenge for example.

It’s natural for hiring managers to want to interact with a candidate directly in this way to satisfy their personal need for “proof” of skills to do the job and a cultural fit.

However, when we examine the data around the accuracy of face-to-face and unstructured interviews, it becomes clear that this approach isn’t always a precursor to candidate success.

Structured Interviews – The Key to Reducing Unconscious Bias in Data Science Recruitment

Numerous studies have shown that by incorporating structure into the interview process the process can be up to 81% more accurate in identifying the right candidate than unstructured ones.

That’s a huge improvement in hiring outcomes! Especially with respect to costly data science hires.

The reasons for this have been widely studied and summarized by organizational psychologists.

Primarily, unstructured interviews give the interviewer the freedom to explore details about the job candidate as they see fit. In academic research terms hiring managers have been found to have a propensity for “sensemaking”.  They believe that they can make sense out of whatever the interviewee says.

Add this sensemaking philosophy to a highly technical interview on the wide variety of subjects that data science touches upon and you have a recipe for unconscious bias.

The fact is that while unstructured interviews provide a way for hiring managers to form relationships organically with candidates through conversation and check for cultural fit, research shows these types of interviews are not a good method of assessing talent.

Even for data science skill assessment, unstructured interviews are frequently dependent on common interests or “chemistry” between the candidate and the interviewer and most technical people doing interviews don’t have formal training on doing Q&A for hiring.

How can companies make sure that candidates are being assessed confidently and fairly for data science and engineering skills and capabilities?

We’re not proposing that companies do away with face to face interviews by any means. However, we are suggesting they add structure and objectivity their technical recruitment and interview process.

Skills Assessments Can Help Reduce the Impact of Unconscious Bias

There are several ways to introduce structure into the interview process. Objective, standardized skills assessments and the candidate data they produce are one such way to bring structure and transparency to technical skill assessments.

These assessments can help Human Resources and hiring managers ensure that job candidates are competent in the exact technical skills required for a job role and that they also have the capability to grow their skill set in the role.

Assessments also allow hiring managers to compare apples to apples if every candidate is given the same assessment.  A randomized set of technical or personal questions or a data challenge to present in person opens up the assessment of a candidate’s technical skills to other types of judgement and “noise” such as physical appearance or shyness.

A bonus of standardized skills assessments is that they provide a snapshot of relative areas of weakness and strength for the candidate that is eventually hired. This snapshot provides a baseline for targeted investment in that employee’s future learning and development, which can aid in retention of a more diverse set of employees, especially those minority groups who tend to score lower in certain skill areas.

Finally, by creating a structured and repeatable skills vetting process, the process becomes easier for all involved to fill positions over time and increases confidence in decision making. This is important for the field of data science where talent is still in shortage.

Conclusion

At QuantHub, we’re paying close attention to this issue of unconscious bias and what diversity and HR managers are trying to do about it. It’s an issue our customers – data and analytics leaders – are struggling with when working with their HR departments to fill data science and data engineering roles. We’d like to help.

Between the shortage of data talent and the clear need for diversity and inclusion in the industry to promote innovation and a data-driven culture, companies need to cast a wider net when recruiting.

They also need to start reviewing existing hiring practices that are based on factors which contribute to unconscious bias such as “gut feel”, unstructured conversations with random questioning, and adherence to rigid screening requirements and strict job descriptions.

Skill assessments are one way to address these issues.

Interested in knowing more about QuantHub’s role in reducing  bias?  Check out our diversity scholarships and blogs about women in data science or read a summary of our platform’s bias audit.

 

+ Add Comment

Leave a comment

Your email address will not be published. Required fields are marked *