How AI Increases Female Representation in Tech

True fact: it really is possible for AI to increase female representation in tech.

Female representation in tech

Gender diversity in the tech industry has been a major issue for over two decades now.

The numbers don’t add up. The EEOC reports that women currently make up roughly 56% of the overall workforce. However, only about 28% of proprietary software jobs are currently held by women.

Why is it important to have women in tech?

Various reports, including one from Catalyst and one from McKinsey, have shown that companies with more female leadership tend to outperform both their market and their rivals. An additional study, from the Peterson Institute for International Economics, showed companies with 30% or more females in leadership outperformed rivals by an average 6% net profit margin.

Women are often associated with being empathetic leaders. This is not true of all women of course, and many men are also empathetic. But, if we can generally associate empathy with female leadership, we see a compelling reason to recruit more women in technology. Five of the ten most empathetic technology companies are also the fastest growing. They have grown about 23.3% per year, compared to a weighted average of 5.2% growth of all technology companies.

Why is diversity in tech seemingly so far behind?

This is often framed as a “pipeline problem,” and that might be true. For example: according to Girls Who Code, 74% of young women (i.e. high-school aged and lower) express interest in STEM (technological) courses and career paths, but by the time decisions need to be made about taking those classes in college, only 18% choose STEM/computer science pathways. (And that’s actually dropped: in the 1980s, women held 37% of computer science degrees, for example.)

How do we change the interview?

The best approach is to use a structured interview format. This ensures a consistent, repeatable process where candidates and their skills can be directly compared. Interviews have evolved over time, but they are still often largely subjective — and technical interview outcomes based on large data sets reveal the process can be relatively arbitrary.

One additional concern: arbitrary, unstructured interviews lead to quality candidates potentially getting turned down. As noted on Aline Lerner’s interviewing.io, which lets recruiters anonymously practice technical interviewing with engineers, women are 7x more likely than men to stop practicing and interviewing after 4+ rejections. Therefore, the more women are interviewed and passed over, the harder it becomes to solve the diversity in tech issue.

Using a structured interview format where all candidates receive the same questions in the same order, and are evaluated with the same metrics, is part of the solution. To create structure that actually drives results, you need data. Companies need to embrace the data side of their hiring funnels. Think about who in your organization has been successful (and unsuccessful) in specific roles. What attributes or data points (performance reviews, KPI measures) do you have on them? Eventually, a picture may emerge of what an ideal candidate for X-role looks like.

How can AI/tech help?

An AI technique called sentiment analysis can identify exclusionary language (e.g., aggressive, competitive, brilliant) that research has found may turn off certain groups of candidates.

For example, studies by researchers at the University of Waterloo has found job postings that use adjectives like “aggressive” and “competitive” attract fewer female candidates.

New research on job ads found women were less interested in applying to roles that included brilliance-related descriptions compared to dedication-related ones such as “passionate about the job”. The research suggests alternatives that are more inclusive in order to attract a more diverse candidate pool such as “committed” and “responsible.”

Despite some reports, AI tech actually helps to reduce bias. Biases related to demographics such as race, gender, and age can be triggered by information on a resume such as the candidate’s name and the dates they’ve held previous positions.

Throughout 2019, AI adoption for diversity will increase as AI can be programmed to avoid unconscious biases by ignoring demographic information when screening and shortlisting candidates.

Furthermore, technology like AI can be tested for bias by checking the demographic breakdown of the applicants it sources and screens. If the AI tool is disproportionately excluding a group of people, this oversight can be corrected through human intervention.

So what’s the solution? Start small: focus on a quantifiable, realistic hiring goal for improving your gender representation. For example, to increase female hires in management positions by 10% within six months. Figure out a strategy on how to get there by leveraging de-biasing software and opening up your sourcing pools.

What else have you seen about getting more women into tech roles, and how can tech itself help?

 

Ideal Blog Banner

Olivia Folick

Olivia Folick

Digital Marketing Manager at Ideal
Olivia is a Bachelor of Commerce graduate from the Smith School of Business at Queen’s University with a deep passion for marketing, fashion, sports, and analytics. Recently moving from Vancouver to Toronto, Olivia has left the tree-hugging west-coast culture to explore new career opportunities within AI and technology.
Olivia Folick

Comments