New research identifies a lack of diversity amongst the talent pool that drives AI development in Europe. But without diversity, AI systems run the risk of bias.

The report was compiled by LinkedIn’s ‘Economic Graph’. Amongst the key takeaways from the report, researchers have found that half of Europe’s AI workers are based in just three countries (UK, Germany and France). The majority of AI workers work within the IT and research sectors.

The argument can be made that in order to facilitate an AI-driven economy, that should now be more evenly spread across the whole economy. Additionally, the report finds that only 16% of all AI workers in the EU are women.

Geographic divide

One of the main findings of the report is an East-West divide in terms of the distribution of AI talent. Whilst the countries of Central and Eastern Europe produce AI graduates, data shows that these graduates are migrating to Western European countries. Over half of all of the regions AI workers are based in the UK, France and Germany.

There are also deficiencies in talent distribution from a North-South perspective. Southern European countries like Spain and Italy score well for clustering of academic AI talent in that region. However, dispersal of that talent from working on academic-based AI projects to the private sector is not filtering through within that region.

Gender imbalance

A gender imbalance in the workforce in the tech sector overall has been an ongoing issue. That shortcoming is also evident in the AI sector. LinkedIn researchers found that only 16% of workers in the sector are women.

Other research backs up these findings. The State of European Tech, published last week, demonstrates a lack of diversity in the broader tech context.

In Spain and Portugal, 92% of the funding of tech-based projects went to all-male teams in 2019. One woman CTO was identified amongst 119 companies surveyed within the same region. However, there are some exceptions: For example, 40% of IBM’s data & AI elite team are women.

Only 16% of AI developers are women, finds LinkedIn’s ‘Economic Graph’. Image: Annie Spratt/ Unsplash .

AI bias

Bias in AI can have some knock-on effects that skew the effectiveness of the technology. A biased algorithm could be designed to take hold of data that would render an outcome of very poor results.

“Diverse representation among emerging technology workers is crucial for the sector and especially important for AI products given the potential for bias”

LinkedIn report: ‘AI Talent in the European Labour Market’

As the report puts it, “diverse representation among emerging technology workers is crucial for the sector and especially important for AI products given the potential for bias”.

When AI-based systems are devised by staff who belong to a less diverse talent pool, it has been found that the perspective and approach used to tackle problems are diminished. Only certain problems are worked on, or they are approached from a particular bias such that certain possibilities and outcomes are overlooked.

A case of inequality?

One reason for these disparities is the fact that the pool of AI talent is very much in short supply. In the case of gender equality in tech, it’s not a case of direct discrimination during the hiring process. The reality is often that not enough female applicants have filtered through into the sector.

There is a lack of women in education that prepares for a career in the tech sector. This has led to programmes such as ‘Girls Who Code’ , a nonprofit whose aim is to close the gender gap in technology.

It is often debated whether men have simply been more inclined to pursue a tech-based education career than women. Some may argue that this disparity may simply be based on the pursuit of different interests.

But even beyond potential gender inequality, AI bias is a recognised issue. For this reason, there’s a clear need to prioritise diversity in the development of AI. Women’s participation needs to be brought up to make for AI design that is less likely to have the outcome of skewed results.