Voice Recognition ‘Sexist’

Voice-Recognition

Delip Rao, CEO and co-founder of start-up R7 Speech Sciences has brought the issue back into the spotlight that voice recognition systems struggle more with female voices.

Not New

The issue has been known about for some time and has been brought into sharper focus with the popularity of voice-activated digital assistants like Apple’s Siri, or Amazon’s Alexa, or Google Home.

Why?

According to Linguistics experts, the key problem is that females have higher pitched voices than males, and they tend to be quieter and sound more “breathy” when they talk.

With speech for example, Mean Fundamental Frequency (Mean FO) can be expressed as a number around which vocal tones are spread. The FO for men is around 120Hz, but for women it is much higher at 200Hz.

MFCCs

Also, another problem for voice recognition systems comes when they try to process words and sounds into MFCCs (Mel-frequency cepstral coefficients). The voices of women are known to give a less robust acoustic signal, and this signal can be easily masked by noise. These two challenges also make things more difficult for speech recognition systems.

Lack of Diverse Training

Since speech recognition systems also rely on an AI element, they require training to become more used to recognising certain vocal characteristics. Linguistics experts, therefore, also believe that a lack of diverse training examples of the speech of women may also be a contributing factor to the problems encountered by current voice recognition systems.

Gender Biases As A Result

Some commentators are, therefore, predicting possible worsening gender biases problems with voice recognition systems if these issues are not tackled.

Experts have pointed out the importance of training systems using equal proportions of men and women to avoid the problem of them being very good at recognising male data and very bad at recognizing female data.

Ethnic Mix

The same experts have also highlighted potential biases based on ethnicity if voice recognition systems aren’t trained using a wide ethnic as well as gender mix.

What Does This Mean For Your Businesses?

With digital assistants now in the workplace in computer systems (e.g. Alexa for Business),and with AI bots being used e.g. to handle customer service systems (with a voice element), it’s important that women and / or certain ethnic groups are not at a disadvantage when using the systems.

The problem is known about now, and companies should, therefore, be taking action to make sure that voice recognition systems work well for all demographics, and deliver equality as part of their value.

Sponsored

Ready to find out more?

Drop us a line today for a free quote!