Cortana, Siri, Alexa ... why do intelligent assistants have women's voices?


Cortana, Siri, Alexa, most intelligent personal assistants have a voice, a name and a female personality. A choice that has no technical basis.

Experience Siri on his sex, the intelligent assistant will answer that he does not, like most of the voice assistants on the market. Yet his feminine identity is no doubt given his name, and even if it is available also in a male voice. Among the newborns on the market, there is Léa de Carrefour who will soon help customers manage their shopping lists. But there is also already Alexa (Amazon), Cortana (Microsoft), while Google Assistant is accessible only with a female voice for the moment in France.

This situation is explained because it would be easier to develop a female voice? Not at all. It's the opposite. "It is more difficult to create a feminine voice than a male voice: the vocal frequency is generally higher, more acute, more variable; it's more complex to manage, "explains Christophe Couvreur, Vice President of Nuance and in charge of voice synthesis in this company. Nuance is famous for providing the basics of Siri's voice, but also voice assistants for many banks, car manufacturers, smartphone manufacturers and telecom operators.



Nuance's customer requests, however, tend to favor women's voices, with the exception of Domino's Pizza, or certain specific sectors. It depends on the intended functionality and the effect you want on the user. If your intelligent assistant has a global vocation of service and must seem to be always listening to you, he will get a feminine voice, according to Nuance. If he is expected on specific advice and money issues like a home loan, he will be infused more testosterone. Are you starting to say that these are misogynist stereotypes? You are right. Clichés of which we are all slaves unconsciously because we wait first to be served ... by women.

We are all waiting to be served by a female voice

A study makes reference on the subject at the giants of the Tech. That of Clifford Nass, professor of communication of Stanford University today disappeared, who proved in 1997 that we react differently to a synthetic voice according to his sex. In her experience, male voices were perceived as inspiring more respect and competence, while women's voices were poorly felt when they showed authority but positively when they were compassionate and sociable. And that still applies today.

It is very difficult to escape these clichés. Simply because we live in a society where women are in the majority in the care professions, where we give orders more often to women who occupy more junior professional positions or because they care and education for children. With all the fantasy fantasy that goes with the stewardess to the maid, through the nurse. But it is not the only reason. "The voice assistants are developed by a homogeneous group of rather white and heterosexual developers who do not see any problem in choosing female voices because that's just what they want to hear," Isabelle Collet observes. Trained computer scientist and specialist in gender issues in education.

Tech giants reinforce stereotypes

When asked about the reasons that led him to focus on Cortana, the answer fuse. "After extensive research, we discovered that a woman's voice was a better match for what our users expect. But it is the only one to honestly assume his choice, Apple having not deigned to answer us and Google preferring to advance us a slightly slight excuse. "The female voice was already available on the search engine," we were told. This does not eliminate the problem, since the decision was made at one point to focus on a female voice. Not very surprising, considering the study cited above. Tech companies are not philanthropists and go to the most effective for their product to work with users. There is no question of destabilizing them by going against their conventional wisdom.

The problem is that by going public, the giants of Silicon Valley aggravate the problem. "Giving women voices to voice assistants is like a caricature. We serve soup with stereotypes that already exist. So there is inevitably a risk of reinforcing them, "says Isabelle Collet. What alarm the Center Hubertine Auclert, which works for equality between women and men and who organized last year a symposium entitled "Beauty and the Bot. Is artificial intelligence sexist? ".

This pitfall is not worth that for the genre. "The projection of stereotypes about voices also applies to age and social category. Intelligent personal assistants like Siri are 20 years old to give an impression of dynamism, while those of the banks will have rather 30 years to inspire confidence, "explains Christophe Couvreur. And of course, there is no question of putting them in a regional or social accent other than that of the dominant category of each country.


Follow the model of emoji

The solution lies in the development of a neutral voice, today quite possible? That's what Nuance tried, at the request of Aldebaran when she developed the Nao and Pepper robots. It was enough for that to alter a woman's voice with a male voice. But this is not a solution for Isabelle Collet. "Nao and Pepper are pet robots. It can not work in all situations. The neutral does not go in the direction of a commercial logic. And our unconscious is reassured when it can identify a female or male sex when we are sold something.
The model to be followed would be that of emoji. At the beginning of their popularization around 2011, there were only figures of white and heterosexual men, supposed to be universal. For the past three years, emoji catalogs have become more diversified, displaying skins of all colors and ages, as well as various sexual orientations. The ideal would be that no voice is imposed by default when you buy a speaker or is used for the first time a smart assistant. And that we can choose the voice that suits us. This will be the first step to fight stereotypes. Provided, however, that society evolves at the same time. Because as long as women's place does not evolve in society, we will continue to wait for vocal assistants to conform to the stereotypes that surround us.

paypal,facebook,yahoo,mail,google,maps,ebay,amazon,barcelone,realmadrid,netflix,craigslist,AliCarter,Liverpool,AlfieEvans,YankeesVsAngels,RonanFarrow,YeVsThePeople,MesotheliomaLawFirm,Donate,CarToCharity,California,Donate,Car,ForTaxCredit,DonateCarsInMa;Insurance,Loans,Mortgage,Attorney,Credit,Lawyer,Donate,Degree,Hosting,Claimcashfear,softwares,money,football,SPORTNEWS,cars,carrental,cellphone,phonenumber,forex,torrent,voip,net,adsence,tollsspeakers,tipsspeakers,iphonespeakers,phones,iphone4,facebook,youtube,twitter,livematch,newslive,watchmatchforfree,watchlaligaforfree,watchserieAliveonjsc+,softwares,football,SPORTNEWS,cars,carrental,cellphone,phonenumber,forex,torrent,voip,net,adsence,tollsspeakers,tipsspeakers,iphonespeakers,phones,iphone4,facebook,youtube,twitter,livematch,newslive,watch match for free,watch laliga for free,watch serie A live on jsc+,windows 7,windows 8

Commentaires

CALL US

Nom

E-mail *

Message *

Posts les plus consultés de ce blog

We tested the Nuraphone, the headset that fits your hearing

They turned Amazon Echo into an audio snitch

Dolby Digital, Atmos or DTS ... what do these audio technologies hide?