How AI could be using ourvoices against us
Voice control gadgets – such as Amazon’sAlexa, Google’s Home or Apple’s Homepod – are becoming increasingly popular,but people should pause for thought about advances in machine learning that could lead to applicationsunderstanding different emotions in speech.
The CEO of Google,Sundar Pichai, recently said that 20% of the company’s searches are initiated by voice via mobile phones. And, at the end of 2017, analysis of the USmarket suggested that a total of 44m Amazon Alexa and Google Home devices had beensold.
The technology has increasingly impressiveabilities to recognize words, but – as an expert on acoustics – it is clear to me that verbalcommunication is far more complex. How things are said can be just as importantas the words themselves. When someone says “I’m alright”, the tone of theirvoice might tell you their mood is the opposite of what they claim.
Voice control gadgets, also known assmart speakers or virtual assistants, can be frustrating to use because theyonly pay attention to the words, and mostly ignore how speech is expressed.Tech giants hope that the next frontier for devices, such as AmazonEcho, will be to from their voice to make detect how a person is feeling interactions more natural.
The human voice can give away informationabout who that person is, where they come from and how they are feeling. When astranger talks, people immediately pick up on their accent and intonation andmake assumptions about their class, background, and education.
If voice control gadgets pick up on suchinformation, speech interfaces could be improved. But it’s worth remaining waryof unintended consequences. The technology relies on machine learning – abranch of artificial intelligence that brings togetheralgorithms and statistics learned by a machine that has been fed reams of data– and so its behavior is not entirely predictable.
Is the future smart or dumb?
Research shows that speech examples used to train the machinelearning application is likely to lead to bias. Such problems with thetechnology have been evident in popular tools such as Google Translate.
When used, for example, to translate theTurkish phrases “o bir doktor” and “o bir hemşire” into English, Google’sservice returns the results “he is a doctor” and “she is a nurse”. But “o” is agender-neutral third-person pronoun in Turkish. The presumption that a doctoris male and a nurse is female reflects cultural prejudices and the skeweddistribution of gender in the medical profession.
Google Translate picked up a humancultural bias that was in the data the algorithms were trained on and the endresult is a sexist translation system.
What you think about the Future with Voice Control Gadgets ?
Share your views in comment box !
In order to fulfill the basic functions of our service, the user hereby agrees to allow Xiaomi to collect, process and use personal information which shall include but not be limited to written threads, pictures, comments, replies in the Mi Community, and relevant data types listed in Xiaomi's Private Policy. By selecting "Agree", you agree to Xiaomi's Private Policy and Content Policy .