top of page
Search
  • Writer's pictureHelen

Midterm:Voice Controlled Devices


Voice Controlled Technology

Just 10 years ago we couldn’t have imagined being able to turn the lights on with a voice command, open our trash cans with a voice command or even play music by simply asking Google or Siri to do so. Technology has truly come a very long way, however there is still so much to be done to improve it in what they do in our lives.

Many people today own a Google Home or an Alexa or some other voice control device (like a smartphone!). The voice control features on just our phones have already proven to be very useful. It allows a very significant ease of access for everyone but especially in some cases for people that may have some impairment that they live with. For example someone that may have impaired vision or may be blind would definitely have a harder time using certain technologies that we have today. Yes, there definitely was life before things like a Google Home, Amazon Alexa and other voice control devices but it was definitely a little more challenging to get around. These features can also be very useful in the case of an emergency. Let’s say someone is unable to use their hands for whatever reason and the only way for them to call 911 was to ask Siri to do it. In a situation like that, voice commands can literally be life saving.


How does language as a salient social identity come in to play with voice controlled devices? In America alone, 54% of adults have used voice commands and 20% of them use them on a daily basis. This statistic does not take into account the younger generations that are for sure using is at a much higher rate. Voice commands have almost become imperative to the way that we interact with technology today and even with each other. Just as an example I am currently using voice commands to write this blog. I have an iPhone and I’m able to use the voice to text function to facilitate myself so I don’t have to type out everything that I want to write and express in my blog. Of course that is not absolutely necessary for me since I am also able to type but sometimes when my hands are full or if I’m honestly just feeling lazy I can ask my phone to text a person and I don’t even have to lift a finger to do that. This has become very useful and practical to use however not everyone has those features as accessible to them in terms of ease of use. In the past couple years it has been revealed that voice controlled devices have trouble understanding certain languages and accents. Here’s an example:

About 2 years ago, my family got a Google Home Mini. I found it fascinating and really cool to use to have such control with just my voice. My mother, however, tried to use our Google Home and she did not like it at all. My mother was born and raised in Ecuador, a country whose primary language is Spanish. Although she’s been in this country for almost 30 years now she still has a pretty thick accent when she speaks English. If she spoke to someone who dominates the English language very well I promise you that they will not have trouble understanding her. However, when she tried to speak to the Google Home it could not understand what she was requesting even when she was requesting a simple song or a timer for when she was cooking. “I am speaking clearly and slowly and I don’t understand why Google still doesn’t understand me until I repeat myself like ten times!” Naturally she was upset when she found that she wasn’t being understood because of her accent so she just stopped using it for the longest time.

The Washington post wrote about this calling it “the accent gap”. Even just in America there are people with many different accents all over the country and some of them may be thicker than others and may cause us to pronounce certain words differently. Perhaps someone in New York pronounces the word aunt differently than someone in Michigan might. Now, when we speak to each other in person we understand each other; however, when speaking to a voice controlled device the understanding is not quite the same. Like I mentioned earlier, my mother has an accent when she speaks English however in person it is easy to understand what she’s saying. Why can’t Siri or Good understand her? When she spoke to our Google Home it would always say “Sorry I don’t know how to help you with that.”

This isn't an issue only occurring to people with Spanish accents but an issue happening with people all over the world with all different accents. It turns out that Amazon‘s Alexa and Google home we’re having quite a bit of trouble understanding British accents. So, in order to be understood someone would have to completely change their voice and their accents and this isn’t ideal anymore.

According to the Washington Post, the Google Homes and Amazon Alexas were having issues understanding people in different regions of the United States even though they were all of English speaking backgrounds. “People with Southern accents, for instance, were 3% less likely to get accurate responses from a Google Home device than those with Western accents and Alexa understood Midwest accents 2% less than those from along the East Coast.” Although this was back in 2018 and things have improved now, this is an issue that I have personally seen firsthand it’s still very prevalent. I do not think I have a very thick accent but sometimes when I switch between English and Spanish, I have the tiniest bit of an accent “left over” which causes me to not be understood. Let’s say I am requesting a Spanish song to my Google Home: it usually takes me a minimum of three tries before I give up and just play it from my phone.

This issue has been fixed a little bit in the past couple years but is still very prevalent. It is definitely very hard to get A.I. to understand every accent but fortunately it is learning with time. The Google home has a feature that allows the user to train the home to the voice so that perhaps the Google Home can understand different voice commands from that voice.


14 views0 comments

Recent Posts

See All

Phase 2, Post 12: Story Making Workshop

I wasn’t able to join in on this class but I looked back at the class recording and followed along: This is the story I created: First breakout session: Issues of bias: Different races are being hit h

Post: Blog2_Post
bottom of page