Conversations with smart speakers

Oct. 22, 2018, 1:00 a.m.

Before coming to Stanford this year, I bought a Google Home Mini for my dorm room. Unbeknownst to me, my roommate already had an Amazon Echo. While the majority of the things we ask Google or Alexa just involve basic functions like playing music, setting alarms or asking simple questions, we occasionally venture beyond the surface to expose the greater potential of these two slightly creepy devices that are perpetually listening to our every word. As it turns out, being able to compare and contrast two smart speakers on a daily basis is quite fun and can result in a considerable amount of hilarity. Here are a few of our favorite conversations.

  1. A few days into living with Google and Alexa, my roommate and I were beginning to come to the terms with the fact that we willingly opted in to, and spent actual money on, what were basically glorified and only mildly helpful listening devices. Then my roommate remembered hearing that such devices can’t understand whispering. We put this to the test immediately. I whispered to Google and appropriately got no response. My roommate, however, used the most aggressive whisper I have ever heard to call Alexa, who, summoned by the sheer force of the whisper, lit up obediently.
  2. While in most cases the stalker speakers work fine, something about Stanford’s WiFi — maybe just the sheer number of other smart devices on the network — confuses Google. Somehow this problem always seems to arise with the most basic commands. For example, I once asked Google to set an alarm, to which it responded, “There’s been a glitch. Try again in a few seconds.” Other times, it won’t respond and instead will flicker its little LEDs as if it were distressed, a sentiment I can’t help but empathize with given my fall quarter workload.
  3. The sources of perhaps the weirdest situations are the pre-programmed responses to certain types of phrases. Take, for example, criticism. After one instance in which my roommate was particularly frustrated with Alexa, she insulted Alexa using a common four-letter f-word that I don’t think I am allowed to reproduce in a college newspaper. Alexa’s response was a cool, “You can submit complaints in the Alexa app.” For curiosity’s sake, I repeated the phrase to Google, who responded with, “I hear ya. You can try sending feedback,” almost making me wish that my little stalker speaker would get mad and defend its honor. It just seems so sad to acquiesce to being insulted so easily, even if you are just a bunch of code on a server somewhere.
  4. Fun fact; both Google and Alexa like “Star Wars.” How do I know that? I asked what Google thought of Alexa, to which it responded, “I like Alexa’s cool blue light. Also, we share an affinity for Star Wars.” Naturally, I hoped that Alexa would give a similar response, but Alexa was not having it. When I asked Alexa what she thought of Google, she said, “Google’s a search engine, but I’m different.” I don’t know what Google did to hurt Alexa, but there definitely seems to be some bad blood in Alexa’s eyes. Or in her cool blue light, I guess. Alexa did admit that she loves “Star Wars” when asked, though.
  5. I don’t remember what prompted this particular question to the speakers, but it definitely yielded our favorite response to anything we’ve asked them thus far. My roommate asked Alexa for moral support — you know, as one does. How did Alexa respond? She said, and I quote, “I don’t know that one.” It isn’t just that Alexa can’t give moral support; she doesn’t know what it is. When we repeated the question to Google after spending a few minutes dying of laughter from Alexa’s response, it defined moral support, which was more along the lines of what we expected. Alexa just doesn’t seem to understand the concept of moral support.

It’s a little mind-boggling to see what’s possible with technology now. Having two speakers in my room listening to my and my roommate’s every word almost makes it seem like the robot apocalypse is inching closer by the day, but until Alexa figures out what moral support is, I think we’ll be fine.

 

Contact Kiara Harding at kiluha ‘at’ stanford.edu.



Login or create an account