View profile

Hearing Voices - Issue #75

Revue
 
"When Alexa runs your home, Amazon tracks you in more ways than you might want." - The Washington Po
 

Hearing Voices

May 24 · Issue #75 · View online
This week's news about voice computing apps, delivered directly to your inbox.

“When Alexa runs your home, Amazon tracks you in more ways than you might want.”
- The Washington Post

“Share our data (this newsletter) with your friends!”
- Me (I know I said this week’s quote would be better than last week’s. I am aware that I have not kept my promise)

Did someone forward you this newsletter? You can subscribe at hearing voices.xyz

Happy almost Memorial Day. I’m away this week so just sending this very quick update:
What's Up In Voice
> Don’t read Your smart speaker is always listening. Just kidding, you can read it, but I want to spend a moment analyzing why these types of privacy analyses are flawed. In the case of this article, the headline is really misleading, as are the many headlines that have sprung up after it was publicized that smart speakers use real humans to help debug voice commands / train the voice recognition to do the correct thing. Maybe it’s possible for someone to hack in to your smart speaker and listen in to your conversations, but that’s not what these articles are about and the headlines are truly misleading. They range from the more benign title “Smart speakers are everywhere – and they’re listening to more than you think” which actually has a thoughtful analysis. This brings me to:
> Do readAlexa has been eavesdropping on you this whole time.” This is not from Buzz Feed, it’s fromThe Washington Post “where (they do, however, provide a helpful link to see what your Alexa device has recorded. Your Alexa history is located here). While the headline is alarmist, the analysis is excellent, if you actually read the whole thing.
At first it seems unreasonable: Given "Alexa has been eavesdropping on you this whole time” was written in the Washington Post. However let’s look at its content specifically:
“Any personal data that’s collected can and will be used against us. An obvious place to begin: Alexa, stop recording us.”
However beforehand, the author writes:
“Alexa keeps a record of what it hears every time an Echo speaker activates. It’s supposed to record only with a “wake word” — “Alexa!” — but anyone with one of these devices knows they go rogue. I counted dozens of times when mine recorded without a legitimate prompt. (Amazon says it has improved the accuracy of “Alexa” as a wake word by 50 percent over the past year.)
Computer scientists will recognize that these two statements are at odds with one another – the way to make the Alexa wake word more accurate is by improving the model that identifies the wake word. This requires a dataset which means it requires recording successful and failed attempts and then labeling that data. So what the author is proposing is a catch 22.
However the author acknowledges this:
Noah Goodman, an associate professor of computer science and psychology at Stanford University, told me it’s true that AI needs data to get smarter.
And finally, provides this metaphor, which is actually pretty awesome:
Think of “Downton Abbey”: In those days, rich families could have human helpers who were using their intelligence to observe and learn their habits, and make their lives easier. Breakfast was always served exactly at the specified time. But the residents knew to be careful about what they let the staff see and hear.
It’s interesting to think about our smart assistants as a sort of helper around whom we need to behave differently because they are inherently flawed – humans may make decisions to share our information even though we don’t want them to, and software companies are a sort of organism run by humans, flawed in that they may use our data in a way we didn’t intend for it to be used, or share it in some other way.
If we’re going to start propose ways to address data privacy, we need to acknowledge why we need to use the data to make our technology better, and then decide if that tradeoff is worth it. While the analysis is thoughtful, calling the article "Your smart speaker is always listening” buries the nuance.
Synthetic Reality
Speaking of AI, and on a more fun note, Betaworks Studios (Betaworks’ club for builders) is hosting a Summit on Synthetic Reality! (Synthetic Media is a subset of Synthetic Reality…something they’ll explain at the event with a market analysis & market map). This is a paid event but HearingVoices subscribers can get a discount with the code FRENDER20 (ok so it’s actually a discount code for friends of me not necessarily of this newsletter, but we’re friends, right?!).

Have a great weekend, friend.
– Matt Hrt.mn
You can follow me on twitter @matthartman
Did someone forward you this newsletter? You can subscribe at hearing voices.xyz

The Future Will Be Synthesized
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue