View profile

Mary Kondo conversational AI - Issue #43

Mary Kondo conversational AI - Issue #43
By Mari from VoiceFirst Weekly • Issue #43 • View online
There’s enough content inside this one for me not to extend this introduction. Psychology research articles, infographics, the voice market current landscape and several Google announcements and tutorials to enrich your Google Actions knowledge base are the scoop of this issue. Yes, we have everything. Plus, a new chops I’m enjoying in my inbox every Tuesday.
We have been absent of the podcasting land for far too long now, but thankfully the process is about to wrap up and we’ll get back at it with everything we have to offer, starting with all of our recordings at The Alexa Conference Exhibit Hall. Stay tuned, we talked with almost every exponent company at the Alexa Conference Exhibition (You can see them all here), and there’s a lot to unpack.
As always, much ❤️.

Gasp! First audio map of oohs, aahs and uh-ohs spans 24 emotions
UC Berkeley scientists conducted a statistical analysis of listener responses to more than 2,000 nonverbal exclamations known as “vocal bursts” and found they convey at least 24 kinds of emotion.
“Our findings show that the voice is a much more powerful tool for expressing emotion than previously assumed,” said study lead author Alan Cowen, a Ph.D. student in psychology at UC Berkeley.
Beyond Chatbots: Hyper-Personalized, Intelligent Assistants
Current reality is that natural language applications have severe limitations, failing on tasks that even a child could easily handle. They have no memory of what was said earlier, cannot learn simple but arbitrary facts interactively (unless they were specifically programmed for it), don’t reason about their tasks, and have no common sense.
To overcome the current limitations of chatbots and intelligent assistants, we need to adventure in the domain of intelligent, hyper-personalization.
“What hyper-personalization entails is a fundamental shift in thinking: The AI bot must be seen as the customer’s assistant, not the company’s. Each user has their own PA, able to interactively learn their individual preferences and situations”
And that’s your clue to read this one. Plus: As Geoff Hinton, the ‘Godfather of Deep Learning’, opined: “My view is to throw it all away and start over”.
Sounds tempting! Marie Kondo the current conversational AI.
How Big Tech Is Battling To Own The $49B Voice Market
Read this for a general sense of the state of the art in the voice space right now. I particularly enjoyed the part dedicated to the efforts of the big tech companies to own the voice of healthcare. Expect more development in the area.
New KPIs and Metrics for Intelligent Assistants
A report with the goal to Aggregate, normalize and present core metrics that have proven useful both to justify procurement and implementation of intelligent assistants, these metrics by research firm Opus Research are a sneak peek of the report that you can purchase contacting them. This piece covers the index and the key findings, which alone provides value.
Try Voice Chops Tuesday newsletter. In this week’s Chop I found this article about prison voice prints. It’s been a nice read every week so far.
In the news
Quote of the week
The AI bot must be seen as the customer’s assistant, not the company’s.
Peter Voss
Did you enjoy this issue?
Mari from VoiceFirst Weekly

The ultimate resource in #VoiceFirst. Weekly digest of the most relevant news in the #VoiceFirst ecosystem. Everything from voice assistants to bots including conversational AI and published audio.

In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
4 Norwich St, 94110, SF, California