View profile

Open Voice #109: Voicecast & Lunch with NPO learnings, 5 Levels of Conversational AI, Massive AI and "The stationary smart speaker space was a mirage"

Hi {{first_name}}, Every week we collect the developments in the Voice domain into a single list of t
Open Voice #109: Voicecast & Lunch with NPO learnings, 5 Levels of Conversational AI, Massive AI and "The stationary smart speaker space was a mirage"
By Maarten & Sam • Issue #109 • View online
Hi {{first_name}},
Every week we collect the developments in the Voice domain into a single list of things you shouldn’t have missed. But before we dive into the news, here’s more on the other channels we work on.
🏝️ Holiday schedule
Maarten will be off for the next few weeks. No worries, all will progress as usual. We’ve never missed a Monday since we started, and since we’re with two, there is no need to change the schedule for now 😄
🎧 Voicecast
Check the latest version of the Voicecast with Sarah van der Land, Senior Digital Innovation Advisor at NPO on the research findings of the voice projects the Dutch broadcasters have been running. Later today you can listen to this episode via your browser, or better yet: subscribe in iTunes or Spotify.
🥪 VoiceLunchNL: More NPO
This Wednesday from 12 to 1 Sarah will also be our guest in the #voicelunchNL. We will have a bit longer to hear her story about the NPO Voice learnings and even better, discuss her findings at the informal lunch table.
**********************Join via Zoom**********************
Now, on with the news!

🤴🏼 Azeem Azhar: Massive AI
In this weeks newsletter, Azeem shares a bit on the mega AI models being trained. Awesome progress but is it how we want it. The platforms took over the ad business and are more powerful than many nation states. UK initiatives like this one are important but lagging on movements like Massive AI.
Google releases a language model with 600 billion parameters that can be efficiently trained to achieve “superior quality” translations from 100 languages into English. Crazy growth in the complexity of these language models. The Allen Institute’s ELMo clocked in at 93.6m parameters in February 2018. Two years later, Microsoft astounded us with Turing-NLG, and that weighed in at 17bn parameters. Four months on, Google has pushed the scale a further 36 times. In less than two years, the parameter size of the biggest language models has increased 6,000 fold. Google chucked 2048 optimised TPU-cores for four days to train this model, which is a lot. There is some evidence of diminishing returns, but these more complex models are also capable of quite sophisticated activities. (Check this accessible video of OpenAI’s GPT-3 language model writing pretty decent Python code.) My take is that we’re just nudging the start of this s-curve of size for these models. The processors are getting more powerful, and cheaper, and the returns from improved performance are as great as ever. Throw in a pinch of joshy rivalry between research groups, focusing on size rather than elegance, and it’s a recipe for unchecked growth.
Tom Hewitson
"You never needed anybody’s approval to build a website, or to invent a new use for the internet... To accelerate progress on conversational AI we need the same ingredients: open source tools and the ability to try new things without asking for permission." 👈 This https://t.co/G0MniKYSzp
🎙️ More audio tips
What’s in your playlist for this summer? Any suggestions to share? Reply and let me know. I’m happy to include them in the list. Maarten suggests listening to Jon Stine from the Open Voice Network (which can be a confusing name, we’re not related). John gave an interview at The Artificial Podcast on closing the user trust gap. Listen to it here.
We hope you enjoyed the links we collected. That’s it for this week! We’ll be back next Monday with a new edition of the Open Voice newsletter.
Cheers,
— Maarten & Sam
Did you enjoy this issue?
Maarten & Sam

Open Voice weekly newsletter

If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue