VUX

VUX design 2.0

VUX design 2.0 1800 1200 VUX World

Voice assistants should change how they interact with users based on how you interact with them and the context you’re in.

For example:

🎧 If I’m walking with my headphones on, and I ask Siri “What was the Boro score at the weekend?” Then I just want the score. Nothing more. Did they win or lose?

🔊 If I’m at home and I ask the same question to my smart speaker then I have a little more time. You can give me some stats, tell me who scored etc.

📺 If I’m in the living room and I ask the same question to Alexa on my TV, then I might want to see some highlights, watch some punditry etc.

This is VUX design 2.0.

So does anyone have any examples of this happening across the main voice assistants or have you included interactions like this in your experiences?

Creating world-class interactive audio with Jonathan Myers and Dave Grossman

Creating world-class interactive audio with Jonathan Myers and Dave Grossman 1800 1200 VUX World

Jonathan Myers and Dave Grossman, founders of Earplay, join us to share how they create world-leading interactive stories. read more

All about voice UX research with Chris Geison

All about voice UX research with Chris Geison 1800 1200 VUX World

For the first episode of 2019, Dustin and I chat to long time fan of the show, hardcore Wu-Tang fan and resident UX research expert at AnswerLab, Chris Geison, about the ins and outs of UX research for voice. read more

Emotional intelligence with Sina Kahen

Emotional intelligence with Sina Kahen 1800 1200 VUX World


This week, we’re joined by VAICE co-founder, Sina Kahen to discuss the importance of emotional intelligence and how you can design EQ into your voice experiences using the 6 ‘First date’ principles. read more

All about Alpha Voice with Bryan Colligan

All about Alpha Voice with Bryan Colligan 1800 1200 VUX World

This week, we’re finding out how content creators can have their podcasts and YouTube content indexed and searchable on voice, with Bryan Colligan of Alpha Voice. read more

All about Voysis and the GUI to VUI transition with Brian Colcord

All about Voysis and the GUI to VUI transition with Brian Colcord 1800 1200 VUX World

Today we’re taking a close look at the Voysis platform and discussing transitioning from GUI to VUI design with VP of Design, Brian Colcord.

We’ve covered plenty of voice first design and development on this podcast. Well, that’s what the podcast is, so we’re bound to! Most of what we’ve discussed has largely been voice assistant or smart speaker-focused. We haven’t covered a huge amount of voice first application in the browser and on mobile, until now.

Mic check

You’ll have noticed the little mic symbol popping up on a number of websites lately. It’s in the Google search bar, it’s on websites such as EchoSim and Spotify are trialing it too. When you press that mic symbol, it enables your mic on whatever device you’re using and lets you speak your search term.

Next time you see that mic, you could be looking at the entry point to Voysis.

On a lot of websites, that search may well just use the website’s standard search tool to perform the search. With Voysis, its engine will perform the search for you using its voice tech stack.

That means that you can perform more elaborate searches that most search engines would struggle with. For example:

“Show me Nike Air Max trainers, size 8, in black, under $150”

Most search engines would freak out at this, but not Voysis. That’s what it does.

Of course, it’s more than an ecommerce search tool, as we’ll find out during this episode.

In this episode

We discuss how approaches to new technology seem to wrongly follow a reincarnation route. Turning print into web by using the same principles that govern print. Turning online into mobile by using the same principles that govern the web. Then taking the practices and principles of GUI and transferring that to VUI. We touch on why moving you app to voice is the wrong approach.

We also discuss:

  • Voysis – what it is and what it does
  • Getting sophisticated with searches
  • Designing purely for voice vs multi modal
  • The challenge of ecommerce with a zero UI
  • The nuance between the GUI assistant and voice only assistants
  • How multi modal voice experiences can help the shopping experience
  • Making the transition from GUI to VUI
  • The similarities between moving from web to mobile and from mobile to voice – (when moving to mobile, you had to think about gestures and smaller screens)
  • Error states and points of delight
  • The difference between designing for voice and designing for a screen
  • Testing for voice
  • Understand voice first ergonomics

Our Guest

Brian Colcord, VP of Design at Voysis, is a world-leading designer, cool, calm and collected speaker and passionate sneaker head.

After designing the early versions of the JoinMe brand markings and UI, he was recruited by LogMeIn and went on to be one of the first designers to work on the Apple Watch prior to its release.

Brian has made the transition from GUI to VUI design and shares with us his passion for voice, how he made the transition, what he learned and how you can do it too.

About Voysis

Voysis is a Dublin-based voice technology company that believes voice interactions can be as natural as human ones and are working intently to give brands the capability to have natural language interactions with customers.

Links

Check out the Voysis website
Follow Voysis on Twitter
Read the Voysis blog
Join Brian on LinkedIn
Follow Brian on Twitter
Listen to the AI in industry podcast with Voysis CEO, Peter Cahill
Read Brian’s post, You’re already a voice designer, you just don’t know it yet

Where to listen

My first 30 days as a VUI designer with Ilana Shalowitz and Brian Bauman

My first 30 days as a VUI designer with Ilana Shalowitz and Brian Bauman 1800 1200 VUX World

Today, we’re getting into detail about what it’s like to be a full-time VUI designer. We’re discussing the details of the role, the day to day duties and the skillsets that are important to succeed in designing voice user interfaces. read more

Hearing voices: a strategic view of the voice first ecosystem with Matt Hartman

Hearing voices: a strategic view of the voice first ecosystem with Matt Hartman 1800 1200 VUX World

We’re getting into detail on the voice first ecosystem; the opportunities, challenges and future, with curator of the Hearing Voices newsletter, Matt Hartman.

This week, Dustin and I are joined by Matt Hartman, partner at Betaworks, curator of the Hearing Voices newsletter and creator of the Wiffy Alexa Skill.

In this episode, we’re discussing:

  • All about Betaworks
  • A strategic vision of the voice first scene
  • Changing user behaviour
  • On-demand interfaces
  • Friction and psychological friction
  • How context influences your design interface
  • The 2 types of companies that’ll get built on voice platforms
  • Differences between GUI and VUI design
  • Voice camp
  • The Wiffy Alexa Skill
  • Lessons learned building your first Alexa Skill
  • Text message on-boarding
  • Challenges in the voice space

Our Guest, Matt Hartman

Matt Hartman has been with Betaworks for the past 4 years and handles the investment side of the company. Matt spends his days with his ear to the ground, meeting company founders and entrepreneurs, searching for the next big investment opportunities.

Matt Hartman, parter at Betaworks

Today’s guest, Matt Hartman

Paying attention to trends in user behaviour and searching for the next new wave of technology that will change the way people communicate has led Matt and Betaworks to focus on the voice first space.

Matt has developed immense knowledge and passion for voice and is a true visionary. He totally gets the current state of play in the voice first space and is a true design thinker. He has an entirely different and unique perspective on the voice scene: the voice first ecosystem, voice strategy, user behaviour trends, challenges and the future of the industry.

Matt curates the Hearing Voices newsletter to share his reading with the rest of the voice space and created the Wiffy Alexa Skill, which lets you ask Alexa for the Wifi password. It’s one of the few Skills that receives the fabled Alexa Developer Reward.

Betaworks

Betaworks is a startup platform that builds products like bit.ly, Chartbeat and GIPHY. It invests in companies like Tumblr, Kickstarter and Medium and has recently turned its attention to audio and voice first platforms such as Anchor, Breaker and Gimlet.

As part of voice camp in 2017, Betaworks invested in a host of voice first companies including Jovo, who featured on episode 5 of the VUX World podcast, as well as Spoken Layer, Shine and John Done, which conversational AI guru, Jeff Smith (episode 4), was involved in.

Links

All about Mycroft with Joshua Montgomery, Steve Penrod and Derick Schweppe

All about Mycroft with Joshua Montgomery, Steve Penrod and Derick Schweppe 1800 1200 VUX World

This week, we’re joined by the Mycroft AI team, and we’re getting deep into designing and developing on the open source alternative to Amazon Alexa and Google Assistant. read more

A deep dive into cross-platform voice development with Jan König

A deep dive into cross-platform voice development with Jan König 1800 1200 VUX World



Find out all about the Jovo framework that lets you create Alexa Skills and Google Assistant apps at the same time, using the same code! read more