New Research Shows Which Digital Assistants Actually Know Stuff

According to a report from Edison Research, more than 51 million Americans now own a “smart speaker” like the Amazon Echo or Google Home. The adoption rate of these voice-activated devices is faster than the adoption rate for smartphones, a decade ago. And speaking of smartphones, they also have digital assistants onboard, including Apple’s Siri, Google Assistant for smartphones, and Microsoft’s Cortana (also accessible on Xbox and other devices).

All told, we are positively SURROUNDED by digital assistants, each practically begging to help us learn and be more productive. But can they actually accomplish that, consistently?

Digital assistants are as useful as a James Harden beard trimmer if they can’t answer the questions we want answered, right? This is why I am shocked, awed, and in love with the second-annual study from Stone Temple that exhaustively tested which digital assistants are best at answering questions.

I interviewed Stone Temple CEO Eric Enge about the study recently to learn how they conducted it and what they learned. My full interview is below. It’s worth the watch! Highlights are below.

How to Test Digital Assistants

Turns out, there are no shortcuts for figuring out which digital assistant can actually assist. Eric’s team at Stone Temple methodically asked 4,942 questions to Alexa, Siri, Google Assistant on a phone, Google Assistant on the Google Home, and Microsoft’s Cortana, running on the Harmon Kardon Invoke speaker. Yes, they asked 24,710 separate queries! This took a LOT of labor.

For each question, the team noted whether the response was accurate or inaccurate. They also noted if the assistant didn’t understand the inquiry and whether the response was “verbal” from the device, pulled from a database, or sourced from the web.

Which Is the Best Digital Assistant?

According to the research, the best performer in 2018 is Google Assistant on a smartphone. This may not be a huge shock, given Google has access to an unfathomable array of information and routinely handles billions of user queries. This digital assistant attempted to answer almost 80 percent of all questions, meaning there were very few of the frustrating “I don’t understand what you mean” replies.

And, among questions answered, Google’s accuracy rate exceeded 90 percent.

In comparison, Cortana attempted to answer slightly more than 60 percent of the questions, with Alexa at slightly more than half, and Siri at just over 40 percent.

When the assistants offered answers, accuracy rates were more closely grouped together. Google on smartphone is best at more than 95 percent, but Google assistant on Home and Microsoft’s Cortana are right there, as well. Alexa tops 80 percent, and even Siri gets it right 80 percent of the time (when it actually has an answer at all).

Sometimes, the answers provided by the digital assistant are flat-out wrong. This is most likely to occur with Alexa and Siri. Each had more than 160 incorrect answers, compared to fewer than 40 for Google and Microsoft. Note, however, that Google and Microsoft own enormous search engines, which likely help with their data matching considerably.

We Ask Digital Assistants Pretty Dumb Stuff (Today)

Today, in these early days, the questions we ask our digital assistants are fairly basic and banal. (This is NOT the case among Stone Temple’s test, as many of the 5,000 questions are tricky.) But for many of us, we are primarily using these devices to check the weather, learn sports scores, retrieve general knowledge, or set timers.

In our conversation, Eric and I discussed this situation, and we believe it to be temporary-a snapshot in time. As humans get more comfortable with voice-activated queries and replies, our use of these digital assistants will become more nuanced and complex.

To my eye, this mirrors what happened in the early days of search engines, when people typically used very short search strings when querying Lycos, et al. As comfort with online search grew, and the quality of the search results improved, we began using longer and longer queries.

Over time, these digital assistants will improve, and our use of them will correspondingly become more comprehensive.

Voice Is a Huge Content Marketing Opportunity

In addition to their digital assistants study, Eric and his team have also created “skills” for Alexa and Google Assistant that allow you to ask those assistants questions about search engine optimization, and you’ll be served up answers from Stone Temple. And on Alexa, they even have an SEO quiz you can take instantly. Brilliant!

Eric reports that the company is getting visibility and usage from this voice-activated advice. He said:

“On the Google Assistant, they have a mode called implicit queries, and if you check the box when you set up your device that you want that, somebody can ask Google a question without invoking our specific actions. They might just say, ‘How do you implement a no-follow tag?’ Google might come back and say, ‘Stone Temple has an answer for that, do you want to hear it?'”

To date, Eric says more than 1,000 people have interacted with the Stone Temple SEO advice via implicit queries on Google Assistant.

Impact of Digital Assistant Data on Traditional SEO Rankings

I’m fascinated by Eric’s foray into voice-activated SEO advice and want to work on some of my own. “Alexa: Ask Jay Baer about tequila”!

Given that Google and Microsoft have major stakes in the digital assistants battle, I wondered whether being a “source” of information for those devices-as Stone Temple is for SEO information-might “bleed over” and positively influence search rankings on Google and Bing? I asked Eric about that, and he replied:

“No evidence of benefits to date, and I think it’s too early for that to have happened at this point. But it definitely isn’t going to hurt, and if you’re delivering reliable information, and people are asking for you to give them answers, that’s a topic authority signal that search engines could mine.”

Grab a copy of Stone Temple’s personal digital assistants study, and start thinking about your own foray into voice-activated knowledge. And if you can, take a few minutes to watch my interview with Eric above, or read the transcript below. Good stuff in there.

Transcript

Jay Baer:

One of the things you have in the study and again, it’s called Rating the Smarts of Digital Personal Assistants in 2018, you can get it on the Stone Temple website, stonetemple.com. You list sort of the question sets, not that you necessarily asked in the study, although you mention that as well, but what people ask of these assistants generally speaking. It shows that a lot of the questions today are somewhat banal. It’s what’s the weather going to be tomorrow, though I’m certainly guilty of that. I use my Alexa for that all the time even though I’ve got multiple other ways to determine the weather tomorrow, it’s just easier. Do you feel like over time as humans become more comfortable with this technology, perhaps more trusting of it, that the types of questions we ask will change?

Browse

Article by channel:

Read more articles tagged: Content Marketing