fbpx

Blog

Report: Google Assistant Beats Rivals When It Comes To Questions Answered And Overall Accuracy

Thrive Business Marketing company logo

Back in January, a virtual assistant consumer survey by Stone Temple Consulting was released, showing the majority of respondents wanted the assistants to provide “answers” rather than conventional search results.  A follow-up study was released by the company, and it measured the relative accuracy of the four major assistants.

The study compared results of “5,000 different questions about everyday factual knowledge” on Google Home, Alexa, Cortana and Siri, using traditional Google search results as a baseline for accuracy.  The table below shows the study’s top-line results.

(Apple is reportedly “finalizing” its Amazon Echo competitor.)

Here’s Stone Temple Consulting’s summary of the outcome:

Google still has the clear lead in terms of overall smarts with both Google search and the Google Assistant on Google Home. Cortana is pressing quite hard to close the gap, and has made great strides in the last three years. Alexa and Siri both face the limitation of not being able to leverage a full crawl of the web to supplement their knowledge bases. It will be interesting to see how they both address that challenge.

An interesting observation that was found in the report focuses on featured snippets.  Cortana had more featured snippets integrated than any of the others, even when compared to Google Home, although Google search had more.  Siri and Alexa lagged far behind in the category, even though they want to use third parties to deliver “answers” and transactional capabilities.

There’s a good deal more discussion of both the results and the study’s methodology on the Stone Temple blog.

Source – Greg Sterling

Are You Ready To Thrive?

Or send us a message

Name(Required)

Categories