Give or take our own minor frustrations, the Google Assistant is generally accepted as the best voice assistant on the market today. While public perception plays a critical role, it’s also important to back that feeling up with numbers. For the third year in a row, the Google Assistant has been tested and proven to be able to answer more queries correctly versus both Siri and Alexa.
Every year, Loup Ventures performs an in-depth test of major voice assistants available on mobile phones. This year, Google Assistant, Siri, and Alexa were each graded on how well they understand what you said and whether they can give a “correct” answer. To do this, Loup Ventures asks each voice assistant the same 800 queries, broken down into five different categories.
Just like last year, the Google Assistant scored a perfect 100% on the ability to at least understand what was said, but Siri and Alexa came close behind at 99.8% and 99.9% respectively. It surely won’t be long before we no longer need to question whether a voice assistant can understand you perfectly. Google themselves have been making strides toward this being possible for larger groups of people, with efforts like Project Euphonia.
Where this competition really becomes fierce is in whether or not the digital assistant will give you the correct answer. The Google Assistant has visibly improved over the last year, as it scored an impressive 92.9%, which is 7 percentage points higher from the last test. However, the biggest improvement was seen by Alexa, which jumped from 61.4% to 79.8%, an impressive feat for a third-party app.
Looking at the breakdown of each category, it’s clear that, even on a Pixel phone, the Google Assistant isn’t able to control your device as well as Siri can for iOS devices.
Google Assistant was the top performer in four of the five categories but fell short of Siri in the Command category again. Siri continues to prove more useful with phone-related functions like calling, texting, emailing, calendar, and music. Both Siri and Google Assistant, which are baked into the OS of the phone, far outperformed Alexa in the Command section.
While there’s a clear trend toward assistants soon answering 100% of queries correctly, Loup cautions that their tests only cover the primary use cases for a digital assistant. Outside of these core uses, digital assistants are “not generally intelligent.”