Loup Ventures recently tested the three major virtual digital assistants through a test. Google Assistant, Amazon Alexa, and Apple Siri each received the same 800 questions. And it evaluates the activities for their ability to understand the question and provide response correctly. The results? The Google Assistant understood the 800 questions asked and answered 93% correctly. Siri included 99.8% of the questions and correctly answered 83.1% of the time. Alexa almost equals the perfect understanding of Google Assistant with a score of 99.9%. But it did the worst in terms of accuracy and scored 79.8%.
The three digital assistants improved their scores last year when the Google Assistant answered 86% of the questions correctly. Siri was right 79% of the time and Alexa gave the correct answer 61% of the time. The report, written by analysts Gene Munster and Will Thompson, indicates an important factor; This test, like last year. It measures the response of these digital assistants on a smartphone rather than a smart speaker. This is important because the lack of a screen could change the response of a speaker instead of a phone. The questions are very short as it is performing the test with telephones, the questions were shorter. And the use of a screen allowed the digital assistants to answer some questions without having to verbally announce the answer.
Google Assistant is the best digital assistant
Each of the three digital assistants is divided into five different categories: local, commerce, navigation, information and command. The Google Assistant got the best score for each, except for Command. The last group of questions concerned telephone-related activities, such as e-mail, text messages, calendar, and music. That category was led by Siri, who beat the Google Assistant with a score from 93% to 86%. Siri comes second in the Local category (“Where is the nearest bookstore?”). And for Navigation (“What subway did it took to get to the center?”). In that department, Alexa came in second, while in second place in Information (” What time do the Yankees play tonight? “). Alexa, with her Amazon family, was a favorite in the commerce department. As it fails to beat the Google Assistant, Alexa is also facing the disadvantage of being an app. And instead of a native function of the operating system, which prevented her from doing well in the Command category.
It conducts the test on an iPhone with iOS 12.4, a Pixel XL with Android 9 Pie installed. And Alexa was tested with the iOS app. It is interesting to note that while Alexa concluded last with precision. Amazon’s digital assistant showed an improvement of 18% points in the last 13 months, the best result among the three. The Google Assistant had a 7-percentage point increase in his score over the same time period. While Siri improved 5 percentage points.
After the first test, Google Assistant and Siri have improved more in the Commerce category. Alexa, who was not part of the first test, also saw his biggest improvement in the last 13 months in the Commerce category.
(Via: Loup Ventures)