In the same way that retailers have learned to craft metadata for search engine optimization, they now need to arrange data effectively for voice searches, which are more conversational and involve more words than text searches.

Christel Grizaut Billault, vice president of marketing, Akeneo

Christel Grizaut Billault, vice president of marketing, Akeneo

The Amazon Echo has been projected to sell 60 million unitsin 2022—a ten-fold increase from 2016. Many Americans would not be surprised to hear that. From their tablets, smartphones, and computers,theyhave eagerly embraced digital voice-search assistants like Amazon Alexa, Apple’s Siri and Microsoft Cortana to do everything from getting directions to researching products. 

As households change their search behavior, shifting from typing keywords to using voice queries, making purchases using voice assistants are becoming part of people’s everyday routines. According to a recent studyby NPR and Edison Research, 57 percent of those surveyed have ordered items using smart speakers and 26 percent have used them regularly to add to shopping lists.

Now businesses are realizing they need to adjust their product data to fit Internet searches made through voice assistants. In the same way that retailers have learned to craft metadata for search engine optimization, they now need to arrange data effectively for voice searches, which are more conversational and involve more words than  text searches.

 

advertisement
Unlike text searches, voice queries run six to 10 words in length and usually start with who, how, what, where, why and when.

As retailers evaluate their product data, there are a few areas they should pay particular attention to when composing product information:

Understand the Context

Remember that voice shoppers often multitask while making a query. Unlike text searches, where shoppers must have their hands on keypads and eyes on a screen, voice assistants allow shoppers to be doing something else, like making dinner.

Suppose someone preparing for a dinner party asks a voice assistant, “What’s the best red wine to pair with ribeye steak?” A smart retailer would include usage information in wine product descriptions of the types of food that pair well with the product.

A shopper could also be driving, working out or doing laundry while making a voice query. When sorting clothes, a consumer might ask a voice assistant, “Which products are best at getting grass stains out of jeans?” Detergent makers would want to make this information available in their product details. As a general rule, it’s good to think in advance about what situations may occur during voice queries so you can craft product details that will specifically fit those scenarios.

Use Conversational Language

People speak to voice assistants in everyday conversational language just as they would if talking to a friend. This is quite different from the concise, to-the-point language used when typing a query into a traditional search engine. For instance, when you do a standard Google keyword search, you might just write “best restaurants in Los Angeles.” But when using Amazon Alexa, you might say, “Alexa, where can I find the best restaurants in Los Angeles?”

advertisement

 

Because of this, website content and product data need to be structured to conform to natural language machine learning requirements. These types of data updates will allow shoppers to ask common questions in conversational tones, using words and phrases that are much longer than text searches. Unlike text searches, voice queries run six to 10 words in length and usually start with who, how, what, where, why and when.

Optimize Product Descriptions for Different Stages of the Decision-Making Process

The choice of words used in questions to voice assistants offers incredible insights into the customers’ purchase intent. If a shopper asks, “What is the difference between the iPhone X and Samsung Galaxy S9?” that probably means that person is still in the research phase and is looking for a comparison across technical specifications and attributes. But a person who asks, “Where can I find an iPhone X?” is likely ready to pull out her wallet at any moment.

Retailers can use these types of language distinctions to create highly-targeted product descriptions that reach users at each stage of the shopper’s decision-making process. You could even add a “Why You Should Buy”section to your product listing to convince those who haven’t yet made up their minds.

advertisement

The advent of vocal assistants offers retailers a new channel for selling, and therefore another place that requires comprehensive, up-to-date, and optimized product information.This is where a dedicated PIM (product information management) tool can come in particularly handy. These tools tailor and optimize product information for different channels, such as websites, print catalogs and data feeds for partners. They can help make sure that customers searching the Internet through voice assistants get the information they need when they need it.

Failing to account for the rise in voice-conducted Internet searches could result in needlessly lost sales.Fifty percent of all Internet searches will be conducted by voice by 2020, according to estimates by Comscore. Make sure your business doesn’t miss this stream of revenue by taking action now to update your product information to fit the new technology.

Based in France, Akeneo provides open source product information management software.

 

Favorite

advertisement