Friday, March 3, 2017

How Artificial Intelligence Is Quietly Changing How You Shop Online - TIME Business

Posted: 01 Mar 2017 05:00 AM PST

The writer William Gibson once said that the future is here, just not evenly distributed. That was the case with the World Wide Web 20 years ago, when some business models – notably e-commerce and new media – took off faster than others. Now a similar trend is happening with artificial intelligence, or AI.
The promise of AI has seemingly been just on the horizon for years, with little evidence of change in the lives of most consumers. A few years ago, buzzwords like “big data” hinted at the potential, but ending up generating little actual impact. That’s now changing, thanks to advancements in AI like deep learning, in which software programs learn to perform sometimes complex tasks without active oversight from humans.
Deep learning algorithms have been powering self-driving cars and making quick progress in tasks like facial recognition. Now these innovations are beginning to find their way into the daily lives of consumers as well.
“For retail companies that want to compete and differentiate their sales from competitors, retail is a hotbed of analytics and machine learning,” says John Bates, a product manager with Adobe Marketing Cloud, which offers machine learning services in e-commerce and other industries. As with the early Web, travel and entertainment are also making early use of machine learning, according to Bates.
A spate of recent experiments and announcements underscore the trend in e-commerce. One notable example is Pinterest Lens, a Shazam-like service that conducts visual searches based on items in the everyday world. Just point your camera at, say, a piece of furniture or item of clothing and Lens can help you find it online.
Lens builds on earlier Pinterest innovations like Related Pins and Guided Search — both based on the idea that you sometimes don’t know what you’re looking for until you see it — as well as a visual search tool that can find similar images inside the billions of pins that Pinterest has collected. Related pins are served up according to a similarity score that Pinterest’s algorithms assign between images.
Lens expands on that earlier search tool beyond images to include things in the real world. Image-detection programs identify an object and visual search digs up similar images, making it easier to buy a coveted item online. The potential for this kind of product-search innovation is interesting: You can search for things that won’t fit in a standard search box, and more tightly connect things found offline with those found online.
“For shopping specifically, improvements to online discovery means new ways to find products you’re interested in but may not have the words for,” says Andrew Zhai, an engineer working on Pinterest’s visual search. “Visual discovery gives people a way to discover new brands and ways of styling that they never knew existed.”
Other e-commerce sites are also adopting deep learning to help shoppers more easily find what they seek. Gilt deploys it to search for similar items of clothing with different features like a longer sleeve or a different cut. Etsy bought Blackbird Technologies last fall to apply the firm’s image-recognition and natural-language processing to its search function.
And notably, Amazon is planning to use the AI technology it offers on its Web Services in its new Amazon Go grocery stores. The company is operating only one store in Seattle, but Chief Financial Officer Brian Olsavsky said during a February earnings call that “it’s using some of the same technologies you would see in self-driving cars: computer vision, sensor, fusion, deep learning.”
Adobe, meanwhile, is taking things a step further by letting people create images of their desired products. Working with researchers at UC Berkeley, Adobe developed an image-editing tool that can turn a crude sketch of a handbag or shoe into a photorealistic and easily tweakable image. The tool also draws on a deep database of related images to turn sketches into pictures.
Adobe’s marketing tools are also incorporating deep learning into offerings for retailers, using AI to predict customer behavior. Shoppers can choose to receive suggestions based on their shopping lists – a belt that matches a pair of pants, painting supplies to help with a can of paint, a wine paired to a dinner recipe. Programs can subtly nudge people along when they are making a big-ticket purchase online but are not ready to hit the buy button.
Subtlety is a key part of these AI-powered marketing tools. People can grow alienated if they feel retailers are snooping on their behaviors or if it comes across as a hard sell. Adobe’s AI learns from past behavior as well as trial and error to learn how to make a gentle nudge without being too pushy.
“That’s a bit of the art and science behind deep learning,” says Bates. “But that’s where a lot of these signals can be built into the algorithms. If it creates an unnatural signal or puts someone off, it can be built into the training itself.”
While deep learning is becoming a part of the retail experience, it’s happening in fits and starts, as Facebook found with chatbots. Touted as a tool that could automate customer-service functions and deepen human engagement, chatbots were added to Facebook Messenger, with more than 11,000 of them available last year. But last week, Facebook scaled back its chatbot ambitions after they clocked a 70% failure rate.
As with the early days of the Web, there remains much work to do before deep learning can be seamlessly integrated into the daily lives of consumers. Compared to expectations of even a few years ago, though, things are a lot farther along than many expected. And that suggests Silicon Valley may again be ready to change how we shop.

No comments:

Post a Comment