The Rise of the Machines

“Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.”

While we may still be a few years away from the robot-filled world depicted in Terminator 2, artificial intelligence (AI) is already a part of our everyday life in other, subtler ways. In search, for instance, it already plays a vital role for both Google and Microsoft.

Artificial Intelligence

Did you know that 1 out of 5 questions asked on Google has never been asked before? Since Google processes over 2 trillion search queries each year, that means there are more than 400 billion search queries entered each year for which Google has never before served up search results. That is an almost inconceivable amount of data that has to be processed by the Google algorithms and engineers. With each day, it becomes more difficult to accurately interpret this vast amount of new data and serve relevant search. Enter artificial intelligence. While many company boards may still consider a presentation on the benefits of AI to be a futuristic distraction; since October 2015 AI has become part of the standard toolset at Google.

Google began working with AI by applying a deep learning approach to reinvent their search activity. This approach uses deep neural networks that approximate the web of neurons in the human brain, expanding a machine’s capacity for learning tasks. The system, nicknamed “RankBrain,” uses AI to process that data so the computers can understand and respond to it.

In the early stages, RankBrain certainly lived up to its hype. In an experiment that pitted the engineers against the new system, RankBrain had an 80% success rate (humans had 70%) when tasked with identifying which sample pages Google’s search engine technology would rank on top. Today, Facebook and Microsoft have also incorporated similar AI systems, with Facebook filtering through newsfeeds that compromise the homepages and Microsoft increasing capabilities for their search engine.

Virtual Reality

While AI and machine learning are creating rapid, behind-the-scenes in search, the question remains whether consumer interaction with search will also take a “futuristic” twist? All signs indicate that we are not far away from such a development. The holographic shark that appears to eat Marty in Back to the Future 2 is now a very real possibility; and while holographic sharks may not seem to be directly related to search, the connection becomes more obvious when you think about the opportunity for consumers to interact with search using holographs via tools like Microsoft’s HoloLens. (Expect Google Glass to be resurrected and include technology that is similar to what HoloLens is using.)

Voice Search

AI can integrate with technology like HoloLens in a number of ways, including using deep learning to identify voice commands directed toward Android phones and to recognize images of faces posted on Google+. Microsoft already uses AI-based neuro networks in a new Skype tool that instantly translates one language into another. Voice search/communication is evolving rapidly across a variety of platforms including Cortana, Amazon’s Echo, and Google Home. There’s no doubt that we are moving toward a scenario in which our communication with machines can be driven completely via a verbal interface.  It’s only a matter of time until we are regularly interacting with search results using a voice-activated, holographic interface that allows us to swipe though PLAs (product listing ads) or sponsored product results while walking down the street or standing in-line at the coffee shop. Taking the concept one step further, is it all that improbable to think that these machines will evolve to a point where they can read our minds and serve relevant search results by tapping into our neurons? Computers (supercomputers) can already mimic the networks of neurons inside our heads, creating a system that is capable of not only analyzing information, but can also learn.

The question we have to ask ourselves is whether we are going to quickly need to start using AI in our everyday agency lives or allow ourselves to be replaced by machines? Of course, we won’t all be replaced by machines, but it’s very possible – even probable – that we will soon see an increase in digital assistants that will become part of our everyday search life. Think of the digital assistants that most of us already use every day: Microsoft’s Cortana, Apple’s Siri, and Google Now are all voice-search enabled and growing smarter with every interaction. Staying with voice search for a moment, it is predicted that voice search will make up 50% of all search activity by the year 2020. Because voice search is naturally more conversational and uses natural language, AI is the perfect tool with which to better understand consumer intent. The longer query strings associated with voice search (as compared to text) provide richer user data. In addition, AI will be able to understand the different nuances of conversational tone, allowing us to identify different intent signals and more accurately assess where the consumer is in the customer journey.

And those are just the tip-of-the-iceberg possibilities of one aspect of AI. It really is a brave new world filled with exciting new opportunities for consumers and advertisers alike.