Alexa, Cortana and IBM Executives Envision the Future of AI

Amazon vice president of Alexa Engine software Al Lindsay, IBM general manager of Watson’s Content and IoT platform Cameron Clayton, formerly chief exec of The Weather Channel, and Microsoft AI and Research Group corporate vice president Andrew Shuman were prompted by moderator Avram Piltch, Tom’s Guide editorial director, during a CES panel to depict what artificial intelligence will look like in 2023. Clayton summed it up best. “It’s going to be ubiquitous,” he said. “All connected systems will have AI integrated into them somehow someway.”

But he continued with a caveat. “We have to be careful about our own human frailties,” he said. “Laziness is mine. Do we become co-dependent on our phones, addicted to our phones? As we move forward, ubiquity and dependence is something to look at.”

Shuman agreed. “In core software you’ll expect much more intelligence,” he said. “In particular, what we’re most excited about is having the context of the physical world and the person himself.” Lindsay agreed that, “we’ll see a lot of context.”

Binary_AI_Robot

“It’s taken a lot of time to become a two-year-old,” he said. “In five years, will the two-year-old become a seven-year-old? We have a lot of work to do to get there. On the pure science side, it’s a lot of hand-held supervised training. We need to see great advances in unsupervised methods so you’ll also see that feedback in real time.” He also mentioned that, “multi-modality is also in its infancy.”

Clayton revealed that IBM is building a cloud, from scratch, specifically for AI. “Most other clouds are adapted or trying to adapt to AI,” he said. Whether it’s on-prem, in the public cloud or a hybrid solution — even mainframe computing — all will be handling huge workloads for AI.” Lindsay agreed that, “Big Data needs big memory, big storage.”

“I don’t know if you’ll ever get to the level of accuracy that you want,” he said. “We’ll see how Moore’s Law works out.”

As to the uses of AI, Shuman noted what he sees today and how that informs the future. “Some very smart people are working incredibly hard to get you to click on things,” he said. “We all have to be thoughtful about how we design our experiences and how we consume them.”

How close are we to a sci-fi sentient AI, or what futurist Ray Kurzweil calls the Singularity? “We’ve pre-conceived what it would be like, since we gathered around campfires,” said Shuman. “There are so many steps along the way before we get to that end stage. To be able to teach AI instead of programming it to learn will be a milestone. There’s a lot to learn along that journey.”

Clayton said that IBM believes “AI systems should make recommendations and not decisions.” “By putting the AI system as a recommendation to a person, the movie version of AI gets stopped by people,” he said. “That’s a big deal for us.”

In the next five years, said Lindsay, we’ll see a tremendous number of advances in modalities, with the integration of gestures and context coming into play. “Agents will start to feel more natural and you’ll be able to accomplish more,” he said. Clayton had advice for people “worried or concerned” about the AI-powered future.

“Learn to code now,” he said. “If you know how to code, you can help change the outcome the way you want it to be. It’s that simple.”