Artificial Intelligence at CES 2018: Expect More of the Same

If measured in press impressions, 2017 has most definitely been the “Year of AI,” But looking past the hype, a few things are clear: 1) progress in actual machine intelligence capability has been slow and fragmented; 2) applied AI is still the domain of less than 20 companies; and 3) still, machine learning (not AI) is being deployed across enterprise domains of numerous business sectors and creating big value. Similarly, and since it will take another year or two for current advances in machine learning to trickle down to the consumer sector, we’re not really expecting much breakthrough in AI or even machine learning at CES 2018.

Specifically, when looking at the current state of artificial intelligence:

1. Progress in actual machine intelligence capability has been slow, fragmented, almost 100 percent confined in one narrow domain (deep learning), and almost exclusively in the lab.

2. Applied AI is still the domain of less than 20 companies (mostly within the walls of Google, Microsoft, Amazon and Facebook … which have the most data to conduct large-scale deep learning-type work).

3. Still, machine learning (not AI) is being deployed across an increasing number of enterprise domains in areas such as health, financial services, media, aerospace and defense … and creating big time — yet massively unsexy — value.

Amazon_Echo_Tap

At CES next month, it’s more likely that we’ll see a broader and more ubiquitous presence of some very basic forms of machine intelligence throughout the consumer electronics landscape, from toys to assistants and of course “ambient experiences.”

So … more of the same “kind of cool, but mostly useless” stuff, especially in the CES startup jungle, where early stage investments have been flowing at high volumes … and often in total blindness.

That said, we’ll be taking a close look at the following:

  • Further innovation in deep learning hardware, especially from Nvidia.
  • Deeper integration of learning and personalization into devices to create seamless and organic consumer experiences, especially in retail.
  • Some early stage AI- and machine learning-enabled “smart cities” and related IoT ecosystems as potential foundations for “ambient experiences” (those bear a lot of promise, and will particularly capture our attention).
  • Alexa, Alexa, Alexa.
  • Some progress in self-driving car performance and user experience.
  • Some forward motion in applied machine learning for gaming and toys.

But just like last year, we’re also hoping to see early signs of any foundational capacity to semantically and intelligently merge together diverse datasets and ontologies (for example, the firehose of multiple IoT devices) into a central application capable of representing knowledge from atomic-level data.

… Or as we like to call it: machine understanding. There will be no “Year of AI,” in consumer electronics or anywhere else, before that happens.

NOTE: We’re referring to “AI” here in the literal sense: applications that both learn from input data AND act with autonomous agency in any known (or unknown) computable domain (for example, self-driving cars, or AlphaGo/AlphaZero). Applications that just learn from input data (in a supervised or unsupervised manner) are considered machine learning. They are still valuable, and often hard to build.

CES_2018_C_Space_Banner

For more information on CES 2018 (#CES2018), visit the event’s official website or its Facebook page. Our audience should be particularly interested in C Space at CES, which examines “disruptive trends and how they are going to change the future of brand marketing and entertainment.” For those interested in attending CES January 9-12, CTA is offering the ETCentric community free Exhibits Plus passes. Use the discount code ETC2018 when registering (offer expires 12/22).

And be sure to check back with us in the new year; the ETCentric team will have the latest in new products and trends with live reporting from Las Vegas.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.