The Best New Products Displayed at Augmented World Expo

Several demos stood out at the 9th annual Augmented World Expo in Santa Clara, California last week. The most compelling involved a holographic display from Brooklyn-based Looking Glass Factory. Co-founder and CEO Shawn Frayne and his team have been working for a few years on a technique that “blends the best of volumetric rendering and light field projection.” Also compelling was a markerless multi-person tracking system that runs off a single video feed, developed by a Canadian computer vision/deep learning company named wrnch. And marking its first exhibit in the United States since launching its latest satellite office in San Francisco this April, Japanese company Miraisens demonstrated how a suite of effects could be used to enhance extended reality experiences.

The latest deliverable from Looking Glass is an approximately 3-inch thick clear Lucite block that appears to have a 3D interactive object floating inside of it in front of a black rear panel. They are readying an 8.6-inch diagonal and a 15.9-inch diagonal screen for sale at proposed $600 and $3,000 price points, respectively.

AWE_2018_Banner

The demo I saw used a Leap Motion hand tracker that enables you to virtually play with a very realistic looking frog. The control was intuitive and the frog’s response was realistic and entertaining. The displays can be tiled with spacers, like a window with multiple panes.

Looking_Glass_Volumetric_Display

Frayne has also developed a black-box interactive holographic display. You can reach into it and interact with floating objects that are visible, with parallax, across a wide arc around the unit. If there is interest, we will ask for demos of both technologies at an ETC event.

Wrnch founder and CEO Paul Kruszewski conducted a live demo during his stage presentation in which a single webcam recorded a wide swath of the audience and the video feed displayed on the main screen showed both the audience members and a stick figure overlay on many of them that tracked their movements in real time. Attendees who were near to the camera also had some of their facial features tracked.

Human_Tracking_wrnchAI

Kruszewski detailed three unique features of wrnchAI:

  • Accurate — Proprietary deep learning networks and synthetic 3D data pipeline.
  • Robust — Biomechanical 3D human model and body, hands and face tracking.
  • Fast — Optimized for real-time across Nvidia and mobile AI processors and Unity integration for rapid development.

This wrnchAI technology is available now.

The haptic feedback demo by Miraisens beautifully demonstrated how a handheld pod the size of a lipstick case could steer your hand left, right, forward, backward, give you the sense that you were pulling or pushing on a spring, and convey textures as the cursor slides over images of different surface types.

Miraisens_3D_Haptics

Miraisens is looking to license the software drivers for the haptics or contract out its software development services. I have experienced demos like this before from other companies, but Miraisens did an exceptionally good job of demonstrating how the suite of effects could be used to enhance the illusion of XR.

For more from AWE 2018, check out this year’s Auggie Awards winners.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.