April 19, 2016
Korean grad students are developing an Environmental Modeling Scanner that takes an interesting approach to detailed scanning of complex and crowded environments. Their hardware/software capture volumetric architectural and décor details and, in post-processing, extract people and any other moving or unwanted elements from the model. The approach creates a model by scanning an environment, the result of which can be combined with other capture sessions. They are demonstrating their solution at the NAB Show’s VR Zone in the North Hall of the Las Vegas Convention Center.
To create the model, a person carries a rapidly-rotating depth sensor through a building, capturing its surroundings in 360 degrees many times a second. The walker waives the device up and down so that everything from floor to ceiling is repeatedly scanned. The captured data can be blended with other capture-sessions to create a full model of the space that can be wandered through freely.
In the future they plan to incorporate photo image capture. Much further down the road they will develop multi-location processing and real-time rendering. When perfected, the technology could be an inexpensive way to capture and model environments while they are occupied.
This is a research project under the leadership of Dr. Bum-Jae You at the Center of Human-Centered Interaction for Coexistence (CHIC), Korea Institute of Science and Technology in Seoul, Korea.