SIGGRAPH: Disney Uses 3D Printing to Create Expressive Eyes

Disney Research is using 3D printing to create components to build expressive eyes for robots. However, rather than mimic human eyes, which can appear strange to some, the robot eyes have a cartoonish look. The technology may have future applications for interactive toys, video game characters and possibly even human prosthetic eyes. The research team demonstrated the technology at the ACM SIGGRAPH Conference in Anaheim last week. Continue reading SIGGRAPH: Disney Uses 3D Printing to Create Expressive Eyes

SIGGRAPH: Nvidia Demonstrates Next-Gen Mobile Processor

Nvidia offered a sneak peek at its next-generation mobile processor, Project Logan, during the SIGRRAPH Conference in Anaheim this week. Logan is based on Nvidia’s advanced Kepler graphics architecture used for desktop and laptop chips. Nvidia plans to launch Logan next year, and combine its mobile phone and desktop 3D graphics architectures. This will enable new mobile applications such as augmented reality, computer vision and speech recognition. Continue reading SIGGRAPH: Nvidia Demonstrates Next-Gen Mobile Processor

SIGGRAPH: Canon Previews Handheld Mixed Reality Technology

Canon has been demonstrating a handheld version of its MREAL Mixed Reality technology at SIGGRAPH this week. According to Canon, the technology merges virtual objects with the real world, at full scale in three dimensions. The company launched its MREAL Mixed Reality headset earlier this year. The handheld version functions similarly to the headset, by enabling the use of markers or sensors to render images in real space. Continue reading SIGGRAPH: Canon Previews Handheld Mixed Reality Technology

SIGGRAPH: Disney Creates Air-Based Touch Feedback System

Disney Research has created a haptic, or touch feedback, system that uses bursts of air. The AIREAL system simulates touch, or tactile sensation, in three-dimensional empty space. The haptic feedback can be applied to countless applications and situations, and may offer new ways for users to interact with their devices. The concept is to make touchless experiences, such as motion and gesture control, a physical interaction. This will ultimately give the user a more natural, touch-like perception. Continue reading SIGGRAPH: Disney Creates Air-Based Touch Feedback System

Lucasfilm and Sony Pictures Imageworks Release CG Interchange Format

  • Lucasfilm and Sony Pictures Imageworks unveiled Alembic, a computer graphics interchange format, this week at the SIGGRAPH Conference in Vancouver. Version 1.0 is now available for download.
  • According to Carolyn Giardina of The Hollywood Reporter, Alembic is “an open source system aimed at helping VFX companies easily store and share complex animated scenes across facilities, regardless of what software is being used.”
  • Sony Pictures Imageworks reports Alembic enables its artists to work 48 percent faster while using significantly less disc space.
  • At SIGGRAPH, leading software suppliers including Autodesk, Luxology, The Foundry and Side Effects are showing Alembic support for their top products.

Disney Demonstrates Surround Haptics System for Gaming and More

  • Disney Research has developed a new technology that leverages phantom sensations and other tactile illusions to provide a wide range of physical sensations for gamers and film-goers via chairs outfitted with vibrating actuators. The technology is being demonstrated this week at SIGGRAPH in Vancouver.
  • Disney says its Surround Haptics system makes it possible for video game players and film viewers to “feel the smoothness of a finger being drawn against skin, for example, or the jolt of a collision.”
  • The system could potentially have a wide range of applications in movies, music and games, even communication systems for the blind.
  • “Although we have only implemented Surround Haptics with a gaming chair to date, the technology can be easily embedded into clothing, gloves, sports equipment and mobile computing devices,” says senior research scientist Ivan Poupyrev. “This technology has the capability of enhancing the perception of flying or falling, of shrinking or growing, of feeling bugs creeping on your skin. The possibilities are endless.”

Microsoft Develops Accurate Technique for 3D Facial Modeling

  • Researchers from Microsoft’s Beijing lab have developed a technique to automatically model human faces in 3D with a new level of accuracy.
  • According to the research, the new approach can acquire “high-fidelity 3D facial performances with realistic dynamic wrinkles and finescale facial details.”
  • The technique combines 3D scanning technology with a motion-capture system, in addition to what Geekwire describes as “a technique they developed to determine the minimal number of face scans needed to create an accurate model, which makes the system faster and more efficient.”
  • The research paper will be presented this week at the SIGGRAPH Conference in Vancouver.
  • ETCentric staffer Phil Lelyveld comments: “This more-accurate facial modeling, tied to game engine character behavior generation, will make for some very interesting experiences. There was a recent story on an emotional (versus Q&A response) Turing test. Would you hurt a visually realistic avatar?”