Lucasfilm and Sony Pictures Imageworks unveiled Alembic, a computer graphics interchange format, this week at the SIGGRAPH Conference in Vancouver. Version 1.0 is now available for download.
According to Carolyn Giardina of The Hollywood Reporter, Alembic is “an open source system aimed at helping VFX companies easily store and share complex animated scenes across facilities, regardless of what software is being used.”
Sony Pictures Imageworks reports Alembic enables its artists to work 48 percent faster while using significantly less disc space.
At SIGGRAPH, leading software suppliers including Autodesk, Luxology, The Foundry and Side Effects are showing Alembic support for their top products.
Disney Research has developed a new technology that leverages phantom sensations and other tactile illusions to provide a wide range of physical sensations for gamers and film-goers via chairs outfitted with vibrating actuators. The technology is being demonstrated this week at SIGGRAPH in Vancouver.
Disney says its Surround Haptics system makes it possible for video game players and film viewers to “feel the smoothness of a finger being drawn against skin, for example, or the jolt of a collision.”
The system could potentially have a wide range of applications in movies, music and games, even communication systems for the blind.
“Although we have only implemented Surround Haptics with a gaming chair to date, the technology can be easily embedded into clothing, gloves, sports equipment and mobile computing devices,” says senior research scientist Ivan Poupyrev. “This technology has the capability of enhancing the perception of flying or falling, of shrinking or growing, of feeling bugs creeping on your skin. The possibilities are endless.”
Researchers from Microsoft’s Beijing lab have developed a technique to automatically model human faces in 3D with a new level of accuracy.
According to the research, the new approach can acquire “high-ﬁdelity 3D facial performances with realistic dynamic wrinkles and ﬁnescale facial details.”
The technique combines 3D scanning technology with a motion-capture system, in addition to what Geekwire describes as “a technique they developed to determine the minimal number of face scans needed to create an accurate model, which makes the system faster and more efficient.”
The research paper will be presented this week at the SIGGRAPH Conference in Vancouver.
ETCentric staffer Phil Lelyveld comments: “This more-accurate facial modeling, tied to game engine character behavior generation, will make for some very interesting experiences. There was a recent story on an emotional (versus Q&A response) Turing test. Would you hurt a visually realistic avatar?”