November 10, 2016
Facebook is intent on helping innovate technological advances as it grapples with increasingly more video, and pushing down prices for hardware. To that end, the company has created Voyager, a high-speed, long-distance networking system, and will share its plans with other companies, per its commitment to open source software. Voyager will enable data centers in different locations to link with fiber-optic cables. The company also unveiled Backpack, its 100G-switch platform for connecting racks inside the data center.
The Wall Street Journal notes that, “Voyager was developed as part of the Telecom Infra Project, a technology-sharing effort announced by Facebook in February.”
Voyager, which includes chips from Acacia Communications and software from startup SnapRoute, is “the first use of optical networking in a white-box product.” That refers to “generic equipment made by lesser-known hardware makers … that typically offer lower prices than big-name vendors.”
“At Facebook, we believe that a key to efficiency is enabling open and unbundled solutions,” said Facebook head of engineering and infrastructure Jay Parikh.
Dell’Oro Group analyst Jimmy Hu says Voyager’s four ports, each of which sends data over fiber networks at 200 gigabits per second, is “in line” with products from other long-distance networking providers, including Huawei Technologies, Cienna Corp., and ZTE Corp. He also predicts the market will grow “nearly 10 percent this year to $11.3 billion.” German company Adva Optical Networking is the first partner to agree to build and sell Voyager.
TechCrunch reports that Facebook’s Backpack is the latest move to improve fiber optic networking from 40G to 100G. Facebook director of software engineering for networking Omar Baldonado says faster networking technology is “mostly driven by the need to be able to support more live and recorded video, as well as 360 photos and video.” LinkedIn is also working on 100G for its data center.
One of the challenges of moving to 100G, says Baldonado, is that “these new devices are significantly more power hungry and harder to cool.”
“We want to play at those high speeds but we need to do it in a way that works across all of our data centers,” he said. “We’ve been working with the whole industry ecosystem — server vendors, NIC manufacturers, fiber manufacturers — to get this to work at our scale.”
Facebook, which is slowly rolling out Backpack to its data centers, plans to contribute the new switch’s design to the Open Compute Project.