Facebook revealed today that it tried using potatoes in its servers to make them more environmentally friendly.
Under the Open Compute Project (OCP), Facebook is on a mission to improve the efficiencies of the servers, storage devices and data centres that are used to power its social networking platform. Any breakthroughs that the company makes are shared with the rest of the OCP community so that they too can improve their own efficiencies and reduce the overall environmental impact of IT on the world.
“We disposed of anything that wasn’t useful,” said Facebook's VP of hardware design and supply chain operations Frank Frankovsky at a briefing in London.
This included removing the top lids from its servers, but that resulted in a lack of airflow over the central processing unit (CPU), causing them to overheat.
The company tried to use a plastic cover to redirect the airflow over the CPU but this didn't sit well with Frankovsky. “That was really frustrating for me because we eliminated all this material and then we put a plastic lid on the thing,” he said. “That’s just more material in the waste stream.”
Facebook then considered whether it was possible to make the lid out of a more eco-friendly material. Frankovsky confessed that he even tried using the material that is also used to make Spudware kitchen utensils, which are compromised out of 80 percent starch and 20 percent soy oil.
“We created a thermal lid out of that starchy material but we found out pretty quickly that when you heat that up it smells a lot like French Fries,” said Frankovsky.
Besides making data centre workers rather peckish, the Spudware material also went floppy and gloopy, he continued.
Frankovsky went on to argue that other datacentre operators should try and “push the envelope a little bit harder” in order to drive innovation and improve efficiencies.
Indeed, Facebook has recorded a power utilisation effectiveness (PUEs) rating of 1.07, which far surpasses the industry's “gold standard” of 1.5.
The PUE is a measure of how much electricity gets to the server compared to how much is taken off the grid.
Facebook is able to achieve such efficiencies partly because it runs its data centres without air conditioning, instead relying on the outside air to cool its servers and prevent them from overheating. This means that it doesn’t have to use up electricity for cooling purposes.
The result is a facility that is 38 percent more efficient and 24 percent less expensive than predecessor data centres, said Frankovsky.
Sceptics argue that Facebook is only able to create super efficient data centres because it is building them from scratch, where as normal businesses are committed to their data centres for at least 20 years.
However, Frankovsky pointed out that Facebook was still able to achieve efficiency savings of 30 percent when it was leasing space in someone else’s data centre.
“We started turning off computer room air conditioning. We started separating hot aisle from cold aisle,” he said. “These were things our co-location landlords hadn’t done."
There are very parts of the world that are so hot and humid that you can’t get the inlet temperatures to a point where the electronics would survive, according to Frankkovsky.
“Even if air conditioning is not a risk they’re willing to take, I think there’s a lot that can be done in electrical efficiency. Eliminating the uninterrupted power supply (UPS) would be one easy way. These cost $2 a Watt while the OpenCompute battery racks we’ve open sourced cost $0.25. It’s also far more efficient because they’re not transferring the AC to DC.”
Facebook has even taken steps to reduce the carbon footprint of the lorries that transport equipment from Germany to its only European data centre, in Lulea, Sweden.
“We design the rack enclosure, as well as the pallets that it transports on, to be able to plug a truck 100 percent so we don’t have any wasted space on a truck so we don't have any wasted transportation costs,” he said.