Earlier this week I wrote about the Linley Data Center Conference and how Thermal is the New Power . With perfect timing, also earlier this week, Microsoft announced the existence of project Natick to submerge a datacenter, a sort of datacenter in a can. The motivation to do this is that the surrounding water makes cooling the datacenter straightforward. Also, 44% of the world lives within 100km of the coast so there is a sense that the sea is where the people are. Not in it, of course, but very close. The challenge with this approach is that there is no access to the datacenter and so they have to cope with any failures by degrading gracefully. The plan is for the facility to last for 20 years with access only every 5 years to upgrade all the servers and routers. The initial experimental prototype vessel, christened the Leona Philpot after a Halo game character, was operated on the seafloor approximately one kilometer off the Pacific coast of the US (I'm guessing near Seattle) from August to November of 2015. Although that trial deployment has depended on onshore power, future ideas are to use wave energy to power the datacenter, making it self-sustaining. However, it seems to me that the amount of power needed in a datacenter is a lot larger than is easily harvested from the immediate environment. Microsoft Research has produced a video about the program. (Please visit the site to view this video) This reminded me of a couple of other datacenters that I have read about. Google has one at The Dalles in Oregon, beside the Columbia River. The original datacenter there was built a decade ago in 2006. On their web page, Google coyly says that the location was picked for "the right combination of energy infrastructure, developable land, and available workforce". In fact, it is right next to a hydro-electric power station at the Bonneville dam that powered The Dalles aluminum smelter that closed. Aluminum smelting is an electric process that requires a lot of power and so smelters have often been located near hydroelectric facilities. And trivia fact of the day: the Bonneville Power Administration's grid was managed by a PDP-10 computer and the software was written by a couple of guys you have probably heard of, Bill Gates and Paul Allen. So the combination of cheap power and one of the largest rivers in the world for cooling water made it attractive. Another way to cool is to go somewhere cold. Google has another datacenter in Hamina in Finland . This one has nothing to do with aluminum, but was built in an old paper plant. It was initially opened in 2011 and is being expanded. Hamina is on the coast and the datacenter is cooled with (cold) sea water. Plus, the paper plant consumed enough power that it had its own electricity substation that Google could take over. There is clearly something symbolic about a datacenter taking over from an aluminum smelter, a sort of post-industrial transition. Even more symbolic is Google taking over a paper plant at the same time as it makes newsprint increasingly irrelevant. But all three datacenters show the importance of cooling. All that power is originally created in the semiconductor chips in the servers and routers. In battery-powered devices like smartphones, the requirement for low power is obvious. But on a datacenter scale, it is important for servers and networking equipment, at the scale of needing dedicated electricity substations and unlimited amounts of cold water.
↧