Of Bricks and Bytes: How data strategy is changing real estate | PROPMODO BOOKMARKS→

Data Centers Are Energy-Guzzlers, but That May Not Be the Case for Long

As the world’s digital infrastructure continues to expand, data centers can’t be built fast enough. The pandemic-induced shift to remote work, online education, video streaming, and e-commerce have driven both demand for data centers and their yields compared to more traditional real estate classes. Even with persistent supply chain backlogs and a looming recession, the compound growth annual rate for the data center market is expected to increase by 9.6 percent from now until 2030, ultimately reaching an overall value of $418 billion. While that’s great news for a thriving internet connection, it’s not so stellar for the environment. 

Data centers are notorious energy gobblers, and their meteoric rise as an asset class stands in diametric opposition to the many decarbonization efforts being laid out by both corporations and policymakers. Pressure is mounting for the real estate industry to move the needle on climate progress, and the need to tamp down energy consumption in data centers is no exception. In recent years, some strategies have emerged to mitigate the whale-sized bite data centers take out of the global energy supply, but the complex energy ecosystem that data centers require to function properly poses a huge sustainability challenge.


Data centers consume a lot of electricity due to the high density of power-hungry hardware, from servers to storage devices to networking equipment. Thanks to the laws of thermodynamics, every joule of energy passed through computer equipment eventually turns into heat, as anyone who’s ever put a personal computer on their lap can testify. Now at most, a laptop may get uncomfortably warm when in use, but a hallway full of servers can emit enough heat to cause significant engineering challenges. Most data centers pipe the excess heat into the air, which contributes to the heat island effect that makes the surrounding neighborhood even hotter.

Keeping data centers cool enough not to melt the equipment (or give the people walking inside the building a heatstroke) requires some serious cooling power. These cooling systems can take many forms, such as air conditioning units, fans, and liquid cooling systems. Those necessary cooling systems require even more energy to keep the data center running. In smaller data centers, cooling can require up to 50 percent of the electricity consumed. But if so much energy is spent keeping data centers cool, what if there was a way to utilize the excess heat and keep other parts of the building warm?

Mitigating heat generation is a big challenge when it comes to meeting urgent net-zero targets, so some entities are getting creative with the waste heat emitted from data centers by using it to offset energy use elsewhere. This technique not only lowers expenses and carbon emissions but also creates a symbiotic energy ecosystem within the building where the data centers are kept cool, and other parts of the building are kept warm. One of the oldest examples of this can be found at Syracuse University, where excess heat from the university’s data center has provided free heat to a nearby office building since 2010. At the beginning of this year, another university was warmed by nearby servers. Technological University Dublin was one of the first users (including the adjoining South Dublin County Council building) of a new district heating system fueled by a neighboring Amazon data center that went into service in the middle of December. 

Using data centers to warm buildings isn’t a far-fetched idea in the slightest because entire cities in both Norway and Sweden have turned to data centers to keep their residents warm, showing that recycled heat from data centers can be used sustainably at a large scale. It’s an inspiring display for the rest of the world of how real estate can make an environmental win-win, but it also showcases the difficulty of implementing that system in a place that doesn’t have the necessary infrastructure to support it. 

Lyseparken, the town in Norway that will receive all of its heat from a central data center, is being built entirely from scratch, and though it’s been in development for several years now, it may take a long time for construction to finish completely. In the case of the Swedish city of Stockholm, the infrastructure for heat recycling was already in place when the city’s heating system was first built nearly seven decades ago. While that doesn’t mean that cities can’t restructure themselves to allow piping-hot data centers to provide heat for their inhabitants, it does mean that doing so is likely an expensive and cumbersome ordeal. Still, like other green initiatives that aim to transform buildings from energy hogs to high performance, these kinds of ambitious efforts take time, but they will ultimately pay off in the end.

Gone fishing

There’s another way data centers can be used sustainably without needing to overhaul a city’s framework, and it’s making a splash. In order to keep data centers cool without needing to extract extra energy, Microsoft decided to throw one of theirs into the ocean. 

Microsoft has been experimenting with the idea of underwater data centers since one staffer suggested the idea in 2014. By 2020, the tech juggernaut gave the experimental data center initiative (called Project Natick) an investment of $25 million. The goal was to offer coastal towns lightning-fast cloud services while keeping energy consumption low. In order to achieve this, Microsoft submerged data centers near coastal towns to test its solutions. In doing so, data wouldn’t have to travel as far as it would on land, resulting in a faster and more fluid internet connection. Given that more than half of the world’s population resides less than 120 miles from the coast, this strategy had a lot going for it.

Microsoft posited that putting a data center in a sealed container at the bottom of the ocean floor would actually boost the data center’s performance, reliability, and, of course, lower energy consumption. The ocean water absorbs the heat from the data center, allowing the computer processors to maintain a constant temperature without drawing a lot of power to power the cooling equipment. The number of computer hardware problems is further decreased since the entire server is maintained at a constant temperature, preventing uneven expansion and contraction.

Microsoft discovered that its data center was significantly more dependable than an identical data center maintained on land to control the experiment. It used a lot less electricity and had a much smaller negative environmental effect. It was also much quicker to implement because they didn’t have to buy the land or building space, construct everything that a normal data center has, or do any of those things. 

Project Natick’s underwater data center was pulled from the ocean bottom at the conclusion of the experiment in the summer of 2020, but since then, Microsoft has been aiming to build a second undersea data center along the North Sea. Microsoft’s continued interest in underwater data centers not only affirms that Project Natick was a success but it also offers a glimpse into how data centers could evolve into a more provocative asset class. 

Data centers are certainly having a moment in the investor spotlight right now, but traditional data centers are hardly energy-efficient, which is costly for both the environment and operational management. New, ambitious strategies to usurp data centers’ power guzzle are already underway. Soon, data centers may generate positive returns on energy and investment.

Image - Design