There’s something about the water that keeps attracting data center builders. We were brought in by a VC firm to vet out a project all the way back in 2008 (the idea was sunk not because of the project’s merits but because the financial assumptions of the founders were way off).
Google got its patent on a floating one back in 2009 and keeps getting linked to projects that may or may not surface. Someone was reportedly building one last year. Now Microsoft is getting into the act.
The twist on this one is that they’re going to sink it, which presumably means you can’t swap out a faulty hard drive or cable, and in exchange you get colder water (although usually these floating projects are in the SF Bay, which is plenty cold), and maybe less catastrophic risk from big storms or some such.
It makes sense that Microsoft would do it – after all they went in heavily on containers very early on, and this is essentially an undersea container. It’s nice that they’re thinking about CSR (or at least PR) in doing an environmental study after being posterized in the infamous James Glanz data center pollution article series in 2012. That said, I’m not sure if the tradeoffs are worth it.
Good data centers (Tier III or higher) are built to be concurrently maintainable for a reason. And not so they can deploy them in a shabby state an then patch them later. Rather it’s because they’re big complex ecosystems where a lot of things can go wrong, no matter how prefabbed and modular they are. Without the ability to tweak and fix minor issues, they’d grow into major issues. This looks to not be maintainable at all, which is a tall challenge for any quality control program.