Data centers powering the generative AI boom are gulping water and exhausting electricity at what some researchers view as an unsustainable pace. Two entrepreneurs who met in high school a few years ago want to overcome that crunch with a fresh experiment: sinking the cloud into the sea.
Sam Mendel and Eric Kim launched their company, NetworkOcean, out of startup accelerator Y Combinator on August 15 by announcing plans to dunk a small capsule filled with GPU servers into San Francisco Bay within a month. “There's this vital opportunity to build more efficient computer infrastructure that we're gonna rely on for decades to come,” Mendel says.
The founders contend that moving data centers off land would slow ocean temperature rise by drawing less power and letting seawater cool the capsule’s shell, supplementing its internal cooling system. NetworkOcean’s founders have said a location in the bay would deliver fast processing speeds for the region’s buzzing AI economy.
But scientists who study the hundreds of square miles of brackish water say even the slightest heat or disturbance from NetworkOcean’s submersible could trigger toxic algae blooms and harm wildlife. And WIRED inquiries to several California and US agencies who oversee the bay found that NetworkOcean has been pursuing its initial test of an underwater data center without having sought, much less received, any permits from key regulators.
The outreach by WIRED prompted at least two agencies—the Bay Conservation and Development Commission and the San Francisco Regional Water Quality Control Board—to email to NetworkOcean that testing without permits could run afoul of laws, according to public records and spokespeople for the agencies. Fines from the BCDC can run up to hundreds of thousands of dollars.
The nascent technology has already been in hot water in California. In 2016, the state’s coastal commission issued a previously unreported notice to Microsoft saying that the tech giant had violated the law the year before by plunging an unpermitted server vessel into San Luis Obispo Bay, about 250 miles south of San Francisco. The months-long test, part of what was known as Project Natick, had ended without apparent environmental harm by the time the agency learned of it, so officials decided not to fine Microsoft, according to the notice seen by WIRED.
The renewed scrutiny of underwater data centers has surfaced an increasingly common tension between innovative efforts to combat global climate change and long-standing environmental laws. Permitting takes months, if not years, and can cost millions of dollars, potentially impeding progress. Advocates of the laws argue that the process allows for time and input to better weigh trade-offs.
“Things are overregulated because people often don’t do the right thing,” says Thomas Mumley, recently retired assistant executive officer of the bay water board. “You give an inch, they take a mile. We have to be cautious.”
Over the last two weeks, including during an interview at the WIRED office, NetworkOcean’s founders have provided driblets of details about their evolving plans. Their current intention is to test their underwater vessel for about an hour, just below the surface of what Mendel would only describe as a privately owned and operated portion of the bay that he says is not subject to regulatory oversight. He insists that a permit is not required based on the location, design, and minimal impact. “We have been told by our potential testing site that our setup is environmentally benign,” Mendel says.
Mumley, the retired regulator, calls the assertion about not needing a permit “absurd.” Both Bella Castrodale, the BCDC’s lead enforcement attorney, and Keith Lichten, a water board division manager, say private sites and a quick dip in the bay aren’t exempt from permitting. Several other experts in bay rules tell WIRED that even if some quirk does preclude oversight, they believe NetworkOcean is sending a poor message to the public by not coordinating with regulators.
“Just because these centers would be out of sight does not mean they are not a major disturbance,” says Jon Rosenfield, science director at San Francisco Baykeeper, a nonprofit that investigates industrial polluters.
School Project
Mendel and Kim say they tried to develop an underwater renewable energy device together during high school in Southern California before moving onto non-nautical pursuits. Mendel, 23, dropped out of college in 2022 and founded a platform for social media influencers.
About a year ago, he built a small web server using the DIY system Raspberry Pi to host another personal project, and temporarily floated the equipment in San Francisco Bay by attaching it to a buoy from a private boat in the Sausalito area. (Mendel declined to answer questions about permits.) After talking with Kim, also 23, about this experiment, the two decided to move in together and start NetworkOcean.
Their pitch is that underwater data centers are more affordable to develop and maintain, especially as electricity shortages limit sites on land. Surrounding a tank of hot servers with water naturally helps cools them, avoiding the massive resource drain of air-conditioning and also improving on the similar benefits of floating data centers. Developers of offshore wind farms are eager to electrify NetworkOcean vessels, Mendel says.
NetworkOcean’s launch announcement pictured what it described as a 0.5-megawatt capsule “to be tested underwater in the SF Bay in 1 month.” The company promoted the availability of 2,048 H100s, the highly sought-after Nvidia GPUs that allow AI tools to churn out text, images, and videos.
Though generative AI has increased the potential need for underwater data centers, companies including Microsoft have pursued small projects for years. Microsoft’s tests off the coasts of California and Scotland between 2015 and 2020 led it to conclude that the idea was “logistically, environmentally, and economically practical.” But the company hasn’t moved beyond experimentation and doesn’t have data centers in the water today. It declined to comment on the California permitting violation. (The Scottish portion was permitted.)
The long-term reliability of underwater setups is questionable, says Tony Harvey, a senior director who analyzes data centers for the consultancy Gartner. While they may suffer less from the heat stress that contributes to faulty GPUs and optical components on land, underwater centers may not prevent breakdowns altogether, and he suggests that repairs will be more difficult in the sea.
Mendel disputes the concerns. He says NetworkOcean’s vessels will be compatible with common maritime equipment and be serviceable in under an hour, without turning the hardware off.
NetworkOcean isn’t alone in its ambitions. Founded in 2021, US-based Subsea Cloud operates about 13,500 computer servers in unspecified underwater locations in Southeast Asia to serve clients in AI and gaming, says the startup’s founder and CEO, Maxie Reynolds. “It’s a nascent market,” she says. “But it’s currently the only one that can handle the current and projected loads in a sustainable way.”
Subsea secured a permit for each site and uses remotely operated robots for maintenance, according to Reynolds. It plans to fire up its first underwater GPUs next year and also is considering private sites, which Reynolds says would ease permitting complexity. Subsea claims it isn’t significantly increasing water temperature, though it hasn’t published independent reviews.
NetworkOcean also believes it will cause negligible heating. “Our modeling shows a 2-degree Fahrenheit change over an 8-square-fot area, or a 0.004-degree Fahrenheit change over the surface of the body” of water, Mendel says. He draws confidence from Microsoft’s finding that water a few meters downstream from its testing warmed only slightly.
Protected Bay
Bay Area projects can increase water temperatures by no more than 4 degrees Fahrenheit at any time or place, according to Mumley, the ex-water board official. But two biologists who spoke to WIRED say any increase is concerning to them because it can incubate harmful algae and attract invasive species.
Shaolei Ren, a University of California, Riverside, associate professor of electrical and computer engineering who’s studying the environmental impact of AI, compares plans for an underwater data center of NetworkOcean’s announced capacity, when running fully utilized, to operating about 300 bedroom space heaters. (Mendel disputes the concern, citing Project Natick’s apparently minimal impact.) A few years ago, a project that proposed using San Francisco Bay water to cool a data center on land failed to win approval after public concerns were voiced, including about temperatures.
The San Francisco Bay is on average around a dozen feet deep, with salty Pacific Ocean water flowing in from under the Golden Gate Bridge mixing with fresh runoff from a huge swath of Northern California. Experts say it isn’t clear whether any location in the expanse would be suitable for more than a tiny demonstration between its muddy, shallow, salty, and turbulent parts.
Further, securing permits could require proving to at least nine regulatory bodies and several critical nonprofits that a data center would be worthwhile, according to spokespeople for the agencies and five experts in the bay’s politics. For instance, under the law administered by the Conservation and Development Commission, a project’s public benefit must “clearly exceed” the detriment, and developers must show there’s no suitable location on land.
Other agencies consider waste emissions and harm to the region’s handful of endangered fish and birds (including the infamous delta smelt). Even a temporary project requires signoff from the US Army Corps of Engineers, which reviews obstruction to ship and boat traffic, and the water board. “For example, temporarily placing a large structure in an eelgrass bed could have lingering effects on the eelgrass, which is a critical habitat for certain fish,” the water board’s Lichten says.
NetworkOcean’s Kim tells WIRED that the company is cognizant of the concerns and is avoiding sensitive habitats. His cofounder Mendel says that they did contact one of the region’s regulators. In March, NetworkOcean spoke to an unspecified US Coast Guard representative about testing at the bottom of the bay and pumping in seawater as a coolant. The company later shifted to the current near-surface plans that don’t involve pumping. (A Coast Guard spokesperson declined to comment without more clarity on whom NetworkOcean allegedly contacted.)
For permanent installations, Kim and Mendel say they are eyeing other US and overseas locations, which they declined to name, and that they are engaging with the relevant regulators.
Mendel insists the “SF Bay” test announced last month will move forward—and soon. “We're still building the vessel,” he says. A community of marine scientists will be keeping their thermometers close.