The moment I entered the room, I heard the noise. It was filled with fans, cooling systems and electrical equipment dedicated to serving a giant machine with 11 million computer cores.
This week, we got a rare look at El Capitan, the world’s fastest supercomputer. It’s located inside the Lawrence Livermore National Laboratory in California, where the $600 million machine is constantly consuming liquid coolant and electricity.
In fact, the Linux-based computer is so large that it requires 5 to 9 million gallons of water each day to keep it cool. El Capitan also needs about 30 megawatts of electricity—or about three times the amount of power used by the city of Livermore, California.
All this computing power promises to pay off by unleashing cutting-edge research and even strengthening US national security. “We live for days like today,” said Lisa Su, CEO of AMD, a major supplier of El Capitan, at a Thursday unveiling event for the supercomputer.
The building that houses El Capitan, Sierra, Tuolumne and other supercomputers. (Credit: Michael Kahn/PCMag)
El Capitan is scheduled to launch a crucial but classified mission: In March, it will begin conducting nuclear weapons research as part of a US drive to maintain its nuclear weapons stockpile.
The US no longer conducts real-world nuclear bomb tests. Instead, it relies on supercomputers to perform sophisticated calculations that simulate nuclear explosions from today’s old stockpile. Lawrence Livermore National Laboratory is also home to Sierra, another massive supercomputer that peaked as the world’s second fastest in 2018. It also conducts classified nuclear weapons simulations. But in 2019, the US Department of Energy announced plans for a more powerful “exascale” supercomputer to help take nuclear weapons simulations to the next level.
We got a rare look at the supercomputer before it starts performing classified research. (Credit: Michael Kahn/PCMag)
The result is El Capitan, which is over 20 times more powerful than Sierra. Specifically, the new machine can achieve 2.79 exaFLOPs in performance, or 2.79 quintillion calculations per second, which is the equivalent of about 1 million flagship smartphones.
Despite the increase in computing power, El Capitan did not require an exponential expansion in size or power consumption. Like the Sierra, the machine occupies only a single room in the lab building, including an indoor section about the size of a basketball court.
The size of the machine reflects the lengths Lawrence Livermore Lab and its partners, HPE and AMD, have gone to prioritize efficiency while pushing today’s server technology to the limits. “It’s just extraordinary to pack that amount of computing power into that space,” said Marvin Adams, Deputy Administrator for Defense Programs at the US National Nuclear Security Administration.
Computer racks on the left and network parts on the right. (Credit: Michael Kahn/PCMag)
For perspective, Adams said that replicating the power of El Capitan in the 1990s would require “half a million” advanced supercomputers and more electricity than the US could generate. “And it would take several hundred square kilometers,” he said.
El Capitan looks nothing like a consumer PC. It includes multiple racks of servers, making the machine look like a column of black monoliths. However, it includes components from AMD, a leading supplier of consumer PCs and enterprise-grade chips.
Inside a set of AMD Instinct MI300A APUs, which power the supercomputer. (Credit: Michael Kahn/PCMag)
Similar to a data center, El Capitan’s computing is contained within rows and rows of server blades, which house AMD’s Instinct MI300A APUs. The chip stands out for featuring the CPU and GPU in the same package, increasing transistor density.
According to AMD, component integration was essential to help El Capitan improve its efficiency. Separating the CPU and GPU in such a massive machine risked increasing power requirements, increasing its size and cost.
AMD’s CEO also revealed that El Capitan was not originally intended to use the company’s APUs, which integrate CPU, GPU and memory components on top of each other, similar to its chip stacking technology. 3D for Ryzen processors.
AMD CEO Lisa Su at the inauguration of the supercomputer. (Credit: Michael Kahn/PCMag)
The MI300A “was a big bet. It was an important bet,” Su told reporters. “I remember looking at the technology and going, ‘Oh my god, this thing is so complicated.’
Recommended by our Editors
“We made the bet, and frankly, it’s now the foundation of how we believe we should build chips going forward for high-performance computing,” she added.
El Capitan also connects to countless racks of network switches to receive and transfer data. To cool the device, the machine uses a liquid “glycol” solution, which flows through the blue cables and exits through the red cables after absorbing heat. A nearby heat exchanger also uses gallons of water to help cool the glycol. That’s why you feel a constant rush of cool air and heat emanating from the supercomputer as you walk past the various racks of servers.
A closer look at the server blades in the supercomputer. (Credit: Michael Kahn/PCMag)
Network cable for the supercomputer. (Credit: Michael Kahn/PCMag)
All power for El Capitan comes from the local power grid, said Kim Budil, director of Lawrence Livermore National Laboratory. It took eight long years to develop the supercomputer, which is currently in “early access mode” and has already shown it can run 3D simulations of physics applications more accurately and in finer detail than other systems.
“The machine is doing calculations with numbers and degrees of freedom that we haven’t seen before,” said Teresa Bailey, associate program director for computational physics and weapons simulation at the lab.
But the real work will begin in the coming weeks and months as El Capitan begins conducting nuclear weapons research in addition to computational work for fusion energy projects, climate change and drug discovery.
An example of the El Capitan 3D simulations is running in early access mode. (Credit: Michael Kahn/PCMag)
“You get more physics or better (simulated) physics,” Budil said of the benefits of using a more powerful supercomputer. “You get more resolution, better resolution. You can have more simulations. And with El Cap, it will give us all three.”
Get our best stories!
Register for What’s new now? to get our top stories delivered to your inbox every morning.
This newsletter may contain advertisements, deals or affiliate links. By clicking the button, you confirm that you are over 16 years of age and agree to our Terms of Use and Privacy Policy. You can unsubscribe from newsletters at any time.