The heart is the powerhouse that propels the body’s essential systems. It delivers the fuel, oxygen, and nutrients that keep our organs and tissues running, but it can’t work correctly when there is a problem with blood flow or the heart itself. This happens with heart disease, which comes in different forms, causing various symptoms and signs like pain or pressure in the chest or upper body, breathlessness, dizziness, or unexplained fainting.
According to the Centers for Disease Control and Prevention, scientists have been making progress in treating and preventing heart diseases for decades, with deaths from heart attacks and other cardiovascular ailments falling by a stunning 69 percent between 1950 and 2009. But that rosy picture is overshadowed by a troubling trend: Deaths related to heart failure, arrhythmias, and other heart conditions have risen since 2022.
The problem is that many researchers need more resources to conduct thorough studies on heart disease and find effective treatments. This is especially true for scientists who need access to high-powered computing capabilities only available with supercomputers. Now, a researcher at Harvard has reportedly used Alphabet Inc’s cloud platform to clone a supercomputer for a study on heart disease. The novel move could help other researchers overcome a shortage of powerful computing resources and speed up their work.
Petros Koumoutsakos, a researcher at Harvard’s School of Engineering and Applied Sciences, led the study that simulated a therapy that aims to dissolve blood clots and tumor cells in the human circulatory system. But the simulation required enormous computing power that typically can only be harnessed with a supercomputer. Koumoutsakos and his team used the Google Cloud platform to clone the supercomputer in a move that could help other researchers overcome bottlenecks and advance their work.
To clone the supercomputer, the research team used advanced algorithms to duplicate the computational structure of the system on a smaller scale. The process entailed modifying the software, networking, and physical hardware design. The resulting platform achieved 80 percent of the efficiency in the state-of-the-art dedicated supercomputer facilities with extensively tuned code.
Koumoutsakos’s project is one of several that have recently tapped into the power of cloud infrastructure to address scientific computing challenges. “Folks are realizing the potential of cloud to solve problems and technical scientific engineering computing to unlock productivity — getting better answers, faster,” said Bill Magro, chief high-performance computing technologist at Google Cloud. But converting cloud infrastructure to mimic supercomputers requires significant time and effort. The platform isn’t designed to support scientific engineering computing, and it can be a challenging environment for users unfamiliar with the underlying technology. This can be exacerbated by Google Cloud’s APIs often deprecating older versions, forcing scientists to update their software and applications continuously. This could potentially limit the adoption of the new platform.