University of Pittsburgh

Super Models

Written by Sharon Guynup

Jordan

Jordan

In a vast, water-pocked region of Western Siberia, a Russian crew explores the Messoyahka natural-gas field, drilling deep into thick permafrost. Engineers—ever alert for changes in downward progress that could signal an oil or gas deposit—note that suddenly the drill rig blasts through something that isn’t bedrock. They carefully examine the rock cuttings and dirt and mud that flow up from the drill hole. Oddly, they also find something they’ve never seen in the mix before. There are icy chunks embedded in the core sample. The ice melts and the chunks disintegrate when pulled to the surface.

The drilling crew soon learned that they had found a substance thought to exist naturally only in the farthest reaches of the solar system. They had discovered methane hydrate, or “methane ice,” in nature on Earth. What they couldn’t know then—in 1964, in the remote fields of Messoyahka—was that the frozen gas may hold a key to solving 21st-century energy problems.

Today, it’s known that abundant methane hydrate riddles permafrost across the Arctic and also lies entombed under deep-ocean sediment, forming the planet’s single-largest carbon reservoir. The ice chunks form when bubbles of methane gas rise from, say, a fault into a frigid, high-pressure environment that squeezes water and methane gas together into a solid. The methane molecules are then trapped within water cages in ice-like crystals.

With about 23 percent of Earth’s surface frozen as permafrost and much of the ocean deep enough to form hydrates, the potential size of these deposits appears to be staggering. Methane hydrate is likely to be far more abundant than all of the remaining oil, coal, and gas fields combined, according to Timothy Collett, a research geologist with the U.S. Geological Survey.

Could these mysterious subterranean deposits be harnessed as a future source of natural gas? With most methane hydrate deposits buried a minimum of a quarter-mile below the surface of the sea or buried in Arctic ice, these formations are extremely difficult to study. Yet, Pitt chemist Kenneth D. Jordan is collaborating with the U.S. Department of Energy’s National Energy Technology Laboratory to explore methane hydrate’s mysterious properties from afar: He’s conducting up-close research from his office in Eberly Hall on the Pittsburgh campus.

Jordan is using computer simulation to explore methane as it exists in Arctic permafrost and in deep-ocean terrain. In essence, he’s applying mathematic formulas to simulate methane hydrate’s structure and dynamics.

Using “virtual” modeling, Jordan and his team run a series of calculations to examine how the properties of methane hydrate depend on temperature and the occupation of the water cages.Computer modeling allows them to conduct experiments that couldn’t be carried out in nature or in a laboratory, like examining how the crystals are created and whether the presence of the methane molecules affects how efficiently the crystals conduct heat. Jordan explains that in nature, under high enough pressure and at low enough temperature, water and methane form methane hydrate crystals, trapping the methane molecules in water cages. “On the computer, you can build these structures with or without methane molecules to analyze how the presence of methane affects heat movement through the crystals,” he says.

Johnson

Johnson

Jordan, Distinguished Professor of Computational Chemistry at Pitt, runs these simulations on anywhere from 8 to 32 computers simultaneously, a method called parallel processing. Even when combining the power of multiple computers, the calculations can take anywhere from a few days to a week to calculate, depending on the nature and complexity of the simulation. The computers are linked together on a lightning-fast “InfiniBand” network, some 30 times faster than gigabit Ethernet.

“We couldn’t have done this type of study 15 years ago,” he says. Back then, computers could only model a few hundred atoms; now, with much faster computers working together in parallel, researchers can model the behavior of systems containing thousands, even millions of atoms. “It means that we can model much more accurately.”

Even with these advances, researchers still face the challenge of dealing with many research problems that span a wide range of time and/or length scales, requiring the development of new multiscale algorithms. Jordan is just one of scores of Pitt scientists working across many disciplines who are pairing multiscale modeling and parallel processing with traditional laboratory experiments. This means that physicists, biologists, chemists, engineers, doctors and others are now collaborating closely with top computer programmers—although there are rare birds who can do both.

This fall, the University of Pittsburgh launched a new, multidisciplinary Center for Simulation and Modeling (SAM), which will speed development of innovation and discoveries in computational-based research. Research can be conducted using any of the networked supercomputers clustered around campus. The center will help researchers wean themselves from limited serial processing and gain expertise in parallel processing and multiscale modeling. It will bring together about 50 faculty members and more than 100 graduate students from very diverse areas to discuss problems and work with the center’s newly hired consultants who are experts in translating research problems into computer programs that will help to answer researchers’ questions.

“What we’re doing is providing researchers with resources to tackle their problems,” says George Klinzing, Pitt’s vice provost for research. “We want them to be at the forefront with the tools they need to make important breakthroughs.” The new center is located in Bellefield Hall. Jordan is codirecting the center with J. Karl Johnson, interim chair and W.K. Whiteford Professor in the Department of Chemical and Petroleum Engineering.

For some problems, researchers use existing computer codes; all that’s needed is to plug in and run the data. But for others, it’s necessary to write code from scratch. Helping scientists to write such codes is the main work of the center consultants. The process starts with identifying the research questions and then selecting one of two possible simulation methods according to the time and physical scale of the project. The “Monte Carlo” technique is used to try to find different configurations, perhaps rotating and grouping or moving molecules, while “molecular dynamics” is used to track movement over time, like the possible configurations of protein folding during a nanosecond transformation.

Also key to the center’s mission is helping researchers at Pitt take advantage of parallel computing, which, says Johnson, is a tool that’s revolutionizing researchers’ ability to model complex systems. “Anything that can be analyzed quantitatively, turned into an equation, translated into an algorithm, and put on a computer can be simulated and modeled,” he says. “Parallel computing means you can tackle more realistic, more important problems.”

Indeed, computers can be used to carry out experiments that would be too costly, too complicated, or simply impossible to do in a traditional laboratory setting. Jordan remembers a time, not that many years ago, when lab scientists were suspicious of computer modeling, but notes that today many breakthroughs are only possible as a result of collaborations between computational and experimental researchers. “This close coupling between simulation and more traditional experiments is changing the way that a lot of modern science is done,” he says. Computer modeling allows researchers to tackle complex questions on everything from energy and climate change, to the ups and downs of the world economy, to the spread of infectious diseases.

Predicting the spread—and prevention—of a global viral epidemic is just such an example. To simulate the path and infection rate of a rampaging infectious disease, Donald S. Burke, dean of Pitt’s Graduate School of Public Health, led a research team at the University of Pittsburgh, Johns Hopkins University, and Imperial College in London to craft a complex model. He and his team needed to consider how many people lived in a particular location, where they lived, and in what concentrations. He had to put virtual people into households, use transportation data to figure out where and how they went to work or school, places where they would mix germs freely—and then carry those bugs back home. Then he introduced a disease, Avian flu. Using pandemic information from the 1918 flu, he entered information about how long a carrier is contagious, how quickly the disease spreads, and in what proximity.

After putting all this data into a simulation format, he watched individuals on the screen turn red as they contracted the disease, and turn green as they recovered. Then he changed the parameters to watch what would happen if the flu strain was more or less deadly or infectious than the 1918 strain. Using his models detailing the speed and patterns of infection, the Centers for Disease Control and Prevention, the U.S. Department of Homeland Security, and the U.S. Department of Health and Human Services have crafted new policies on how to respond to an outbreak, including things like school closings and restrictions on travel.

Burke, who also is director of Pitt’s Center for Vaccine Research and is UPMC-Jonas Salk Professor of Global Health, collaborated with the Ministry of Health in Thailand to see if it would be possible to stop just such an epidemic before it raged out of control if it were to strike in Southeast Asia—and how big a stockpile of antiviral drugs it would take. His computer model needed to reflect the fact that the drug would need to be administered to everyone within one kilometer of a known outbreak. He quizzed public health experts on how quickly the drug could be distributed (48 hours, even in an emergency) and what fraction of the population would really take it (about 60 percent). His model showed that a maximum of three million courses of antiviral drugs would be needed to quench the spread of the disease. In response to his study, the pharmaceutical firm Hoffmann-La Roche donated three million doses to the World Health Organization.

“With this kind of computing power,” says Burke, “it’s possible to track what would happen to 85 million people in Thailand.”

Over the next four years, with funding from the Bill and Melinda Gates Foundation, Burke will use this kind of modeling to identify what is needed in new or improved vaccines for maladies such as measles, malaria, dengue fever, and influenza.

When asked about the importance of modeling in his work, Burke notes that Pitt is on the cutting edge, using methods that he expects to change the face of public health research. Then he laughs and quotes J.W. Forrester, the father of systems science: “‘All decisions are based on the models,’” he said. “‘Most models are in our heads; mental models are not true and accurate images of our surroundings but are a set of assumptions and observations gained from experiences. Computer simulation models can compensate for weakness in mental models.’”

Through Pitt’s new simulation center, codirector J. Karl Johnson and his team are studying new ways of storing carbon dioxide, a major contributor to the planet’s recent and worrisome warm-up. Johnson’s simulation work at Pitt is helping researchers to design materials that could capture CO2for storage underground or under the sea.

Johnson says that until the nation and the world develop efficient renewable sources of energy, we’ll still be using fossil fuels, probably for the next few decades. The question he’s tackling, using computer simulations, is whether we can use oil, gas, coal, and natural gas without releasing CO2into the atmosphere. That solution would help to prevent CO2’s increasing environmental damage until new alternative energy sources become viable.

In other CO2 research, Johnson’s team is trying to enhance oil recovery from “dry” oil wells. Team members are modeling changes to CO2using various polymers that make the molecule more closely resemble oil—so when injected into a well, it could push remaining oil to the surface. “The United States has a lot of oil wells that are not producing anymore,” says Johnson. “If we could squeeze more oil out of them, it would aid in our quest for energy independence.”

Simulation is particularly valuable in the creation of new materials and is capable of “computational discovery”—the virtual discovery of something that has not yet been seen in the laboratory. Geoffrey Hutchison, assistant professor of chemistry at Pitt, is investigating what could prove to be an inexpensive, innovative energy source—a new kind of solar cell made from conductive plastic. The cells could be dissolved, like ink, and painted on roofs or cars to supply electricity. They could be produced in flexible, lightweight rolls that people could buy at the hardware store and trim to size.

So far, though, the polymers being used do not conduct electricity well enough. The more traditional silicon solar panels currently in use are roughly 20 percent energy efficient; the new solar-cell materials in development are just five to six percent efficient. Hutchison is using simulations to guide the synthesis of new materials.

Hutchison is trying to understand how electrical current moves on the nanoscale level (about 1,000th the diameter of a human hair.) Do the currents weave, move in straight lines, or bounce off the walls like billiard balls? Do impurities always act as roadblocks? How long do they hold a charge? Computer simulations are giving him the answers.

Hutchison uses computational chemistry to build a hundred or more possible new materials, adding a carbon atom here, exchanging it for an oxygen atom there—sometimes with surprising results. “Making what seems like a subtle change may change the entire shape of a molecule,” he says. “We like molecules to be flat, which spreads the electrical charge over a larger area than twisted or spiral-shaped molecules, which are less conductive.”

The result: He has already come up with five new molecules that look like promising solar materials. In the lab, it would take many months to synthesize an array of polymers, then months more to test them. Instead, using computer simulations, Hutchison was able to find a number of possibilities in a fraction of the time and at much lower cost. Now, he will collaborate with colleagues at Pitt and elsewhere to make and test these new polymers.

The new Center for Simulation and Modeling, says Klinzing, puts Pitt on a vast frontier for the next scientific revolutions in many fields. Among the primary areas of research are: energy and sustainability, nanoscience and materials engineering, medicine and biology, public health, economics and the social sciences, and visualization.

“This center provides our faculty with top-notch resources and the ability to be truly cutting edge on multiple fronts,” states Klinzing. He adds that, going forward, multiscale modeling will have a major impact on understanding a large range of physical and biological processes, enabling the design of completely new molecules and materials for specific, targeted functions.

Back in Eberly Hall, Jordan’s work with methane ice extends beyond its potential as a major source of future energy. The hydrate fields have been in the news recently for other reasons.There is concern that warming oceans and a melting Arctic could trigger a massive, potentially catastrophic methane release. Since methane is a potent greenhouse gas—about 20 times more potent than CO2—a large influx into our atmosphere would quickly heat oceans and thaw permafrost, creating a cycle that could seriously accelerate global warming.

Computer simulation enables researchers like Jordan to gain an intimate understanding of methane hydrate, even from thousands of miles away. Solutions are as close as a computer screen and the brainpower of colleagues next door. A few decades ago that was unthinkable. Instead, an isolated Russian crew on a vast Siberian plain huddled around the ice-like material retrieved while drilling for oil gas. They wondered about their find while, before their eyes, the unusual chunks began to melt away.

For more information about Pitt’s new Center for Simulation and Modeling, please visit www.sam.pitt.edu.

-->