Ron Cohen, Professor of Chemistry and Earth and Planetary Sciences at UC Berkeley, and graduate students Alexis Shusterman, Josh Laughner, and visiting Harvard graduate student Alex Turner, are “trying to create a model of the world that matches the observations of the world.” That is, Cohen’s team is using sensor technology to measure the concentrations of atmospheric gases at both local and global scales, and from that data develop computational models that describe the emission processes and rates that cause those observations.
The team is working on numerous ongoing projects that follow this measure-and-model research structure. Several of these, Laughner explains, explore the chemistry of nitrogen oxides (NOx) reacting with isoprene emitted by plants in rural areas; and NOx produced by lightning strikes in the upper troposphere and measured by instruments mounted on aircraft or satellites. Another project, however, resides particularly close to home: the BErkeley Atmospheric CO2 Observation Network (BEACO2N). To understand how human behaviors and energy consumption at a community scale influence the climate and air pollution, Cohen’s research team is using inexpensive sensor technology to measure atmospheric gas concentrations across Oakland, Berkeley, and San Francisco. Installed on rooftops in urban locations, BEACO2N nodes help to inform community members of their contributions to emissions; and provide quantitative data to measure the efficacy of environmental policy in the Bay Area.
Contributors of both big memory and standard nodes to Berkeley Research Computing’s (BRC) High Performance Computing (HPC) cluster Savio, Cohen’s team is using computation to solve millions of differential equations associated with chemical and physical qualities of the air observed by BEACO2N nodes, and from which the team builds their models.
Measure and Model
The atmosphere absorbs and carries the emitted byproducts of combustion, a process by which carbon-based fuels are burned to produce energy. Cohen’s team measures the concentrations of these byproducts and other molecules -- which include carbon dioxide (CO2), nitrogen oxide (NO), nitrogen dioxide (NO2), carbon monoxide (CO), aerosols, and ozone (O3) -- with sensors aboard airplanes and satellites, and on the ground at different locations around the world. “Spaced approximately 2 km apart throughout Oakland, and soon expanding into San Francisco and Richmond,” Shusterman says, BEACO2N nodes are doing this work on a local scale. Constructed from “low to moderate cost off-the-shelf sensors,” she continues, these nodes are relatively inexpensive, meaning “the lab can buy a greater number of them,” deploy more “in a higher spatial density in urban areas,” and ultimately collect data that better “reflects the heterogeneity of atmospheric patterns” than would a single, expensive, fine-tuned sensor at an isolated location.
“How do you identify the emission processes controlling the concentrations of gases in the atmosphere from the observation of those concentrations?” Cohen asks. The byproducts of combustion are “emitted at the land surface,” he explains, “at unknown rates, and at unknown locations.” These gases “are then moved around in the atmosphere by winds,” at some point crossing paths with a BEACO2N node. Although the concentration measurement from the node lends important information about the quality of the air in that space and time, it does not necessarily reveal how, when, or where the gases were emitted. And it’s particularly unclear how to measure community emissions, for example from home heating, or from a commuter’s drive into work. An option could be to observe purchasing habits and thereby measure fuel consumption: “look at the tax receipts,” as Cohen says. “But even then we have no way of considering spatial allocation of this resource consumption. If a neighborhood banded together and decided to all buy electric cars, and their emission levels went to zero, no one would know,” but that’s information that could be useful to policy makers.
Pairing computational modeling with the observations gathered by BEACO2N nodes is the key to identifying emission sources, rates, and continuing patterns. Cohen’s team is working with two distinct classes of modeling for the project: forward and inverse. The forward model, for which Shusterman is a lead developer, with the support of fellow graduate student Laughner and postdoc Azimeh Zare, makes “predictions about what we’ll see [in regard to atmospheric gas concentrations] over long time periods, and then compares those to emerging observations.” This effort draws “connections between the things we think we know about emissions,” Cohen says, which may, for example, involve educated guesses about how many miles/gallon a commuter gets across an average distance in a certain locale, “to create a model that represents that perception.” The accuracy of that perception can be assessed when the team compares the results of their model to the observed measurements from the nodes. Conversely, the inverse model, for which Alex Turner of Harvard was the leading contributor when visiting Cohen’s lab in both 2014 and 2016, “takes the predictions of a model at a point in time and runs them backwards” to determine the initial conditions or sources that gave rise to the predicted effect. The emission processes hypothesized by the inverse model are tweaked until they yield concentrations that match observations.
Computational fluid dynamics is hard
“Computational fluid dynamics is hard,” says Cohen, in reference to his team’s utilization of Savio, the HPC cluster supported by BRC. Recall that the wind moves combustion byproducts after they’re emitted, and thus to “identify the magnitude and location of the sources” of emissions via modeling, one has to understand and model wind currents. Computational fluid dynamics, which Cohen explains is “a weather model in which you solve equations for the motion of air, where air acts as a fluid,” is a fundamental component of the team’s modeling effort. To build their models, Cohen’s team has to solve, simultaneously, “millions of differential equations” associated with chemical and physical qualities of the air -- including pressure gradients, temperature, and the condensation of water which results both in cloud formation and energy release -- within numerous, coupled, approximately “1 kilometer (horizontally) by 1 (vertically) kilometer grid cells” and “with a time resolution (minutes) that captures changes in concentrations.” These calculations require huge computational power and storage, like that provided by Savio, and are accelerated through parallelization. The Cohen condo’s “big memory nodes are especially important for the inverse modeling Alex Turner was doing, and which Alexis is continuing to work on,” Laughner explains: part of this modeling involves trillion element matrices -- too large even for those enhanced nodes, until they are reduced to a manageable size by “some clever linear algebra.”
Beyond its utility as a modeling workhorse, the Savio cluster serves as a teaching tool within Cohen’s research team, on which new graduate students learn how to do the computation associated with the lab’s research goals and pilot their calculations before running even larger tasks on Yellowstone, a massive nationally-supported HPC cluster operated by the National Center for Atmospheric Research.
Education and Outreach
Most BEACO2N nodes are installed on the rooftops of local schools and museums, and so are well positioned to serve an educational function. “The informal science education aspect of our work is a high priority,” says Cohen, who recalls the project was “initially funded as a direct collaboration with the Chabot Space and Science Center.” In this partnership, Cohen explains, his research team “would do the science, and Chabot would take the lead on developing a teacher training using the data” collected by the nodes, facilitating instruction of “atmospheric science and climate change for the high schools and middle schools” in the area. These origins in public outreach and education have strongly influenced the BEACO2N project’s open science orientation. The CO2 measurements collected by the nodes are publicly accessible on the BEACO2N website, making it possible for teachers to use hard data in their instruction on climate-related topics; and for students to “use some of the data produced by BEACO2N for science fair projects or reports, especially if there is a sensor on top of their school,” Shusterman says.
Paired with this in-class outreach, Cohen’s team is making a movie, to which computation run on Savio has made a major contribution. “It’s analogous to the fog movie on the nightly news,” Cohen explains, “but what you’ll see is CO2, instead of clouds.” While this swirling computational visualization of carbon emissions won’t be played on the big screens, it is expected to be displayed at the Exploratorium in San Francisco soon. The animation displayed in this article is a preliminary result created by Shusterman, using both inverse and forward modeling, and displaying CO2 as colors on a map of the Bay Area.
“We have an obligation,” Cohen says, “to educate people about what we know as scientists, so that individuals can make choices that reflect their own values but are grounded in the facts as we see them.” With this directive in mind, Cohen and his team, in their scientific exploration, reside at the interface of personal habits and public policy. They strive to understand “what kinds of things we’re doing [at each of these levels] to reduce emissions,” and then evaluate the efficacy of those practices with objective data and physical models.