It was her inner environmentalist that lured Asst. Professor Rachel Slaybaugh to Nuclear Engineering. “I have always been an environmentalist,” she explained. “When I was a freshman at Penn State I heard about this existing, large-scale, base load electricity source that didn’t emit air pollution, and I thought -- well, that sounds like a great way to get rid of coal plants -- I’m going to do that.” What Slaybaugh had heard about was nuclear power, which led to her B.S. in Nuclear Engineering from Penn State, then a Master’s and Ph.D. from the University of Wisconsin at Madison. She joined UC Berkeley’s Nuclear Engineering faculty in 2014, and is currently a Senior Fellow at the Berkeley Institute for Data Science.
Slaybaugh’s research is focused on numerical methods applied to neutral particle transport, which are critical to nuclear reactor design and shielding, and to accounting for radioactive materials formed in fission reactions (of fundamental significance to nuclear weapons nonproliferation work). Application of these methods at a scale that models real-world fission events in a nuclear reactor requires high performance computing to solve the Boltzman Transport Equation to describe the probable position and momentum of millions of individual neutrons and the aggregate behavior of the reactor’s nuclear fuel.
“I build tools that other people use,” Slaybaugh explains. “Really, what I do is applied math and computer science with some nuclear understanding sprinkled on top. We have an emphasis on high performance computing.” The software Slaybaugh and her students develop to apply numerical methods to problems in nuclear engineering is adopted by U.S. Department of Energy national laboratories, as well as by engineers engaged in design of next-generation reactors. Successfully running the software on a given cluster requires resolution of a host of compiler issues and library dependencies.
Before the Berkeley Research Computing (BRC) Program brought the Savio cluster online in 2013, Slaybaugh and several of her fellow NE professors ran their computation on an internally-managed cluster. As she sees it, “there’s some benefit to having [a private cluster], because then we get complete control over the system. Except then you have to deal with having complete control over the system, and doing all the maintenance and upgrades yourself. If there’s a student passionate about doing that, that’s fine, but if there’s not it becomes a big burden. ... And it’s nice to have people [whose] professional job is to manage clusters. Then we’re not in this knowledge-transfer, grad student gap world.”
Because so much of Slaybaugh’s work is focused on developing code to be run on high performance computing clusters, she sees great value in teaching her students transparent and sustainable coding practices. These are often referred to as “software carpentry” skills, and are supported on campus through a local chapter of The Hacker Within, which meets weekly during the academic year at BIDS. As Slaybaugh puts it, “I think software carpentry and The Hacker Within are enormously important, as literally every student has to do computing now. Knowing how to do that work efficiently and reproducibly and transparently is increasingly important -- from their own efficiency standpoint, and the question whether ‘is something science if it’s not reproducible?’ -- and a lot of computational science is a little iffy if it’s not reproducible.”
Like any scientist whose research is complicated by intricate compilations and dependencies, and for whom reproducible computational environments is key to credibility, Slaybaugh is highly motivated to optimize and automate software installation. One emerging technology that addresses these concerns is Singularity, which enables one-time, scripted installation of software into an application container. Singularity containers can be built as an image runnable not only on Savio (as described in a recent BRC service announcement), but on any cluster, and also on Linux virtual machines -- including virtual development environments instantiated by Vagrant on a researcher's laptop . Graduate students working with Slaybaugh over the summer will be assessing whether and how Singularity can further streamline the setup of the research group’s computational research environments. One advantage Singularity offers Slaybaugh and her colleagues is that images that include export-controlled software (common in Nuclear Engineering work) can be archived and shared privately -- unlike Shifter, another emerging solution that she has consideredt, which currently requires images to be stored in publicly-accessible archives.
Singularity is not the only new IT support offering on campus in which Slaybaugh sees value. As funding agencies increasingly require development of a Data Management Plan (DMP) for submitted proposals, the Research Data Management (RDM) Program, a partnership between Research IT and The Library, is proving itself a key support to her work.
Speaking generally of Research IT and the BRC Program, Slaybaugh says, “It’s been interesting to see this whole process develop. We accidentally learned about Savio at the very beginning. To see it grow, and adding the Faculty Computing Allowance, and how it can be used in other parts of the campus -- reaching out to non-traditional fields -- it’s just been interesting to see how much the team wants to really be a resource on campus and figure out what works for the most people, and the most diverse set of people.”
Research IT welcomes inquiries from all researchers at UC Berkeley who have computational and/or data management needs. Please let us know if you think we might be able to help: research-it@berkeley.edu.