NERSC Resources Help Find Mismatches in Tropical Disturbance Model

An Alaska-sized pulse of clouds and precipitation congregates over the Indian Ocean and lumbers east. Soon it will be mustering monsoons over Mumbai and developing downpours for Portland. So big, so enigmatic, it has a name: Madden-Julian Oscillation (MJO).

The MJO occurs on its own timetable—every 30 to 60 days—but its worldwide impact spurs scientists to unlock its secrets. The ultimate answer? Timely preparation for the precipitation havoc it brings—and insight into how it will behave when pressured by a warming climate.

Although it was identified and named in the 1970s, the MJO continues to be a challenge to simulate and predict. Working to reveal the MJO’s cycle secrets, a research team from Pacific Northwest National Laboratory (PNNL) used NERSC’s Edison supercomputer and data gathered during a field campaign over the Pacific Ocean to identify the processes that are responsible for too much precipitation in the models especially during the low-rainfall period of the MJO signal. The study was published in the Journal of Climate.

An Aerial View of Wang Hall

For a birds-eye view of Wang Hall, check out the recent series of aerial photographs taken by lab photographer Roy Kaltschmidt. Perched in the Berkeley hills, the Wang Hall Computational Research and Theory facility takes advantage of the Bay Area’s temperate climate to cool both our scientific supercomputers and energy efficient office space.

This Week’s CS Seminars

Monday, April 18

Non Uniform 3 Dimensional Fast Fourier Transform on Multi GPUs
11 a.m. to 12 p.m. Wang Hall – Bldg. 59, Room 4102
Kumar Aatish, ArrayFire

While FFT solves the Fourier Transform in O(N log N), it relies on the fact that the input is sampled at equidistant intervals. Non Uniform Fast Fourier transform seeks to overcome this restriction as analysis of irregular data is needed by many scientific disciplines. This talk deals with the implementation specifics of solving the inversion of NFFT using the Conjugate Gradient Residual Minimization method on multiple CUDA capable GPUs, that is computation of Fourier coefficients from given samples at irregular data points in a volume.

Wednesday, April 20

Dynamic Data Prefetching and Layout Optimizations for High Performance Heterogeneous Data Access
11:30am – 12:30pm, Bldg. 50F, Room 1647
Houjun Tang, North Carolina State University

The advancement toward exascale computing is producing massive amounts of data. There is a growing demand for efficient data access for data-intensive applications as the I/O performance often dominates the overall execution time. However, the existing I/O sub-systems in HPC only provide general purpose optimizations, and cannot satisfy the dynamic and diverse read accesses from various applications. In this talk, I will first present an online analyzer that is capable of detecting various heterogeneous data access patterns during applications’ runtime, with low computational and memory overhead. Combining our pattern detection with prefetching, high accuracy of prefetching is achieved and data access performance is improved. To further optimize data access performance, a dynamic I/O framework is proposed that recognizes the data based on access patterns, replicates the data of interest in multiple reorganized layouts that would benefit various read patterns, and makes runtime decisions on selecting a favorable layout for read accesses. We also enabled space-filling curves and proposed a new cluster-based layout optimization approach to support block-structured Adaptive Mesh Refinement data. Our results show multi-fold read performance improvement over the dataset’s original layout in various applications.

CITRIS Research Exchange Health Initiative
Windows to the Brain
Guillermo Aguilar, UC Riverside
12 to 1 p.m., 310 Sutardja Dai Hall, Banatao Auditorium

One of the recent research thrusts in my research group aims at developing a novel transparent cranial implant (“window”) that enables life-long, non-invasive delivery and/or collection of laser light into and from shallow and deep brain tissue on demand. Such an implant would allow for real-time and highly precise visualization and treatment of diverse brain pathologies, such as those resulting from traumatic brain injury  or brain tumors, without the need of highly-invasive craniotomies or trepanation procedures. The window could be permanently covered with native scalp that can be rendered temporarily transparent on demand in a minimally-invasive manner. In collaboration with other research groups at UCR, an YSZ implant has been successfully fabricated with current-activated powder-assisted densification (CAPAD) processing method. A summary of these results as well as ongoing and future studies pertaining to this research thrust will be presented. This talk is free and open to the public. 

Understanding Compiler Optimizations
3:30 to 5 p.m., Bldg. 50F, Room 1647
Chandler Carruth, Google

C++ is used in applications where resources are constrained and performance is critical. However, its power in this domain comes from the ability to build large, complex systems in C++. These systems leverage numerous C++ features in order to build and utilize abstractions that make reasoning about these complex systems possible.  Abstractions are the very essence of how we scale software to solve ever larger and more complex problems.

But the common C++ idea of “zero cost” abstractions is, in some senses, a myth. The real achievement of C++ is allowing you, the programmer, to control where and how the cost of your abstractions will be paid. It does this by leveraging remarkably advanced optimizing compilers and carefully written libraries and techniques, all working together to control the cost. In order to be effective writing software that leverages this control, it is essential that the programmer understand the core fundamentals of how the compiler optimizations will behave. Without this, it is too easy to unknowingly limit it or create challenges that it cannot overcome.

Friday, April 22

BIDS, Data Science Lecture Series
Transparency and Reproducibility in Economics Research
1:10 to 2:30 p.m., 190 Doe Library, UC Berkeley
Edward Miguel, Department of Economics, UC Berkeley

There is growing interest in research transparency and reproducibility in economics and other scientific fields. We survey existing work on these topics within economics and discuss the evidence suggesting that publication bias, inability to replicate, and specification searching remain widespread problems in the discipline. We next discuss recent progress in this area, including improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing of data and materials, and draw on experiences in both economics and other social sciences. We discuss areas where consensus is emerging on new practices as well as approaches that remain controversial and speculate about the most effective ways to make economics research more accurate, credible, and reproducible in the future.

Link of the Week: Quantum Problems Solved Through Games

Danish researchers have figured out how to harness the power of gamers to help solve quantum physics problems. Quantum Moves a game created by the scientists, helped them improve algorithms used in quantum mechanical simulations, they reported in a Nature paper. A Nature News article points out that, beyond its practical implications for quantum physics, the study suggests the human brain may be far more capable of intuiting the weird world of quantum mechanics than previously thought.