Keynote Lectures

ICCS is well known for its excellent line up of keynote speakers. This page will be frequently updated with new names, lecture titles and abstracts.

Rommie Amaro, University of California, San Diego, USA
Discovery through the Lens of a Computational Microscope

Jackie Chen, Sandia National Laboratories, USA
Towards Compute and Data Intensive Turbulent Combustion Simulation at the Exascale

Slawomir Koziel, Reykjavik University, Iceland
Strategies for Solving Computationally Expensive Engineering Design Optimization Problems

Larry Smarr, University of California, San Diego, USA
Using Supercomputers and Gene Sequencers to Discover Your Inner Microbiome

Sauro Succi, Istituto per le Applicazioni del Calcolo “Mauro Picone” (C.N.R.) and University of Roma, Italy and Harvard University, USA
Lattice Boltzmann simulations all the way: from aerodynamic design to quark-gluon plasma hydrodynamics

 

Discovery through the Lens of a Computational Microscope
Rommie Amaro
Rommie Amaro
University of California, USA
Dr. Rommie Amaro is a native of the south side of Chicago and earned her bachelor’s degree in chemical engineering with high distinction from the University of Illinois at Urbana-Champaign in 1999. After graduating, she joined Kraft Foods, Inc. as an Associate Research Engineer in Glenview, Illinois, working mainly on Philadelphia Cream Cheese productivity and commercialization projects. After two years of working with condensed matter, she returned to Urbana to attend graduate school in Chemistry.
Dr. Amaro earned her Ph.D. in chemistry in the lab of Professor Zan Luthey-Schulten, where she worked mainly on computational methods to reconstruct free energy profiles from non-equilibrium pulling experiments of ammonia conduction through a beta-barrel protein involved in histidine biosynthesis and studying mechanisms of allosteric regulation in proteins. While a graduate student, she also worked closely with the National Institutes of Health (NIH) Resource for Macromolecular Modeling and Bioinformatics, led by Professor Klaus Schulten, where she helped develop a series of workshops that have now been taught worldwide.
After earning her Ph.D., Dr. Amaro went on to receive a NIH Kirschstein-National Research Service Award (NRSA) postdoctoral fellowship and worked under the tuteledge of the Howard Hughes Medical Investigator and National Academy of Sciences member, Professor J. Andrew McCammon at University of California, San Diego (UCSD).
In 2009, Dr. Amaro started her independent research career in the Deparments of Pharmaceutical Sciences, Computer Science, and Chemistry at the University of California, Irvine. In 2010 she was selected as an NIH New Innovator for her work developing cutting-edge computational methods to help discover new drugs. The following year, she received the Presidential Early Career Award for Scientists and Engineers. In 2012, Dr. Amaro opened her lab at UCSD in the Department of Chemistry and Biochemistry.
Research in the Amaro Lab is broadly concerned with the development and application of state-of-the-art computational methods to address outstanding questions in drug discovery and molecular-level biophysics. Her lab focuses mainly on targeting neglected diseases, Chlamydia, influenza, and cancer, and works closely with experimental collaborators to catalyze the discovery of new potential therapeutic agents. The Amaro Lab is also keenly interested in developing new multiscale simulation methods and novel modeling paradigms that scale from the level of atoms to whole cells, and beyond.

ABSTRACT

With exascale computing power on the horizon, computational studies have the opportunity to make unprecedented contributions to drug discovery efforts. Steady increases in computational power, coupled with improvements in the underlying algorithms and available structural experimental data, are enabling new paradigms for discovery, wherein computationally predicted ensembles from large-scale biophysical simulations are being used in rational drug design efforts. Such investigations are driving discovery efforts in collaboration with leading experimentalists. I will describe our work in this area that has provided key insights into the systematic incorporation of structural information resulting from state-of-the-art biophysical simulations into protocols for inhibitor and drug discovery, with emphasis on the discovery of novel druggable pockets that may not be apparent in crystal structures. I will also discuss how we are developing capabilities for multi-scale dynamic simulations that cross temporal scales from the picoseconds of macromolecular dynamics to the physiologically important time scales of cells (milliseconds to seconds). Our efforts are driven by gaps in current abilities to connect across scales where it is already clear that new approaches and insights will translate into novel biomedical research discoveries and therapeutic strategies.

 

Towards Compute and Data Intensive Turbulent Combustion Simulation at the Exascale
Jackie Chen
Jackie Chen
Sandia National Laboratories, USA
Jacqueline H. Chen is a Distinguished Member of Technical Staff at the Combustion Research Facility at Sandia National Laboratories. She has contributed broadly to research in petascale direct numerical simulations (DNS) of turbulent combustion focusing on fundamental turbulence-chemistry interactions. These benchmark simulations provide fundamental insight into combustion processes and are used by the combustion modeling community to develop and validate turbulent combustion models for engineering CFD simulations. In collaboration with computer scientists and applied mathematicians she is the founding Director of the Center for Exascale Simulation of Combustion in Turbulence (ExaCT). She leads an interdisciplinary team to co-design DNS algorithms, domain-specific programming environments, scientific data management and in situ uncertainty quantification and analytics, and architectural simulation and modeling with combustion proxy applications. She received the DOE INCITE Award in 2005, 2007, 2008-2016, the DOE ALCC Award in 2012, the 34th International Combustion Symposium Distinguished Paper Award 2012, and the Asian American Engineer of the Year Award in 2009. She is a member of the DOE Advanced Scientific Computing Research Advisory Committee (ASCAC) and Subcommittees on Exascale Computing, and Big Data and Exascale. She is the editor of Flow, Turbulence and Combustion, the co-editor of the Proceedings of the Combustion Institute, volumes 29 and 30, the Co-Chair of the Local Organizing Committee for the 35th Intl Combustion Symposium, and a member of the Board of Directors of the Combustion Institute.
ABSTRACT
Exascale computing will enable combustion simulations in parameter regimes relevant to next-generation combustion devices burning alternative fuels. High fidelity combustion simulations are needed to provide the underlying science base required to develop vastly more accurate predictive combustion models used ultimately to design fuel efficient, clean burning vehicles, planes, and power plants for electricity generation. However, making the transition to exascale poses a number of algorithmic, software and technological challenges due to power constraints and the massive concurrency expected at the exascale. Addressing issues of data movement, power consumption, memory capacity, interconnection bandwidth, programmability, and scaling through combustion co-design are critical to ensure that future combustion simulations can take advantage of emerging computer architectures in the 2023 timeframe. Co-design refers to a computer system design process where combustion science requirements influence architecture design and constraints inform the formulation and design of algorithms and software. The current state of petascale turbulent combustion simulation will be reviewed followed by a discussion of co-design topics investigated by the exascale combustion co-design center, ExaCT (http://www.exactcodesign.org): 1) asynchronous numerical methods for partial differential equations; 2) asynchronous programming and execution models for multi-core hybrid architectures, and 3) in situ data analytics and adjoint sensitivity analysis. Results from a recent refactorization of a combustion direct numerical simulation (DNS) code, S3D, using an asynchronous model, Legion, with dynamic runtime analysis performed at scale on a petascale leadership class hybrid architecture will be presented. Further, using Legion, the extensibility of incorporating in situ analytics is demonstrated.

 

Strategies for Solving Computationally Expensive Engineering Design Optimization Problems
Slawomir Koziel
Slawomir Koziel
Reykjavik University, Iceland
ABSTRACT
Computer simulation tools have become ubiquitous in contemporary engineering and science over the decade. High-fidelity simulations provide accuracy that is beyond capability of any theoretical models and allow reliable representation of complex components and systems, including coupled and often multi-physics phenomena therein. Modeling accuracy is critical for quality of the design process; however, it comes at a high computational cost. In many engineering areas (electrical, structural, aerospace, etc.) typical simulation times of realistic 3D models are as long as many hours, days, or even weeks. Obviously, this hinders application of simulation tools for solving designs tasks that require multiple evaluations of the system at hand, such as parametric optimization. In this talk, the strategies of reducing computational cost of simulation-driven design are discussed. The focus is on surrogate-assisted design techniques which are the most promising approaches to expedited design involving expensive computer models. Several specific algorithmic frameworks are presented, including space mapping, response correction techniques, feature-based optimization, shape-preserving response prediction, as well as multi-fidelity design. Applications for solving real-world design problems in various engineering fields including electrical and aerospace engineering are also provided.

 

Using Supercomputers and Gene Sequencers to Discover Your Inner Microbiome
Larry Smarr
Larry Smarr
University of California, San Diego, USA
Larry Smarr is the founding Director of the California Institute for Telecommunications and Information Technology (Calit2), a UC San Diego/UC Irvine partnership, and holds the Harry E. Gruber professorship in UCSD’s Department of Computer Science and Engineering. Before that he was the founding director of the National Center for Supercomputing Applications (NCSA) at UIUC. He is a member of the National Academy of Engineering, as well as a Fellow of the American Physical Society and the American Academy of Arts and Sciences. In 2006 he received the IEEE Computer Society Tsutomu Kanai Award for his lifetime achievements in distributed computing systems and in 2014 the Golden Goose Award. He served on the NASA Advisory Council to 4 NASA Administrators, was chair of the NASA Information Technology Infrastructure Committee and the NSF Advisory Committee on Cyberinfrastructure, a member of the DOE Advanced Scientific Computing Advisory Committee and ESnet Policy Board, and for 8 years he was a member of the NIH Advisory Committee to the NIH Director, serving 3 directors. Larry can be followed on Twitter (@lsmarr) or on his portal http://lsmarr.calit2.net/.
ABSTRACT
The human body is host to 100 trillion microorganisms, ten times the number of DNA-bearing cells in the human body, and these microbes contain 300 times the number of DNA genes that our human DNA does. The microbial component of our “superorganism” is comprised of hundreds of species with immense biodiversity. To put a more personal face on the “patient of the future,” I have been collecting massive amounts of data from my own body over the last seven years, which reveals detailed examples of the episodic evolution of this coupled immune-microbial system. Collaborating with the UC San Diego Knight Lab, we have genetically sequenced a time series of my gut microbiome, as well as single moments from 50 patients with autoimmune disease. An elaborate software pipeline, running on high performance computers, reveals the details of the microbial ecology and its genetic components, in health as well as in disease. Not only can we compare a person with a disease to a healthy population, but we can also follow the dynamics of the diseased patient. We can look forward to revolutionary changes in medical practice over the next decade.

 

Lattice Boltzmann simulations all the way: from aerodynamic design to quark-gluon plasma hydrodynamics
Sauro Succi
Sauro Succi
Istituto per le Applicazioni del Calcolo “Mauro Picone” (C.N.R.) and University of Roma, Italy and Harvard University, USA
Dr Sauro Succi holds a degree in Nuclear Engineering from the University of Bologna and a PhD in Plasma Physics from the EPFL, Lausanne (Switzer- land). Since 1995 he serves as a Director of Research at the Istituto Applicazioni Calcolo of the Italian National Research Council in Rome and he is also a Re- search Associate of the Physics Department of Harvard University and a Visit- ing Professor at the Institute of Applied Computational Science at the School of Engineering and Applied Sciences of Harvard University.
His research interests cover a broad range of topics in kinetic theory and non-equilibrium statistical physics, including thermonuclear plasmas, fluid tur- bulence, micro and nanofluidics as well as quantum-relativistic fluids. He has received the Humboldt Prize in physics (2002), the Killam Award of the Univer- sity of Calgary (2005) and the Raman Chair of the Indian Academy of Sciences (2011). Dr Succi is an elected Fellow of the American Physical Society (1998) and an elected member of the Academia Europaea (2015).
ABSTRACT
Over the last near three decades, the Lattice Boltzmann (LB) method has gained a prominent role as an efficient computational scheme for the numerical sim- ulation of complex flows across a broad range of scales, from fully-developed turbulence in real-life geometries, to multiphase microflows, all the way down to biopolymer translocation in nanopores. Lately, the method has also shown promising potential for the simulation of quantum-relativistic fluids, such as quark-gluon plasmas, electron transport in graphene and relativistic magneto- hydrodynamics. After a brief introduction to the main ideas behind the LB method, we shall illustrate a few selected applications, along with prospects for future multiscale applications.