ICCS 2014, Cairns, Australia




fb logo

Subscribe to ICCS mailing list


Corporate Supporters
Valid HTML 4.01!
corner

ICCS 2014 Keynote Lectures

  • Professor Vassil Alexandrov, ICREA Research Professor in Computational Science, Barcelona Supercomputing Centre, Spain
    Towards Scalable Mathematical Methods and Algorithms for Extreme Scale Computing (see more)
     
  • Professor Dr Luis Bettencourt, Santa Fe Institute, New Mexico, USA
    A Science of Cities (see more)
     
  • Professor Professor Peter T. Cummings, Department of Chemical and Biomolecular Engineering, Vanderbilt University, USA
    Automating Computational Materials Discovery through Model-Integrated Computing (see more)
     
  • Dan Fay, Director - Earth, Energy, and Environment Microsoft External Research, Microsoft
    The Fourth Paradigm: Data-Intensive Scientific Discovery with the Cloud (see more)
     
  • Dr Warren Kaplan, Garvan Institute of Medical Research, Sydney, Australia
    Factory-Scale Genome Sequencing (see more)
     
  • Professor Bob Pressey, Australian Research Council Centre of Excellence for Coral Reef Studies, James Cook University, Australia
    Computational science in conservation planning: potential and limitations (see more)
     
  • Professor Mark Ragan, Institute for Molecular Bioscience, The University of Queensland, Australia
    Modelling cancer as a transcriptional landscape (see more)


Presenters and Abstracts

 

Towards Scalable Mathematical Methods and Algorithms for Extreme Scale Computing
Professor Vassil Alexandrov
ICREA Research Professor in Computational Science, Barcelona Supercomputing Centre, Spain
http://www.bsc.es/about-bsc/staff-directory/alexandrov-vassil

Vassil Alexandrov Abstract: Novel mathematics and mathematical modelling approaches together with scalable algorithms are needed to enable key applications at extreme-scale. This is especially true as HPC systems continue to scale up in compute node and processor core count. At the moment computational scientists are at the critical point/threshold of novel mathematics development as well as large-scale algorithm development and re-design and implementation that will affect most of the application areas. Therefore the talk will focus on the mathematical and algorithmic challenges and approaches towards exascale and beyond. It will present ideas and some novel mathematical methods and approaches for extreme-scale systems, such as stochastic and hybrid methods that in turn lead to scalable scientific algorithms with minimal or no global communication, hiding network and memory latency, have very high computation/communication overlap, have no synchronization points. Examples will be drawn from diverse application areas.

Back to top

 

A Science of Cities
Professor Dr Luis Bettencourt
Santa Fe Institute, New Mexico, USA
http://www.santafe.edu/about/people/profile/Luis%20Bettencourt

Louis Bettencourt Abstract: The study of cities immediately brings up most of the complex questions inherent to understanding human social systems in terms of their extreme complexity and evolvability. Yet, over the last few years increasing accessibility of data and a more interdisciplinary focus have started to pay off and reveal the nature of cities as complex adaptive systems, describable by quantitative laws about their general organization and dynamics. In this talk I will review the current empirical understanding of the structure and dynamics of cities in terms of their land use, social organization and mobility costs. I will show how this evidence can be synthesized in terms of a small set of quantitative principles of self-organization and how, in this light, we can understand the type of information processing cities perform in human societies. I will conclude with a set of open challenges for a deeper understanding of cities as computational networks that when addressed will form the basis of a mature science of cities and of other complex adaptive systems.

Back to top

 

Automating Computational Materials Discovery through Model-Integrated Computing
Professor Peter T. Cummings
Department of Chemical and Biomolecular Engineering, Vanderbilt University, USA
http://engineering.vanderbilt.edu/chbe/FacultyResearch/peter-cummings.php

Peter Cummings

Abstract: One of the key elements of President Obama’s manufacturing competitiveness initiative is the Materials Genome Initiative (MGI). Using high accuracy validated computational materials methods to predict and screen materials for specific properties, the MGI’s goal is to reduce by a factor of two both the time and the cost to bring new materials to the market. The development of computational materials screening tools for hard materials (metals, alloys, ceramics, etc) is relatively easy compared to the same task for soft materials (liquids, colloids, polymers, foams, gels, granular materials, and soft biomaterials), since in the former case the energy scales involved are large by comparison to thermal energy (~kBT where kB is Boltzmann’s constant and T is temperature) and the atoms/molecules are typically in crystalline lattice with no mesoscopic structuring. By contrast, for soft materials the energy scales are comparable to thermal energy, weak dispersion interactions frequently dominate, and soft matter often self-organizes into mesoscopic physical structures that are much larger than the microscopic scale yet much smaller than the macroscopic scale, and the macroscopic properties result from the mesoscopic structure. Consequently, to perform computational screening and design for soft materials involves complex molecular simulations, frequently on large petascale computing platforms and in some cases performed at multiple levels of detail. These simulations have complex workflows that experts understand and perform routinely; however, to translate these workflows into a set of tasks that can be automated, combined with other tasks and embedded within hierarchies of application-hardened engineering methodologies (such as stochastic optimization), we employ the discipline of model integrated computing (MIC). Using MIC, we develop a “meta-programming” tool that enables the establishment of a domain specific modeling language for the creation, synthesis and execution of simulation workflows. We illustrate this within two specific materials design problems: tethered nanoparticle systems and lubrication systems.

Back to top

 

The Fourth Paradigm: Data-Intensive Scientific Discovery with the Cloud
Dan Fay
Director - Earth, Energy, and Environment Microsoft External Research, Microsoft
http://research.microsoft.com/en-us/people/danf/

Dan Fay Abstract: There is broad recognition within the scientific community that the emerging data deluge will fundamentally alter disciplines in areas throughout academic research. A wide variety of scientists—biologists, chemists, physicists, astronomers, engineers – will require tools, technologies, and platforms that seamlessly integrate into standard scientific methodologies and processes. “The Fourth Paradigm” refers to the data management techniques and the computational systems needed to manipulate, visualize, and manage large amounts of scientific data. This talk will illustrate the challenges researchers will face, the opportunities these changes will afford, and the resulting implications for data-intensive researchers. It will also cover how Cloud Computing can be used to new insights in Scientific Discovery.

Back to top

 

Scaling and compression of regulatory information in integrated complex systems: lessons from genomics
Dr Warren Kaplan
Garvan Institute of Medical Research, Sydney, Australia
http://www.garvan.org.au/research/clinical-genomics/warkap

Warren Kaplan Abstract: At 150 Whole Human Genomes every 3 days, scaling up to handle Illumina's HiSeq X Ten population-scale genome sequencing factory has many implications. Our group in the Kinghorn Centre for Clinical Genomics at the Garvan Institute are addressing the challenges of analysing and storing 18000 human genomes each year, together with our plans for building a Genome Variant Store to correlate genome variants with phenotypical information. My talk will include core software systems including our LIMS used in our sequencing laboratory, SeqWare for the running and management of analytical pipelines as well as Alfred, our own software application, that oversees everything from sample receipt to the management of Quality Control events and the return of results to users in both research and clinical settings. I will also discuss our software development process of Continuous Integration that allows us to update our software systems multiple times daily, while ensuring that upgrades are not introducing new and unforeseen problems. Finally, I intend discussing our infrastructure, both in-house and external, which we will use in order to ensure the project's success.

Back to top

 

Computational science in conservation planning: potential and limitations
Professor Bob Pressey
Australian Research Council Centre of Excellence for Coral Reef Studies, James Cook University, Australia
http://www.coralcoe.org.au/programs-initiatives/program-6-conservation-planning

Bob Pressey

Abstract: The field of systematic conservation planning began, without computers, in 1983 but quickly developed a dependence on machines for rapid processing of large data sets. Since then the conceptual and technical expectations of computer analyses have grown exponentially, and algorithms have been developed for many, though not all problems. Dimensionality presents challenges, if not for analysis then for data elicitation. The leading edges of computation in conservation concern visualization and dynamic interaction. Perhaps the greatest limitations of computing in the field relate to interactions with people. People are affected by conservation decisions and increasingly expecting a role in decisions. This requires optimality to be redefined to refer to socially, as well as environmentally optimal outcomes.

Back to top

 

Modelling cancer as a transcriptional landscape
Professor Mark Ragan
Institute for Molecular Bioscience, The University of Queensland, Australia
http://www.imb.uq.edu.au/mark-ragan

Mark Ragan

Abstract: More than half a century ago, the British developmental biologist and philosopher Conrad Hal Waddington introduced the landscape as a metaphor of how cells differentiate into different types of tissues. This is now recognised as “probably the most famous and most powerful metaphor in developmental biology”. Nonetheless it has remained unclear how such a surface might be computed from actual cell-state data, and if so whether it could be informative or predictive about real-life biology. Here I explore Hopfield neural networks with genome-wide gene expression data as a computational model of Waddington’s landscape. I discuss concepts of state, trajectory and attractors, and present visualisations of the Hopfield surface for subtypes of breast cancer.

Back to top




ICCS 2014 is organised by

University of Queensland - Research Computing Centre

Research Computing Centre

UvA logo

iVEC UTK logo