Keynote Lectures

Tiziana Di Matteo, King’s College London, UK
        Network Filtering for Big Data    

Teresa Galvão, University of Porto / INESC TEC, Portugal
        Modelling Intelligent Urban Mobility: The Data and Knowledge at the Service of Citizens    

Douglas Kothe, Exascale Computing Project, USA
        Delivering Mission Critical Computational and Data Science Applications for U.S. Department of Energy via the Exascale Computing Project    
        This keynote lecture is proudly sponsored by the Journal of Computational Science (Elsevier)

James Moore, Imperial College London, UK
        Modelling Transport in the Lymphatics: The Inner Workings and Failings of the Body’s Sewer System    

Robert Panoff, The Shodor Education Foundation, USA
        Fostering Parallel Thinking in Computational Science Preparing for Petascale and Beyond    

Xiaoxiang Zhu, Technical University of Munich, Germany
        Data Science in Earth Observation    

Network Filtering for Big Data    
Tiziana Di Matteo
Tiziana Di Matteo
King’s College London, UK
WEB1 & WEB2

Tiziana Di Matteo is Professor of Econophysics. A trained physicist, she took her degree and PhD from the University of Salerno in Italy before assuming research roles at universities in Australia and Britain. She works in the Department of Mathematics at King’s College London in Econophysics, complex systems, complex networks and Data science. She has authored over 100 papers and gave invited and keynote talks at major international conferences in the US, across Europe and Asia, making her one of the world’s leaders in this field. She is member of the External Faculty of the Complexity Science Hub in Vienna, Honorary Professor of UCL in the Department of Computer Science, Member of the UCL Centre for Blockchain Technologies, Member of the Board of the Complex System Laboratory, Member of the Council and of the executive committee of the Complex Systems Society. She is Editor-in-Chief for the Journal of Network Theory in Finance, Editor of the European Physical Journal B, Editor of the Artificial Intelligence in Finance journal, Academic Editor Board Member of Advances in Mathematical Physics and Guest Editor of several other volumes. She is Co-founder of the Econophysics Network. She has been consultant for the Financial Services Authority and several hedge funds.

ABSTRACT
In this lecture I will present network-theoretic tools [1-5] to filter information in large-scale datasets and I will show that these are powerful tools to study complex datasets. In particular I will introduce correlation-based information filtering networks and the planar filtered graphs (PMFG) and I will show that applications to financial data-sets can meaningfully identify industrial activities and structural market changes [2-5]. It has been shown that by making use of the 3-clique structure of the PMFG a clustering can be extracted allowing dimensionality reduction that keeps both local information and global hierarchy in a deterministic manner without the use of any prior information [6,4]. However, the algorithm so far proposed to construct the PMFG is numerically costly with O(N3) computational complexity and cannot be applied to large-scale data. There is therefore scope to search for novel algorithms that can provide, in a numerically efficient way, such a reduction to planar filtered graphs. I will introduce a new algorithm, the TMFG (Triangulated Maximally Filtered Graph), that efficiently extracts a planar subgraph which optimizes an objective function. The method is scalable to very large datasets and it can take advantage of parallel and GPUs computing. The method is adaptable allowing online updating and learning with continuous insertion and deletion of new data as well changes in the strength of the similarity measure [7].
Finally I will also show that filtered graphs are valuable tools for risk management and portfolio optimization too [8-9] and they allow to construct probabilistic sparse modeling for financial systems that can be used for forecasting, stress testing and risk allocation [10].

[1] R. N. Mantegna., Eur. Phys. J. B 11 (1999) 193-197.
[2] T. Aste, T. Di Matteo, S. T. Hyde, Physica A 346 (2005) 20.
[3] M. Tumminello, T. Aste, T. Di Matteo, R. N. Mantegna, PNAS 102, n. 30 (2005) 10421.
[4] N. Musmeci, T. Aste, T. Di Matteo, PLoS ONE 10(3): e0116201 (2015).
[5] N. Musmeci, Tomaso Aste, T. Di Matteo, Journal of Network Theory in Finance 1(1) (2015) 1-22.
[6] W.-M. Song, T. Di Matteo, and T. Aste, PLoS ONE 7 (2012) e31929.
[7] Guido Previde Massara, T. Di Matteo, T. Aste, Journal of Complex networks 5 (2), 161 (2016).
[8] F. Pozzi, T. Di Matteo and T. Aste, Scientific Reports 3 (2013) 1665.
[9] N. Musmeci, T. Aste and T. Di Matteo, Scientific Reports 6, 36320; doi:1038/srep36320 (2016).
[10] Wolfram Barfuss, Guido Previde Massara, T. Di Matteo, T. Aste, Phys.Rev. E 94 (2016) 062306.

Modelling Intelligent Urban Mobility: The Data and Knowledge at the Service of Citizens    
Teresa Galvão
Teresa Galvão
University of Porto / INESC TEC, Portugal
WEB

Teresa Galvão has a background in Mathematics from the University of Coimbra, a Master in Electrical and Computers Engineering, and a PhD in Sciences of Engineering from the University of Porto. She is Assistant Professor in the Faculty of Engineering of University of Porto and a senior researcher at INESC TEC in Porto. She has participated in several national and European R&D projects in areas related to transportation systems and mobility. She collaborates regularly with the largest public transport companies in Portugal as researcher and consultant and was responsible for the development and implementation of several innovative systems for the operational planning, mobile ticketing, and passenger information in several of those companies. The academic and professional background led her to have a broad and multidisciplinary perspective of the current transportation and mobility challenges. Her main research interests are operational research, information systems, human-computer interaction, and transportation systems. She has more than 70 scientific publications, supervised 9 PhD students and more than 100 MSc students.
She is co-founder and CEO of OPT-Otimização e Planeamento de Transportes, SA, a company that develops innovative solutions for the optimization of public transports operation, the provision of passenger information and mobility management.

ABSTRACT
Cities are growing fast, and mobility plays a crucial role in people’s lives. Urban mobility is a multidisciplinary field involving architects, urban planners, engineers, computer scientists and, more recently, data scientists. New mobility services are rising, most of them supported by advanced information and communication technologies, internet, sensors, IoT, and mobile devices. All these devices and technologies generate vast volumes of data. In the last years, in many cities around the world, the transformation of these data into meaningful knowledge in order to improve the relationship between the citizens and the city is becoming an exciting and engaging challenge. Operational Planning Systems, Transit Management Systems, Automatic Fare Collection Systems, and Mobile Ticketing Systems are a few examples of data providers. Computer science explores the data collected from these systems in several areas: (i) optimizing the resources allocated to urban mobility services, (ii) improving the reliability and robustness of these services (iii) adapting the transport services to the citizens’ mobility profiles; (iv) providing innovative ticketing services and (v) calculating and visualizing performance indicators. In this talk, the main challenges and difficulties will be presented and discussed with the aid of real-world examples.
Delivering Mission Critical Computational and Data Science Applications
for U.S. Department of Energy via the Exascale Computing Project    
Doug Kothe
Douglas Kothe
Exascale Computing Project, USA
WEB

Douglas B. Kothe (Doug) has over three decades of experience in conducting and leading applied R&D in computational applications designed to simulate complex physical phenomena in the energy, defense, and manufacturing sectors. Doug is currently the Director of the Exascale Computing Project (ECP). Prior to that, he was Deputy Associate Laboratory Director of the Computing and Computational Sciences Directorate (CCSD) at Oak Ridge National Laboratory (ORNL). Other prior positions for Doug at ORNL, where he has been since 2006, include Director of the Consortium for Advanced Simulation of Light Water Reactors, DOE’s first Energy Innovation Hub (2010-2015), and Director of Science at the National Center for Computational Sciences (2006-2010).
Before coming to ORNL, Doug spent 20 years at Los Alamos National Laboratory, where he held a number of technical and line and program management positions, with a common theme being the development and application of modeling and simulation technologies targeting multi-physics phenomena characterized in part by the presence of compressible or incompressible interfacial fluid flow. Doug also spent one year at Lawrence Livermore National Laboratory in the late 1980s as a physicist in defense sciences.
Doug holds a Bachelor in Science in Chemical Engineering from the University of Missouri – Columbia (1983) and a Masters in Science (1986) and Doctor of Philosophy (1987) in Nuclear Engineering from Purdue University.

ABSTRACT
The Exascale Computing Project (ECP), launched in 2016 by the US Department of Energy (DOE), is an aggressive research, development, and deployment project focused on delivery of mission critical applications, an integrated software stack, and exascale hardware technology advances. These products are being deployed to DOE high performance computing (HPC) facilities on pre-exascale and ultimately exascale computers, where they will address critical challenges in national security, energy assurance, economic competitiveness, healthcare, and scientific discovery. Illustrative examples will be given on how the ECP teams are delivering in its three areas of technical focus:

  • Applications: Exascale-capable applications are a foundational element of the ECP and are the delivery vehicle for solutions and insights to key national challenges and emerging technical areas such as machine learning and artificial intelligence. Problems heretofore intractable are accessible with ECP applications.
  • Software Technologies: Software technologies play an essential enabling role as the underlying technology to application integration and efficacy on computing systems. An expanded and vertically integrated software stack is being developed to include advanced mathematical libraries and frameworks, extreme-scale programming environments, tools, and visualization libraries.
  • Hardware and Integration: ECP is focused on integration of applications, software, and hardware innovations to ensure a capable exascale computing ecosystem. Working closely with the DOE HPC facilities, the project supports US HPC vendor research and development of innovative architectures for competitive exascale system designs.
Modelling Transport in the Lymphatics: The Inner Workings and Failings of the Body’s Sewer System    
James Moore
James Moore
Imperial College London, UK
WEB

Dr. Moore was born in Toccoa, Georgia, and received his Bachelor of Mechanical Engineering in 1987, his Master of Science in Mechanical Engineering in 1988 and his Ph.D. in 1991, all from the Georgia Institute of Technology. He was the first PhD student of Dr. David N. Ku, MD PhD, and his thesis work was a collaborative project with vascular surgeon Dr. Christopher Zarins and vascular pathologist Dr. Seymour Glagov. He had postdoctoral training at the Swiss Institute of Technology at Lausanne, 1991 – 1994, where he also helped set up a new biomedical engineering lab. From 1994 – 2003 Dr. Moore served as a professor of Mechanical and Biomedical Engineering, Florida International University. He moved to Texas A&M University in 2003, where he served as the Carolyn S. and Tommie E. Lohman ’59 Professor of Biomedical Engineering and Director of Graduate Studies. In Jaunary 2013, he joined Imperial College as the Bagrit and Royal Academy of Engineering Chair in Medical Device Design, and Director of Research for the Department of Bioengineering.
Dr. Moore’s research interests include Cardiovascular Biomechanics, Stents, Implantable Devices, Atherosclerosis, and the Lymphatic System. His research focuses on the role of biomechanics in the formation and treatment of diseases such as atherosclerosis and cancer. His cardiovascular biomechanics research includes the first finite element models of artery walls to include residual stress, the first studies of the effects of combined flow and stretch on vascular endothelium, early work on the effects of myocardial contraction on coronary artery flow patterns, and the first studies of the effects of stents on both blood flow patterns and artery wall stress. This work resulted in the development of two novel stent designs aimed at optimizing post-implant biomechanics for the prevention of restenosis, as well as new testing devices for implants that employ more physiologic mechanical forces (currently marketed by Bose). His work on the effects of stretch gradients on cells was awarded best paper of the year in the Journal of Biomechanical Engineering for 2011. In collaboration with Dr. Kumbakonam Rajagopal, he developed constitutive models of strain-accelerated degradation of polymers used in medical implants. His research on lymphatic system biomechanics, initiated in 2004 with Dr. David Zawieja, has provided unprecedented insight into the pumping characteristics of the system and the transport of nitric oxide, antigens, and chemokines in lymphatic tissues. He is currently developing two technologies for preventing and resolving secondary lymphedema, which typically forms subsequent to cancer surgery. Along with his funding from government, charity, and industry sources, Dr. Moore has received multiple patents for medical devices and testing equipment. Dr. Moore has also co-founded two startup companies.

ABSTRACT
Lymphatic vessels play an important role in maintaining fluid balance by returning interstitial fluid and proteins to the blood. This system also plays a crucial role in transporting immune cells into lymph nodes where adaptive immunity is formed. Cancer cells exploit the lymphatic system to spread to other parts of the body. All of the deadliest forms of cancer spread via lymphatics, and often set up secondary tumours in other parts of the body such as the brain, lungs and bone marrow. Approximately 90% of all cancer deaths are due to these secondary tumours. However, knowledge of the human lymphatic system is severely limited by the lack of technologies to measure pressure, flow rate or diameter in any vessel. Interstitial fluid which has been transported out of the walls of small blood vessels is collected by highly porous and largely passive initial lymphatic vessels, which converge into collecting lymphatic vessels, featuring one-way valves and mural musculature. Lymphatic muscle cells actively contract to push fluid centrally. Computational modelling has played an important role in informing our understanding of lymphatic pumping, such as how the system generates the suction required to draw in fluid from interstitial tissue spaces, many of which exhibit subatmospheric pressures. Disruptions to lymphatic pumping often result in an incurable swelling referred to as lymphoedema. Improving our understanding of lymphatic function across multiple length and time scales will play a role in developing clinical interventions to prevent and cure these devastating disease processes.
Fostering Parallel Thinking in Computational Science Preparing for Petascale and Beyond    
Robert Panoff
Robert Panoff
The Shodor Education Foundation, USA
WEB

Dr. Robert M. Panoff is founder and Executive Director of The Shodor Education Foundation, Inc., and has been a consultant at several national laboratories. He is also a frequent presenter at NSF- sponsored workshops on visualization, supercomputing, and networking, and continues to serve as consultant for the education program at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. He has served on the advisory panel for Applications of Advanced Technology program at NSF. Dr. Panoff received his M.A. and Ph.D. in theoretical physics from Washington University in St. Louis, undertaking both pre- and postdoctoral work at the Courant Institute of Mathematical Sciences at New York University.
As principal investigator on several NSF grants that seek to explore the interaction of high performance computing technologies and education, he worked to develop a series of interactive simulations which combine supercomputing resources and desktop computers. Besides developing and teaching a new course in Information Technologies, Dr. Panoff continues an active research program in computational condensed matter physics while defining and implementing educational initiatives at the Shodor Foundation.
At Kansas State University and Clemson University from 1986-1990, he developed a fully interdisciplinary computational science and engineering course. He served as director of the Carolinas Institute in Computational Science, an NSF-funded initiative in Undergraduate Faculty Enhancement, 1991-1993. His work has won several major science and education awards, including the 1990 Cray Gigaflop Performance Award in Supercomputing, the 1994 and 1995 Undergraduate Computational Science Education Awards from the U.S. Department of Energy, and a 1995 Achievement Award from the Chicago Chapter of the Society for Technical Communication. In 1993-1994, his interactive simulations were used as the basis of an international science collaboration demonstrating network technologies involving four of the schools from the Department of Defense Dependent Schools, for which he received a letter of commendation from the Department of Defense. In recognition of Dr. Panoff’s efforts in undergraduate faculty enhancement and curriculum development, the Shodor Foundation was named in 1996 as a Foundation Partner of the National Science Foundation for the revitalization of undergraduate education.

ABSTRACT
High Performance Computing continues to substantially reduce the “time to science” while improving the quality of the science itself. Nearly every exploration in the social, life, and physical sciences requires the efficient implementation of complex models of increasing size and scale along with the application of massively parallel computing and the analysis of big data. With performance of compilers not yet close to automatic, the question arises as to how much human intervention for various parallel paradigms is required to achieve substantial machine speed-ups for various parallel architectures.This talk will discuss the challenges in preparing a new generation of computational scientists who are faced with a heterogenous set of multi-core/many-core environments. The goal is to spark a discussion of how we can focus on parallel thinking in scientific problem formulation and solution from the outset, thereby exploiting the parallelism in nature to illuminate the nature of parallelism.
Data Science in Earth Observation    
Xiaoxiang Zhu
Technical University of Munich, Germany
WEB

Xiaoxiang Zhu is the professor for Signal Processing in Earth Observation (SiPEO, www.sipeo.bgu.tum.de) at Technical University of Munich (TUM) and the German Aerospace Center (DLR), Germany. She is also the founding head of the department of EO Data Science in DLR’s Earth Observation Center. Zhu received the Master (M.Sc.) degree, her doctor of engineering (Dr.-Ing.) degree and her “Habilitation” in the field of signal processing from TUM in 2008, 2011 and 2013, respectively. She was a guest scientist or visiting professor at the Italian National Research Council (CNR-IREA), Naples, Italy, Fudan University, Shanghai, China, the University of Tokyo, Tokyo, Japan and University of California, Los Angeles, United States in 2009, 2014, 2015 and 2016, respectively.
The research of Xiaoxiang focuses on signal processing and data science in earth observation. Geoinformation derived from Earth observation satellite data is indispensable for many scientific, governmental and planning tasks. Furthermore, Earth observation has arrived in the Big Data era with ESA’s Sentinel satellites and NewSpace companies. Xiaoxiang Zhu develops explorative signal processing and machine learning algorithms, such as compressive sensing and deep learning, to improve information retrieval from remote sensing data, and to enable breakthroughs in geoscientific and environmental research. In particular, by the fusion of petabytes of EO data from satellite to social media, she aims at tackling challenges such as mapping of global urbanization.
Xiaoxiang Zhu is an associate editor of IEEE TGRS and SPIE JARS and the author of 250 scientific publications, among them about 160 full-paper-peer-reviewed and 10 paper awards. She has received several important scientific awards and grants, for example the Heinz Maier-Leibnitz-Preis of the German Research Foundation (DFG) in 2015, Innovators under 35 of Technology Review Germany in 2015, IEEE GRSS Early Career Award in 2016, ERC Starting Grant in 2016, PRACE Ada Lovelace Award for HPC in 2018 and Helmholtz Excellence Professorship in 2018 etc.

ABSTRACT
Geoinformation derived from Earth observation satellite data is indispensable for many scientific, governmental and planning tasks. Geoscience, atmospheric sciences, cartography, resource management, civil security, disaster relief, as well as planning and decision support are just a few examples. Furthermore, Earth observation has irreversibly arrived in the Big Data era, e.g. with ESA’s Sentinel satellites and with the blooming of NewSpace companies. This requires not only new technological approaches to manage and process large amounts of data, but also new analysis methods. Here, methods of data science and artificial intelligence (AI), such as machine learning, become indispensable.
In this keynote, explorative signal processing and machine learning algorithms, such as compressive sensing and deep learning, will be shown to significantly improve information retrieval from remote sensing data, and consequently lead to breakthroughs in geoscientific and environmental research. In particular, by the fusion of petabytes of EO data from satellite to social media, fermented with tailored and sophisticated data science algorithms, it is now possible to tackle unprecedented, large-scale, influential challenges, such as the mapping of global urbanization — one of the most important megatrends of global changes.