Submission and Deadlines:
All papers (both workshop and main track) for ICCS 2013 should be submitted through our submission system.
Please select the appropriate workshop there.
Unless stated otherwise submission deadlines for workshops are the same as those listed in our important dates. All deadlines after February 10 are synchronised.
- 3: 7th Workshop on Computational Chemistry and Its Applications
Contact: P. Ramasami
Computational chemistry uses of computers in attempts to solve chemical problems. It uses theoretical methods implemented in software for computations. At the outset of the 21st Century, computational chemistry is leading to a wide range of possibilities usually interdisciplinary due to explosive increase in computer power and software capabilities. Computational chemistry is also integrating the chemistry curriculum.
The objectives of this workshop are to highlight the latest scientific advances within the broad field of computational chemistry in academia, industry and society.
This workshop will provide the opportunity for researchers coming from corners of the world to be on a single platform for discussion, exchanging ideas and developing collaborations.
It will also be a suitable platform for researchers from different fields to meet so that ideas for new interdisciplinary research can emerge.
This will be the seventh workshop after being successful events in ICCS since 2003.
This workshop will consider only original work and the submissions will be selected after peer reviewing.
The accepted full manuscripts will be published in Procedia Computer Science.
- 4: The 4th Workshop on Computational Optimization, Modelling and Simulation (COMS2013)
Contact: X.S. Yang, S. Koziel, L. Leifsson
The 4th workshop on "Computational Optimization, Modelling and Simulation (COMS 2013)" will be a fourth event of the COMS workshop series. COMS 2013 continues to provide a current forum and foster discussion on the cross-disciplinary research and development in computational optimization, computer modeling and simulations. COMS2013 will focus on new algorithms and methods, new trends, and latest developments in computational optimization, modelling and simulation as well as applications in science, engineering and industry.
Topics include (but not limited to):
· Computational optimization, engineering optimization and design
· Bio-inspired computing and algorithms
· Metaheuristics (bat algorithm, cuckoo search, firefly algorithm, ABC, GA, PSO etc)
· Simulation-driven design and optimization of computationally expensive objectives
· Surrogate- and knowledge-based optimization algorithms
· Scheduling and network optimization as well as design of experiements
· Integrated approach to optimization and simulation
· New optimization algorithms, modelling techniques related to optimization
· Application case studies in engineering and industry.
Xin-She Yang (Middlesex University, UK)
Slawomir Koziel (Reykjavik University, Iceland)
Leifur Leifsson (Reykjavik University, Iceland)
- 9: 10th International Workshop on Modeling and Computing Multiscale Systems
Contact: S.M.M.S. Workshop Organizers
The modeling and computation of multiphysics and multiscale systems constitutes a grand challenge in computational science, and is widely applied in fields such as astrophysics, chemical engineering, plasma physics, materials science, biomedical science, and aerospace and automotive engineering. Most of the real-life systems encompass interactions within and between a wide range of physical phenomena, each of which may operate on different time and length scales. They require the development of sophisticated numerical models and computational techniques to accurately simulate the diversity and complexity of multiscale and multiphysics problems, and to effectively capture the wide range of relevant physical phenomena within these simulations. Additionally, these multiscale models frequently need large scale computing capabilities as well as dedicated software and services that enable the exploitation of existing and evolving e-infrastructures.
This workshop aims to provide a forum for multiscale application modelers, framework developers and experts from the distributed infrastructure communities to identify and discuss challenges in, and possible solutions for, modeling multiscale systems, as well as their execution on distributed e-infrastructures.
This MCMS workshop is a successor of the successful SMMS series of workshops on Simulations of Multiphysics Multiscale Systems organized by the same chairs over the last 10 years. It has been merged with the workshop on Distributed Multiscale Computing (DMC2011) held by the co-chairs as a response to the necessity of a platform for both, scientific modelers and experts for execution tools and e-infrastructure.
Specific topics include (but are not limited to):
-- Modeling of multiphysics and/or multiscale systems. Of particular interest are: Monte Carlo methods, particle-based methods, mesoscopic models such as cellular-automata, lattice gas and lattice-Boltzmann methods, computational fluid dynamics and computational solid mechanics;
-- Multiphysics and/or multiscale modeling of biological or biomedical systems. This includes computational models of tissue- and organo-genesis, tumor growth, blood vessels formation and interaction with the hosting tissue, biochemical transport and signaling, biomedical simulations for surgical planning, etc.
-- Novel approaches to combine different models and scales in one problem solution;
-- Challenging applications in industry and academia, e.g. time-dependent 3D systems, multiphase flows, fluid-structure interaction, chemical engineering, plasma physics, material science, biophysics, automotive industry, etc.;
-- Advanced numerical methods for solving multiphysics multiscale problems;
-- Challenging multiscale applications in industry and academia from different communities;
-- Environments and frameworks for simulation of multiscale models;
-- Cloud-based support for multiscale computing;
-- e-infrastructure for distributed multiscale computing (computing, storage, networking);
-- dedicated services required for distributed multiscale computing;
- 10: Workshop on Computational and Algorithmic Finance
Contact: A. Itkin
This workshop is intended to present the advances in numerical and computational techniques in pricing, hedging and risk management of financial instruments. The topics include (but not limited to) that are usually covered by the Journal of Computational Finance, namely:
- Numerical solutions of pricing equations: finite differences, finite elements, and special techniques in one and multiple dimensions.
- Simulation approaches in pricing and risk management: advances in Monte Carlo and quasi- Monte Carlo methodologies; new strategies for market factors simulation.
- Optimization techniques in hedging and risk management.
- Fundamental numerical analysis relevant to finance: effect of boundary treatments on accuracy; new discretization of time-series analysis.
- Developments in free-boundary problems in finance: alternative ways and numerical implications in American option pricing.
- CVA, DVA, FVA, valuing portfolio of instruments.
- Pricing and hedging in incomplete markets
- New techniques in Machine Learning as applied to finance (Support Vector Machines, Neural Networks etc.)
- Numerical techniques and tools for Algorithmic and High-Frequency trading, Market making etc.
- Parallel computing as applied to finance
For more detailed information, please visit our webpage.
- 18: Knowledge representation and applied models and metadata in computational science (KREAM)
Contact: M.A. Sicilia
Computational science techniques require in many cases models and representations of knowledge for the complex processes supporting research in the different fields, and complex models are also required to capture the research context itself .
This has resulted in the development of ontologies, metadata schemas and other kinds of models that are shared, reused and enriched for computational science tasks continuously. Relevant examples are scientific ontologies as the Gene Ontology or the Plant Ontology and metadata schemas as the Ecological Metadata Language (EML), but many other models that are less used are regularly used in computational science research. These artifacts call for specific methods, techniques and scientific infrastructure support that deserve separate attention.
The Knowledge representation and applied models and metadata in computational science (KREAM) workshop aims at gathering high quality research results about:
- the use of knowledge representations, schemas and models in computational science for concrete applications
- the design of e-science infrastructure support or papers dealing with the analysis,
- development or evaluation of the representations themselves
- Semantic Sensor Networks for e-science
- Data analysis and models of machine learning from sensor networks
- 19: Discrete, stochastic simulations with an emphasis on epidemiological application
Contact: M.K. Roh
For a myriad of investigations into natural processes, computational resources have been a bottleneck to the validation of scientific hypotheses about the fundamental nature of those processes. In fields such as epidemiology, this bottleneck has profound consequences for the development of strategies to combat the spread of diseases. The rapid advancement and availability of computational resources have provided the ability to test hypotheses about natural processes through the construction of high-dimensional discrete, stochastic models.
The objective of this workshop is to focus on that frontier of computational science: the intersection of numerical algorithms, high-dimensional complex systems, and an ever-increasing computational resource pool. We aim to bring together scientists from the broad field of computational science in academia, industry, and society.
This workshop will also have special emphasis on, but not be limited to, discrete models and numerical methods that focus on epidemiological processes. Bringing together specialists in computation, modeling, and analysis, the workshop will help foster the development of new methods and models through collaboration in this cross-disciplinary research field.
Specific topics include (but are not limited to):
- Discrete models for disease propagation and vaccination campaigns
- High-dimensional epidemiological models
- Epidemiological applications for near-eradication regimes
- Modeling of malaria, polio, HIV, tuberculosis, and neglected tropical diseases
- Models for pathogen life cycle, within-host dynamics, and vector populations and transmission
- Spatially inhomogeneous processes, e.g. reaction-diffusion systems
- Rare-event probability estimation, importance sampling, variance reduction
- Epidemiological and biochemical modeling using Monte Carlo methods or stochastic differential equations
- Model formulation and parameter fitting methods from empirical data
- Sensitivity analysis for stochastic numerical methods
- Dimensional reduction and numerical methods for the master equation
- Compressive sensing and sparse data analysis associated with field observations.
- 20: Third International Workshop on Advances in High-Performance Computational Earth Sciences: Applications and Frameworks
Contact: Y. Cui
The IHPCES workshop will provide a forum for presentation and discussion of state-of-the-art research in high-performance computational earth sciences. Emphasis will be on novel advanced high-performance computational algorithms, formulations and simulations, as well as the related issues for computational environments and infrastructure for development of high-performance computational earth sciences. The workshop facilitates communication between earth scientists, applied mathematicians, computational and computer scientists and presents a unique opportunity for them to exchange advanced knowledge, insights and science discoveries. With the imminent arrival of the exascale era, strong multidisciplinary collaborations between these diverse scientific groups are critical for the successful development of high-performance computational earth sciences applications. Presentations and audience representation from the broad earth sciences community is strongly encouraged. Contributions are solicited in (but not restricted to) the following areas:
- Large-scale simulations using modern high-end supercomputers in earth sciences, such as atmospheric science, ocean science, solid earth science, and space & planetary science, as well as multi-physics simulations.
- Advanced numerical methods for computational earth sciences, such as FEM, FDM, FVM, BEM/BIEM, Mesh-Free method, Particle method, and etc.
- Numerical algorithms and parallel programming models for computational earth sciences.
- Optimization and reengineering of applications for multi/many-core processors and accelerators.
- Strategy, implementation and applications of pre/post processing and handling of large-scale data sets for computational earth sciences, such as parallel visualization, parallel mesh generation, I/O, data mining and etc.
- Frameworks and tools for development of codes for computational earth sciences on peta/exascale systems.
Yifeng Cui (San Diego Supercomputer Center, USA)
Xing Cai (Simula Research Laboratory, Norway)
- 21: Eighth international Workshop on Automatic Performance Tuning (iWAPT2013)
Contact: D. Gimenez
iWAPT2013 is the eighth in a series of workshops dedicated to
autotuning (Eighth international Workshop on Automatic Performance
Tuning). It provides opportunities for researchers and practitioners
in all fields related to automatic performance tuning to exchange
ideas and experiences on algorithms, libraries, and applications
tuned for recent computing platforms. This workshop will consist of a
few guest speaker presentations, and several presentations of
peer-reviewed papers. The main topics of interest are Performance
modelling; Adaptive algorithms; Numerical algorithms and libraries;
Scientific applications; Parallel and distributed computing;
Computing with GPGPU and accelerators; Database management systems;
Numerical precision and stability; Resource restrictions; Low-power
computing; Empirical Compilation; Automatically-tuned Code
Generation; Frameworks and theories of automatic tuning and software
optimization; Autonomic computing and context-aware computing.
- 23: 6th Workshop on "Biomedical and Bioinformatics Challenges for Computer Science" (BBC 2013)
Contact: M. Cannataro
Emerging technologies in genomics, proteomics, interactomics and other life science areas are generating an increasing amount of complex data and information. The evolving data and information ecosystem includes large experimental “omics” data sets, natural language text from the scientific literature and the Web, and highly connected heterogeneous information networks, such as open linked data distributed on the Internet. Integrating and analysing this information in the context of modern life science research problems and biomedical applications poses a considerable challenge for bioinformatics and computational biology. Traditionally, bioinformatics has concentrated on methods and technologies facilitating the acquisition, storage, organization, archiving, analysis and visualization of biological and medical data. Computational biology, on the other, hand, has emphasized mathematical and computational techniques facilitating the modelling and simulation of biomedical processes and systems. In recent years the distinction between these two fields has become increasingly blurred. In order to tackle the growing complexity associated with emerging and future life science challenges, bioinformatics and computational biology researchers and developers need to explore, develop and apply novel computational concepts, methods, tools and systems. Many of these new approaches are likely to involve advanced and large-scale computing techniques, technologies and infrastructures such as:
- High-performance architectures and systems (e.g., multicore, GPU);
- Distributed computing (e.g. grid, cloud, peer-to-peer, Web services, e-infrastructures);
- Data and information management and integration (e.g., databases, data warehousing, data fusion);
- Knowledge discovery/management (e.g., knowledge bases, data mining, ontologies, workflow);
- Computational simulation (mechanistic, stochastic, multi-model);
- Artificial and computational intelligence (machine learning, agents, evolutionary techniques).
Together, these topics cover the key bioinformatics and computational biology techniques and technologies encountered in modern life science environments: (1) Advanced computing architectures/infrastructures; (2) Data/information management and integration; (3) Data/information analysis and knowledge discovery; (4) Integration of quantitative/symbolic knowledge into executable biomedical “theories” or models. The aim of this workshop is to bring together computer and life scientists to discuss emerging and future directions in these areas.
- 31: Workshop on Teaching Computational Science 2013 (WTCS 2013)
Contact: A.B. Shiflet
The seventh Workshop on Teaching Computational Science (WTCS 2013) solicits submissions that describe innovations in teaching computational science in its various aspects (e.g. modeling and simulation, high-performance and large-data environments) at all levels and in all contexts. Typical topics include, but are not restricted to, innovations in the following areas:
- course content,
- curriculum structure,
- methods of instruction,
- methods of assessment,
- tools to aid in teaching or learning,
- evaluations of alternative approaches, and
- non-academic training in computational sciences.
These innovations may be in the context of formal courses or self-directed learning; they may involve, for example, introductory programming, service courses, specialist undergraduate or postgraduate topics, industry-related short courses. We welcome submissions directed at issues of current and local importance, as well as topics of international interest. Such topics may include transition from school to university, articulation between vocational and university education, quality management in teaching, teaching people from other cultures, attracting and retaining female students, and flexible learning.
- 32: Agent-Based Simulations, Adaptive Algorithms and Solvers
Contact: M. Paszynski
The aim of this workshop is to integrate results of different domains of computer science, computational science and mathematics.
We invite papers oriented toward simulations, either hard simulations by means of finite element or finite difference methods, or soft simulations by means of evolutionary computations, particle swarm optimization and other.
The workshop is most interested in simulations performed by using agent-oriented systems or by utilizing adaptive algorithms, but simulations performed by other kind of systems are also welcome.
Agent-oriented system seems to be the attractive tool useful for numerous domains of applications.
Adaptive algorithms allow significant decrease of the computational cost by utilizing computational resources on most important aspect of the problem.
To give - rather flexible - guidance in the subject, the following topics are suggested.
These of theoretical brand, like:
-multi-agent systems in high-performance computing,
-agent-oriented approach to adaptive algorithms,
-mathematical modeling and asymptotic analysis,
-finite element or finite difference methods,
-mathematical modeling and asymptotic analysis.
And those with stress on application sphere:
-application of adaptive algorithms in simulation,
-simulation and multi-agent systems,
-application of adaptive algorithms in finite element and finite difference simulations,
-application of multi-agent systems in computational modeling,
-multi-agent systems in integration of different approaches.
- 33: Architecture, Languages, Compilation and Hardware support for Emerging ManYcore systems (ALCHEMY 2013)
Contact: L. CUDENNEC
Massively parallel processors are becoming one of the key actors for
the next embedded and high performance computing architectures. They
offer thousands of cores, integrated memory and network on a single
chip. They also keep a low power consumption compared to regular chip
multiprocessors. As far as today, taking benefit of parallel and
distributed architectures such as GPGPU farms, clusters, grids and
clouds, has exhibited the complexity of offering tight programmability
to the developer, while preserving a high level of
performance. Manycores encounter the same challenges, but they rely on
paradigms coming from CMP architectures. In this session, we explore
the newest academic and industrial works that contribute to the
efficient programmability of manycores. These contributions should be
from any of the following research fields: programming languages and
compilers, runtime generation, architecture support for massive
parallelism management and enhanced communications, new operating
systems and dedicated operating systems.
* Advanced compilers for programming languages targeting massively parallel architectures
* Advanced architecture support for massive parallelism management
* Advanced architecture support for enhanced communication for CMP/manycores
* New OS, or dedicated OS for massively parallel application
* Runtime generation for parallel programing on manycores.
As this topic deals with cutting edge architectures, we plan to
advertise this session using communication networks and mailing lists
from a wide selection of research fields: from CMP architectures to
HPC, grid computing and peer-to-peer communities, from architecture
design to language theory. A dedicated website will be set to gather
any useful information about the workshop and link to the ICCS
Several high ranked researchers contacted as potential CP members
already ensured their support for the future advertising through
their research networks (e.g. HiPEACS).
- 35: The Tenth Workshop on Computational Finance and Business Intelligence
Contact: Y. Shi
The workshop focus on computational science aspects of asset/derivatives pricing & financial risk management that relate to business intelligence. It will include but not limited to modeling, numeric computation, soft computing, algorithmic and complexity issues in arbitrage, asset pricing, future and option pricing, risk management, credit assessment, interest rate determination, insurance, foreign exchange rate forecasting, online auction, cooperative game theory, general equilibrium, information pricing, network band witch pricing, rational expectation, repeated games, etc.
- 36: Tools for Program Development and Analysis in Computational Science
Contact: K. Fuerlinger
The use of super-computing technology, parallel and distributed processing, and sophisticated algorithms is of major importance for computational scientists. Yet, the scientists' goals are to solve their challenging scientific problems, not the software engineering tasks associated with it. For that reason, computational science and engineering must be able to rely on dedicated support from program development and analysis tools.
The primary intention of this workshop is to bring together developers of tools for scientific computing and their potential users. Paper submissions by both tool developers and users from the scientific and engineering community are encouraged in order to inspire communication between both groups. Tool developers can present to users how their tools support scientists and engineers during program development and analysis. Tool users are invited to report their experiences employing such tools, especially highlighting the benefits and the improvements possible by doing so.
The following areas and related topics are of interest:
Problem solving environments for specific application domains
Application building and software construction tools
Domain-specific analysis tools
Program visualization and visual programming tools
On-line monitoring and computational steering tools
Requirements for (new) tools emerging from the application domain
In addition, we encourage software tool developers to describe use cases and practical experiences of software tools for real-world applications in the following areas:
-Tools for parallel, distributed and network-based computing
-Testing and debugging tools
-Performance analysis and tuning tools
-(Dynamic) Instrumentation and monitoring tools
-Data (re-)partitioning and load-balancing tools
-Checkpointing and restart tools
-Tools for resource management, job queuing and accounting
- 38: Dynamic Data Driven Application Systems - DDDAS 2013
Contact: C.C. Douglas
This workshop covers several aspects of the Dynamic Data Driven Applications Systems (DDDAS) concept, which is an established approach defining a symbiotic relation between an application and sensor based measurement systems. Applications can accept and respond dynamically to new data injected into the executing application. In addition, applications can dynamically control the measurement processes. The synergistic feedback control-loop between an application simulation and its measurements opens new capabilities in simulations, e.g., the creation of applications with new and enhanced analysis and prediction capabilities, greater accuracy, longer simulations between restarts, and enable a new methodology for more efficient and effective measurements. DDDAS transforms the way science and engineering are done with a major impact in the way many functions in our society are conducted, e.g., manufacturing, commerce, transportation, hazard prediction and management, and medicine. The workshop will present such new opportunities as well as the challenges and approaches in technology needed to enable DDDAS capabilities in applications, relevant algorithms, and software systems. The workshop will showcase ongoing research in these aspects with examples from several important application areas. All related areas in Data-Driven Sciences are included in this workshop.
- 42: 2nd Workshop on Computational Approaches to Social Modeling (ChASM)
Contact: B. Gonçalves
Modern life is infused with a myriad of gadgets and new technologies that are quickly becoming online extensions of our offline lives. How we interact with others, where we are and where we go are all facets that are increasingly captured with ever greater detail by our online tools and gadgets.
The digital traces constantly produced by these tools create hitherto unseen possibilities for the study of human behavior, but also pose their own challenges. The avalanche of data we are witnessing demands new tools and concepts to be analyzed and the new problems that are within our reach demand new algorithms and models to be developed. Recent years have seen a major revival of the interest of applying computational and data driven methods to the study of individual and collective human behavior.
This workshop aims to bring together practitioners of both computer science and social science so that both may better understand the challenges faced by each other and how best they may collaborate to overcome them.
- 43: International Workshop on Computational Flow and Transport: Modeling, Simulations and Algorithms
Contact: S. Sun
Modeling of flow and transport is an essential component of many scientific and engineering applications, with increased interests in recent years. Application areas vary widely, and include groundwater contamination, carbon sequestration, air pollution, petroleum exploration and recovery, weather prediction, drug delivery, material design, chemical separation processes, biological processes, and many others. However, accurate mathematical and numerical simulation of flow and transport remains a challenging topic from many aspects of physical modeling, numerical analysis and scientific computation. Mathematical models are usually expressed via nonlinear systems of partial differential equations, with possibly rough and discontinuous coefficients, whose solutions are often singular and discontinuous. An important step of a numerical solution procedure is to apply advanced discretization methods (e.g. finite elements, finite volumes, and finite differences) to the governing equations. Local mass conservation and compatibility of numerical schemes are often necessary to obtain physical meaningful solutions. Another important solution step is the design of fast and accurate solvers for the large-scale linear and nonlinear algebraic equation systems that result from discretization. Solution techniques of interest include multiscale algorithms, mesh adaptation, parallel algorithms and implementation, efficient splitting or decomposition schemes, and others.
The aim of this special issue is to bring together researchers in the aforementioned field to highlight the current developments both in theory and methods, to exchange the latest research ideas, and to promote further collaborations in the community. We invite original research articles as well as review articles describing the recent advances in mathematical modeling, computer simulation, numerical analysis, and other computational aspects of flow and transport phenomena of flow and transport. Potential topics include, but are not limited to:
(1)advanced numerical methods for the simulation of subsurface and surface flow and transport, and associated aspects such as discretization, gridding, upscaling, multiscale algorithms, optimization, data assimilation, uncertainty assessment, and high performance parallel and grid computing;
(2)spatial discretization schemes based on advanced finite element, finite volume, and finite different methods; schemes that preserve local mass conservation (such as mixed finite element methods and discontinuous Galerkin methods) are of particular interest;
(3)decomposition methods for improved efficiency and accuracy in treating flow and transport problems; decomposition methods for nonlinear differential equations and dynamical systems arising in flow and transport; temporal discretization schemes for flow and transport;
(4)a-priori and a-posteriori error estimates in discretizations and decompositions; numerical convergence study; adaptive algorithms and implementation;
(5)modeling and simulation of single-phase and multi-phase flow in porous media or in free space, and its applications to earth sciences and engineering;
(6)modeling and simulation of subsurface and surface transport and geochemistry, and its application to environmental sciences and engineering;
(7)computational thermodynamics of fluids, especially hydrocarbon and other oil reservoir fluids, and its interaction with flow and transport;
(8)computational modeling of flow and transport in other fields, such as geological flow/transport in crust and mantle, material flow in supply chain networks, separation processes in chemical engineering, information flow, biotransport, and intracellular protein trafficking, will also be considered.
- 47: Urgent Computing: Computations for Decision Support in Critical Situations
Contact: A.V. Boukhanovsky
Decision support in critical situations comprising complex technical and environmental systems is a difficult interdisciplinary research area which is based on data-driven technologies, high-performance simulation and visualization facilities. The computational concept of urgent computing is considering as computational services (or resources) as data services work jointly in distributed environment for the help decision maker to make an optimal behavior scenario in time limitations. The main topics of the workshop are:
- Methods and the principles of urgent computing.
- Urgent computing platforms and infrastructures.
- Simulation-based decision support for complex systems.
- Interactive visualization for decision support in emergency situations.
- Domain-area applications to emergency situations.
- 48: Large Scale Computational Physics
Contact: E.H.J. DE DONCKER
Call for Papers:
Workshop on Large Scale Computational Physics/Physical Sciences - LSCP 2013
Authors are invited to submit original contributions to LSCP 2013,
organized in conjunction with the Int. Conf. on Comp. Science (ICCS) (see,
http://www.iccs-meeting.org/iccs2013 ) in Barcelona, Spain, June 5-7, 2013.
Scope. LSCP focuses on symbolic and numerical methods and simulations,
algorithms and tools for developing and running large-scale computations
in physical sciences. Special interest will go to: high numerical
precision, parallelism and scalability (massively parallel systems, GPU,
many-integrated-cores, cluster and grid/cloud computing). Topics will be
chosen from areas including theoretical physics (high energy physics,
nuclear physics, astrophysics, cosmology, quantum physics, accelerator
physics), plasma physics, condensed matter physics, molecular dynamics,
bio-physical system modeling, material science/engineering, nanotechnology,
fluid dynamics, complex and turbulent systems, climate modeling and so on.
Deadline for paper submission. January 15, 2013.
Proceedings. Accepted papers will be printed in the ICCS proceedings
published by Elsevier Science in Procedia Computer Science series.
Workshop Chairs. Elise de Doncker (email@example.com);
Fukuko Yuasa (firstname.lastname@example.org).
Program Committee. M. Al-Turany, Gesellschaft fr Schwerionenforschung (GSI);
D. Bailey, Lawrence Berkeley National Laboratory; T. Ishikawa, High Energy
Accelerator Research Organization (KEK); L. Maschio, Univ. degli Studi
di Torino; N. Nakasato, Univ. of Aizu; D. Perret-Gallix, Centre National de
la Recherche Scientifique (CNRS); J. Vermaseren, Theoretical Physics NIKHEF.
- 52: Solving Problems with Uncertainties
Contact: V.N. Alexandrov
Problems with uncertainty need to be tackled in an increasing variety of areas ranging from problems in physics, chemistry, computational biology to decision making in economics and social sciences. Uncertainty is unavoidable in almost all systems analysis, in risk analysis in decision making and economics and financial modeling, in weather and pollution modeling, disaster modeling and simulation (earthquake modeling, forest fires simulation etc.). How uncertainty is handled and quantified shapes the integrity of the analysis, and the correctness and credibility of the solution and the results. With the advent of exascale computing larger and larger problems have to be tackled in a systematic way and the problem of solving such problems with uncertainties and quantifying the uncertainties becomes even more important due to the variety and scale of uncertainties in such problems. The focus of the workshop will be on methods and algorithms for solving problems with uncertainties, stochastic methods and algorithms for solving problems with uncertainty, methods and algorithms for quantifying uncertainties such as dealing with data input and missing data, sensitivity analysis (local and global), dealing with model inadequacy, model validation and averaging, software fault-tolerance and resilience, etc.
- 53: Fourth Workshop on Data Mining in Earth System Science (DMESS 2013)
Contact: F.M. Hoffman
Spanning many orders of magnitude in time and space scales, Earth science data are increasingly large and complex, and often represent very long time series, making such data difficult to analyze, visualize, interpret, and understand. Moreover, advanced electronic data storage technologies have enabled the creation of large repositories of observational data, while modern high performance computing capacity has enabled the creation of detailed empirical and process-based models that produce copious output across all these time and space scales. The resulting “explosion” of heterogeneous, multi-disciplinary Earth science data have rendered traditional means of integration and analysis ineffective, necessitating the application of new analysis methods and the development of highly scalable software tools for synthesis, assimilation, comparison, and visualization. This workshop explores various data mining approaches to understanding Earth science processes, emphasizing the unique technological challenges associated with utilizing very large and long time series geospatial data sets. Especially encouraged are original research papers describing applications of statistical and data mining methods—including cluster analysis, empirical orthogonal functions (EOFs), genetic algorithms, neural networks, automated data assimilation, and other machine learning techniques—that support analysis and discovery in climate, water resources, geology, ecology, and environmental sciences research.
- 62: Poster Session
Contact: M.H. Lees
- 63: Second International Young Scientists Conference 2013 “HPC technologies and computer modeling” (YSC 2013)
Contact: A.V. Boukhanovsky