Session5 14:20 - 16:00 on 13th June 2018

Chair: Maciej Paszynski

 245 Isogeometric Residual Minimization Method (iGRM) with Direction Splitting for Time-Dependent Advection-Diffusion Problems [abstract]Abstract: We propose a novel computational implicit method called Isogeometric Residual Minimization (iGRM) with direction splitting. The method mixes the benefits of isogeometric analysis, implicit dynamics, residual minimization, and alternating direction solver. We utilize tensor product B-spline basis functions in space, implicit second-order time integration schemes and residual minimization at every time step. Then, we implement an implicit time integration scheme and apply, for each space-direction, a stabilized mixed method based on residual minimization. Finally, we show that the resulting system of linear equations has a Kronecker product structure, which results in a linear computational cost alternating direction solver, even using implicit time integration schemes together with the stabilized mixed formulation. We test the proposed method on three advection-diffusion computational examples, including model "membrane" problem, the circular wind problem, and the simulations modelling pollution propagating from a chimney. Judit Muñoz-Matute, Marcin Los, Ignacio Muga and Maciej Paszynski 328 Augmenting Multi-Agent Negotiation in Interconnected Freight Transport Using Complex Networks Analysis [abstract]Abstract: This paper proposes the use of computational methods of Complex Networks Analysis to augment the capabilities of a broker involved in multi agent freight transport negotiation.We have developed an experimentation environment that provides compelling arguments that using our proposed approach the broker is able to apply more effective negotiation strategies for gaining longer term benefits, than those offered by the standard Iterated Contract Net negotiation approach. The proposed negotiation strategies take effect on the entire population of biding agents and are driven by market inspired purposes like for example breaking monopolies and supporting agents with diverse transportation capabilities. Alex Becheru and Costin Badica 358 Security-Aware Distributed Job Scheduling in Cloud Computing Systems: A Game-Theoretic Cellular Automata-based Approach [abstract]Abstract: We consider the problem of security-aware scheduling and load balancing in Cloud Computing systems. This optimization problem we replace by a game-theoretic approach where players tend to achieve a solution by reaching a Nash equilibrium. We propose a fully distributed algorithm based on applying iterated spatial Prisoner's Dilemma Game and a phenomenon of collective behavior of players participating in the game. Brokers representing users participate in the game to fulfill their own two criteria: the execution time of the submitted tasks and the level of provided security assurance. We experimentally show that in the process of the game a solution is found which provides an optimal resource utilization while users meet their applications’ performance and security requirements with a minimum expenditure and overhead. Jakub Gasior and Franciszek Seredynski 402 Residual minimization for isogeometric analysis in reduced and mixed forms [abstract]Abstract: Most variational forms of isogeometric analysis use highly-continuous basis functions for both trial and test spaces. For a partial differential equation with a smooth solution, isogeometric analysis with highly-continuous basis functions for trial space results in excellent discrete approximations of the solution. However, we observe that high continuity for test spaces is not necessary. In this work, we present a framework which uses highly-continuous B-splines for the trial spaces and basis functions with minimal regularity and possibly lower order polynomials for the test spaces. To realize this goal, we adopt the residual minimization methodology. We pose the problem in a mixed formulation, which results in a system governing both the solution and a Riesz representation of the residual. We present various variational formulations which are variationally-stable and verify their equivalence numerically via numerical tests. Victor Calo, Quanling Deng, Sergio Rojas and Albert Romkes

Chair: Derek Groen

 465 Introducing VECMAtk - verification, validation and uncertainty quantification for multiscale and HPC simulations [abstract]Abstract: Multiscale simulations are an essential computational tool in a range of research disciplines, and provide unprecedented levels of scientific insight at a tractable cost in terms of effort and compute re- sources. To provide this, we need such simulations to produce results that are both robust and actionable. The VECMA toolkit (VECMAtk), which is officially released in conjunction with the present paper, estab- lishes a platform to achieve this by exposing patterns for verification, validation and uncertainty quantification (VVUQ). These patterns can be combined to capture complex scenarios, applied to applications in dis- parate domains, and used to run multiscale simulations on any desktop, cluster or supercomputing platform. Derek Groen, Robin Richardson, David Wright, Vytautas Jancauskas, Robert Sinclair, Paul Karlshoefer, Maxime Vassaux, Hamid Arabnejad, Tomasz Piontek, Piotr Kopta, Bartosz Bosak, Jalal Lakhlili, Olivier Hoenen, Diana Suleimenova, Wouter Edeling, Daan Crommelin, Anna Nikishova and Peter Coveney 350 EasyVVUQ: Building trust in simulation. [abstract]Abstract: Modelling and simulation are increasingly well established techniques in a wide range of academic and industrial domains. As their use becomes increasingly important it is vital that we understand both their sensitivity to inputs and how much confidence we should have in their results. Nonetheless, few simulations are reported with rigorous validation (V) and verification (V), or even meaningful error bars (uncertainty quantification - UQ). EasyVVUQ is a Python library designed to allow the integration of non-intrusive VVUQ techniques into existing simulation workflows. Our aim is to provide the basis for tools which wrap around applications to allow the user to specify the scientifically interesting parameters of the model and the type of VVUQ algorithm they wish to implement and the details of the setup and analysis are abstracted from them. To this end we have designed JSON based input formats that provide a human readable and comprehensible interface to the code. The EasyVVUQ framework is based on the concept of a Campaign of simulations, the inputs of which are generated by a range of sampling algorithms. This Campaign is executed externally to the library but the results processed, aggregated and analyzed within it. EasyVVUQ provides simple templating features that facilitate mapping between scientific parameters and input options and files for a wide range of applications out of the box. Furthermore, our design allows simple user customization of both the input generation and the extraction of relevant data from simulation outputs by expert users and developers. We present it's use in three example multiscale applications from the VECMA project; protein-ligand binding affinity calculations, coupled molecular dynamics and finite element materials modelling and fusion. David Wright, Robin Richardson and Peter Coveney 391 Uncertainty quantification in multiscale simulations applied to fusion plasmas [abstract]Abstract: In order to predict the overall performance of a thermonuclear fusion device, an understanding on how microscale turbulence affects the global transport of the plasma is essential. A multiscale component based fusion simulation was designed by coupling together several single-scale physics models into a workflow comprising a transport code, an equilibrium code and a turbulence code. While previous simulations using such workflow showed promising results on propagating turbulent effects to the overall plasma transport [1], the profiles of densities and temperatures simulated by the transport model carry uncertainties that have yet to be quantified. The turbulence code provides the transport coefficients that are inherently noisy. These coefficients are then propagated through the transport code and produce an uncertainty interval in the calculated profiles, which would be used in the equilibrium and turbulence codes to calculate new uncertainty intervals. Our goal is therefore to study how these uncertainties propagate through the workflow, so that we can draw quantitative comparisons between numerical and experimental results. In this context, we are developing tools based on a non-intrusive polynomial chaos expansion [2] (PCE). For that, each sub-model is treated as a black box in which the PCE method is applied. Then, several statistical metrics are derived directly from the polynomial expansion, and finally we get the uncertainty quantification (UQ) and the parameter sensitivity of the multiscale model involved. References: [1] O.O. Luk, O. Hoenen, A. Bottino, B.D. Scott, D.P. Coster, ComPat framework for multiscale simulations applied to fusion plasmas, Computer Physics Communications (2019), https://doi.org/10.1016/j.cpc.2018.12.021. [2] R. Preuss, U. von Toussaint, Uncertainty quantification in ion–solid interaction simulations, Nuclear Instruments and Methods in Physics Research Section B (2017), https://doi.org/10.1016/j.nimb.2016.10.033. Jalal Lakhlili, David Coster, Olivier Hoenen, Onnie Luk, Roland Preuss and Udo von Toussaint 293 Analysis of Uncertainty of an In-Stent Restenosis Model [abstract]Abstract: Uncertainty and sensitivity analysis provides insights on how uncertainty in the model inputs affects the model response [1, 2]. Usually, methods for such analysis are computationally expensive and may require high performance resources. In [3], we perform uncertainty quantification applying the quasi-Monte Carlo method for a two-dimensional version of an in-stent restenosis model (ISR2D) [4]. Additionally, in [5], we improve the efficiency of uncertainty estimation by applying the semi-intrusive multiscale method [6]. We observe approximately 30% uncertainty in the mean neointimal area as simulated by the ISR2D model. Depending on whether a fast initial endothelium recovery occurs, the proportion of the model variance due to natural variability ranges from 15% to 35%. The endothelium regeneration time is identified as the most influential model parameter. The model output contains a moderate quantity of uncertainty, and the model precision can be increased by obtaining a more certain value on the endothelium regeneration time. The results obtained by the semi-intrusive method show a good match to those obtained by a black-box quasi-Monte Carlo method (see Fig. 1). Moreover, we significantly reduce the computational cost of the uncertainty estimation. We conclude that the semi-intrusive metamodeling method is reliable and efficient, and can be applied to such complex models as the ISR2D model. Anna Nikishova, Lourens Veen, Pavel Zun and Alfons Hoekstra 100 Creating a reusable cross-disciplinary multi-scale and multi-physics framework: from AMUSE to OMUSE and beyond [abstract]Abstract: We describe our efforts to create a multi-scale and multi-physics framework that can be retargeted across different disciplines. Currently we have implemented our approach in the astrophysical domain, for which we developed AMUSE, and generalized this to the oceanographic and climate sciences, which led to the development of OMUSE. The objective of this paper is to document the design choices that led to the successful implementation of these frameworks as well as the future challenges in applying this approach to other domains. Federico Inti Pelupessy, Simon Portegies Zwart, Arjen van Elteren, Henk Dijkstra, Fredrik Jansson, Daan Crommelin, Pier Siebesma, Ben van Werkhoven and Gijs van den Oord

Chair: Xin-She Yang

 437 Comparison of Constraint-Handling Techniques for Metaheuristic Optimization [abstract]Abstract: Most engineering design problems have highly nonlinear constraints and the proper handling of such constraints can be important to ensure solution quality. There are many different ways of handling constraints and different algorithms for optimization problems, which makes it difficult to choose for users. This paper compare six different constraint-handling techniques such as penalty methods, barrier functions, $\epsilon$-constrained method, feasibility criteria and stochastic ranking. The pressure vessel design problem is solved by the flower pollination algorithm, and results show that stochastic ranking and $\epsilon$-constrained method are most effective for this type of design optimization. Xing-Shi He, Qin-Wei Fan and Xin-She Yang 231 Dynamic Partitioning of Evolving Graph Streams using Nature-inspired Heuristics [abstract]Abstract: Detecting communities of interconnected nodes is a frequently addressed problem in situation that be modeled as a graph. A common practical example is this arising from Social Networks. Anyway, detecting an optimal partition in a network is an extremely complex and highly time-consuming task. This way, the development and application of meta-heuristic solvers emerges as a promising alternative for dealing with these problems. The research presented in this paper deals with the optimal partitioning of graph instances, in the special cases in which connections among nodes change dynamically along the time horizon. This specific case of networks is less addressed in the literature than its counterparts. For efficiently solving such problem, we have modeled and implements a set of meta-heuristic solvers, all of them inspired by different processes and phenomena observed in Nature. Concretely, considered approaches are Water Cycle Algorithm, Bat Algorithm, Firefly Algorithm and Particle Swarm Optimization. All these methods have been adapted for properly dealing with this discrete and dynamic problem, using a reformulated expression for the well-known modularity formula as fitness function. A thorough experimentation has been carried out over a set of 12 synthetically generated dynamic graph instances, with the main goal of concluding which of the aforementioned solvers is the most appropriate one to deal with this challenging problem. Statistical tests have been conducted with the obtained results for rigorously concluding the Bat Algorithm and Firefly Algorithm outperform the rest of methods in terms of Normalized Mutual Information with respect to the true partition of the graph. Eneko Osaba, Miren Nekane Bilbao, Andres Iglesias, Javier Del Ser, Akemi Galvez-Tomida, Iztok Jr. Fister and Iztok Fister 317 Bat Algorithm for Kernel Computation in Fractal Image Reconstruction [abstract]Abstract: Computer reconstruction of digital images is an important problem in many areas such as image processing, computer vision, medical imaging, sensor systems, robotics, and many others. A very popular approach in that regard is the use of different kernels for various morphological image processing operations such as dilation, erosion, blurring, sharpening, and so on. In this paper, we extend this idea to the reconstruction of digital fractal images. Our proposal is based on a new affine kernel particularly tailored for fractal images. The kernel computes the difference between the source and the reconstructed fractal images, leading to a difficult nonlinear constrained continuous optimization problem, solved by using a powerful nature-inspired metaheuristics for global optimization called the bat algorithm. An illustrative example is used to analyze the performance of this approach. Our experiments show that the method performs quite well but there is also room for further improvement. We conclude that this approach is promising and that it could be a very useful technique for efficient fractal image reconstruction. Akemi Galvez-Tomida, Eneko Osaba, Javier Del Ser and Andres Iglesias Prieto 107 Heuristic Rules for Coordinated Resources Allocation and Optimization in Distributed Computing [abstract]Abstract: In this paper, we consider heuristic rules for resources utilization optimization in distributed computing environments. Existing modern job-flow execution mechanics impose many restrictions for the resources allocation procedures. Grid, cloud and hybrid computing services operate in heterogeneous and usually geographically distributed computing environments. Emerging virtual organizations and incorporated economic models allow users and resource owners to compete for suitable allocations based on market principles and fair scheduling policies. Subject to these features a set of heuristic rules for coordinated compact scheduling are proposed to select resources depending on how they fit a particular job execution and requirements. Dedicated simulation experiment studies integral job flow characteristics optimization when these rules are applied to conservative backfilling scheduling procedure. Victor Toporkov and Dmitry Yemelyanov 37 Nonsmooth Newton’s Method: Some Structure Exploitation [abstract]Abstract: We investigate real asymmetric linear systems arising in the search direction generation in a nonsmooth Newton’s method. This applies to constrained optimisation problems via reformulation of the necessary conditions into an equivalent nonlinear and nonsmooth system of equations. We propose a strategy to exploit the problem structure. First, based on the sub-blocks of the original matrix, some variables are selected and ruled out for a posteriori recovering; then, a smaller and symmetric linear system is generated; eventually, from the solution of the latter, the remaining variables are obtained. We prove the method is applicable if the original linear system is well-posed. We propose and discuss different selection strategies. Finally, numerical examples are presented to compare this method with the direct approach without exploitation, for full and sparse matrices, in a wide range of problem size. Alberto De Marchi and Matthias Gerdts