Solving Problems with Uncertainties (SPU) Session 1

Time and Date: 16:30 - 18:10 on 13th June 2019

Room: 0.6

Chair: Vassil Alexandrov

14 Path-Finding with a Full-Vectorized GPU Implementation of Evolutionary Algorithms in an Online Crowd Model Simulation Framework [abstract]
Abstract: This article introduces a path-finding method based on evolutionary algorithms. It makes an extension of the current work on this problem providing a path-finding algorithm and a parallel computing implementation (GPU-based) of it. The article describes both the GPU implementation of full-vectorized genetic algorithms and a path-finding method for large maps based on dynamic tiling. The approach is able to serve a large number of agents due its performance and can handle dynamic obstacles in maps of arbitrary size. The experiments show the proposed approach outperforms other traditional path-finding algorithms, like breadth-first search, Dijkstra’s algorithm, and A*. The conclusions present further improvement possibilities to the proposed approach like the application of multi-objective algorithms to represent full crowd models. Also, further improvement of the presented approach is discussed.
Anton Aguilar-Rivera
44 Analysing the trade-off between computational performance and representation richness in ontology-based systems [abstract]
Abstract: As the result of the intense research activity of the past decade, Semantic Web technology has achieved a notable popularity and maturity. This technology is leading the evolution of the Web via interoperability by providing structured metadata. Because of the adoption of rich data models on a large scale to support the representation of complex relationships among concepts and automatic reasoning, the computational performance of ontology-based systems can significantly vary. In the evaluation of such a performance, a number of critical factors should be considered. Within this paper, we provide an empirical framework that yields an extensive analysis of the computational performance of ontology-based systems. The analysis can be seen as a decision tool in managing the constraints of representational requirements versus reasoning performance. Our approach adopts synthetic ontologies characterised by an increasing level of complexity up to OWL 2 DL. The benefits and the limitations of this approach are discussed in the paper.
Salvatore Flavio Pileggi, Fabian Peña, Maria Del Pilar Villamil and Ghassan Beydoun
92 Assessing uncertainties of unrepresented heterogeneity in soil hydrology using data assimilation [abstract]
Abstract: Soil hydrology is a discipline of environmental physics exhibiting considerable model errors in all its processes. Soil water movement is a key ecosystem process, with a crucial role in services like water buffering, fresh water retention, and climate regulation. The soil hydraulic properties as well as the multi-scale soil architecture are hardly ever known with sufficient accuracy. In interplay with a highly non-linear process described by the Richards equation, this yields significant prediction uncertainties. Data assimilation is a recent approach to cope with the challenges of quantitative soil hydrology. The ensemble Kalman filter (EnKF) is a method which allows to handle model errors for non-linear processes. This enables estimation of system state and trajectory, soil hydraulic parameters, and small-scale soil heterogeneities at measurement locations. Uncertainties in all estimated compartments can be incorporated and quantified. However, as measurements are typically scarce, estimations of high-resolution heterogeneity fields remain challenging. Relevant spatial scales for soil water movement range from less than a meter to kilometers. Accurately representing soil heterogeneities in models at all scales is exceptionally difficult. We investigate this issue on the small scale, where we model a two-dimensional domain with prescribed heterogeneity and conduct synthetic observations in typical measurement configurations. The EnKF is applied to estimate a one-dimensional soil profile including heterogeneities. We assess the capability of the method to cope with the effects of unrepresented heterogeneity by analyzing the discrepancy between synthetic 2D and estimated 1D representation.
Lukas Riedel, Hannes Helmut Bauser and Kurt Roth
119 A Framework for Distributed Approximation of Moments with Higher-Order Derivatives through Automatic Differentiation [abstract]
Abstract: We present a framework for the distributed approximation of moments, enabling an online evaluation of the uncertainty in a dynamical system. The first and second moment, mean, and variance are computed with up to third-order Taylor series expansion. The required derivatives for the expansion are generated automatically by automatic differentiation and propagated through an implicit time stepper. The computational kernels are the accumulation of the derivatives (Jacobian, Hessian, tensor) and the covariance matrix. We apply distributed parallelism to the Hessian or third-order tensor, and the user merely has to provide a function for the differential equation, thus achieving similar ease of use as Monte Carlo-based methods. We demonstrate our approach using with benchmarks on Theta, a KNL-based system at the Argonne Leadership Computing Facility.
Michel Schanen, Daniel Adrian Maldonado and Mihai Anitescu
191 IPIES for Uncertainly Dened Shape of Boundary, Boundary Conditions and Other Parameters in Elasticity Problems [abstract]
Abstract: The main purpose of this paper is modelling and solving boundary value problems considering simultaneously uncertainty of all of input data. These are such data as: shape of boundary, boundary conditions and other parameters. The strategy is presented on problems described by Navier-Lamé equations. Therefore, the uncertainty of parameters here, means the uncertainty of the Poisson ratio and Young's modulus. For solving uncertainly defined problems we use interval parametric integral equations system method (IPIES). In this method we propose modification of directed interval arithmetic for modeling and solving uncertainly defined problems. We consider an examples of uncertainly defined, 2D elasticity problems. We present boundary value problems with linear and as well curvelinear (modelled using NURBS curves) shape of boundary. We verify obtained interval solutions with compare to precisely defined (without uncertainty) analytical solutions. Additionally, to obtain errors of such solutions, we decided to use the total differential method. We also analyze influence of input data uncertainty on interval solutions.
Marta Kapturczak and Eugeniusz Zieniuk

Solving Problems with Uncertainties (SPU) Session 2

Time and Date: 10:15 - 11:55 on 14th June 2019

Room: 0.6

Chair: Vassil Alexandrov

326 Enabling UQ for complex modelling workflows [abstract]
Abstract: The increase of computing capabilities promises to address many scientific and engineering problems by enabling simulations to reach new levels of accuracy and scale. The field of uncertainty quantification (UQ) has recently been receiving an increasing amount of attention as it enables reliability study of modelled systems. However, performance of UQ analysis for high-fidelity simulations remains challenging due to exceedingly high complexity of computational workflows. In this paper, we present a UQ study on a complex workflow targeting a thermally stratified flow. We discuss different models that can be used to enable it. We then propose an abstraction at the level of the workflow specification that enables the modeller to quickly switch between UQ models and manage underlying compute infrastructure in a completely transparent way. We show that we can keep the workflow description almost unchanged while benefitting of all the insight the UQ study provides.
Malgorzata Zimon, Samuel Antao, Robert Sawko, Alex Skillen and Vadim Elisseev
340 Ternary-Decimal Exclusion Algorithm for Multiattribute Utility Functions [abstract]
Abstract: We propose methods to eliminate redundant utility assessments in decision analysis applications. We abstract a set of utility assessments such that the set is represented as a matrix of ternary arrays. To achieve efficiency, arrays converted to decimal numbers for further processing. The resulting approach demonstrates excellent performance on random sets of utility assessments. The method eliminates the redundant questions for the decision maker and can serve for consistency check.
Yerkin Abdildin
341 Sums of Key Functions Generating a Cryptosystem [abstract]
Abstract: In this paper, we propose an algorithm for designing a cryptosystem, in which the derivative disproportion functions are used. The symbols to be transmitted are encoded with the sum of at least two of these functions combined with random coefficients. A new algorithm is proposed for decoding the received messages by making use of important properties of the derivative disproportion functions. Numerical experiments are demonstrating the algorithm’s reliability and robustness.
Viacheslav Kalashnikov, Viktor V. Avramenko and Nataliya Kalashnykova
372 Consistent Conjectures in Globalization Problems [abstract]
Abstract: We study the effects of merging two separate markets each originally monopolized by a producer into a globalized duopoly market. We consider a linear inverse demand with cap price and quadratic cost functions. After globalization, we find the consistent conjectural variations equilibrium (CCVE) of the duopoly game. Unlike in the Cournot equilibrium, a complete symmetry (identical cost functions parameters of both firms) does not imply the strongest coincident profit degradation. For the situation where both agents are low-marginal cost firms, we find that the company with a technical advantage over her rival has a better ratio of the current and previous profits. Moreover, as the rival becomes ever weaker, that is, as the slope of the rival’s marginal cost function increases, the profit ratio improves.
Viacheslav Kalashnikov, Mariel A. Leal-Coronado, Arturo García-Martínez and Nataliya Kalashnykova
373 Verification on the Ensemble of Independent Numerical Solutions [abstract]
Abstract: The element of the epistemic uncertainty quantification concerning the estimation of the approximation error is analyzed from the viewpoint of the ensemble of numerical solutions obtained via independent numerical algorithms. The analysis is based on the geometry considerations: the triangle inequality and measure concentration in spaces of great dimension. In result, the feasibility for nonintrusive postprocessing appears that provides the approximation error estimation on the ensemble of the solutions. The ensemble of numerical results obtained by five OpenFOAM solvers is analyzed. The numerical tests were made for the inviscid compressible flow around a cone at zero angle of attack and demonstrated the successful estimation of the approximation error.
Artem Kuvshinnikov, Alexander Bondarev and Aleksey Alekseev

Solving Problems with Uncertainties (SPU) Session 3

Time and Date: 14:20 - 16:00 on 14th June 2019

Room: 0.6

Chair: Vassil Alexandrov

467 On the estimation of the accuracy of numerical solutions in CFD problems [abstract]
Abstract: The task of assessing accuracy in mathematical modeling of gas-dynamic processes is of utmost importance and relevance. Modern software packages include a large number of models, numerical methods and algorithms that allow to solve most of the current CFD problems. However, the issue of obtaining a reliable solution in the absence of experimental data or any reference solution remains relevant. The paper provides a brief overview of some useful approaches to solving the problem, including such approaches as a multi-model approach, the study of an ensemble of solutions, the construction of a generalized numerical experiment.
Alexander Bondarev
499 "Why did you do that?" Explaining black box models with Inductive Synthesis [abstract]
Abstract: By their nature, the composition of black box models is opaque. This makes the ability to generate explanations for the response to stimuli challenging. The importance of explaining black box models has become increasingly important given the prevalence of AI and ML systems and the need to build legal and regulatory frameworks around them. Such explanations can also increase trust in these uncertain systems. In our paper we present RICE, a method for generating explanations of the behaviour of black box models by (1) probing a model to extract model output examples using sensitivity analysis; and (2) applying CNPInduce, a method for inductive logic program synthesis, to generate logic programs based on critical input-output pairs, and (3) interpreting the target program as a human-readable explanation. We demonstrate the application of our method by generating explanations of an artificial neural network trained to follow simple traffic rules in a hypothetical self-driving car simulation. We conclude with a discussion on the scalability and usability of our approach and its potential applications to explanation-critical scenarios.
Gorkem Pacaci, David Johnson, Steve McKeever and Andreas Hamfelt
510 Predictive Analytics with Factor Variance Association [abstract]
Abstract: Predictive Factor Variance Association (PFVA) is a machine learning algorithm that solves the multiclass problem. A set of feature samples is provided and a set of target classes. If a sample belongs to a class, then that column is marked as one or zero otherwise. PFVA will carry out Singular Value Decomposition in the standardized samples creating orthogonal linear combinations of the variables called Factors. For each linear combination, probabilities are estimated for a target class. Then a least squares curve fitting model is used to compute the probability that a particular sample belongs to a class or not. It can also give predictions based on regression for quantitative dependent variables and carry-out clustering of samples. The main advantage of our technique is a clear mathematical founda-tion using well-known concepts of linear algebra and probability.
Raul Ramirez-Velarde, Laura Hervert-Escobar and Neil Hernandez-Gress
536 Integration of ontological engineering and machine learning methods to reduce uncertainties in health risk assessment and recommendation systems [abstract]
Abstract: This research provides an approach that integrates the best from ontology engineering and machine learning methods in order to reduce some types of uncertainties in health risk assessment challenges and improve explainability of decision-making systems. The proposed approach is based on ontological knowledge base of health risk assessment having regard to medical, genetic, environmental and life style factors. To automate the knowledge base development, we propose integrating both traditional knowledge engineering methods and machine learning approach using collaborative knowledge base Freebase. We also come up with the idea of using Text Mining method based on lexico-syntactic patterns inherited from different sets and created by our own. Moreover, we use ontology engineering methods in order to explain machine learning results, unsupervised methods in particular. In the paper we present the case studies showing original methods and approaches solving problems with some kind of uncertainties in biomedicine decision making systems within BioGenom2.0 platform development. Because the platform use ontology driven reasoner there is no need to make changes in source code in order to tackle health risk assessment challenges using various of knowledge base focused on medical, genetic aspects and etc.
Svetlana Chuprina and Taisiya Kostareva