Solving Problems with Uncertainties (SPU) Session 1

Time and Date: 14:10 - 15:50 on 13th June 2017

Room: HG F 33.1

Chair: Vassil Alexandrov

129 High-Level Toolset For Comprehensive Visual Data Analysis and Model Validation [abstract]
Abstract: The paper is devoted to the new method of high-level scientific visualization, comprehensive visual analysis and model validation tools development using new version of client-server scientific visualization system SciVi as an example. The distinctive features of the methods implemented are ontology-based automated adaptation to third-party data sources from various application domains and to specifics of the visualization problems as well as multiplatform portability of the software solution. High-level tools for semantic filtering of the rendered data are presented. These tools improve visual analytics capabilities of SciVi enabling to validate solvers’ or/and data sources’ models in more comprehensive form and to reduce uncertainties due to the explicit representation of hidden features of data.
Konstantin Ryabinin and Svetlana Chuprina
402 Statistical Estimation of Brown Bears Population in Rhodope Mountains [abstract]
Abstract: The brown bear (Ursus arctos) is the most widespread bear in the world. It can be found across Europe, Asia and North America in habitats ranging from forests to dry deserts and tundra. One of the best bear habitats in Europe are located in Bulgaria. They are situated in the mountains of Rhodopa, Stara planina, Rila, Pirin, Vitosha. Until 1992 the bear had been a game target. By Order 1023 dated 31.12.1992 of the Ministry of Environment and Water (MoEW)the species has been declared protected, in compliance with the Nature protection act. This status has been kept also after the Biodiversity act was passed in 2002. The Habitat directive requires strict protection of the species and declaration of special protected areas for conservation of its habitats \cite{Red11}. The main habitats of the bear in Bulgaria are included in the ecological network NATURA 2000 \cite{Nat}. For the purposes of protection of the habitats and the management of the network NATURA 2000 a mapping and determination of their environmental status was carried out in the frame of project under the EU operational programmes environment. The acquired information are used for elaboration of plans for management of the protected areas and the populations of the species as well as for regulation of the investment projects therein. That is why it is important to estimate habitat use and population dynamics of brown bears in the country. In this work we study the population of brown bears in Rhodopa Mountains, using data received from the monitoring that was carried out in Autumn 2011. Recommendations regarding the obtained estimators and the necessary sample sizes are presented, as well as some ways to improve data collection during future monitoring.
Todor Gurov, Emanouil Atanassov, Aneta Karaivanova, Ruslan Serbezov and Nikolai Spassov
307 Methodology of estimation of achieving regional goals of sustainable development on the basis of program and goal oriented approach [abstract]
Abstract: This paper describes the methodology of estimation of sustainable development of a region on the basis of a system of weighted target indicators, adjusted in accordance with goals and tasks of regional management. The authors analyze an example of practical application of the methodology in St. Petersburg and draw the conclusions about an influence of objective indicators and subjective value orientations of residents on sustainable development of the region
Sergey Mityagin, Olga Tikhonova and Aleksandr Repkin
223 A Posterior Ensemble Kalman Filter Based On A Modified Cholesky Decomposition [abstract]
Abstract: In this paper, we propose a posterior ensemble Kalman filter (EnKF) based on a modified Cholesky decomposition. The main idea behind our approach is to estimate the moments of the analysis distribution based on an ensemble of model realizations. The method proceeds as follows: initially, an estimate of the precision background error covariance matrix is computed via a modified Cholesky decomposition and then, based on rank-one updates, the Cholesky factors of the inverse background error covariance matrix are updated in order to obtain an estimate of the inverse analysis covariance matrix. The special structure of the Cholesky factors can be exploited in order to obtain a matrix-free implementation of the EnKF. Once the analysis covariance matrix is estimated, the posterior mode of the distribution can be approximated and samples about it are taken in order to build the posterior ensemble. Experimental tests are performed making use of the Lorenz 96 model in order to assess the accuracy of the proposed implementation. The results reveal that, the accuracy of the proposed implementation is similar to that of the well-known local ensemble transform Kalman filter and even more, the use of our estimator reduces the impact of sampling errors during the assimilation of observations.
Elias D. Nino Ruiz, Alfonso Mancilla and Juan Calabria
621 Addressing global sensitivity in chemical kinetic models using adaptive sparse grids [abstract]
Abstract: Chemical kinetic models often carry very large parameter uncertainties and show a strongly non-linear response with rapid changes over relatively small parameter ranges. Additionally, the dimensionality of the parameter space can grow large and there is not a priori known hierarchy between the parameters. Improving the accuracy of the model requires either high-level computationally demanding electronic structure simulation or a typically large number of dedicated experiments. Naturally, a practitioner would like to know which parameters should be determined with higher accuracy and which conclusions can already be drawn from the uncertain model. Using a real life model for the water splitting on a Cobalt oxide catalyst as a prototypical example, we address these problems using global sensitivity analysis (GSA) based on the Analysis Of Variances (ANOVA). For this, we discretize the parameter space by an adaptive sparse grid approach. Dimension adaptivity automatically sorts out unimportant terms in an (anchored) ANOVA expansion and, furthermore, adjusts the resolution for each direction. Using locally supported basis functions, the dimension adaptivity is combined with a local refinement strategy in order to address local, rapid changes. This allows to discretize the 19-dimensional parameter space with only modest numbers of grid points. Our findings indicate that, for the given model, it is not possible to make any quantitative statement, e.g. whether is highly active or not. However, from the GSA, we are still able to draw chemically relevant conclusions, i.e. what are the dominant reaction pathways and how they interact.
Sandra D¨opking, Sebastian Matera, Daniel Strobusch, Christoph Scheurer, Craig Plaisance and Karsten Reuter
68 An ontological approach to dynamic fine-grained Urban Indicators [abstract]
Abstract: Urban indicators provide a unique multi-disciplinary data framework which social scientists, planners and policy makers employ to understand and analyze the complex dynamics of metropolitan regions. Indicators provide an independent, quantitative measure or benchmark of an aspect of an urban environment, by combining different metrics for a given region. While the current approach to urban indicators involves the systematic accurate collection of the raw data required to produce reliable indicators and the standardization of well-known commonly accepted or widely adopted indicators, the next generation of indicators is expected to support a more dynamic, customizable, fine-grained approach to indicators, via a context of interoperability and linked open data. Within this paper, we address these emerging requirements through an ontological approach aimed at (i) establishing interoperability among heterogeneous data sets, (ii) expressing the high-level semantics of the indicators, (iii) supporting indicator adaptability and dynamic composition for specific applications and (iv) representing properly the uncertainties of the resulting ecosystem.
Salvatore Flavio Pileggi and Jane Hunter

Solving Problems with Uncertainties (SPU) Session 2

Time and Date: 16:20 - 18:00 on 13th June 2017

Room: HG F 33.1

Chair: Vassil Alexandrov

406 Recommendation of Short-Term Activity Sequences During Distributed Events [abstract]
Abstract: The amount of social events has increased significantly and location-based services have become an integral part of our life. This makes the recommendation of activity sequences an important emerging application. Recently, the notion of a distributed event (e.g., music festival or cruise) that gathers multiple competitive activities has appeared in the literature. An attendee of such events is overwhelmed with numerous possible activities and faces the problem of activity selection with the goal to maximise satisfaction of experience. This selection is subject to various uncertainties. In this paper, we formulate the problem of recommendation of activity sequences as a combination of personalised event recommendation and scheduling problem. We present a novel integrated framework to solve it and two computation strategies to analyse the categorical, temporal and textual users' interests. We mine the users' historical traces to extract their behavioural patterns and use them in the construction of the itinerary. The evaluation of our approach on a dataset built over a cruise program shows an average improvement of 10.4% over the state-of-the-art.
Diana Nurbakova, Léa Laporte, Sylvie Calabretto and Jérôme Gensel
396 Optimal pricing model based on reduction dimension: A case of study for convenience stores [abstract]
Abstract: Pricing is one of the most vital and highly demanded component in the mix of marketing along with the Product, Place and Promotion. An organization can adopt a number of pricing strategies, typically based on corporate objectives. This paper proposes a methodology to define an optimal pricing strategy for convenience stores based on dimension reduction methods and uncertainty of data. The solution approach involves a multiple linear regression as well as a linear programming optimization model using several variables to consider. A strategy to select a set of important variables among a large number of predictors using mix of PCA and best subset methods is presented. A linear optimization model then in solved using uncertainty data and diverse business rules. To show the value of the proposed methodology computation of optimal prices are compared with previous results obtained in a pilot performed for selected stores. This strategy provides an alternative solution that allows the decision maker include proper business rules of their particular environment in order to define a price strategy that meet the objective business goals.
Laura Hervert-Escobar, Oscar Alejandro Esquivel-Flores and Raul Valente Ramirez-Velarde
388 Identification of Quasi-Stationary Dynamic Objects with the Use of Derivative Disproportion Functions [abstract]
Abstract: This paper presents an algorithm for designing a cryptographic system, in which the derivative disproportion functions (key functions) are used. This cryptographic system is used for an operative identification of a differential equation describing the movement of quasi-stationary objects. The symbols to be transmitted are encrypted by the sum of at least two of these functions combined with random coefficients. A new algorithm is proposed for decoding the received messages making use of important properties of the derivative disproportion functions. Numerical experiments are reported to demonstrate the algorithm’s reliability and robustness.
Vyacheslav V. Kalashnikov, Viktor V. Avramenko, Nataliya I. Kalashnykova and Nikolay Yu. Slipushko
369 Symbol and Bit Error Probability for Coded TQAM in AWGN Channel [abstract]
Abstract: The performance of coded modulation scheme based on the application of integer codes to TQAM constellation with $2^{2m}$ points is investigated. A method of calculating the exact value of SER in the case of TQAM over AWGN channel combined with encoding by integer codes is described. The results (SER and BER) of simulations in the case of coded 16, 64, and 256-TQAM simulations are given.
Hristo Kostadinov and Nikolai Manev
462 A comparative study of evolutionary statistical methods for uncertainty reduction in forest fire propagation prediction [abstract]
Abstract: Predicting the propagation of forest fires is a crucial point to mitigate their effects. Therefore, several computational tools or simulators have been developed to predict the fire propagation. Such tools consider the scenario (topography, vegetation types, fire front situation), and the particular conditions where the fire is evolving (vegetation conditions, meteorological conditions) to predict the fire propagation. However, these parameters are usually difficult to measure or estimate precisely, and there is a high degree of uncertainty in many of them. This uncertainty provokes a certain lack of accuracy in the predictions with the consequent risks. So, it is necessary to apply methods to reduce the uncertainty in the input parameters. This work presents a comparison of ESSIM-EA and ESSIM-DE: two methods to reduce the uncertainty in the input parameters. These methods combine Evolutionary Algorithms, Parallelism and Statistical Analysis to improve the propagation prediction.
María Laura Tardivo, Paola Caymes-Scutari, Germán Bianchini, Miguel Méndez-Garabetti, Andrés Cencerrado and Ana Cortés