Solving Problems with Uncertainties (SPU) Session 1

Time and Date: 10:15 - 11:55 on 2nd June 2015

Room: M201

Chair: Vassil Alexandrov

455 An individual-centric probabilistic extension for OWL: Modelling the Uncertainness [abstract]
Abstract: The theoretical benefits of semantics as well as their potential impact on IT are well known concepts, extensively discussed in literature. As more and more systems are currently using or referring semantic technologies, the challenging third version of the web (Semantic Web or Web 3.0) is progressively taking shape. On the other hand, apart from the relatively limited capabilities in terms of expressiveness characterizing current concrete semantic technologies, theoretical models and research prototypes are actually overlooking a significant number of practical issues including, among others, consolidated mechanisms to manage and maintain vocabularies, shared notations systems and support to high scale systems (Big Data). Focusing on the OWL model as the current reference technology to specify web semantics, in this paper we will discuss the problem of approaching the knowledge engineering exclusively according to a deterministic model and excluding a priori any kind of probabilistic semantic. Those limitations determine that most knowledge ecosystems including, at some level, probabilistic information are not well suited inside OWL environments. Therefore, despite the big potential of OWL, a consistent number of applications are still using more classic data models or unnatural hybrid environments. But OWL, even with its intrinsic limitations, reflects a model flexible enough to support extensions and integrations. In this work we propose a simple statistical extension for the model that can significantly spread the expressiveness and the purpose of OWL.
Salvatore Flavio Pileggi
457 Relieving Uncertainty in Forest Fire Spread Prediction by Exploiting Multicore Architectures [abstract]
Abstract: The most important aspect that affects the reliability of environmental simulations is the uncertainty on the parameter settings describing the environmental conditions, which may involve important biases between simulation and reality. To relieve such arbitrariness, a two-stage prediction method was developed, based on the adjustment of the input parameters according to the real observed evolution. This method enhances the quality of the predictions, but it is very demanding in terms of time and computational resources needed. In this work, we describe a methodology developed for response time assessment in the case of fire spread prediction, based on evolutionary computation. In addition, a parallelization of one of the most important fire spread simulators, FARSITE, was carried out to take advantage of multicore architectures. This allows us to design proper allocation policies that significantly reduce simulation time and reach successful predictions much faster. A multi-platform performance study is reported to analyze the benefits of the methodology.
Andrés Cencerrado, Tomàs Vivancos, Ana Cortés, Tomàs Margalef
723 Populations of models, Experimental Designs and coverage of parameter space by Latin Hypercube and Orthogonal Sampling [abstract]
Abstract: In this paper we have used simulations to make a conjecture about the coverage of a $t$ dimensional subspace of a $d$ dimensional parameter space of size $n$ when performing $k$ trials of Latin Hypercube sampling. This takes the form $P(k,n,d,t)=1-e^{-k/n^{t-1}}$. We suggest that this coverage formula is independent of $d$ and this allows us to make connections between building Populations of Models and Experimental Designs. We also show that Orthogonal sampling is superior to Latin Hypercube sampling in terms of allowing a more uniform coverage of the $t$ dimensional subspace at the sub-block size level.
Bevan Thompson, Kevin Burrage, Pamela Burrage, Diane Donovan
340 Analysis of Space-Time Structures Appearance for Non-Stationary CFD Problems [abstract]
Abstract: The paper presents a combined approach to finding conditions for space-time structures appearance in non-stationary flows for CFD (computational fluid dynamics) problems. We consider different types of space-time structures, for instance, such as boundary layer separation, vortex zone appearance, appearance of oscillating regimes, transfer from Mach reflection to regular one for shock waves, etc. The approach combines numerical solutions of inverse problems and parametric studies. Parallel numerical solutions are implemented. This approach is intended for fast approximate estimation for dependence of unsteady flow structures on characteristic parameters (or determining parameters) in a certain class of problems. The numerical results are presented in a form of multidimensional data volumes. To find out hidden dependencies in the volumes some multidimensional data processing and visualizing methods should be applied. The approach is organized in a pipeline fashion. For certain classes of problems the approach allows obtaining the sought-for dependence in a quasi-analytical form. The proposed approach can be considered to provide some kind of generalized numerical experiment environment. Examples of its application to a series of practical problems are given. The approach can be applied to CFD problems with ambiguities.
Alexander Bondarev, Vladimir Galaktionov

Solving Problems with Uncertainties (SPU) Session 2

Time and Date: 14:10 - 15:50 on 2nd June 2015

Room: M201

Chair: Vassil Alexandrov

509 Discovering most significant news using Network Science approach [abstract]
Abstract: The role of social network mass media increased greatly in the recent years. We investigate news publications in Twitter from the point of view of Network Science. We analyzed news data posted by the most popular media sources to reveal the most significant news over some period of time. Significance is a qualitative property that reflects the news impact degree at society and public opinion. We define the threshold of significance and discover a number of news which were significant for society in period from July 2014 up to January 2015.
Ilya Blokh, Vassil Alexandrov
713 Towards Understanding Uncertainty in Cloud Computing Resource Provisioning [abstract]
Abstract: In spite of extensive research of uncertainty issues in different fields ranging from computational biology to decision making in economics, a study of uncertainty for cloud computing systems is limited. Most of works examine uncertainty phenomena in users’ perceptions of the qualities, intentions and actions of cloud providers, privacy, security and availability. But the role of uncertainty in the resource and service provisioning, programming models, etc. have not yet been adequately addressed in the scientific literature. There are numerous types of uncertainties associated with cloud computing, and one should to account for aspects of uncertainty in assessing the efficient service provisioning. In this paper, we tackle the research question: what is the role of uncertainty in cloud computing service and resource provisioning? We review main sources of uncertainty, fundamental approaches for scheduling under uncertainty such as reactive, stochastic, fuzzy, robust, etc. We also discuss potentials of these approaches for scheduling cloud computing activities under uncertainty, and address methods for mitigating job execution time uncertainty in the resource provisioning.
Andrei Tchernykh, Uwe Schwiegelsohn, Vassil Alexandrov, El-Ghazali Talbi
507 Monte Carlo method for density reconstruction based on insucient data [abstract]
Abstract: In this work we consider the problem of reconstruction of unknown density based on a given sample. We present a method for density reconstruction which includes B-spline approximation, least squares method and Monte Carlo method for computing integrals. The error analysis is provided. The method is compared numerically with other statistical methods for density estimation and shows very promising results.
Aneta Karaivanova, Sofiya Ivanovska, Todor Gurov
20 Total Least Squares and Chebyshev Norm [abstract]
Abstract: We investigate the total least square problem with Chebyshev norm instead of the traditionally used Frobenius norm. Using Chebyshev norm is motivated by seeking for robust solutions. In order to solve the problem, we make link with interval computation and use many of results developed there. We show that the problem is NP-hard in general, but it becomes polynomial in the case of a fixed number of regressors. This is the most important result for practice since usually we work with regression models with a low number of regression parameters (compared to the number of observations). We present not only an precise algorithm for the problem, but also a computationally cheap heuristic. We illustrate the behavior of our method in a particular probabilistic setup by a simulation study.
Milan Hladik, Michal Cerny