Main Track (MT) Session 6

Time and Date: 11:00 - 12:40 on 12th June 2014

Room: Kuranda

Chair: Andrew Lewis

199 Mechanism of Traffic Jams at Speed Bottlenecks [abstract]
Abstract: In the past 20 years of complexity science, traffic has been studied as a complex sys- tem with a large amount of interacting agents. Since traffic has become an important aspect of our lives, understanding traffic system and how it interacts with various factors is essential. In this paper, the interactions between traffic flow and road topology will be studied, particularly regarding the relationship between a sharp bend in a road segment and traffic jams. As suggested by Sugiyama[1], when car density exceed a critical density, the fluctuations in speed of each car triggers a greater fluctuation in speed of the car be- hind. This enhancement of fluctuation leads to the congestion of vehicles. Using a cellular automata model modified from Nagel-Schreckenberg CA model[2], the simulation results suggests that the mechanism of traffic jam at bottlenecks is similar to this. Instead of directly causing the congestion in cars, bottleneck on roads only causes the local density of traffic to increase. The resultant congestion is still due to the enhancement of fluctuations. Results of this study opened up a large number of possible analytical studies which could be used as grounds for future works.
Wei Liang Quek, Lock Yue Chew
234 Computing, a powerful tool in flood prediction [abstract]
Abstract: Floods have caused widespread damages throughout the world. Modelling and simulation provide solutions and tools enabling us to face this reality in order to forecast and to make necessary prevention. One problem that must be handled by physical systems simulators is the parameters uncertainty and their impact on output results, causing prediction errors. In this paper, we address input parameters uncertainty towards providing a methodology to tune a flood simulator and achieve lower error between simulated and observed results. The tuning methodology, through a parametric simulation technique, implements a first stage to finding an adjusted set of critical parameters which will be used in a next stage to validate the predictive capability of the simulator in order to reduce the disagreement between observed data and simulated results. We concentrate our experiments in three significant monitoring stations and the percentage of improvement over the original simulator values ranges from 33 to 60%.
Adriana Gaudiani, Emilo Luque, Pablo Garcia, Mariano Re, Marcelo Naiouf, Armando De Giusti
117 Benchmarking and Data Envelopment Analysis. An Approach Based on Metaheuristics [abstract]
Abstract: Data Envelopment Analysis (DEA) is a non-parametric technique to estimate the current level of efficiency of a set of entities. DEA provides information on how to remove inefficiency through the determination of benchmarking information. This paper is devoted to study DEA models based on closest efficient targets, which are related to the shortest projection to the production frontier and allow inefficient firms to find the easiest way to improve their performance. Usually, these models have been solved by means of unsatisfactory methods since all of them are related in some sense to a combinatorial NP-hard problem. In this paper, the problem is approached by metaheuristic techniques. Due to the high number of restrictions of the problem, finding solutions to be used in the metaheuristic algorithm is a difficult problem. Thus, this paper analyzes and compares some heuristic algorithms to obtain solutions of the problem. Each restriction determines the design of these heuristics. Thus, the problem is considered by adding constraints one by one. In this paper, the problem is presented and studied taking into account 9 of the 14 constraints, and the solution to this new problem is an upper bound of the optimal value of the original problem.
Jose J. Lopez-Espin, Juan Aparicio, Domingo Gimenez, Jesús T. Pastor
249 Consensus reaching in swarms ruled by a hybrid metric-topological distance [abstract]
Abstract: Recent empirical observations of three-dimensional bird flocks and human crowds have challenged the long-prevailing assumption that a metric interaction distance rules swarming behaviors. In some cases, individual agents are found to be engaged in local information exchanges with a fixed number of neighbors, i.e. a topological interaction. However, complex system dynamics based on pure metric or pure topological distances both face physical inconsistencies in low and high density situations. Here, we propose a hybrid metric-topological interaction distance overcoming these issues and enabling a real-life implementation in artificial robotic swarms. We use network- and graph-theoretic approaches combined with a dynamical model of locally interacting self-propelled particles to study the consensus reaching process for a swarm ruled by this hybrid interaction distance. Specifically, we establish exactly the probability of reaching consensus in the absence of noise. In addition, simulations of swarms of self-propelled particles are carried out to assess the influence of the hybrid distance and noise.
Yilun Shang and Roland Bouffanais
258 Simulating Element Creation in Supernovae with the Computational Infrastructure for Nuclear Astrophysics at nucastrodata.org [abstract]
Abstract: The elements that make up our bodies and the world around us are produced in violent stellar explosions. Computational simulations of the element creation processes occurring in these cataclysmic phenomena are complex calculations that track the abundances of thousands of species of atomic nuclei throughout the star. These species are created and destroyed by ~60,000 thermonuclear reactions whose rates are stored in continually updated databases. Previously, delays of up to a decade were experienced before the latest experimental reaction rates were used in astrophysical simulations. The Computational Infrastructure for Nuclear Astrophysics (CINA), freely available at the website nucastrodata.org, reduces this delay from years to minutes! With over 100 unique software tools developed over the last decade, CINA comprises a “lab-to-star” connection. It is the only cloud computing software system in this field and it is accessible via an easy-to-use, web-deliverable, cross-platform Java application. The system gives users the capability to robustly simulate, share, store, analyze and visualize explosive nucleosynthesis events such as novae, X-ray bursts and (new in 2013) core-collapse supernovae. In addition, users can upload, modify, merge, store and share the complex input data required by these simulations. Presently, we are expanding the capabilities of CINA to meet the needs of our users who currently come from 141 institutions and 32 countries. We will describe CINA’s current suite of software tools and the comprehensive list of online nuclear astrophysics datasets available at the nucastrodata.org website. This work is funded by the DOE’s Office of Nuclear Physics under the US Nuclear Data Program.
E. J. Lingerfelt, M. S. Smith, W. R. Hix and C. R. Smith