Time and Date: 16:20 - 18:00 on 11th June 2014
Room: Bluewater II
Chair: Heiko Aydt
|111|| Analysing the Effectiveness of Wearable Wireless Sensors in Controlling Crowd Disasters [abstract]
Abstract: The Love Parade disaster in Duisberg, Germany lead to several deaths and injuries. Disasters like this occur due to the existence of high densities in a limited area. We propose a wearable electronic device that helps reduce such disasters by directing people and thus controlling the density of the crowd. We investigate the design and effectiveness of such a device through an agent based simulation using social force. We also investigate the effect of device failure and participants not paying attention in order to determine the critical number of devices and attentive participants required for the device to be effective.
|Teo Yu Hui Angela, Vaisagh Viswanathan, Michael Lees, Wentong Cai|
|204|| Individual-Oriented Model Crowd Evacuations Distributed Simulation [abstract]
Abstract: Emergency plan design is an important problem in building design to evacuate people as fast as possible. Evacuation simulation exercises as fire drills are not a realistic situation to understand the behaviour of people. In the case of crowd evacuations the complexity and uncertainty of the systems increases. Computer simulation allows us to run crowd dynamics models and extract information from emergency situations. Several models solve the emergency evacuation problem. Individual oriented modelling allows to describe rules for individual and simulate interactions between them. Because the variation on the emergency situations results have to be statistically reliable. This reliability increases the computing demand. Distributed and parallel paradigms solve the performance problem. In the present work we developed a model to simulate crowd evacuations. We implemented two versions of the model. One using Netlogo and another using C with MPI. We chose a real environment to test the simulator: building 2 of Fira de Barcelona building, able to hold thousands of persons. The distributed simulator was tested with 62,820 runs in a distributed environment with 15,000 individuals. In this work we show how the simulator has a linear speedup and scales efficiently.
|Albert Gutierrez-Milla, Francisco Borges, Remo Suppi, Emilio Luque|
|133|| Simulating Congestion Dynamics of Train Rapid Transit using Smart Card Data [abstract]
Abstract: Investigating congestion in train rapid transit systems (RTS) in today's urban cities is a challenge compounded by limited data availability and difficulties in model validation. Here, we integrate information from travel smart card data, a mathematical model of route choice, and a full-scale agent-based model of the Singapore RTS to provide a more comprehensive understanding of the congestion dynamics than can be obtained through analytical modelling alone. Our model is empirically validated, and allows for close inspection of the dynamics including station crowdedness, average travel duration, and frequency of missed trains---all highly pertinent factors in service quality. Using current data, the crowdedness in all 121 stations appears to be distributed log-normally. In our preliminary scenarios, we investigate the effect of population growth on service quality. We find that the current population (2 million) lies below a critical point; and increasing it beyond a factor of approximately 10% leads to an exponential deterioration in service quality. We also predict that incentivizing commuters to avoid the most congested hours can bring modest improvements to the service quality provided the population remains under the critical point. Finally, our model can be used to generate simulated data for statistical analysis when such data are not empirically available, as is often the case.
|Nasri Othman, Erika Fille Legara, Vicknesh Selvam, Christopher Monterola|
|177|| A method to ascertain rapid transit systems' throughput distribution using network analysis [abstract]
Abstract: We present a method of predicting the distribution of passenger throughput across stations and lines of a city rapid transit system by calculating the normalized betweenness centrality of the nodes (stations) and edges of the rail network. The method is evaluated by correlating the distribution of betweenness centrality against throughput distribution which is calculated using actual passenger ridership data. Our ticketing data is from the rail transport system of Singapore that comprises more than 14 million journeys over a span of one week. We demonstrate that removal of outliers representing about 10\% of the stations produces a statistically significant correlation above 0.7. Interestingly, these outliers coincide with stations that opened six months before the time the ridership data was collected, hinting that travel routines along these stations have not yet settled to its equilibrium. The correlation is improved significantly when the data points are split according to their separate lines, illustrating differences in the intrinsic characteristics of each line. The simple procedure established here shows that static network analysis of the structure of a transport network can allow transport planners to predict with sufficient accuracy the passenger ridership, without requiring dynamic and complex simulation methods.
|Muhamad Azfar Ramli, Christopher Monterola, Gary Kee Khoon Lee, Terence Gih Guang Hung|
|236|| Fast and Accurate Optimization of a GPU-accelerated CA Urban Model through Cooperative Coevolutionary
Abstract: The calibration of Cellular Automata (CA) models for simulating land-use dynamics requires the use of formal, well-structured and automated optimization procedures. A typical approach used in the literature to tackle the calibration problem, consists of using general optimization metaheuristics. However, the latter often require thousands of runs of the model to provide reliable results, thus involving remarkable computational costs. Moreover, all optimization metaheuristics are plagued by the so called curse of dimensionality, that is a rapid deterioration of eciency as the dimensionality of the search space increases. Therefore, in case of models depending on a large number of parameters, the calibration problem requires the use of advanced computational techniques. In this paper, we investigate the eectiveness of combining two computational strategies. On the one hand, we greatly speed up CA simulations by using general-purpose computing on graphics processing units. On the other hand, we use a specifically designed cooperative coevolutionary Particle Swarm Optimization algorithm, which is known for its ability to operate eectively in search spaces with a high number of dimensions.
|Ivan Blecic, Arnaldo Cecchini, Giuseppe A. Trunfio|