Time and Date: 11:20 - 13:00 on 10th June 2014
Room: Tully II
Chair: Craig Douglas
|427|| DDDAS – Bridging from the Exa-Scale to the Sensor-Scale [abstract]
Abstract: This talk will provide an overview of new opportunities created by DDDAS (Dynamic Data Driven Applications Systems) and engendering a new vision for Exa-Scale computing and Big Data. Exa-Scale is considered the next frontier of high-end computational power and Big-Data seen as the the next generation of data-intensive. The presentation will discuss new opportunities that exist through DDDAS in synergism with a vision of additional dimensions to the Exa-Scale and Big Data, namely considering that the next wave of Big Data and Big Computing will result not only from the Exa-Scale frontiers but also from the emerging trend of “ubiquitous sensing” - ubiquitous instrumentation of systems by multitudes of distributed and heterogeneous collections of sets of sensors and controllers. Undeniably, achieving and exploiting Exa-Scale will enable larger scale simulations and complex “systems of systems” modeling, which will produce large sets of computed data contributing to the Big Data deluge, and adding to the data avalanche created by large scientific and engineering instruments. The emerging trend of large-scale, ubiquitous instrumentation through multitudes of sensors and controllers creates another dimension to computing and to data, whereby data and computations for processing and analyzing such data will be performed in combinations of collections of sensor and higher performance platforms – including the Exa-Scale. DDDAS provides a driver for such environments and an opportunity for new and advanced capabilities. The DDDAS paradigm, by its definition of dynamically integrating in a feed-back control loop the computational model with the instrumentation aspects of an application system, premises a unified computational-instrumentation platform supporting DDDAS environments. In general this unified computational-instrumentation platform will consist of a range of systems such as high-end (petascale, exascale), mid-range and personal computers and mobile devices, and instrumentation platforms such as large instruments or collections of sensors and controllers, such networks of large numbers of heterogeneous sensors and controllers. Otherwise stated, in DDDAS the computational and data environments of a given application span a range of platforms from the high-end computing to the data collection instruments - from the exa-scale to sensor-scale. Consequently, DDDAS environments present these kinds of unprecedented levels of computational resource heterogeneity and dynamicity which require new systems software to support the dynamic and adaptive runtime requirements of such environments. In addition to the role of DDDAS in unifying these two extremes of computing and data, there are also technological drivers that lead us to consider the extremes and the range of scales together. Thus far, conquering the exascale has been considered as having “unique” challenges in terms power efficiency requirements at the multicore unit level, dynamic management of the multitudes of such resources for optimized performance, fault tolerance and resilience, to new application algorithms. However, ubiquitous instrumentation environments comprising of sensors (and controllers) have corresponding requirements in terms of power efficiencies, fault tolerance, application algorithms dealing with sparse and incomplete data, etc. Moreover, it is quite possible that the same kinds of multicores that will populate exascale platforms will also be the building blocks of sensors and controllers. In fact, it is likely that these sensors and controllers – these new “killer micros” – they will drive the technologies at the device and chip levels. Leveraging common technologies for the range of platforms from the Exa-cale to the Sensor-Scale, not only is driven by the underlying technologies, but is also driven by the trends in the application requirements. Commonality in the building blocks (e.g. at the chip and multicore levels) across the range and the extremes of the computational and instrumentation platforms will simplify the challenges of supporting DDDAS environments. Such considerations create new opportunities for synergistically advancing and expediting advances in the two extreme scales of computing. The talk will address such challenges and opportunities in the context of projects pursuing capability advances through DDDAS such as those presented in the 2014 ICCCS/DDDAS Workshop and elsewhere.
|287|| Control of Artificial Swarms with DDDAS [abstract]
Abstract: A framework for incorporating a swarm intelligent system with the Dynamic Data Driven Application System (DDDAS) is presented. Swarm intelligent systems, or artificial swarms, self-organize into useful emergent structures that are capable of solving complex problems, but are difficult to control and predict. The DDDAS concept utilizes repeated simulations of an executing application to improve analytic and predictive capability by creating a synergistic feedback loop. Incorporating DDDAS with swarm applications can significantly improve control of the swarm. An overview of the DDDAS framework for swarm control is presented, and then demonstrated with an example swarm application.
|Robert Mccune, Greg Madey|
|114|| Multifidelity DDDAS Methods with Application to a Self-Aware Aerospace Vehicle [abstract]
Abstract: A self-aware aerospace vehicle can dynamically adapt the way it performs missions by gathering information about itself and its surroundings and responding intelligently. We consider the specific challenge of an unmanned aerial vehicle that can dynamically and autonomously sense its structural state and re-plan its mission according to its estimated current structural health. The challenge is to achieve each of these tasks in real time---executing online models and exploiting dynamic data streams---while also accounting for uncertainty. Our approach combines information from physics-based models, simulated offline to build a scenario library, together with dynamic sensor data in order to estimate current flight capability. Our physics-based models analyze the system at both the local panel level and the global vehicle level.
|Doug Allaire, David Kordonowy, Marc Lecerf, Laura Mainini, Karen Willcox|
|198|| Model Based Design Environment for Data-Driven Embedded Signal Processing Systems [abstract]
Abstract: In this paper, we investigate new design methods for data-driven digital signal processing (DSP) systems that are targeted to resource- and energy-constrained embedded environments, such as UAVs, mobile communication platforms and wireless sensor networks. Signal processing applications, such as keyword matching, speaker identification, and face recognition, are of great importance in such environments. Due to critical application constraints on energy consumption, real-time performance, computational resources, and core application accuracy, the design spaces for such applications are highly complex. Thus, conventional static methods for configuring and executing such embedded DSP systems are severely limited in the degree to which processing tasks can adapt to current operating conditions and mission requirements. We address this limitation by developing a novel design framework for multi-mode, data driven signal processing systems, where different application modes with complementary trade-offs are selected, configured, executed, and switched dynamically, in a data-driven manner. We demonstrate the utility of our proposed new design methods on an energy-constrained, multi-mode face detection application.
|Kishan Sudusinghe, Inkeun Cho, Mihaela van der Schaar, Shuvra Bhattacharyya|
|46|| A Dynamic Data Driven Application System for Vehicle Tracking [abstract]
Abstract: Tracking the movement of vehicles in urban environments using fixed position sensors, mobile sensors, and crowd-sourced data is a challenging but important problem in applications such as law enforcement and defense. A dynamic data driven application system (DDDAS) is described to track a vehicle’s movements by repeatedly identifying the vehicle under investigation from live image and video data, predict probable future locations of the vehicle, and reposition sensors or retarget requests for information, in order to reacquire the vehicle under surveillance. An overview of the system is described that includes image processing algorithms to detect and recapture the vehicle from live image data, a computational framework to predict probable vehicle locations at future points in time, and an information and power aware data distribution system to disseminate data and requests for information. A prototype of the envisioned system is described that is under development in the midtown area of Atlanta, Georgia in the United States.
|Richard Fujimoto, Angshuman Guin, Michael Hunter, Haesun Park, Ramakrishnan Kannan, Gaurav Kanitkar, Michael Milholen, Sabra Neal, Philip Pecher|