Loading in 2 Seconds...
Loading in 2 Seconds...
Collaborative Research: An Interactive Situational Awareness Simulation – A View from the Clouds. Fundamental Research Project Proposal. Sheldon Brown, UCSD, Site Director Milton Halem, UMBC Director Yelena Yeshiva, UMBC Site Director Tom Conte, Georgia Tech Site Director. Summary.
Fundamental Research Project Proposal
Sheldon Brown, UCSD, Site Director
Milton Halem, UMBC Director
Yelena Yeshiva, UMBC Site Director
Tom Conte, Georgia Tech Site Director
We will create a virtual world simulation, distributed across a heterogeneous compute infrastructure, to facilitate dynamic analysis of complex situations. The virtual world will be built from a variety of data sources; including high fidelity, well correlated data sources to data mining of internet feeds and many-user updates with commodity devices such as cell phones and digital cameras.
Integrating heterogeneous computing resources to enable applications that can provide robust representations of dynamic situations, including those with tragic human and environmental impact, with data derived from a multitude of sources in a timely manner for decision makers and for citizens at large.
Working collaboratively with our industry partners, who have joined with us in planning and committed to participating in executing this project, will help extend our center’s capabilities to begin addressing fundamental research problems of societal relevance requiring longer term development through the effective integration of our broad informational science base with its rich variety of computing resources that are increasingly heterogeneous in their architecture, distribution and organization.
This project will allow us to synthesize best current practices in divergent areas of multi-core computing, data assimilation models and knowledge representation while exposing areas requiring new approaches to better utilize emerging computing paradigms to solve urgent broad social needs.
We will develop the methods by which the multi-user virtual world can have a coherent representation of assets which differ by type and scale. We will utilize the distributed compute services to integrate these divergent assets for effective real time data assimilation of multi-media data information streaming, meta-tagging, into real world socio- physical models, distributed media databases, creating specific virtual world experiences for distributed viewers and decision makers, whose activities will interact with the virtual world to make more effective decisions in planning event response by providing a scenario testing platform.
Each center site will bring specific expertise to bear: UCSD will extend its work in virtual world client server architectures; UMBC will develop new techniques in multi-media database management to encompass the emerging types of data along with a methodology for cloud hosting and network traffic analysis as indicators of event conditions. Georgia Tech will undertake asset analysis and network data mining. In this first year, we will begin by collecting pre-event data bases to form the baseline virtual world, and then update with static data at various time increments, up to the point of integrating real-time data by year end.
LiDAR scan Torrey Pines State Beach by Falko Kuester, UCSD
Users interact with the data within the virtual world; creating, correlating, validating and annotating data to derive consensus scenarios. Visual semantics allow for a wide range in data qualifications for scenario inclusion.
User participation, extensibility and coherency Distributed Multi-Media Data Base Management
Asset Analysis, Integration and Computation
Quarter 1: Implement a Distributed Heterogeneous Cloud Computing system based on the open Apache provided Hadoop system consisting of their File system, H base system for large image data base querrying and the MapReduce algorithm for content search and query.
Quarter 1 Milestone: Import large volumes of Haiti earthquake data for storage in the H base system
Quarter 2 – 3: Extend the virtual scalable city world with real world data.
Quarter 2 Milestone: Visualize Haitian data static data from distributed cloud
Quarter 3 Milestone:Distribute visualization to simultaneous multiple users.
Quarter 4: Demonstration scenario with advanced GPS equipped multi-media mobile devices
Quarter 4 Milestone: Test quality control features on human acquired event data.
Quarter 4: Test uploading of streaming data from variety of devices
Quarter 4: Implement user data refinement annotations within virtual world
Quarter 4 Milestone: Compare user refined and unrefined models through external analysis. Quarter 4: Run simulation extraction scenarios for initial demonstration of proof of concept. Road re-devlopment as initial test case.