september 20 2006 n.
Skip this Video
Loading SlideShow in 5 Seconds..
September 20, 2006 PowerPoint Presentation
Download Presentation
September 20, 2006

September 20, 2006

137 Views Download Presentation
Download Presentation

September 20, 2006

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. International Technology Alliance In Network & Information Sciences City University of New York Research Team September 20, 2006

  2. City University of New York Overview ITA Faculty Research Zhanyang Zhang Simon Parsons Abbe Mowshowitz Ping Ji Shuqun Zhang Subash Shankar Jinwoo Kim Kent Boklan Ted Brown Amotz Bar-Noy Agenda

  3. CUNY • Nation’s largest urban public university • 12 senior colleges • 6 community colleges • Numerous professional schools, incl. the Graduate School and University Center, the Graduate School of Journalism, the Law School, the School of Professional Studies and the Sophie Davis School of Biomedical Education • Serves 220,000 degree-credit students and more than 200,000 adult, continuing and professional education students • Traces beginnings to the founding in 1847 of the Free Academy, which later became The City College, the first CUNY college

  4. Students • Extremely diverse student body • 31% African American • 29% white • 26% Hispanic • 15% Asian • Speak 115 native languages in addition to English and represent 164 countries • 43% of first-time freshmen are born outside of the U.S. mainland • 62% of students are female and one-third are 25 or older • One of the nation’s leading producers of African-American and Hispanic engineers and physicians. • Among top sources of doctoral, baccalaureate and master’s degrees earned by minority students in all disciplines

  5. Programs & Faculty • Degrees and Programs • Approximately 1,400 academic programs • More than 200 majors overall, more than 100 graduate degree majors • Distinguished Faculty • 6,100 full-time teaching faculty • Spearhead more than 100 research centers • Have won awards and prizes that include the Guggenheim Fellowship, Pulitzer Prize, Academy Award, MacArthur Foundation “Genius Award”

  6. CUNY graduates include: 12 Nobel laureates--10 scientists and two economists--among the highest number from any public university in the country A U.S. Secretary of State, a Supreme Court Justice, mayors, members of Congress, state legislators, an astronaut, actors, singers, composers, writers and inventors More top U.S. corporate executives earned their bachelor’s degrees at CUNY than at any other university in the country At least one-third of college-educated New Yorkers are CUNY graduates Distinguished Alumni

  7. CUNY's doctorate granting institution and the only such publicly supported institution in New York City. It has: 32 doctoral programs a faculty of more than 1,700 approximately 4,000 doctoral students Founded in 1961 Moved in Fall 1999 into a landmark building redesigned to meet needs of a 21st-century institution The Graduate Center

  8. The Graduate Center . • Unique consortial model for doctoral education: • Each program draws on faculty expertise from across the CUNY campuses • 10% of faculty based at The Graduate Center • CUNY's senior colleges provide 90% of faculty needs

  9. 12 computer science faculty Representing 8 different CUNY colleges Brooklyn College The City College of New York Graduate Center Hunter College John Jay College of Criminal Justice Lehman College Queens College College of Staten Island Involved in 8 of the 12 research projects CUNY ITA Research Team

  10. Project 1An Integrated Solution toward Sensor Location, Coverage Verification and Energy Conservation for Wireless Sensor Network ApplicationsZhanyang ZhangCollege of Staten Island

  11. Sensor Location Tracking • Problem Statement: • Sensors are randomly deployed over an open area by fly objects or artillery shells to monitor military targets. • Sensors might not equipped with GPS and unaware of their own locations. • Applications require correlation sensor data with their locations in order to interpret the data in a meaningful way. • Sensor Location Solution Requirements: • GPS free (low cost, small size) • Low overhead (communication, processing, energy consumption) • Reasonable accuracy • Robust and reliable

  12. Flying object Laser beam Virtual grid 1 2 3 5 6 4 Wireless or Wired Connection To Internet 8 9 Base station 7 Cluster head Sensing region Cluster member Signal Stimulation Model (SSM)

  13. Signal Stimulation Model (2)

  14. Rc Rs D D SSM Model Assumptions • Assume all sensors are homogeneous, and non-mobile with optical sensing ability. • Let Rs be the sensing radius that a sensor can cover. • Let Rc be the communication radius that a sensor can cover. • Let D be the width of the square of the cell. • Let Rs2  2D2, then a sensor anywhere in the cell has the sensing coverage of the cell. • Let Rc  2Rs, then a sensor anywhere in a grid can communicate with a sensor anywhere in its neighboring grid. A four cells grid 2

  15. Since the deployed region is divided into a virtual grid, every point in the region can be represented by a pair of (x, y) coordinator values. A sensor has three possible states, U (unknown), H (cluster header) and M (cluster member). Initially all sensor state are set to U. After deploying sensors, an object flies over the deployed region and projects a laser beam to the center of a cell (Xc, Yc). The sensors nearby will sense the signal. The sensor readings are stronger if they are closer to the projected laser beam. The sensor with the strongest reading is identified as cluster header (state=H). All the sensors that have the reading greater than λ (λ-cut) and are one hop away from the cluster header are the members of the cluster (state=M). SCLA Algorithm

  16. Scalability and Performance • Let N be the total number of sensors deployed. • Let M be the total number of cells. • Let n(i) be the number of sensors in cell (i). • Let L be the largest number of communication hops from a cluster header to the closest base station. • The cost of SCLA algorithm in terms of transmitted messages: • Cost(L, M, N) <= n(i) + M * L • ( i = 1 to M, and  n(i) = N) • Assume sensors are uniformly deployed, then • Cost(L, M, N) < = M*(N/M) + M * L = N + M * L • It has an upper limit of O(N), when M and L is significantly smaller then N (this was validated by simulation results).

  17. A Cluster Based Energy-Efficient Data Query Protocol Select temperature, humility From north-west corner Where temperature > T and humility < H Duration 48 hours Period every hour Starts 22:00.00 pm June. 8th, 2006

  18. Sensor Coverage Verification & Maintenance • Problem Statement: • K-coverage problem: every point in the monitoring area has to be covered by at least K sensors all the time. • Need ways to verify coverage after sensor deployment to detect no coverage or insufficient coverage spots. • Unexpected sensor failures can compromise coverage requirement during operations. • Sensor Coverage Solution Requirements: • Perform post deployment assessment to verify coverage. • Provide warnings when coverage requirement is compromised. • Maintain required coverage over the lifetime of operation. • Scalable with low overhead. • Robust and reliable.

  19. Definition 1: The membership degree of a cluster is the number of sensors in a cluster including the header. Theorem 1: For a cluster(i) formed at a cell(i) in according to the SCLA algorithm, if the membership degree of the cluster(i) is no less than K during the lifetime of operation, then the area within the cell(i) is K-covered. Theorem 2: Given a deployed region R which is enclosed by a virtual grid G, if for every cell in G, there is at least one cluster formed at this cell with membership degree no less then K during the lifetime of the operation, then the deployed region R is K-covered. Cluster Based K-Coverage Analysis

  20. SCMA Algorithm • Based on SSM model and the clusters formed by SCLA • K-coverage verification (deterministic model) • Sensor Cluster Maintenance Algorithm (SCMA) • Assume redundant sensor deployment in each cell. • A subset of sensors, Sa, in a cluster are in active mode to support operation with k-coverage (|Sa|  k). • The rest of sensors are in sleep mode to conserve energy. • Sleep sensors wake up periodically to replace active node if necessary. • Sensor duty cycle coordinated by cluster header • Active Mode – Perform full duty of sensing and communication. • Sleep Mode – Shutdown the radio and sensing unit to conserve energy.

  21. Sensor State Transitions in SCMA (1) Sleeping Members (2) Active Members (4) Base Station (3) Cluster Head

  22. Messages Defined in SCMA

  23. Sensor Network Lifetime Expectancy • Problem Statement: • Network lifetime is the time interval during which the network can perform required operations that satisfy QoS constraints. • Power depletion and random sensor failures can impact network lifetime. • Need to estimate the network lifetime expectancy with QoS constraints before deployment. • Need to estimate the number of sensors to be deployed (budget). • Network lifetime is a variable depending on sensor density, battery capacity, sensor energy consumption rate, QoS constraints, and sensor random failure rate. • Solution Requirements: • Practical energy model • Support multiple energy-efficient operation protocols • Flexibility for different QoS requirements (deterministic, differential & probability coverage) • Scalability, low overhead cost, robust and reliable

  24. Focus on prolonging network lifetime instead of individual sensor lifetime. Assume high density deployment (sensor redundancy). Using previous work of SSM and SCLA to form initial clusters. Cluster based sensor failure and energy model analysis. Enhanced SCMA algorithm to support flexible QoS constrains and duty cycle schedules. To archive linear or “near linear” scalability and performance between network lifetime and sensor population under a given QoS constrain. Sensor Network LifetimeExpectancy Model

  25. A Markov Chain Sensor Failure Model A Markov Birth-Death Chain for Sensor Clusters A Probability Matrix How to estimate lifetime of a sensor cluster

  26. Project 10 Task: Coordination Techniques Simon Parsons Brooklyn College

  27. The world is a collection of interacting autonomous agents. Agents interact with their environment and each other. Multi-agent systems

  28. Literature contains a number of types of techniques for coordination: social norms environmental cues signal broadcasting market mechanisms negotiation argumentation Individual techniques studied in depth. Little work on how to choose between them. Range of techniques

  29. Current work explores how to choose between techniques. Examine the effect of different approaches. Two different scenarios: de-mining urban search and rescue Evolutionary lifecycle: explore landscape parameterize techniques evolve new mechanisms Exploring tradeoffs

  30. Urban search and rescue

  31. Project 10Task 3: resource allocation research abbe mowshowitzCity College of New York (Presented by Simon Parsons)

  32. SWITCHING MODEL OF A TASK SATISFICING CRITERIA: e.g., quality, reliability, cost SWITCHING: dynamic (re)allocation of satisfier to requirement REQUIREMENTS SATISFIERS

  33. Switching model is applicable to any goal oriented task Flexible and responsive compared to fixed assignment strategy Can be used in real time management or for simulating alternative strategies ADVANTAGES

  34. Requirements and satisfiers can change over time Allocation procedure and assignment criteria are subject to change Model accommodates dynamic response to changing conditions FEATURES

  35. Problem decomposition identification of task requirements Agent negotiation specification of satisfiers Switching = dynamic resource allocation FRAMEWORK FOR COLLABORATIVE MECHANISMS

  36. REPRESENTING A VIGNETTE (01 BINNI) SATISFICING CRITERIA: max effectiveness/ risk SWITCHING: (re)allocation based on availability and record REQUIREMENTS: components of mission plan SATISFIERS: available resources

  37. Fine tune the model for representing missions work out details of vignettes specify parameter sets for requirements and satisfiers Incorporate dynamic resource allocation algorithms into the model RESEARCH IN COMING YEAR

  38. Project 7QoI of Sensor DataTask 1: QoI Definition and TransformationPing JiJohn Jay College of Criminal Justice

  39. Measure of confidence placed on information provided to decision makers Primal data flows/streams (raw data) Aggregated data flows/streams Derived data flows/streams aggregation point A Sensor Network A (dataType_A) …… … Derived high level knowledge …… …… …… Decision maker(s) … Sensor Network N (dataType_N) aggregation point N Data collection “layer” Aggregation layer Inference layer Quality of Information

  40. Data Cleansing Problem QoI Tomography My Focus

  41. Motivation errors occurring during sensor data collection (i.e., sensoring), data transmission, data fusion, and data processing and integration limited work done for intellectually identifying and repairing “dirty data” Approaching the problem how are sensor data correlated: spatial, temporal, application-dependant sensor data continuality Data Cleansing Problem

  42. Simple problem formulation • We model : vector of measured (sensing) data : vector of data of interest (true value) : vector of measurement noise, - inevitable measurement imperfection : vector of dirty data, e.g., due to hardware/software malfunction • Measurement data has redundancy, (spatial & temporal) correlation : capture relationship among different measurements: expect • However, measured data not always pass sanity check: • Objective: Identify , , s.t. • Approach: • Iterative data cleansing: solving for each element of in turn, treating the rest as true value • greedy approach: correct the best “fix” to first • Distinguish noise and error (dirty data) • fit into noise model: “unexplainable” errorous data are dirty • Related study and collaboration: project 7 Task 3 – Data Calibration

  43. Motivation Where the quality of information is compromised? Is data fusion node malfunctioning? Goal: identify the “lossy” components (e.g., a lossy transmission channel) incorporate QoI tomography results into the data model for data cleansing or decision making data traverse through “lossy” component - a low fidelity level in future processing Network tomography study lossy internal data components are inferred from end-to-end measurements. QoI Tomography

  44. Rest of 1st year Oct. 06 – Jan. 07: Data cleansing problem specification and formulation Feb. – May, 07: Data cleansing problem solutions and publishable results 2nd year and after Continuous work on data cleansing Start working on problem formulations and resolutions of QoI Tomography Projected timeline of work

  45. [Jeffery] S. Jeffery, G. Alonso, M. Franklin, W. Hong, and J. Widom, “Declarative Support for Sensor Data Cleaning,” Proc. of Int’l Conf. on Pervasive Computing, 2006. [Elnahrawy] Eiman Elnahrawy, Badri Nath, “Cleaning and Querying Noisy Sensors”, Proc. of Int’l Workshop of Wireless Sensor Networks and Applications (WSNA), 2003 [Tan] Yee Lin Tan, Vivek Sehgal, Hamid Haidarian Shahri, “SensoClean: Handling Noisy and Incomplete Data in Sensor Networks Using Modeling”, On-line Technical Report, University of Maryland, [Zhao] Qi Zhao, Zihui Ge, Jia Wang, Jun Xu, “Robust Traffic Matrix Estimation with Imperfect Information: Making Use of Multiple Data Sources”, Proc. of ACM Sigmetrics, 2006 [Paskin] Mark Paskin and Carlos Guestrin, “A Robust Architecture for Distributed Inference in Sensor Networks”, Proceedings of IEEE IPSN, 2005. [Castro] R. Castro, M. Coates, G. Liang, R. Nowak, and B. Yu, “Network Tomography: Recent Developments,” J. on Statistical Science, pp. 499-517, 2004. [Ramanathan06] N. Ramanathan, L. Balzano, M. Burt, D. Estrin, T. Harmon, C. Harvey, J. Jay, E. Kohler, S. Rothenberg, and M.Srivastava, “Rapid Deployment with Confidence: Calibration and Fault Detection in Environmental Sensor Networks,” UCLA Center for Embedded Networked Sensing Technical Report #62, April 2006. [Byckovskiy] V. Byckovskiy, S. Megerian, D. Estrin, and M. Potkonjak, “A collaborative approach to in-place sensor calibration,” in IPSN, 2003 [Zhao01]: Jerry Zhao, Ramesh Govindan, Deborah Estrin, “Sensor Network Tomography: Monitoring Wireless Sensor Networks”, Student Poster, ACM Sigcomm, 2001 References

  46. Project 12Ontology ValidationBy Subash ShankarHunter College

  47. Ontologies for DCPDM Construction Re-Use Semantic Expressivity Cultural Variance Validation Shared Situation Awareness and Understanding Planning and Decision Making Project 12 Tasks

  48. Formalisms and mechanisms to show validity of ontologies: Consistency Completeness Quality [Other] properties of ontologies All of above can be intra- or inter-ontology Ontology Validation

  49. Distributed ontologies Cross-cultural nature of ontologies “Lightweight” formal methods Challenges

  50. Identify validation tasks difficult to perform using existing tools Design suitable formalisms for capturing ontologies Develop validation techniques Explore (and exploit) links with other research projects Tasks