slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of PowerPoint Presentation
Download Presentation
Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of

Loading in 2 Seconds...

play fullscreen
1 / 12

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of - PowerPoint PPT Presentation


  • 496 Views
  • Uploaded on

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning Gregory K. W. K. Chung UCLA/CRESST Mani B. Srivastava Department of Electrical Engineering, UCLA

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of' - jaden


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning

Gregory K. W. K. ChungUCLA/CRESST

Mani B. SrivastavaDepartment of Electrical Engineering, UCLA

Annual Conference of the National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

September 14-15, 2000

Los Angeles, CA

measuring behavior
Measuring Behavior
  • Current techniques
    • Real-time observation with sampling
    • Observation of video or audio taped data
    • Characteristics
      • Are time-consuming and prone to error
      • Rarely capture temporal properties of behavior
      • Major advantage: human in-the-loop categorizing of observations
measuring behavior3
Measuring Behavior
  • Sensor-based techniques
    • Computationally measure physical properties of person and related objects
    • Computationally derive observations from sensor data
    • Vast improvement in observation capabilities
      • Scalability (high number of observations)
      • Efficiency (more information / unit cost)
      • Timeliness (rapid turnaround time)
      • Accuracy
measuring behavior4
Measuring Behavior
  • Sensor-based techniques (continued)
    • Measuring the who, what, where, when, and how of human-human and human-object interactions
    • Key challenges:
      • Develop algorithms to support the aggregation of sensor data that accurately measure the construct of interest, are meaningful, are credible, and are in a form usable to different end-users
      • Relate behavioral measurements to cognitive processes and task outcomes
      • Approximate the 24/7 human observer
wireless networked sensors
Wireless Networked Sensors
  • Wireless networked sensors
    • Integrate sensing and short-range communication function in a single unit
      • Low-power consumption (long operational life)
      • Small form factor (embed in everyday objects)
      • RF (avoid line of sight problems)
    • Tetherless bi-directional connection to the Internet
      • Remote measurement and control capability
      • Embed “intelligence” and interactivity in everyday objects
sample of sensor types
Sample of Sensor Types
  • Acoustic
  • Light
  • Image/video
  • Touch/pressure
  • Temperature
  • Identification
  • Position (x,y,z)
  • Proximity (x’,y’,z’)
  • Orientation (360°)
  • Movement (acceleration)
potential application
Potential Application
  • Describing interaction
    • Student-object
    • Student-student
    • Student-teacher
    • Teacher-object
  • Triangulate multiple measures of interaction to successively refine inferences about interaction
example deriving observations of small group object categorization task
Example: Deriving observations of small group object categorization task

Object and student position, student orientation, object- proximity data allow the following questions to be answered:

S1

1. How many objects are categorized correctly by shape? (12-squares, triangles, circles)

2. What object are students focused on? (rhombus)

3. How many objects remain to be categorized? (1-rhombus)

S2

S3

example deriving observations of small group instruction
Example: Deriving observations of small group instruction

Position, orientation, acoustic data allow the following questions to be answered:

S1

T

1. Who is paying attentionto the teacher? (S1, S2)

2. Which students are participating? (S1, S2)

3. What is the nature of the utterance? (S2 - question)

4. Which students are not paying attention orparticipating? (S3, S4)

???

S2

S4

S3

potential application10
Potential Application
  • Describing the classroom environment
    • Measures of:
      • Amount of lecture, independent, small-group instruction
      • Student resource use
      • Student roaming profiles
      • Teacher-student interaction
      • Student-student interaction
      • Student attention
next steps
Next Steps
  • NSF Information Technology Research Grant (2000-2002)
    • UCLA Electrical Engineering lead department (PI Srivastava), UCLA Computer Science department and CRESST are partners
    • Develop technology wireless protocols, network architectures, middleware architecture, data management and mining, user profiling, speech recognition
    • Application domain: Assessing young children’s (K-1) problem-solving development
next steps12
Next Steps
  • Qualitative analyses of classroom, children’s interactions with each other, and children’s interaction with objects
    • Develop measures using sensor data
    • Validate measures with human observations
  • Develop sensor-based assessment of children’s problem-solving skills
    • Use play or other manipulative-based task that requires demonstration of performance
    • Use extended task to gather data over time