1 / 23

HCI Masters Project Course

HCI Masters Project Course. PSLC Project Phase 1 Report. Background: The Team. We Are…. Sandi Lowe HCI Project Manager. Jeff Wong HCI Tech Lead. Sam Zaiss HCI Documents Manager. Meghan Myers HCI Client Liaison. Jonathan Jo Tepper School Business Documents Lead. Jason Hum

Download Presentation

HCI Masters Project Course

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HCI Masters Project Course PSLC Project Phase 1 Report

  2. Background: The Team We Are… Sandi Lowe HCI Project Manager Jeff Wong HCI Tech Lead Sam Zaiss HCI Documents Manager Meghan Myers HCI Client Liaison Jonathan Jo Tepper School Business Documents Lead Jason Hum HCI Design Lead

  3. Background: The Outline • Background on the Project • Literature Review • Contextual Inquiries • Competitive Analysis • Revenue Model • Next Steps for Phase 2

  4. Background: PSLC • Status Quo: Seven LearnLab courses of interest. • Languages: ESL, French, Chinese • Science: Chemistry, Physics • Math: Algebra, Geometry • Courses are in various stages of development. • Research done on log files collected from the LearnLab tutors.

  5. Background: DataShop • Rises out of the need to collect all of the data from the LearnLab experiments. • Enable researchers to: • contribute to the DataShop • access, manipulate, share, and learn from the data. • Ken: “Provide a web application to routinize researchers’ data analysis and reporting. • High Frequency: which should be easy to generate, ‘one touch.’ • Low Frequency: which should be possible but with a but more ‘work.’”

  6. Lit Review: Data Viz • Articles read: • Card, S. “Information Visualization.” The Human-Computer Interaction Handbook. • Tufte, E. Envisioning Information. • Data Visualization helps cognitive processes: • Reducing search time for information • Enabling perceptual inference operations • Using visual representations to enhance the detection of patterns. • Conclusion: Might be worthwhile, but a premature consideration.

  7. Lit Review: PSLC Research • Articles read: • Baker, R.S., et al. Off-task Behavior in the Cognitive Tutor Classroom: When Students “Game The System”. • Baker R.S., et al. Detecting Student Misuse of Intelligent Tutoring Systems. • Blessing, S. & Anderson, J. R. How people learn to skip steps. Journal of Experimental Psychology: Learning, Memory and Cognition. • Informs our Contextual Inquiries; grounds CIs in artifacts. • Lesson Learned: Summarizing key points in articles reduces the work load for your team.

  8. CIs: Overview • Bob recommended a dynamic focus. • Initial Focus: Understand how people analyze the data they use in a study. • U1: Looking at performance orientation. On a “fishing expedition.” Spent lots of time preparing data for analysis. • Refined Focus: Same as above, but also determine feelings of ownership over collected data. • U2: Analyzing learning curves. • U3: Comparing time spent on various Algebra lessons. • U4: Measuring how specific motivations impact learning.

  9. CIs: Flow Issues • Make data labels clear and meaningful. • Data not understandable for various reasons. • Clean data and minimize time spent organizing it. • Participants (especially U1) spent a lot of time getting data ready for analysis. • “This is a good number. I can’t wait to see what it means!” • Make common tasks easy to access! • Participants asked for this specifically. • Provide a database! • Not a lot of work here; but it was specifically requested by U2.

  10. CIs: Sequence Issues • Minimize errors, and support basic error recovery (basic heuristic). • Many instances of errors in working with a particular program. • Support intermediate analysis (?) • Study design => erroneous data. Many interesting considerations resulted from this: • Researchers like analyzing their data along the way. • At what point will researchers put data into DataShop? • If our system is relevant earlier in the research process, we should support preliminary analysis and changes to tutor setup.

  11. CIs: Physical Issues • Grad students work in cramped spaces!

  12. CIs: Artifact Issues • System should handle basic organization of data. • It seems that most use Excel to sort the data or look at it preliminarily. • Excel seems to fail at complex analysis. • Users seemed eager to skip the Excel step. • Support use of various statistical packages. • Statistical package used seems to be related to personal background.

  13. CIs: Cultural Issues • Provide clear rules to regulate data sharing. • Participants aren’t opposed to the idea of sharing data, but are hesitant to walk into it blindly. • Strength of feelings depends on whether participants actually collected the data. • Support collaborations between advisors and advisees. • All advisees seemed to have some level of reliance on their advisors. • Usually collaboration on research question. • One participant relied on advisor for data clarification.

  14. CIs: Closing Thoughts • Focus Resetting Meeting revealed some interesting points: • Differentiation between data types and data understanding. • Broader stance: incorporating questions about generating research questions. • Looking at how researchers refer to older studies. • DataShop Process questions for clients. • CI Lessons Learned • When adapting to a new domain, dynamic foci are good. • Don’t let people keep you from your participants.

  15. Comp. Analysis: Overview • Systems Analyzed: • CHILDES Database • Other research databases (Odum Institute, UMD Webuse) • Literature repositories (National Institute of Child Health & Human Development). • Questions Asked • What is it? What is it for? Who has access generally? • Is it a physical space or online? • Is it free or is there a charge? How much does it cost? • Permissions, if any, that are incorporated. • How do people search for / extract data? • What are the data analysis tools? • In what format can people extract data? Graphs, charts, tables, etc. • What do interfaces look like... organization of menus, success of design. • How do they expect users to use them? • Is it obvious where to start?

  16. Comp. Analysis: Good Points • Able to quickly visualize some data. • In some cases, permissions are incorporated to tell between different types of users. • May address some sharing concerns, or may play into revenue model. • Most of them were free; some required “registration.” • Many require adherence to a set of rules. • Most were related to participant confidentiality. • CHILDES database has detailed rules for citing the database in a paper.

  17. Comp. Analysis: Bad Points • Many sites examined used Berkeley’s SDA, or Survey Documentation and Analysis. This seemed like an add on; wasn’t incorporated well. • Understanding what a variable means continues to be difficult. • Hint for running analysis programs: If your browser allows you to open more than one window at a time, click the "Extra Codebook Window" button above. This allows you to switch between the two windows in order to specify the correct names of the variables you want to analyze. • Some sites don’t encourage collaboration as much as they could.

  18. Revenue Model: Justification • Research fund is for research, then what about operating costs? • How can we sustain data shop after 5-year funding period? • Other non-profit research organizations (e.g. IEEE) create revenue through their intellectual capital.

  19. Revenue Model: B2B vs. B2C B2B (Business to Business) B2C (Business to Consumer) • Institutions pay • Long-term • High price range • Individuals pay • Short-term • Low price range • Diverse forms of • charging

  20. Revenue Model: B2B Model • If an institution pays, then all the members of the institution can use the service • Volume discount is applied • Suitable for organizations with a large number of education researchers

  21. Revenue Model: B2C Model • Membership Package model • For users who have diverse interests. • Users can use all data in DataShop. • Category model • For users who have specific interests. • Users can subscribe to the specific data group that they want.

  22. Next Steps • Continuing with CIs • Three scheduled this week. • Seven being finalized for the next two weeks. • Competitive Analysis w/ CHILDES Database. • Unique opportunity to watch the work done to incorporate corpora from a database into a research paper. • CTAT Winter Workshop • A chance to watch a standard data analysis done with CTAT. • Furthering Revenue Model • Conducting competitive analysis on similar organizations. • Requirements Document • Due March 15. • Prototyping!

  23. HCI Masters Project Course PSLC Project Phase 1 Report

More Related