1 / 25

Computer-based Experiments: Obstacles

Computer-based Experiments: Obstacles. Stephanie Bryant University of South Florida. Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references. Obstacles—Overview. Technology Skill Needed Threats to Internal Validity Getting Participants. Obstacles (Con’d).

merle
Download Presentation

Computer-based Experiments: Obstacles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer-based Experiments:Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references

  2. Obstacles—Overview • Technology Skill Needed • Threats to Internal Validity • Getting Participants

  3. Obstacles (Con’d) • Technology Skills Needed • “Proficiency” in software or programming

  4. Tools for an Computer-based Experiments • Develop Using a Scripting Language or Applications Software • Applications Software for Web Experiments • Example software packages: RAOSoft, Inquisite, PsychExps • More expensive software, cheaper development & maintenance costs? Easier to use, Features = those built into the software • Scripting languages: • Examples: Cold fusion, PHP, JSP (java server pages), CGI (common gateway interface) • Software is cheap or free, higher development & maintenance costs?, difficult for non-programmers, More features, more customizable • Combine Scripting Languages & applications software

  5. Applications Software: Raosoft Products (Ezsurvey, Survey win, Interform) Difficulty index (1 = hard,10 = easy): 8 Do not provide all the functionalities No randomization, response dependent questions (I.e., only straight surveys) Limited formatting capabilities Expensive – no educational prices ($1,500 - $10,000) SurveyMonkey.com - $19.95/month

  6. SurveyMonkey.com

  7. Applications Software: Inquisite • Difficulty index (1 = hard,10 = easy):8 • Expensive ($10,000) Supports most of functionalities • To support all desired functionalities requires Software Development Kit (SDK) for complex applications ($2,000 but may be available soon for free)

  8. Applications Software: PsychExps • PsychExperiments Web site created and maintained by the Univ. of Mississippi Psychology professor Ken McGraw. • “Collaboratory” • http://psychexps.olemiss.edu/ • Free! • Requires that user download & install applications software • Many existing scripts (e.g., randomization)

  9. Psychexps Home Page

  10. Psychexps (Con’d)

  11. Current Experiments on Psychexps

  12. Obstacles (Con’d) • Big learning curves involved • On-campus support sometimes available • Can hire programmers/graduate students to help with programming

  13. Obstacles (Con’d) • Internal Validity Considerations: • Statistical Conclusion Validity • Internal Validity • Construct Validity • External Validity

  14. Statistical Conclusion Validity(The extent to which two variables can be said to co-vary) • Increased sample size and statistical power  (e.g., Ayers, Cloyd et al.) • Web to recruit participants! • Decreased or eliminated data entry errors  • Capture data directly into database • Increased variability in experimental settings  • Difficult to control in Web experiments • People complete experiments in their own (“natural”) settings with various types of computer configurations (browsers, hardware) • McGraw et al (2000) note that WE noise is compensated for by large sample sizes • System Downtime  • Software Coding Errors  (e.g., Barrick, 01, Hodge 01)

  15. Internal Validity(Correlation or Causation?) • Decreased potential diffusion of treatment  • Unlikely that participants will learn information intended for one treatment group and not another. • Increased participant drop-out rates across treatments  • A higher drop-out rate among Web vs. laboratory experiments could create a participant self-selection effect that makes causal inferences problematic. • Mitigate by placing requests for personal information and monetary rewards at the beginning of the experiment (Frick et al 1999) and McGraw et al. (2000). • Completion rate approached 86% when some type of monetary reward was offered (Musch and Reips 2000)

  16. Internal Validity(Correlation or Causation?) • Controlling “cheating” • Multiple submissions by a single participant • Identification by email address, logon ID, password, or IP address • Randomization (A control) • Computer scripts available for randomly assigning participants to conditions • Complete scripts published in Baron and Siepmann (2000 247) and Birnbaum (2001, 210-212)

  17. Construct Validity(Generalizability from observationsto higher-order constructs) • Decreased demand effects & other experimenter influences  • Rosenthal (66 & 76), Pany (87) • Decreased participant evaluation apprehension  • Rosenberg (69) • “Naturalism” of setting decreases?

  18. Getting ParticipantsWeb-based Experiments • Explosion of WWW Use • 172 million computers linked to WWW • 90% of CPAs conduct internet research • 60% of US population has WWW access

  19. Getting Participants • Internet Participant Solicitation • Benefits • Large sample sizes (power) possible • Availability of diverse, world-wide populations • Interactive, multi-participant responses • Real-time randomization of question order • Response dependent questions(branch and bound) • Authentication and authorization • Multimedia(e.g., graphics, sound) • On-screen clock

  20. Getting Participants • Web-based • Post notices in places where your target population might be likely to visit • Access ListServs • PC-based • Student involvement requirement?? • USF Process

  21. USF Process • Mandatory participation in one experiment per semester • Experimentrix site used to manage • https://experimetrix2.com/soa/

  22. https://Experimetrix.com/soa

  23. Experimetrix Signup

  24. Experimenter Report

  25. A Final Caveat: What Can Go Wrong…Will • Cynical, but realistic • Plan carefully • Develop contingency plans • Consider cost-benefit • Greatest potential for BAR Web experiments is as yet unrealized • Biggest hurdle is required knowledge, but this can be overcome

More Related