computer based experiments obstacles n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Computer-based Experiments: Obstacles PowerPoint Presentation
Download Presentation
Computer-based Experiments: Obstacles

Loading in 2 Seconds...

play fullscreen
1 / 25

Computer-based Experiments: Obstacles - PowerPoint PPT Presentation


  • 81 Views
  • Uploaded on

Computer-based Experiments: Obstacles. Stephanie Bryant University of South Florida. Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references. Obstacles—Overview. Technology Skill Needed Threats to Internal Validity Getting Participants. Obstacles (Con’d).

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Computer-based Experiments: Obstacles' - merle


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
computer based experiments obstacles

Computer-based Experiments:Obstacles

Stephanie Bryant

University of South Florida

Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references

obstacles overview
Obstacles—Overview
  • Technology Skill Needed
  • Threats to Internal Validity
  • Getting Participants
obstacles con d
Obstacles (Con’d)
  • Technology Skills Needed
  • “Proficiency” in software or programming
tools for an computer based experiments
Tools for an Computer-based Experiments
  • Develop Using a Scripting Language or Applications Software
    • Applications Software for Web Experiments
      • Example software packages: RAOSoft, Inquisite, PsychExps
        • More expensive software, cheaper development & maintenance costs? Easier to use, Features = those built into the software
    • Scripting languages:
      • Examples: Cold fusion, PHP, JSP (java server pages), CGI (common gateway interface)
      • Software is cheap or free, higher development & maintenance costs?, difficult for non-programmers, More features, more customizable
    • Combine Scripting Languages & applications software
applications software raosoft products ezsurvey survey win interform

Applications Software: Raosoft Products (Ezsurvey, Survey win, Interform)

Difficulty index (1 = hard,10 = easy): 8

Do not provide all the functionalities

No randomization, response dependent questions (I.e., only straight surveys)

Limited formatting capabilities

Expensive – no educational prices ($1,500 - $10,000)

SurveyMonkey.com - $19.95/month

applications software inquisite
Applications Software: Inquisite
  • Difficulty index (1 = hard,10 = easy):8
  • Expensive ($10,000) Supports most of functionalities
    • To support all desired functionalities requires Software Development Kit (SDK) for complex applications ($2,000 but may be available soon for free)
applications software psychexps
Applications Software: PsychExps
  • PsychExperiments Web site created and maintained by the Univ. of Mississippi Psychology professor Ken McGraw.
  • “Collaboratory”
  • http://psychexps.olemiss.edu/
  • Free!
  • Requires that user download & install applications software
  • Many existing scripts (e.g., randomization)
obstacles con d1
Obstacles (Con’d)
  • Big learning curves involved
  • On-campus support sometimes available
  • Can hire programmers/graduate students to help with programming
obstacles con d2
Obstacles (Con’d)
  • Internal Validity Considerations:
    • Statistical Conclusion Validity
    • Internal Validity
    • Construct Validity
    • External Validity
statistical conclusion validity the extent to which two variables can be said to co vary
Statistical Conclusion Validity(The extent to which two variables can be said to co-vary)
  • Increased sample size and statistical power  (e.g., Ayers, Cloyd et al.)
    • Web to recruit participants!
  • Decreased or eliminated data entry errors 
    • Capture data directly into database
  • Increased variability in experimental settings 
    • Difficult to control in Web experiments
    • People complete experiments in their own (“natural”) settings with various types of computer configurations (browsers, hardware)
    • McGraw et al (2000) note that WE noise is compensated for by large sample sizes
  • System Downtime 
  • Software Coding Errors  (e.g., Barrick, 01, Hodge 01)
internal validity correlation or causation
Internal Validity(Correlation or Causation?)
  • Decreased potential diffusion of treatment 
    • Unlikely that participants will learn information intended for one treatment group and not another.
  • Increased participant drop-out rates across treatments 
    • A higher drop-out rate among Web vs. laboratory experiments could create a participant self-selection effect that makes causal inferences problematic.
    • Mitigate by placing requests for personal information and monetary rewards at the beginning of the experiment (Frick et al 1999) and McGraw et al. (2000).
    • Completion rate approached 86% when some type of monetary reward was offered (Musch and Reips 2000)
internal validity correlation or causation1
Internal Validity(Correlation or Causation?)
  • Controlling “cheating”
    • Multiple submissions by a single participant
    • Identification by email address, logon ID, password, or IP address
  • Randomization (A control)
    • Computer scripts available for randomly assigning participants to conditions
    • Complete scripts published in Baron and Siepmann (2000 247) and Birnbaum (2001, 210-212)
construct validity generalizability from observations to higher order constructs
Construct Validity(Generalizability from observationsto higher-order constructs)
  • Decreased demand effects & other experimenter influences 
    • Rosenthal (66 & 76), Pany (87)
  • Decreased participant evaluation apprehension 
    • Rosenberg (69)
    • “Naturalism” of setting decreases?
getting participants web based experiments
Getting ParticipantsWeb-based Experiments
  • Explosion of WWW Use
    • 172 million computers linked to WWW
    • 90% of CPAs conduct internet research
    • 60% of US population has WWW access
getting participants
Getting Participants
  • Internet Participant Solicitation
    • Benefits
      • Large sample sizes (power) possible
      • Availability of diverse, world-wide populations
      • Interactive, multi-participant responses
      • Real-time randomization of question order
      • Response dependent questions(branch and bound)
      • Authentication and authorization
      • Multimedia(e.g., graphics, sound)
      • On-screen clock
getting participants1
Getting Participants
  • Web-based
    • Post notices in places where your target population might be likely to visit
    • Access ListServs
  • PC-based
    • Student involvement requirement??
    • USF Process
usf process
USF Process
  • Mandatory participation in one experiment per semester
  • Experimentrix site used to manage
  • https://experimetrix2.com/soa/
a final caveat what can go wrong will
A Final Caveat: What Can Go Wrong…Will
  • Cynical, but realistic
  • Plan carefully
  • Develop contingency plans
  • Consider cost-benefit
  • Greatest potential for BAR Web experiments is as yet unrealized
  • Biggest hurdle is required knowledge, but this can be overcome