1 / 28

Assessment of Web-based Education Tools: Are your Students Learning Through Technology?

Assessment of Web-based Education Tools: Are your Students Learning Through Technology? . Andrea Irby Undergraduate Academic Programs North Carolina State University. Background. NC State is a large, public, research one institution. 19,000 undergraduates; 9,000 grads.

sharlene
Download Presentation

Assessment of Web-based Education Tools: Are your Students Learning Through Technology?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment of Web-based Education Tools: Are your Students Learning Through Technology? Andrea Irby Undergraduate Academic Programs North Carolina State University

  2. Background • NC State is a large, public, research one institution. • 19,000 undergraduates; 9,000 grads. • Under pressure in late 1990s to grow • Possible expansion of Distance Education • Virtual Advising Center created 1999

  3. Why Technology? Why Technology? • Enables us to transcend many barriers • Physical campus • Race • Country, nationality • “Saving Face” • Student development theory – meet them where they are • Enables us to be more effective and efficient especially in times constrained resources

  4. Types of Information Exchange • Synchronous vs. Asynchronous • Data - ways of expressing things • Information - the arrangement of data into meaningful patterns • Knowledge - the application and productive use of information • Information versus Knowledge

  5. Technology in Education • Can transform key relationships • Personalization: Targeting the student vs. the department • Engaging the student: making applications interactive • Student Decision Support: Information becomes knowledge • Create community: cultivate student relationships Carl Jacobson, 2000

  6. Engaging Students On-Line • Web-sites • On-line Course Management Systems • Email/Listservs • On line Chat or Instant Messenger type tools • Bulletin Boards and Blogs • Social Networking sites • “Students will gravitate toward the technology medium that best • meets their needs”Karen Thurmond- Univ of Memphis

  7. Focus on Web-sites • 2nd Generation Web-sites: Information based • 3rd Generation are interactive, knowledge based • How do we know if/ how many students come to our website? • Are they able to find the information they need? • Is it clear and meaningful to them? • Do the receive our intended information?

  8. How to Know • Log Data Analysis (data mining) • Usability Studies • Accessibility Testing

  9. Log Data Analysis • Data (log) is created each time a person visits a web-page. • Analytic Tools then either pull data from the server that hosts the web-site or read it from html code on each web-page.

  10. Sampletools Google Analytics (reads code)- free www.google.com/analytics SAS Web Analytics (pulls from server) www.sas.com/solutions/webanalytics TopShareWare (pulls from server) www.topshareware.com/LogSuite-download-38225.htm

  11. Easy to Read Reports (in real life  )

  12. Information Received • Visitors/Users • Repeat Visitors/Cached Visitors • How long on site • Traffic by time of day and month • Where enter, exit, or drop out of site • Browsers, operating systems • Where they are logging in from • Campaign/Action Tracking

  13. 2000: 72,419 annual visitors 26,795 repeat visitors 176 avg. visitors per day 1,021,947 hits to site 2006: 285,053 annual visitors 108,320 repeat visitors 780 avg. visitors per day 8,269,215 hits annually Virtual Advising Center data mining

  14. Virtual Advising Center data mining • Monday-Thursday busiest days. • Visitors on site for avg. of 2 minutes now about 5. • Path in and out; pages turned. • IP locations –whose accessing site on campus, off campus, around the world. • Heaviest traffic times per day.

  15. How we used this information • Sent surveys out on busiest days • Used busiest times of day to determine staff coverage for chat • Used IP address/location information to help IT staff on campus place kiosks in high demand locations • To enhance relationships with local feeder community colleges

  16. Usability Testing • Observed “Focus” Groups for web-sites • Real data versus self reported data • Examines how people use some human-made object • Controlled experiment • Small, cross section of users • Trained users or “Hallway” users

  17. Usability Testing • Pre-test questionnaires get input on subject’s familiarity with web-site • Actual test involves a series of short, simple tasks the subject is asked to complete • Observations are recorded by “tester” • Patterns of how subject completes each task are noted • Emotional responses key to watch

  18. Location can be: In a computer lab with multiple teams In a usability lab At your own computer desk Set up

  19. Usability Testing • Post tests – record overall impressions of web-site and experience; reflection • Very Informative Web-site on Usability Testing: www.usability.gov • If testing uncovers difficulties, redesign • Test in small groups and often • For virtual advising center, its how we keep students at the forefront of our design

  20. Accessibility • Inclusive web-communities allow people with diverse abilities to be engaged in on-line learning • Web-Content Accessibility Guidelines • http://www.w3.org/TR/1999/WAI-WEBCONTENT-19990505/#def-checkpoint • 3 Priorities for Website Accessibility

  21. Accessibility • Series of Checkpoints • Priority 1- Must satisfy these checkpoints or one or more groups will find it impossible to get and use the information • Priority 2- Should satisfy these design elements • Priority 3- May satisfy design elements • Results in A, AA, AAA website ratings

  22. Accessibility • Test your site with on-line tools • Test your site with screen readers/JAWs • On-line web resources to help you: • http://atrc.utoronto.ca/ • http://ncsu.edu/it/access/development/software/tools.php • http://www.accessible.org/bobby-approved.html

  23. Evaluating Learning • Web-site must be interactive tests, essays, portfolios, chats, email • Determine what we want students to learn • Get student input • Variety of ways: focus groups, check lists, rubrics and content analysis • Have more than one staff member working on assessment

  24. What we’ve tried and learned • Focus Groups- what we may see as learning, they see as “help”. • Over time, they may recognize what they have learned: raises question of timing of assessment • Electronic communication trails are rich with evidence of learning. • Students are readmitting, graduating, learning about themselves, picking majors, staying, expanding their academic portfolios.

  25. Evaluating Learning • Need to develop a rubric for evaluating rich content. • George Mason Business School example • Databases/Microsoft Access tool • Technology is a tool – through which students can learn and for helping us with assessment.

  26. Discussion? Questions?

More Related