1 / 23

OER Recommender and OER Glue

OER Recommender and OER Glue. Joel Duffin & Justin Ball Tatemae. Outline. Background and Research Goals System Description Evaluation Issues What's next? . Background. Center for Open Sustainable Learning (COSL) Utah State University, David Wiley eduCommons - OpenCourseWare CMS

elaine
Download Presentation

OER Recommender and OER Glue

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OER Recommender and OER Glue Joel Duffin & Justin Ball Tatemae

  2. Outline Background and Research Goals System Description EvaluationIssuesWhat's next? 

  3. Background Center for Open Sustainable Learning (COSL)Utah State University, David Wiley eduCommons - OpenCourseWare CMS OCW Finder - http://ocwfinder.orgSimple, popular search engine for OpenCourseWare Folksemantic - MOCSL User centered tools for open education - Ozmozr, Make a Path, Send2Wiki, Annorate, Scrumdidilyumptious

  4. How do we get people to USE OER effectively?"most of the OER you see at conferences encourages bad learning design" http://tinyurl.com/34qsux6

  5. References Shelton, B. E., Duffin, J., Wang, Y., & Ball, J. (2010). Linking OpenCourseWares and open education resources: Creating an effective search and recommendation system. 1st Workshop on Recommender systems for Technology Enhanced Learning (RecSysTEL 2010). Procedia Computer Science 1, 2865-2870. doi:10.1016/j.procs.2010.08.012 Maull, K. Sumner, T., Recker, M., Shelton, B. E., Duffin, J. (2009). Recommender Systems, Contextualization and Personalization in NSDL: Current and Future Research Directions. Panel presentation at the National Digital Libraries Conference, Washington D.C. Caswell, T., Henson, S., Jensen, M., & Wiley, D. (2008). Open content and open educational resources: Enabling universal education. International Review of Research in Open and Distance Learning, 9(1). http://www.irrodl.org/index.php/irrodl/article/view/469/1001 Duffin, J. & Muramatsu, B. (2008). OER Recommender: Linking NSDL Pathways and OpenCourseWare Repositories. 2008 Joint Conference on Digital Libraries, Pittsburgh, PA. Duffin, J., Muramatsu, B. & Henson Johnson, S. L. (2007). OER Recommender: A recommendation system for open educational resources and the National Science Digital Library. White paper funded by the Andrew W. Mellon Foundation for the Folksemantic.org project. Henson, S., Ball, J., Wiley, D. and Muramatsu, B. (September 2007). Tag, Annotate, Rate and Share: Activities of Daily Living on the Web. European Conference on Technology Enhanced Learning 2007, Crete, Greece Henson, S., Muramatsu, B. and Caswell, T. (2007). Web 2.0 for Teaching and Learning…Folksemantic Tool Set. New Media Consortium Summer Conference, Indianapolis, IN.

  6. Research Goals Increase effective use of OERs by cross linking OpenCourseWares and NSDL OERs • Widgets and APIs for OER providers • OER recommendation algorithms and architectures • Recommendation display • Adoption

  7. Related Resources Cross Repository Links

  8. Related Resources Widget

  9. Evaluation QuestionsHow is OER Recommender being used?How well is it accomplishing goals? How can it be improved? MethodsAnalyticsUser tests and questionnaires

  10. Strengths It is being used eduCommons and OpenLearn are primary adopters Recommendations attract interest (11% click through rate) Areas for improvement: Algorithms (ranking & learning) Display Ease of adoption Coverage (more resources) Conclusions

  11. Learning from user implicit and explicit feedback Presentation bias Narrowing scope of search and recs Dead links Catalog pages Duplicate entries Incremental update Coverage Score cutoff level Confidence vs reliability OER rank (getting beyond text analysis) Faceted search Metadata vs content Personal recommendations Scaling up processing (hadoop, mahout, cascading, bixo) Activity specific recs  More & better widgets Issues

  12. Evaluation Criteria Related Novel Diverse Popular Changed / updated • Task-appropriate • Learner • Do well in class • Explore interests • Teacher • Use resource • Remix (use part of a resource)

  13. Try It! Add Cross-Site Recommendationsto Your Site Examplehttp://ocw.usu.edu/English/english-1010 Instructionshttp://www.folksemantic.com/widgets Add Your Resources to the Index

  14. Related Work Globe - http://globe-info.org/OER Commons - http://oercommons.org/Ariadne Finder - http://ariadne.cti.espol.edu.ec/Xpert - http://www.nottingham.ac.uk/xpert/

  15. Supports effective teaching  and learning by "glueing" together open content and services  Embracing open Integrates with everyone Easily extensible

  16. Modifies pages in place, using content where it exists instead of copying into an LMS Provides functionality through integrations Recommends OERs to authors Allows extension via JavaScript We are looking for partners - joel.duffin@gmail.com

  17. Help Evaluate OER Recommender! Online questionnaire (15 min) Teachers OER providershttp://oerrecommender.org

More Related