1 / 43

Kim Peacock, M.Ed. Yvonne Norton Michael Carbonaro, Ph.D. University of Alberta

The Aboriginal Teacher Education Program Technology Initiative: Current Progress in a One-to-One Laptop Program to Support Blended Delivery. Kim Peacock, M.Ed. Yvonne Norton Michael Carbonaro, Ph.D. University of Alberta. The ATEP Program. Established in 2001

Download Presentation

Kim Peacock, M.Ed. Yvonne Norton Michael Carbonaro, Ph.D. University of Alberta

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Aboriginal Teacher Education Program Technology Initiative: Current Progress in a One-to-One Laptop Program to Support Blended Delivery Kim Peacock, M.Ed.Yvonne Norton Michael Carbonaro, Ph.D. University of Alberta

  2. The ATEP Program • Established in 2001 • ATEP pre-service education programs to Aboriginal students in partnership with First Nations colleges and/or local school authorities across Northern Alberta • Provides access to teacher education programs for individuals in remote areas • Students receive a B.Ed. From the University of Alberta

  3. The ATEP Philosophy • Seeks to deliver community based education that is reflective of: • Community culture • Local concerns and aspirations • Feeling “at home” while learning • Students can continue to live and work in their home communities

  4. Institutional Collaboration • The University works with the First Nations colleges to: • Identify and hire instructors • Develop curricular materials • Plan individual course delivery • Secure resources • A site coordinator also assists students with their individual program planning

  5. ATEP Logistics • ATEP uses a cohort-based model • Build a climate of trust and community • Foster supportive collegiality • Two-year cohort cycle • Average intake of 18-25 students per cycle • Students do not have to be aboriginal to participate in the program • Since 2001, 71 students have graduated from the ATEP program

  6. The ATEP Technology Initiative • In conjunction with Blue Quills First Nations College in St. Paul, Alberta • Approximately 140 miles (220 KM) north-east of Edmonton.

  7. The ATEP Cohort • 20 ATEP Students • 15 Aboriginal and 5 non-aboriginal • 15 females and 5 males • Average age: 35 years old in 2007 • Median age: 33 years old in 2007 • All 20 are participating in the research component of this initiative in some way

  8. Rationale • Previous experiences with ATEP have indicated that participation and retention are directly linked to issues surrounding: • Commuting • Communication • Financial resources • Access to resources (library, classroom, etc...) • Increases in blended-learning opportunities

  9. Project Purpose • Increase access • Increase integration

  10. Project Purpose - Access • Increase student access to technology equipment and resources • Increase access to courses through distance and blended delivery • Foster a community of learners that extends beyond face-to-face learning and contributes to the cultural experiences of ATEP.

  11. Access • Through the generous support of the University of Alberta TLEF program, the TELUS Community Foundation and Hewlett-Packard, each student received: • An HP laptop computer • Microsoft Office 2007 • EVDO satellite Internet card • Unlimited Internet access for the duration of their program

  12. Project Purpose - Integration • Increase levels of technology integration in the program • Develop students’ technology skills • Develop students’ knowledge of classroom technology integration to: • Promote student problem-solving • Promote critical thinking • Support learners with special needs

  13. Integration • Students participated in a series of PD sessions to enhance their skills and raise awareness of resources. • Instructors were approached about the project and encouraged to develop activities that made use of the technology.

  14. Methodology

  15. Research Questions • The research team developed a set of research questions that included examining: • Student and instructor skills • Attitudes towards technology • Instructor best practices • Student integration levels • Overall effectiveness of our model

  16. Examining Student Progress • This paper/presentation looks at our students’ progress so far. • In light of our research questions, this translates to: • Student attitudes • Student skills • Students’ integration levels

  17. Methodology • Both qualitative and quantitative • Qualitative: • Student reflections • Student interviews • Student focus groups • Sharing circles • Artifacts • Concept maps • Quantitative: • Numerous instruments delivered pre-mid-post (pre and mid only at this time)

  18. Methodology –Attitudes • Instruments administered: • Survey of Teachers’ Attitudes Towards Information Technology (TAT) (Knezek & Christenses, 1998) • Attitudes section of the Fordham University Regional Technology Center Technology Skills Self-Assessment Profile (D’Agustino, Imbimbo & King, 2004) • Student reflections • Interviews, focus groups and sharing circles

  19. Methodology - Skill • Skill self-assessment only • Instruments administered: • Technology Proficiency Self-Assessment (TPSA) (Ropp, 1999) to examine competencies • University of Alberta Faculty of Education Technology Survey to examine how independently students and faculty can complete technology tasks • Student reflections • Interviews, focus groups and sharing circles

  20. Methodology - Integration • Analysis of artifacts: final unit plans from their final methods course (Social Studies). • Concept maps • Interviews with mentor teachers after student teaching. • Student reflections • Interviews, focus groups and sharing circles

  21. Learning Styles • For our own interest, we have also administered: • The Paragon Learning Style Inventory (based on the MBPI) (Shindler, 2003) • The Visual – Auditory – Kinaesthetic (VAK) Test (Chislett, 2005)

  22. Results and Discussion

  23. Student Attitude Influences • High average age of the cohort. • Normal distribution with a mean of 35 and a medium of 33 at the onset of the program in 2007. • Some millennial learners, some baby boomers and some from generation X.

  24. Student Attitude Influences • Hardware difficulties • Late arrival • OS issues • Internet access issues • Hardware failures

  25. Student Attitude Results • FURETCTSSA Instrument • Students moved towards a more positive attitude towards computers on 17 out of 18 questions • The remaining question had no change. • Moved from pre-point averages of 4.00 to mid-point averages of 4.35 (n=13).

  26. Student Attitude Results (n=13)

  27. Student Attitude Results • The Survey of Teachers’ Attitudes Towards Technology (TAT) also showed consistent motion towards more positive attitudes. • The formal TAT scoring analysis will be conducted when the post data has been collected (Knezek & Christiensen, 1998).

  28. Student Attitude Results • Average changes of more than .5 points on a 5 point Likert scale between pre and mid point administration of the Teacher Attitudes Towards Instructional Technology (TAT) instrument. (n=13)

  29. Student Attitude Results • Average changes of more than .5 points on a 7 point semantic scale between pre and mid point administration of the Teacher Attitudes Towards Instructional Technology (TAT) instrument. (n=13)

  30. Student Attitude Results • Average changes of more than .5 points on a 7 point semantic scale between pre and mid point administration of the Teacher Attitudes Towards Instructional Technology (TAT) instrument. (n=13)

  31. Student Attitude Results • Initial qualitative data supports these findings. • In student reflections about the use of Elluminate for online courses, 12 responses were predominantly positive, 3 were neutral and 3 were predominantly negative. • Mid-point reflections on the project overall had 10 students with predominantly positive responses and 1 student who was neutral. • Our focus group data has similar ratios, though the data is hard to quantify.

  32. Student Skill Influences • Students were not starting on the even playing field that was anticipated. • Initial skill-self assessments showed that students did not have the skills you would anticipate after an introductory computing course. • PD sessions had to be modified • Supplemental support for low end learners • Supplemental resources for high end learners • Shift of focus to pedagogy and integration over more complex skills

  33. Student Skill Influences • Supplemental support systems were put into place • One additional college support person • One cohort member was hired as a temporary TA • Phone and email support was provided by the PD coordinator

  34. Student Skill Results • Technology Proficiency Self- Assessment • 20 skills self-assessed on a Likert scale • Students moved from an average of 3.88 to 4.12 on pre to mid-point measures (n=12) • Of the twenty items, students moved more towards disagree on one of the skills, three remained the same and moved towards agree on the remaining 16 items. • We have substantial amounts of qualitative data that support these findings.

  35. Student Skill Results (n=12)

  36. Student Skill Results • We will compare the students on our own University of Alberta Faculty of Education Technology Survey which was only administered at pre and post points because of length. • The instrument has students gauge their skill level on a five point scale of: • I am totally confident that I can do this on my own. • I am pretty sure that I can do this on my own. • I could do this with a bit of help. • I can’t do this but would like to learn how. • I can’t do this and don’t really care to learn how.

  37. Integration Challenges • Instructors did not always respond to the initiative in the way we had hoped. • Some instructors made extensive use of the available technology while... • Others opted not to make use of it at all. • Another reason why pedagogy and integration became an important focus of the PD sessions

  38. Integration Experiences • Despite these challenges, students were able to experience a number of technologies in their courses, including: • Taking a synchronous online course • Taking an asynchronous online course • Recording a podcast • Completing digital scavenger hunts • Creating multimedia storybooks • Creating concept maps with Inspiration • Authoring spreadsheets • Blogging • Using Ning as a CMS

  39. Integration Experiences • Students were also able to explore a wide range of tools and software in the PD sessions, even though they didn’t have a chance to apply them in the classroom: • Read and Write Gold • BoardMaker Plus! • Google Earth • Community Walk • Google Sites • Blogger • VoiceThread • And more..

  40. Technology Integration Results • Results to follow... • Students have just begun their final round of student teaching. • Students will complete the second round of concept maps at the call-back day in April. • We do currently have the students’ final unit plans from their last methods course and are beginning to analyze them for technology integration activities.

  41. Conclusion • Despite many challenges, we have seen evidence of positive growth in terms of student attitudes towards technology and self-assessed skill. • We are currently beginning to examine evidence of enhanced student technology integration in their teaching and planning.

  42. Conclusion • We will be publishing further results after we gather post data at the end of the project. • We will also publish best practices and lessons learned for organizations who may be seeking to implement similar programs. • This initiative has provided us with a great deal of guidance about future endeavours and will inform many future decisions about technology use in our off-site programs.

  43. Thank You! • Kim Peacockkim.peacock@ualberta.ca • ATEP Web Sitehttp://atep.ualberta.ca • ATEP Technology Initiative Web Sitehttp://atep.ualberta.ca/technology/

More Related