1 / 37

Research Methods Project Program Evaluation Marty-Jean Bender Shawn Flood Clemente Julian Lilinoe Yong University o

Did you do your homework?.

jana
Download Presentation

Research Methods Project Program Evaluation Marty-Jean Bender Shawn Flood Clemente Julian Lilinoe Yong University o

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Research Methods Project “Program Evaluation” Marty-Jean Bender Shawn Flood Clemente Julian Lilinoe Yong University of Hawai'i Educational Technology ETEC 601 February 2007 Aloha and welcome to our Research Methods Project. Shawn, Clemente, Lilinoe and myself will be introducing you to “Program Evaluation” (click)Aloha and welcome to our Research Methods Project. Shawn, Clemente, Lilinoe and myself will be introducing you to “Program Evaluation” (click)

    2. Did you do your homework? “An Evaluation of On-line Assignment Submission, Marking, and Return” Stuart Palmer, 2005 What was the main aim of the study, and was it achieved? 2. What was the means of acquiring the data? What data implied a need for change in future iterations? (hint: there are 2 answers.) What changes were needed? I’m sure everyone has done their reading homework. However, let me refresh your memory a bit about our article on On-line Assignment Submission, Marking, and Return while you get out paper & pen and jot down your answers to the 3 questions about the reading which are showing on the screen. Study takes place 2004 with a 2nd semester, 4th year Engineering Management class at Deakin University in Deakin, Australia. It includes students located on-campus, off-campus, as well as off-shore in both Singapore & Malaysia. Prior to 2003 the method of assignment submission was in hard-copy form either in person or via surface mail In 2003, the year prior to this study, this instructor successfully changed weekly journal assignments to a WebCT discussion board. During this semester of study: the instructor introduced both computer-scored multiple choice testing & online submission, marking & return of written reports. This was accomplished by students posting Word documents, and his Feedback showed in the returned Work document by using the “Track Changes” feature Everyone answer the 3 questions? Let me know with a thumbs down if you need more time. (click when no thumbs down)I’m sure everyone has done their reading homework. However, let me refresh your memory a bit about our article on On-line Assignment Submission, Marking, and Return while you get out paper & pen and jot down your answers to the 3 questions about the reading which are showing on the screen. Study takes place 2004 with a 2nd semester, 4th year Engineering Management class at Deakin University in Deakin, Australia. It includes students located on-campus, off-campus, as well as off-shore in both Singapore & Malaysia. Prior to 2003 the method of assignment submission was in hard-copy form either in person or via surface mail In 2003, the year prior to this study, this instructor successfully changed weekly journal assignments to a WebCT discussion board. During this semester of study: the instructor introduced both computer-scored multiple choice testing & online submission, marking & return of written reports. This was accomplished by students posting Word documents, and his Feedback showed in the returned Work document by using the “Track Changes” feature Everyone answer the 3 questions? Let me know with a thumbs down if you need more time. (click when no thumbs down)

    3. Points to Ponder Rationale : Large increase in number of students (183) for the 1 instructor First time on-line methods used in this program Online submission, grading & feedback will facilitate speed and ease for both students and instructor Advantages: Computer-scored test allowed additional number of students to not impact instructor’s scoring time Streamlined administration of a large number of student assignments Reduced assignment turn-around time (especially for off-campus, off-shore students) Both student and instructor have copies of individual assignments with instructor mark-up comments Automatic grade management feature in WebCT Electronic submission allowed easy application of electronic plagiarism testing procedures Dr Palmer, the instructor and Evaluator, is aware technology is making an impact on assessment - the University programs are increasingly including on-line elements He’s had a really large increase in the number students taking this class Wanted to keep LARGER number of SMALLER assignments This is the first time these online methods have been used, so the ongoing students are having to adapt Marking & submitting can be done anywhere - freedom for all Students can track their grades & submissions to insure accuracy Cut down on paper use Permanent electronic archive And the belief being tested by assessing participant experiences is that using the online process will be faster and easier for both students and the instructor (click)Dr Palmer, the instructor and Evaluator, is aware technology is making an impact on assessment - the University programs are increasingly including on-line elements He’s had a really large increase in the number students taking this class Wanted to keep LARGER number of SMALLER assignments This is the first time these online methods have been used, so the ongoing students are having to adapt Marking & submitting can be done anywhere - freedom for all Students can track their grades & submissions to insure accuracy Cut down on paper use Permanent electronic archive And the belief being tested by assessing participant experiences is that using the online process will be faster and easier for both students and the instructor (click)

    4. Still Pondering Method: Voluntary & anonymous, written survey **27 responses out of 182 participants received (14.8%) Statistical analysis showed good match between gender & field of study between total group and respondent group Statistically similar mean ages for on-campus and off-campus students between total group and respondent group Respondents group showed a borderline skewing toward on-campus student location, yet it was felt that since the on-line methods applied to all locations, the respondent’s location was actually not significant to the study **Although response rate low, good correlations suggest validity of conclusions inferred from respondents **Valid findings or rationalization? Written questionnaire was sent to each student with postage-paid return envelope University requirements that it be voluntary and anonymous Sought demographic input including the location where student accessed class, ability to use the system, usefulness of system, general opinions about the system (click) Less than 15% responded to the survey (click twice) Good matches between respondents and the total participant group for gender, field of study, mean age between on & off campus students.(click) A higher percentage of on-campus students returned the study, but due to the on-line nature required by all students, this did not seem to per pertinent to the study. (click) With such a low percentage of return, I wondered about the study’s assertion that positive correlations are sufficient for valid conclusions. Without more information from a psychological statistics class, additional substantiating studies would be helpful in determining this. (click)Written questionnaire was sent to each student with postage-paid return envelope University requirements that it be voluntary and anonymous Sought demographic input including the location where student accessed class, ability to use the system, usefulness of system, general opinions about the system (click) Less than 15% responded to the survey (click twice) Good matches between respondents and the total participant group for gender, field of study, mean age between on & off campus students.(click) A higher percentage of on-campus students returned the study, but due to the on-line nature required by all students, this did not seem to per pertinent to the study. (click) With such a low percentage of return, I wondered about the study’s assertion that positive correlations are sufficient for valid conclusions. Without more information from a psychological statistics class, additional substantiating studies would be helpful in determining this. (click)

    5. More Ponderings Results: Over 90% had used on-line submission previously and knew how to submit Over 80% knew how to get their assignment grades Only 44% knew how to get their assignment feedback 85% said turn-around time was faster online, 15% said it was the same Ease of use = 4.1 (1 = very hard, 5 = very easy) Value of feedback = 4.2 for those who knew how to get their feedback Value of feedback = 2.9 for those who did not know how to get feedback (1 = not valuable , 5 = very valuable) Liked: speed, timeliness, convenience. Same deadlines for all. Disliked: “no problems”, file size limits, couldn’t find feedback Most students had used online submission & knew how to get their assignment grades (click) Less than 1/2 knew how to get the feedback on their papers however (click) Everyone agreed it was equal fast or faster than hard-copy submission (click) Most agreed it was very easy to use The value of the feedback given was closely related to whether or not students knew how to get to their feedback (click) What they reported that they liked were the anticipated outcomes. (click) Many had nothing they disliked, but the two problems mentioned were file size limits and finding instructor feedback.(click) Most students had used online submission & knew how to get their assignment grades (click) Less than 1/2 knew how to get the feedback on their papers however (click) Everyone agreed it was equal fast or faster than hard-copy submission (click) Most agreed it was very easy to use The value of the feedback given was closely related to whether or not students knew how to get to their feedback (click)What they reported that they liked were the anticipated outcomes. (click) Many had nothing they disliked, but the two problems mentioned were file size limits and finding instructor feedback.(click)

    6. Final Ponderings Conclusion: Although most had used online submission before and knew how to submit and get their scores, less than ˝ knew how to retrieve their feedback Since feedback is important to learning, this needs to be addressed with improved instructions All students thought on-line submission was at least as fast as paper submission regardless of whether they were on-campus or off-campus, with over 80% thinking it was faster Achieved goal Most students rated it as easy to use Achieved goal Those that had trouble locating their feedback, rated value of the feedback as much lower than those that could locate their feedback Need to improve feedback instructions Speed, timeliness, and convenience of operation Achieved goal Many respondents had no negative aspects to report Achieved goal Students had trouble including graphics with size set so low, so had to mail hardcopy Maximum size of document was set too small Finding their instructor’s feedback was a problem for many, and because instructor feedback is critical to success, improvement of these instructions is needed prior to the next implementation. (click twice) Regarding speed and ease of use, the goal for this implementation was achieved (click) Worth was clearly tied to successful access, so again, instructions regarding feedback access need to be reworked and improved. (click twice) Participant confirmation that this implementation was of benefit was received. (click) File size was increased with the next version of WebCT which came out during this class, so this issue has been automatically resolved. (click) Finding their instructor’s feedback was a problem for many, and because instructor feedback is critical to success, improvement of these instructions is needed prior to the next implementation. (click twice) Regarding speed and ease of use, the goal for this implementation was achieved (click) Worth was clearly tied to successful access, so again, instructions regarding feedback access need to be reworked and improved. (click twice) Participant confirmation that this implementation was of benefit was received. (click) File size was increased with the next version of WebCT which came out during this class, so this issue has been automatically resolved. (click)

    7. Did you do your homework? “An Evaluation of On-line Assignment Submission, Marking, & Return” What was the main aim of the study, and was it achieved? To facilitate submission, grading, and feedback for a larger number of students. Yes, it was achieved. What was the means of acquiring the data? Voluntary, anonymous survey or questionnaire What data implies a need for change in future iterations? (hint: there are 2 answers.) a) Students who had trouble finding the instructor’s feedback didn’t value the feedback as much as students who found it more easily. b) File size limit was set too small to allow necessary graphics. What is the change needed? a) Improved instructions on how to retrieve instructor’s assignment feedback. b) File size limit increased. So now, check your answers your wrote down on your papers BEFORE the review, as we go through the questions again. By using the clap response, how many got all 3 right? 2 right? 1 right? 0 right? By using the confused icon, if you missed a question, who did NOT know the right answer after I reviewed the article? Now by using the smile icon, if you missed a question, who DID know the right answer after I reviewed the article? Great! (click)So now, check your answers your wrote down on your papers BEFORE the review, as we go through the questions again. By using the clap response, how many got all 3 right? 2 right? 1 right? 0 right? By using the confused icon, if you missed a question, who did NOT know the right answer after I reviewed the article? Now by using the smile icon, if you missed a question, who DID know the right answer after I reviewed the article? Great! (click)

    8. What is Program Evaluation? A method to improve the effectiveness of a practice. • Simple (counting completions) • Moderate (surveying satisfaction levels) • Complex (observing pre/post behavior differences) It is NOT research, which uses more rigidly defined methods to “prove” effectiveness & thus anticipates a high degree of generalizability (transference to other settings) Purposes / When to use: Clarify & validate program objectives (anytime) Predict if objective is attainable (prior to implementing) Monitor & recommend adjustments (during implementation) Determine effectiveness (after implementation) Calculate worth - cost vs gain (after implementation) Assess ways to improve program (after implementation) Increase evaluator’s knowledge & skill (anytime) Now that you’ve seen an example, I’ll give you more information about the process itself. It is a way to improve a practice and can be a simple, moderate, or complex undertaking. (click) It is different from doing a research study which would use a control group as well as other prescribed criteria. Because of this rigor, research studies have a greater transferability between other non-studied setting. For instance, a Program Evaluation may show that students score well from when taking a pre-algebra math course online. From this Program Evaluation you cannot transfer this finding to say that any math class taught online will also be successful. A Research study, however, which has a control group and test group for 3 different math courses, lists all criteria in common for each course, and then has similar and positive outcomes with each, can then make a prediction regarding similar success for a 4th math delivery. And this Research study may have enough criteria in common with an Engineering class, that it may be used to predict success in that area as well. Program Evaluations are just that - evaluating specific programs. Reseach studies are more complex and rigorous so that their results may be applied to similar, but unevaluated contexts. (click) Some purposes can be used at any time during the implementation process, and some are intended to be used at a particular stage. Some benefit those who are implementing the program, some benefit those involved with the program, and all benefit the evaluator. (click)Now that you’ve seen an example, I’ll give you more information about the process itself. It is a way to improve a practice and can be a simple, moderate, or complex undertaking. (click) It is different from doing a research study which would use a control group as well as other prescribed criteria. Because of this rigor, research studies have a greater transferability between other non-studied setting. For instance, a Program Evaluation may show that students score well from when taking a pre-algebra math course online. From this Program Evaluation you cannot transfer this finding to say that any math class taught online will also be successful. A Research study, however, which has a control group and test group for 3 different math courses, lists all criteria in common for each course, and then has similar and positive outcomes with each, can then make a prediction regarding similar success for a 4th math delivery. And this Research study may have enough criteria in common with an Engineering class, that it may be used to predict success in that area as well. Program Evaluations are just that - evaluating specific programs. Reseach studies are more complex and rigorous so that their results may be applied to similar, but unevaluated contexts. (click) Some purposes can be used at any time during the implementation process, and some are intended to be used at a particular stage. Some benefit those who are implementing the program, some benefit those involved with the program, and all benefit the evaluator. (click)

    9. Who? The “Stake Holders”: Providers: suppliers, staff, specialists Consumers: customers, clients, communities The Evaluator Ethics Alert: The study has been commissioned by one of the other stake holders. The Evaluator may be subject to conscious or unconscious pressure to insure the desired outcome. Program Evaluation includes most or all stake holders. (Research focuses on 1 or 2 only.) In Program Evaluation the evaluator is an observer without overt influence on the program. (In Research, the evaluator is directly influencing the program with testing limitations.) Who is involved? Those that are providing the study environment and those that are participating in the study are the obvious stake holders. But it can’t be ignored that the Evaluator is a participant as well, and has been hired by one or more of the stakeholders. It is critical that the Evaluator remember that connection to guard against swaying the evaluation. (click) There are additional differences between Program Evaluation and Research. When making the decision whether to conduct a Program Evaluation or a Research study, these differences need to be considered as well as the time and effort involved with each. (click) Who is involved? Those that are providing the study environment and those that are participating in the study are the obvious stake holders. But it can’t be ignored that the Evaluator is a participant as well, and has been hired by one or more of the stakeholders. It is critical that the Evaluator remember that connection to guard against swaying the evaluation. (click) There are additional differences between Program Evaluation and Research. When making the decision whether to conduct a Program Evaluation or a Research study, these differences need to be considered as well as the time and effort involved with each. (click)

    10. Why? Reasons: Accountability (outside of program - funding source inside program - administration) Improvement (identifying what works and what doesn’t) Marketing (demonstration showing effectiveness) Ethics Alert: While research can infer transference to additional settings due to their rigid & controlled testing methods, with Project Evaluation this extrapolation must be left to the consumer to surmise. Why would an Evaluator be sought? There are 3 reasons to conduct a Program Evaluation. The instigator of the evaluation can come from any of the stakeholder group. (click) The decision whether to conduct a Program Evaluation or a Research study depends on how the results are to be used. If you are deciding whether or not to keep a particular course setup, then a Program Evaluation is appropriate. If you want to know if applying the same course setup to another grade level or another subject, then a Research study would be more useful. However, if a Program Evaluation with successful outcomes uses a similar setting, it may be transferable - but that is for those involved with the similar setting to decide, not for the Evaluator to promote. (click)Why would an Evaluator be sought? There are 3 reasons to conduct a Program Evaluation. The instigator of the evaluation can come from any of the stakeholder group. (click) The decision whether to conduct a Program Evaluation or a Research study depends on how the results are to be used. If you are deciding whether or not to keep a particular course setup, then a Program Evaluation is appropriate. If you want to know if applying the same course setup to another grade level or another subject, then a Research study would be more useful. However, if a Program Evaluation with successful outcomes uses a similar setting, it may be transferable - but that is for those involved with the similar setting to decide, not for the Evaluator to promote. (click)

    11. How? Methods: • Quantitative (analysing numbers, amounts) ~ simple statistics (frequency, central tendency, variability) ~ results can generalize about similar program criteria • Qualitative (identifying patterns, characteristics) ~ subjective, Evaluator-dependant ~ detailed observations, records critical Models: (a program can be assessed using some or all models) Needs Assessment (is there a need?) Feasibility Study (can it be done?) Process Evaluation (can it be improved?) Outcome Evaluation (did it do what was desired?) Cost Analysis (was it financially worthwhile?) THEN: apply Research Methods to determine transferability. How do you go about doing a Project Evaluation? The study can be either quantitative or qualitative. Quantitative is more objective and more able to generalize about similar program criteria. A qualitative study is more subjective and relies heavily on the objectivity and reliability of the Evaluator. (click) There are 5 different models of Program Evaluation. We know some already about a Needs Assessment (thank you Ari and Peter!) as it points out the gap between what is desired and what actually exists. All levels of the education structure can use this model. Feasibility looks at the nuts and bolts of implementing, including the resources, funds, and time involved - departments use this to make decisions about new offerings Process looks at improving the program “on the fly” - teachers do this non-formally all the time - may also be used more formally to compare delivery methods Outcome Evaluation was the method used by the assigned reading. Also departments or administration may use this to determine if an experimental program was successful. Cost Analysis is used by administration a one tool in deciding what programs to keep and what to cut If your Project Evaluation is successful, you might then go to a more scientifically controlled Research Study to see if of compartmentalized use or applicable to widespread use. (click)How do you go about doing a Project Evaluation? The study can be either quantitative or qualitative. Quantitative is more objective and more able to generalize about similar program criteria. A qualitative study is more subjective and relies heavily on the objectivity and reliability of the Evaluator. (click) There are 5 different models of Program Evaluation. We know some already about a Needs Assessment (thank you Ari and Peter!) as it points out the gap between what is desired and what actually exists. All levels of the education structure can use this model. Feasibility looks at the nuts and bolts of implementing, including the resources, funds, and time involved - departments use this to make decisions about new offerings Process looks at improving the program “on the fly” - teachers do this non-formally all the time - may also be used more formally to compare delivery methods Outcome Evaluation was the method used by the assigned reading. Also departments or administration may use this to determine if an experimental program was successful. Cost Analysis is used by administration a one tool in deciding what programs to keep and what to cut If your Project Evaluation is successful, you might then go to a more scientifically controlled Research Study to see if of compartmentalized use or applicable to widespread use. (click)

    12. In Summary Program Evaluation needs to be pre-planned Program Evaluation should be included in each stage of implementation (before, during, after) Investigation & research needs to be done at each stage by the Evaluator Diagnosis (seek multiple options) Design (sequence, boundaries, roles clearly defined) Delivery (continual adaptations during process) Debriefing (assess experiential change) Disembarkation (transition out of program) Ethics Alert: report findings without skewing toward desired outcomes do not over-generalize past the situation studied be aware research requests need more knowledge/experience honor the rights of the participants evaluate your own study for flaws Program Evaluation can’t be an afterthought - without pre-planning important aspects get missed. Ideally included at each stage The Evaluator has responsibility to investigate as thoroughly as possible during each phase of delivery. Part of the due-diligence scholarly peer-reviewed journals is to verify the Evaluator is reputableProgram Evaluation can’t be an afterthought - without pre-planning important aspects get missed. Ideally included at each stage The Evaluator has responsibility to investigate as thoroughly as possible during each phase of delivery. Part of the due-diligence scholarly peer-reviewed journals is to verify the Evaluator is reputable

    13. Are you still with me? 1. A more basic and less time-consuming information gathering method, that seeks to improve effectiveness in a given situation. a) Research b) Program Evaluation 2. When study is completed, study results can be applied to other areas not studied, if basic criteria are met. a) Research b) Program Evaluation 3. Study involves most or all associated groups including customers, clients, communities, suppliers, staff, and specialists. a) Research b) Program Evaluation Using the selection buttons on the top of your Elluminate screen, choose a or b for your answers to the following three questions. B A B Good job, and to round out your understanding of Program Evaluation we will present 3 more examples for you. Using the selection buttons on the top of your Elluminate screen, choose a or b for your answers to the following three questions. B A B Good job, and to round out your understanding of Program Evaluation we will present 3 more examples for you.

    14. Shawn with First ET Example Shawn is next with his example - A Model for Developing and Managing Distance Education Programs using Interactive Video Technology.Shawn is next with his example - A Model for Developing and Managing Distance Education Programs using Interactive Video Technology.

    15. 1st Example of ET Research A Model for Developing and Managing Distance Education Programs using Interactive Video Technology. The article is written by associate professor Michael Forster and director and associate professor Earlie Washington, School of Social Work, from the University of Southern Mississippi.

    16. Article in a Nutshell The model described in the article is based on 2 years of experience with a distance education program at the graduate level using interactive video technology. Major components of the model in the article are: Accreditation standards compliance Resource requirements Curriculum adaptation Faculty Development Program Evaluation Although this model was developed with the Masters in Social Work in mind, considering these factors is essential in the development of any distance education program.

    17. Accreditation Standards Compliance Why is it important? One of the accreditation standards states that, “all program components, including part-time and off-campus are to provide an equal quality of education.” This means that DE programs will be held to the same accreditation standards as it’s brick and mortar counter part. To ensure that the DE program is compliant with its accreditation standards it has created a program development and accreditation checklist which includes: Student Development Curriculum Evaluation By the show of a smiley face or clapping hands, how many of you have had some type of involvement with accreditation at your institution within the last 2 years?

    18. Resource Requirements Although it is difficult to compare costs between DE and traditional classrooms, Technology Costs plus General Administrative costs plus The cost of assuring accreditation compliance Do not add up to cost reduction! By the show of a smiley face or clapping hands, how many of you can tell me the estimated cost for a university distance education program expense? Is it.. $ 50 - 75k $ 75 - 100k $ 100 - 150k $ 175 - 277k That’s right. The expense to the university is ______! This includes costs for: personnel, travel, postage, telephone, printing, contractual services, rental, and supplies. For Universities that do not provide classes, library resources and technical support, the costs will be significantly higher.

    19. Curriculum and Course Adaptation Determine which courses are adaptable to the technology used to deliver the program, and which courses are not. Obviously a practicum clinical setting would not be appropriate for an interactive video network (ITV) class. Incorporate the 4 basic stages to ensure that the adaptation to ITV is in line with learner needs and content requirements: Design Development (pivotal stage in adapting to ITV environment) Evaluation Revision Differences in timing and pacing must be addressed. Guidelines should specify what will be done if the equipment fails.

    20. Faculty Development Develop a formal orientation process for both IVN and distance education, and require all DE faculty to complete it before instructing a class. Provide mentoring support to staff so that they are proficient in the use of delivery technology and function effectively as a facilitator of learning activities. Conduct early course observations and formative evaluation so that corrective action can be taken in time to make a difference.

    21. Program Evaluation Evaluation should address 3 program dimensions: Implementation Using formative evaluation has the program been implemented as it was designed? Quality Assurance Regularly evaluate the compliance of both accreditation and program standards around: curriculum, resources, faculty, student development, and program organization and governance. Outcomes An evaluation of program outcome includes a comparison of both the main campus and DE programs. Answer 2 questions: What do we count as success? How do we measure it?

    22. Data Collection A performance guidance system is needed to ensure that we are consistently driving a process of continuous quality improvement. Ideally, the schools management information system will be adjusted to routinely collect this data and return it to the appropriate stakeholder. By the show of a smiley face or clapping hands, how many of you would say that you have first hand knowledge that this data is being routinely collected at your institution?

    23. Clemente with Second ET Example

    24. Technology Connoisseurs in the Campus Evaluating Educational Technologies Evaluates effectiveness of technologies 4 positions in evaluating educational technologies Stated objectives Compare alternative approaches Benefits & Costs of the technology Criteria developed from a particular theory

    25. Technology Connoisseurs in the Campus Scenario Uses fourth position Study done Fall 2004 State college in Southeast

    26. Technology Connoisseurs in the Campus Based on Eisner’s Connoisseurship model Definition Art of appreciation 5 dimensions Intentional Structural Curricular Pedagogical Evaluative

    27. Collection Process 7 technology connoisseurs Different capacities Interview questions Based on 5 dimensions 1st Dimension Question: demographic information 2nd Dimension Question 1: Technology selection criteria Question 2: Compatibility of technology with school goals Technology Connoisseurs in the Campus

    28. Technology Connoisseurs in the Campus 3rd Dimension Question 1: Availability of technology 4th Dimension Question 1: Alignment of technology and curricular goals 5th Dimension Question 1: Learning and teaching experiences

    29. Technology Connoisseurs in the Campus Review Question Sub bullets Review Question Sub bullets

    30. Lilinoe with Third ET Example

    31. VR Simulator Room CompleXscope at the National Institute for Fusion Science is a projector-based, room-sized VR system built by engineers To facilitate learning in a life-like fusion reactor room. If I were to conduct a program evaluation on a VR simulator room it would be a qualitative study based on the observations of the participant. The data collected would be the observations. NOTES: The observer can walk in a room surrounded by the screens equipped with a 3D sound system, eight loudspeakers to enable a sonification in addition to visualizations. In the VR room there is only a simple input device called Wanda which has the tracking sensor in it to enable the observer to interact with the VR world. NOTE: Stereo sounds relative to distance and velocity between the virtual object and the observer are produced automatically.If I were to conduct a program evaluation on a VR simulator room it would be a qualitative study based on the observations of the participant. The data collected would be the observations. NOTES: The observer can walk in a room surrounded by the screens equipped with a 3D sound system, eight loudspeakers to enable a sonification in addition to visualizations. In the VR room there is only a simple input device called Wanda which has the tracking sensor in it to enable the observer to interact with the VR world. NOTE: Stereo sounds relative to distance and velocity between the virtual object and the observer are produced automatically.

    32. Virtual Reality Program Evaluation Educational Uses of Web-based Virtual Reality Virtual reality computer-simulated environment Paper-based module traditional Tonight I am focusing on both the quantitative (survey) and qualitative (observations) methods with you. The theory behind virtual reality is that students are better able to master, retain, and generalize new knowledge when they are actively involved in constructing that knowledge in a learning-by-doing situation. My question is what effect will virtual reality have on student achievement? Tonight I am focusing on both the quantitative (survey) and qualitative (observations) methods with you. The theory behind virtual reality is that students are better able to master, retain, and generalize new knowledge when they are actively involved in constructing that knowledge in a learning-by-doing situation. My question is what effect will virtual reality have on student achievement?

    33. Virtual Reality Program Evaluation Ways VR facilitate learning visualize abstract concepts observe at atomic or planetary scales visit and interact with distant environments Multi-sensory stimulated learning One of the unique capabilities of virtual reality is the ability to allow students to visualize abstract concepts, to observe events at atomic or planetary scales, and to visit environments and interact with events that distance, time, or safety factors make unavailable. So the observations become the data, it can be discussed in interviews, or focus groups with guided questions; all of which are qualitative. Program evaluation should be conducted before, during, and after the implementation, but due to time constraints we are going to respond to a survey which is a quantitative method of program evaluation. One of the unique capabilities of virtual reality is the ability to allow students to visualize abstract concepts, to observe events at atomic or planetary scales, and to visit environments and interact with events that distance, time, or safety factors make unavailable. So the observations become the data, it can be discussed in interviews, or focus groups with guided questions; all of which are qualitative. Program evaluation should be conducted before, during, and after the implementation, but due to time constraints we are going to respond to a survey which is a quantitative method of program evaluation.

    34. VR at James Cook University Navigating through virtual world Virtual tour of Great Pyramids http://www.pbs.org/wgbh/nova/pyramid/explore/khufudesclo.html Paper-based module Text-based lesson Ancient Egypt Quality pictures and diagrams Analyze: Which approach was most effective for this program? Why? Reflect: Which is most appealing to you? Why? Researchers at James Cook University School of Education compared the effectiveness of navigating through virtual reality against the use of textbooks when teaching 7th grade students about the Egyptian Pyramids. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. Overall the students felt that the text-based information was most beneficial to their learning because it had more detail. The tour basically showed them the size and depth of the pyramid. It didn’t support the theory. NOTES:7th graders tested were given a virtual tour of the Great Pyramid in the Virtus Archaeological Gallery. They were also given a text-based introductory lesson on Ancient Egypt, and quality pictures and diagrams of the pyramid. Overall the students felt that the text-based information was most beneficial to their learning because it had more detail. The tour basically showed them the size and depth of the pyramid. Researchers at James Cook University School of Education compared the effectiveness of navigating through virtual reality against the use of textbooks when teaching 7th grade students about the Egyptian Pyramids. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. Overall the students felt that the text-based information was most beneficial to their learning because it had more detail. The tour basically showed them the size and depth of the pyramid. It didn’t support the theory. NOTES:7th graders tested were given a virtual tour of the Great Pyramid in the Virtus Archaeological Gallery. They were also given a text-based introductory lesson on Ancient Egypt, and quality pictures and diagrams of the pyramid. Overall the students felt that the text-based information was most beneficial to their learning because it had more detail. The tour basically showed them the size and depth of the pyramid.

    35. VR at Georgia Tech University Architectural Design CDs 3D sketchpad http://www.plan3d.com/pages/homeChlgr.aspx?rd=1 Paper-based module 2D sketches Analyze: Which approach was most effective for this program? Why? Reflect: Which was most appealing to you? Researchers at Georgia Tech used CDs and 3D sketchpads to see if VR aided in the development of architectural design skills and compared it to designs developed by using paper-based modules. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR NOTE: After the 10 week study, there was an increased spatial understanding with the VR approach because students can adjust the object orientation and position. This study supported the theory.Researchers at Georgia Tech used CDs and 3D sketchpads to see if VR aided in the development of architectural design skills and compared it to designs developed by using paper-based modules. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR NOTE: After the 10 week study, there was an increased spatial understanding with the VR approach because students can adjust the object orientation and position. This study supported the theory.

    36. VR & Science Space researchers 3D force & energy in electric fields (VR) http://www.bbc.co.uk/schools/revisewise/science/physical/ 2D force & energy in electric fields Same content and learning activities without virtual reality Which approach was most effective for this program? Why? Which is most appealing to you? Why? ScienceSpace researchers compared 2D to 3D using the same content and learning activities focusing on the force and energy in electric fields. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. NOTE: The results showed that students gained a better understanding of the concept using 3D approach because the multi-sensory approach was stimulating and interactive. The 3D students were able to out perform the 2D students in the areas of sketching the concept, performing the concept, and predicting how changes to the source charge would affect the electric field. This study supported the theory.ScienceSpace researchers compared 2D to 3D using the same content and learning activities focusing on the force and energy in electric fields. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. NOTE: The results showed that students gained a better understanding of the concept using 3D approach because the multi-sensory approach was stimulating and interactive. The 3D students were able to out perform the 2D students in the areas of sketching the concept, performing the concept, and predicting how changes to the source charge would affect the electric field. This study supported the theory.

    37. VR & H.I.T.L researchers AtomWorld & Mac-based chemistry cell biology http://www.youtube.com/watch?v=bPuIj0Pc0IQ Bill Nye the Science Guy - Atoms Paper-based module cell biology (2 hydrogens + 1 oxygen) Which approach was most effective for this program? Why? Which is most appealing to you? Why? HITL researchers studied the effectiveness of immersion and interactivity approach to learning about Cell Biology. The task was for students to build 2 hydrogens and 1 oxygen to combine and form a water molecule using AtomWorld and Mac-based chemistry. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. The results revealed that a 3D multi-sensory world can help to develop a mental model better than a 2D approach. This study supported the theory. This is where I can conduct the OUTCOME EVALUATION by asking the question, “Did it do what it said it would do?” I can conduct the COST ANALYSIS by asking the question, “Was it financially worthwhile?” In conclusion: program evaluation is a method to improve the effectiveness of a practice. The method should be implemented before, during and after the implementation. Thank you. We are now open for questions to the group.HITL researchers studied the effectiveness of immersion and interactivity approach to learning about Cell Biology. The task was for students to build 2 hydrogens and 1 oxygen to combine and form a water molecule using AtomWorld and Mac-based chemistry. Pretest survey: How many of you think that the students performed better with VR tour? This is where we would take the tour and make observations. Posttest Survey: How many of you think that the students did better with VR tour? Of ____participants we have ___ respondents. ___ respondents say the students performed better with VR. The results revealed that a 3D multi-sensory world can help to develop a mental model better than a 2D approach. This study supported the theory. This is where I can conduct the OUTCOME EVALUATION by asking the question, “Did it do what it said it would do?” I can conduct the COST ANALYSIS by asking the question, “Was it financially worthwhile?” In conclusion: program evaluation is a method to improve the effectiveness of a practice. The method should be implemented before, during and after the implementation. Thank you. We are now open for questions to the group.

    38. Questions? All Group members will help answer the questions. Mahalo!

More Related