Further Research and Resources Ruth Currey ARE 6747: Assessment Seminar in Art Education
Judging Student Multimedia: Fifteen Criteria Teachers Need to Effectively Assess Kids Projects By: Cornelia Brunner Electronic Learning, May/June 1996, pp. 14-15 APA Citation Information Brunner, C. (2006). Judging Student Multimedia: 15 Criteria Teachers Need to Effectively Assess Kids Projects. Electronic Learning, May/June, pp. 14-15. • Teachers and students alike are using multimedia as a learning tool • However, a student multimedia project must be judged by different criteria that a • traditional paper report • The instructor must take into account the level of preparation, as well as sources, • organization, navigability of the presentation, and media integration • It is important for the student to thoroughly understand the question that inspired the • project, and to define the major sections of the report • The student must also be judged on how well the multimedia information has been • packaged, and how easy it is to navigate through the entire report. • Most importantly, it is important for the student to understand how the various • elements of the presentation contribute to the project, and explain their rationale for • including each item.
Judging Student Multimedia: Fifteen Criteria Teachers Need to Effectively Assess Kids Projects • Can the students articulate what important question motivated their research? • Have the students defined the major sections of their report and how they relate to each other before they start to create screens? • How clear or complete is the story-board, in which students plan the way their screens will look and relate to other, and what they will contain? • How varied are the sources on which the research is based? • How appropriate are the sources? • Are the sources of text, images, sounds, animations, and movies properly referenced? • Is there an appropriate central metaphor tying together the different parts of the report? • How good is the screen design? Is there a lot of clutter? How hard is it to read the text • How good is the design of the information? How useful is the organization of screens and buttons? • How clear are the navigation signals? • Are the links conceptual as well as navigational? • Do the students understand that their choice of illustrations (quotes, sounds, images, animations, or videos) creates a point of view? Can they articulate the assumptions and beliefs behind the materials they select to "illustrate" their topic? • Can students explain why they selected some materials and left out others, and how the various elements contribute to the meaning of the overall piece? • How appropriate is the choice of media? • Can students explain their editorial decisions? The public nature of the medium helps students become more conscious of the impact of their choices. It makes them in technique and encourages them to solve difficult design problems. Commentary • Article provides thoughtful questions to consider when assessing student work • However, unlike other articles, no examples are given to demonstrate the implementation of the • suggestions • This article works as a good jumping off point or basic source of reference in the creation of the • instrument more so than as a rubric itself
Designing Assessment for Student Multimedia Projects By: William Penuel Learning & Leading with Technology, February 2002 29(5), p. 46-53 APA Citation Information Penuel, W. Designing Assessment for Student Multimedia Projects. Learning & Leading with Technology, 29(5), pp. 46-53. • Today, educational uses of technology require students to develop higher-order thinking skills by authoring multimedia projects, designing simulations, or using visualization tools to master difficult concepts • Unfortunately, the assessments are not widely available to teachers and policy makers • Researchers have not paid enough attention to making visible the process of developing assessments as a way to address concerns about the effectiveness of student-centered approaches to using technology • Assessment design ideally begins with an understanding of the student skills and understandings that instructional activities are intended to develop • To draw conclusions about students' skills from their work it is necessary to • develop a way to judge the quality of students' performances • Frequently in assessment design, this step is undertaken after a task has been developed and all the data have been collected • The process designed specifically for developing a rubric to score students’ work is an opportunity to involve students and their classroom teacher in a process of developing, reflecting on, and revising standards for what would count as a quality project • The process of students defining criteria for quality work would provides researchers an opportunity to see whether the assessment task would actually engage students so that they could exhibit the intended skills • The students' development and refinement of criteria illustrates the ideal interplay between making standards meaningful in terms of real work and developing appropriate and useful assessments
Designing Assessment for Student Multimedia Projects • Assessment in student multimedia design projects needs to increase its focus on involving students and teachers in designing, adapting, and using classroom assessment tools as opportunities for reflection and feedback • The benefits of project-based learning using multimedia technology in all likelihood depend on the extent to which teachers use assessment tasks such as the one described in this article to develop shared understandings of quality in their classroom • To the extent that these assessments also provide evidence of student learning, they will fuel support for engaging students in the exciting and complex process of designing with technology Commentary • In comparison to the first article, this article provides a bit more specific information • regarding the procedures needed to develop the actual rubric for the technology • based assessments • Actual rubrics were provided as an example • Suggestions for modifications and revisions are also presented in an easy to follow • manner
Taking Aim: Tips for Evaluating Students in a Digital Age By: Caroline McCullen Technology & Learning, March 1999 19(7), pp. 48-50 APA Citation Information McCullen, C. (1999). Taking Aim: Tips for Evaluating Students in a Digital Age. Technology & Learning, 19(7), pp. 48-50. • The challenge involved in measuring such things as creativity, problem solving and cooperative learning can be daunting • However, these sorts of "skills" are the ones that so many of us hope students will develop with help from technology • It is important to define curriculum goals and technology uses before setting • up classroom procedures for evaluating students • Rubrics offer a way for every student to succeed at some level and are particularly effective for evaluating projects completed over several days or weeks. • Share your rubric or assessment tool with the students before they begin • their tasks • Preparing them for the evaluation process amounts to providing a road map • to guide them along their way
Taking Aim: Tips for Evaluating Students in a Digital Age • Before focusing on the tools needed to measure student achievement, it's important to • have the bigger picture in mind • Begin with the standard curriculum and determine which objectives in your • curriculum standards you want to target • Envision a range of desired outcomes • Evaluate possible uses of technology to achieve desired outcomes • As soon as you have curriculum goals and technology uses in mind, you can begin to set • up classroom procedures for keeping students accountable • Plan research strategies • Discuss assessment procedures • If carefully constructed, they keep students accountable for the process, as well • as the product • When students feel accountable, their behavior often improves and they • take the assignment more seriously • When developing the instrument it is better to start at the top and work your way down • If we start with "perfection" and work backwards, it becomes easier to develop • levels or degrees of success Commentary • Procedures are suggested, but not specifically laid out as in article #2 • An example is given based on an actual rubric, but is not directly conductive to • the classroom • Suggests student assistance in rubric creation to create accountability
Other Possible Resources (Not Listed Previously) Ohler, J. (2008). New Media Assessment is Upon Us. Schools Arts, 107(8), p. 16. White, H. (2006). Engaging Arts Assessment Through Technology. School Arts, 106(2), pp. 26-27. References Brunner, C. (2006). Judging Student Multimedia: 15 Criteria Teachers Need to Effectively Assess Kids Projects. Electronic Learning, May/June, pp. 14-15. McCullen, C. (1999). Taking Aim: Tips for Evaluating Students in a Digital Age. Technology & Learning, 19(7), pp. 48-50. Penuel, W. Designing Assessment for Student Multimedia Projects. Learning & Leading with Technology, 29(5), pp. 46-53.