Peer-Reviewed Journal Details
Mandatory Fields
Seery, N,Canty, D,Phelan, P
2012
January
International Journal Of Technology And Design Education
The validity and value of peer assessment using adaptive comparative judgement in design driven practical education
Published
()
Optional Fields
Teacher education Technology education Holistic assessment Comparative pairs TECHNOLOGY EDUCATION CREATIVITY
22
205
226
This paper presents the response of the technology teacher education programmes at the University of Limerick to the assessment challenge created by the shift in philosophy of the Irish national curriculum from a craft-based focus to design-driven education. This study observes two first year modules of the undergraduate programmes that focused on the development of subject knowledge and practical craft skills. Broadening the educational experience and perspective of students to include design based aptitudes demanded a clear aligning of educational approaches with learning outcomes. As design is a complex iterative learning process it requires a dynamic assessment tool to facilitate and capture the process. Considering the critical role of assessment in the learning process, the study explored the relevance of individual student-defined assessment criteria and the validity of holistic professional judgement in assessing capability within a design activity. The kernel of the paper centres on the capacity of assessment criteria to change in response to how students align their work with evidence of capability. The approach also supported peer assessment, where student-generated performance ranks provided an insight into not only how effectively they evidenced capability but also to what extent their peers valued it. The study investigated the performance of 137 undergraduate teachers during an activity focusing on the development of design, processing and craft skills. The study validates the use of adaptive comparative judgement as a model of assessment by identifying a moderate to strong relationship with performance scores obtained by two different methods of assessment. The findings also present evidence of capability beyond the traditional measures. Level of engagement, diversity, and problem solving were also identified as significant results of the approach taken. The strength of this paper centres on the capacity of student-defined criterion assessment to evidence learning, and concludes by presenting a valid and reliable holistic assessment supported by comparative judgements.
10.1007/s10798-011-9194-0
Grant Details