van der Vleuten, Cees and Schuwirth, Lambert (2005) Assessing professional competence: from methods to programmes. Medical Education, 39 (3). pp. 309-17. ISSN 0308-0110
Microsoft Word (Full text)
Assessing_professional_competence_final.doc - Accepted Version Download (84kB) |
Abstract
INTRODUCTION
We use a utility model to illustrate that, firstly, selecting an assessment method involves context-dependent compromises, and secondly, that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects. In the model, assessment characteristics are differently weighted depending on the purpose and context of the assessment.
EMPIRICAL AND THEORETICAL DEVELOPMENTS
Of the characteristics in the model, we focus on reliability, validity and educational impact and argue that they are not inherent qualities of any instrument. Reliability depends not on structuring or standardisation but on sampling. Key issues concerning validity are authenticity and integration of competencies. Assessment in medical education addresses complex competencies and thus requires quantitative and qualitative information from different sources as well as professional judgement. Adequate sampling across judges, instruments and contexts can ensure both validity and reliability. Despite recognition that assessment drives learning, this relationship has been little researched, possibly because of its strong context dependence.
ASSESSMENT AS INSTRUCTIONAL DESIGN
When assessment should stimulate learning and requires adequate sampling, in authentic contexts, of the performance of complex competencies that cannot be broken down into simple parts, we need to make a shift from individual methods to an integral programme, intertwined with the education programme. Therefore, we need an instructional design perspective.
IMPLICATIONS FOR DEVELOPMENT AND RESEARCH
Programmatic instructional design hinges on a careful description and motivation of choices, whose effectiveness should be measured against the intended outcomes. We should not evaluate individual methods, but provide evidence of the utility of the assessment programme as a whole.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | medical education; undergraduate/methods/standards; educational measurement/methods; professional competence/standards |
Subjects: | B900 Others in Subjects allied to Medicine X300 Academic studies in Education |
Department: | Faculties > Health and Life Sciences > Social Work, Education and Community Wellbeing |
Depositing User: | Paul Burns |
Date Deposited: | 10 Mar 2015 11:57 |
Last Modified: | 12 Oct 2019 14:39 |
URI: | http://nrl.northumbria.ac.uk/id/eprint/21582 |
Downloads
Downloads per month over past year