Assessment Drift

No assessment is completely valid. But that's not simply because we are not very good at designing assessment tasks. It's more of a problem with the whole system than that... As Kahneman put it: "This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution" (2011: 12) This is how it works in practice;

Let's assume that we do know what the area of practice requires (I'll use a vocational/professional example here for clarity, but the argument applies in large part to academic areas, too).

In the simplest cases this might well be what a beginning practitioner would pick up by hanging around with old hands, a.k.a. "legitimate peripheral participation" in a "community of practice". But of course there are all kinds of regulatory bodies and contesting definitions of what the area of practice's boundaries are, and employer and perhaps professional body interests which (usually) serve to inflate requirements...

The outcome of such negotiations is usually a "course", which takes place in a college�and the content and the process is utterly changed. Instead of finding out about things and how to do them as they crop up in practice, the student (as she has now become�and that is a transformation in itself) finds that all the things which happen at the same time and interact in the real world have been separated out into discrete "subjects", which are taught separately, usually in classrooms, although occasionally in workshops. Nor does the sequence of the teaching bear any relationship to the real world; it starts with the easy stuff and goes on to the more difficult, and it is often confusing because it doesn't fit together with any logic other than the teachers'. Oh yes, there are teachers too; generally they are recruited on the basis of their academic qualifications or their understanding of a "subject"�never mind that you don't need to know about most of the subject, just the bits you will need to use.

A lot of effort is put in at the college to ensure that everyone learns the same material at the same pace and in the same order, and the principal means of managing this is through regular assessment. Passing assignments or examinations is the prerequisite for progression through the course, and the assessments are carefully staged to correspond with the students' growing command or "mastery" of the material. However, here too the exigencies of practice distort the process. Most assessments rely on writing, for example�and we all know how that can disadvantage a dyslexic student, or one working in a language other than her own. But beyond that, the assessment may have to be re-cast in a form amenable to a written answer, so the student writes about something rather than demonstrates it. When National Vocational Qualifications were introduced in the UK in the late '80s, much was made of their more valid approach to assessment, based on direct observation of practice and oral questioning where possible; when delivered other than in-service in the workplace, however, the need for permanent, standardised and verifiable records has led to increased reliance on portfolios of evidence, and more paperwork. It is this which is quality-assured, rather than the direct practice.

Moreover, sometimes it is either not practical, or ethical, or it is too expensive to assess some areas of practice directly; or it is impossible to create consistent conditions to standardise the level of difficulty; so simulations or other indirect approaches are required. And when situations are created for the purpose of assessing performance, rather than to achieve an intrinsic goal, there is further drift.

By now, we are talking about the management of proxies�more or less adequate stand-ins or substitutes for the real thing. Step by step, under the inexorable constraints of the organisation, the teaching/learning and assessment process has diverged from what was originally intended. Each step has addressed about (let's be generous and assume conscientiousness on the part of everyone involved in the process), say, 80% of the issues posed by the previous step. It has not addressed 10% of those issues, and it has added its own 10%, including the "hidden curriculum".

Let's strip away all those intermediate accretions. Go back to the initial step, "What the area of practice actually requires." and see how the final step, "What the course actually assesses" relates to it. There may be about 15% overlap. (Do the maths on the diagram and it doesn't work, but then the real world is much messier than this discussion�sorry!)

Notes and References

This discussion is based on Howard Becker's classic "A School is a Lousy Place to Learn Anything in" in conjunction with ideas about situated learning and communities of practice, and informed by Hunter (1994) Rethinking the School: subjectivity, bureaucracy, criticism Sydney; Alan and Unwin: New York; St Martin's Press on the "social technology" of classes, classrooms and timetables.

My take on the hidden curriculum is here.

Here is an annotated presentation covering slightly wider ground, but including an earlier version of the material above:

References for the above include--in addition to those already cited:

  • Kahneman D (2011) Thinking, fast and slow London; Penguin--well worth reading for many reasons. His experiences in the armed forces, failing to develop assessment capable of predicting the effectiveness of officers in training, are principally in chs. 17 and 18.
  • Wenger E (1998) Communities of Practice; learning, meaning and identity Cambridge; C.U.P.

(29 July 2013)

To reference this page copy and paste the text below:

Atherton J S (2013) Learning and Teaching; [On-line: UK] retrieved from

Original material by James Atherton: last up-dated overall 10 February 2013

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Unported License.

Search Learningandteaching.info and associated sites:

Delicious Save this on Delicious        Click here to send to a friend     Print

This site is independent and self-funded, although the contribution of the Higher Education Academy to its development via the award of a National Teaching Fellowship, in 2004 has been greatly appreciated. The site does not accept advertising or sponsorship (apart from what I am lumbered with on the reports from the site Search facility above), and invitations/proposals/demands will be ignored, as will SEO spam. I am of course not responsible for the content of any external links; any endorsement is on the basis only of my quixotic judgement. Suggestions for new pages and corrections of errors or reasonable disagreements are of course always welcome. I am not on FaceBook or LinkedIn.

Back to top