Transparent Assessment/alx

From Open Pattern Repository for Online Learning Systems
Jump to navigation Jump to search


Transparent Assessment
Contributors Joseph Bergin, Christian Kohls, Christian Köppe, Yishay Mor, Michel Portier, Till Schümmer, Steven Warburton
Last modification June 6, 2017
Source Bergin et al. (in press 2015)[1]; Warburton et al. (2016)[2][3]
Pattern formats OPR Alexandrian
Usability
Learning domain
Stakeholders

Also Known As: Open Instruments of Assessment (Open Instruments of Assessment); Visible Assessment (Visible Assessment)


Ensure that your assessment scheme is visible to your students, from the criteria to the actual tools you use to apply them.


You are teaching in a structured educational situation and have a clearly defined Assessment Criteria List (Assessment Criteria List).

***

Students often do not reflect on their work or evaluate it, either because they do not know how to do it or they don’t see value in doing so. This leads to unproductive work on both the student’s and the teacher’s side, as they didn’t do it right, didn’t do the right thing, or even both.

On the other hand, if students have the idea that they are on the right track but assessment shows they aren’t, then this can be very frustrating and demotivating for them. In that case students think they know how they will be assessed or assume that the assessment will match the way they validate their work themselves.

In some cases, students actually are on the right track, but still could perform better. But they often do not know how to improve their performance.


Students often have misconceptions about what you require of them. If students don’t know where you want them to go, how will they get there?

***

Therefore, share all elements of your assessment scheme with the learners, from the Assessment Criteria List (Assessment Criteria List), through the patterns you use to apply them and the tools you use to implement those. Instruct learners to use these tools to self- and peer-assess their work, and encourage them to debug them, i.e. point out any ambiguities or inconsistencies.

Before the course (or learning activity) starts, provide students with the Assessment Criteria List (Assessment Criteria List). These criteria are implemented through your Assessment Instruments (Assessment Instruments): provide the students with these as much as you can of without compromising the integrity of your assessment. Explicitly demonstrate the Constructive Alignment (Constructive Alignment) of your Assessment Criteria List (Assessment Criteria List) and your Assessment Instruments (Assessment Instruments).

In order to maintain your autonomy, you may include elements of Discretionary Bonus (Discretionary Bonus). Sometimes you can give students access to the actual instruments you will use. For example, if you use a Rubric (Rubric), you can provide it as is. However, some instruments, such as an exam, rely on concealment. In that case, you can provide equivalent examples such as a Mock Exam (Mock Exam).


You may expect: Knowing the details of the assessment reduces student anxiety. Students will perceive the assessment as more fair.

“Sunlight is said to be the best of disinfectants"[4]: your students have a clear interest in your assessment criteria and instruments. If these are unclear, ambiguous, incoherent, inconsistent or misaligned - they are sure to point this out to you.

Understanding the rationale behind the assessment will help students align their work (their epistemic practices) with the learning objectives.

Students will reduce their misconceptions of the assessment strategy and by extension of the content to be learnt. Disputes over assessment will be reduced to to a clearer understanding of the assessment criteria and procedures.

If your Assessment Criteria List (Assessment Criteria List) and Assessment Instruments (Assessment Instruments) are well thought, they should be a good example for students to follow. By studying these, and using them for Self Assessment (Self Assessment) and Peer Review (Peer Review), students will develop their meta-cognitive skills.


However, If your Assessment Criteria List (Assessment Criteria List) and Assessment Instruments (Assessment Instruments) are not carefully designed, exposing them will narrow the students’ learning to the cases captured in the assessment tools, and reduce the transfer of knowledge to unknown situations. In some cases, students may even “game" the assessment instruments, thus achieving high marks but missing the learning objectives altogether.


Related Patterns

—Used Assessment Criteria List (Assessment Criteria List) and Constructive Alignment (Constructive Alignment).

—Used by Graded Peer Assessment (Graded Peer Assessment) and Self Assessment (Self Assessment).

—Essential for Combined Self-Peer-Tutor Assessment (Combined Self-Peer-Tutor Assessment).


Examples

—The University of Surrey provides students free access to Turnitin’s[5] originality checking service.

—Christian Köppe provides students with a Mock Exam (Mock Exam) in the introductory programming course. This mock exam also contains a grading scheme which is similar to the one used in the real exam.

—OU UK’s course “H817: Openness and innovation in elearning"[6] included a design studio component. Students were provided with the rubric used to assess this component.

—University of Haifa’s course on “Games and Learning"[7] provided students with the scheme that the tutor would use to mark them, and asked them to use the same scheme to mark themselves. The grade they gave themselves was factored into their course grade.


See Also John Hattie’s book on Visible Learning[8]


References

  1. Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., Warburton, S. (in press 2015). Assessment-Driven Course Design - Fair Play Patterns. In Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015). New York:ACM
  2. Patlet also published in Warburton, S., Mor, Y., Kohls, C., Köppe, C., & Bergin, J. (2016). Assessment driven course design: a pattern validation workshop. Presented at 8th Biennial Conference of EARLI SIG 1: Assessment & Evaluation. Munich, Germany.
  3. Patlet also published in Warburton, S., Bergin, J., Kohls, C., Köppe, C., & Mor, Y. (2016). Dialogical Assessment Patterns for Learning from Others. In Proceedings of the 10th Travelling Conference on Pattern Languages of Programs (VikingPLoP 2016). New York:ACM.
  4. https://en.wikiquote.org/wiki/Louis_Brandeis
  5. http://turnitin.com/
  6. http://www.open.ac.uk/postgraduate/modules/h817
  7. https://sites.google.com/a/edtech.haifa.ac.il/games/
  8. Hattie, J. (2008). Visible Learning. Routledge.