Multiple Right Ways/alx

From Open Pattern Repository for Online Learning Systems
Jump to navigation Jump to search


Multiple Right Ways
Contributors Joseph Bergin, Christian Kohls, Christian Köppe, Yishay Mor, Michel Portier, Till Schümmer, Steven Warburton
Last modification June 6, 2017
Source Bergin et al. (in press 2015)[1]; Warburton et al. (2016)[2][3]
Pattern formats OPR Alexandrian
Usability
Learning domain
Stakeholders

Allow different solutions or different paths to good solutions to be correct. If some solutions or paths are better than others, make criteria that lead to higher scores transparent.


You are assigning a task to students - in projects, written exams, practice sessions etc. and you have to decide which answer is right. The task can be solved in multiple and identifiable ways.

***

It is a fact that in almost any area there are multiple ways to solve a problem. Even if you ask students to solve a problem using a specific method there can be a variation of implementation.

You may be familiar with several alternative solutions but students may come up with many unexpected solution approaches. Sometimes these solutions are clumsy, at other times they are innovative and even better than expected.

You may say that “anything goes" but actually some things go better! Some approaches may plainly be wrong that is they do not fulfill the assignment. Some approaches may be partly wrong: they lead to the right direction but have some flaws in it.

Some approaches reach the goal but have negative side effects. Correct solution should be graded better than partly correct solutions. Better solutions should be graded better than weak solutions. But it becomes tricky to grade better solution approaches that are only partly correct.


If you allow only one solution to be correct then you are not fair to students who have found equivalent or even better alternatives. However, some alternatives may work correctly but are still not good solutions. Grading weak solutions lower seems unfair to students.

***

Therefore, allow multiple alternatives as correct answers. Be open to unexpected approaches and individually evaluate their correctness and quality. Clearly state the goal and expected outcome. Clearly state which additional criteria you expect from the solution (efficiency, innovation, easy read, fulfilling standards).

To test the correctness clearly define the expected outcome, for instance a mathematical result, a computed outcome, or features of an artefact. To test the quality and requirements state explicitly which requirements you expect to be fulfilled. You can use an Assessment Criteria List (Assessment Criteria List) for that. If you use a Reference Solution (Reference Solution), then look for the aspects where different correct alternatives are (hypothetically) possible and identify the core elements of these alternatives which need to be present.

It is good to collect information on alternative correct answers and to use this knowledge during the following running of the course.

If correctness and all requirements are satisfied then rate the outcome with 100% of the potential score. Provide bonus scores for innovative or high quality solution approaches and use the bonus scores to compensate mistakes in implementing that solution.


You may expect: A diversity of approaches and fairer scoring of these different approaches. A decrease of students learning standard solutions by pure repetition or drill & practice. An encouragement for students to become more creative.


However, an unknown solution is hard to evaluate for its correctness. Sometimes you have to process the solution step by step and even then you cannot be 100% sure whether the solution works in all cases. Exchange with colleagues and documenting such unknown solutions and how they were evaluated for future use helps with lessening this problem.


Related Patterns

—is based on Assessment Criteria List (Assessment Criteria List)

—needs to be taken into account when writing a Reference Solution (Reference Solution)


Examples

—In computer science, students can be assigned to write some code in order to solve a problem. The solution may require some list data structure and some running through the list, making use of all elements in it. The most important part here that the students are working with list-implementation and some loop structures, but there are multiple applicable list implementations (such as LinkedList, ArrayList, or a simple Array) and multiple applicable loop-constructs (such as do-while, for, or a while loop). All of these might be correct if applied in an appropriate way.

References

  1. Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., Warburton, S. (in press 2015). Assessment-Driven Course Design - Fair Play Patterns. In Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015). New York:ACM
  2. Patlet also published in Warburton, S., Mor, Y., Kohls, C., Köppe, C., & Bergin, J. (2016). Assessment driven course design: a pattern validation workshop. Presented at 8th Biennial Conference of EARLI SIG 1: Assessment & Evaluation. Munich, Germany.
  3. Patlet also published in Warburton, S., Bergin, J., Kohls, C., Köppe, C., & Mor, Y. (2016). Dialogical Assessment Patterns for Learning from Others. In Proceedings of the 10th Travelling Conference on Pattern Languages of Programs (VikingPLoP 2016). New York:ACM.