|Contributors||Joseph Bergin, Christian Kohls, Christian Köppe, Yishay Mor, Michel Portier, Till Schümmer, Steven Warburton|
|Last modification||June 6, 2017|
|Source||Köppe et al. (in press 2015); Bergin et al. (2015); Köppe et al. (2016); Bergin et al. (2016)|
|Pattern formats||OPR Alexandrian|
Assess for possible misconceptions of key concepts as soon as possible so that they can be corrected early.
You taught a new concept to students which is a key concept and needs to be fully understood. Questions the students have about the concept are answered and the learning objective seems to be achieved, but you need to be sure they have understood.
Getting the feeling that a concept has been grasped often makes students believe that this indeed is the correct understanding of the concept and that it doesn’t need to be questioned anymore.
Some misconceptions are common, but there can be too many to cover them all in lecture. Most students won't have problems, but a few might.
The students and you either think they understood the concept even though the students don’t know if their understanding of the concept really is deeply correct, which increases the chance of incorrect application of the concept.
Therefore explicitly assess the students for common misconceptions of threshold concepts in order to identify necessary corrective actions.
A requirement for the implementation is to have identified and listed the most common misconceptions. This is usually based on the experience of earlier versions of the course given, but such misconceptions can be found in the literature too. Special attention should be paid for key concepts, whose understanding is substantial. In case you don’t know which misconceptions might occur, use the list of key concepts and assess them in various ways. This might help to reveal misconceptions you weren’t aware of before.
Based on this list, choose appropriate assessment tools that help with identifying if these misconceptions are present and need to be corrected. Such tools could be a  during a lecture., small assignments or simply
Applying this solution with larger groups requires a different implementation, as there’s likely not the time for individual tests. In that case it would be helpful to provide afor the students that helps them to detect their misconceptions by themselves or using a . Another option are intelligent tutoring systems.
Applyingin combination with can be helpful, but also dangerous, as the peers might have the same misconceptions and reinforcing them with each other. Thus, you shouldn’t rely on alone. However, peer learning might contribute to making this problem less urgent.
It is good tofor misconceptions. The longer the misconceptions are present, the harder it will be to correct them. Furthermore, there is a high chance that concepts that build on other concepts are also not well understood if the underlying concepts are not clearly grasped.
The assessment here can be summative or formative, with the latter being preferred.
Note also, as teacher, it can be hard to detect misconceptions, even when assessing the understanding of a concept. Allegedly correct answers to questions about the concept don’t always indicate a good understanding of the concepts, misconceptions can be hidden behind correct answers. It is easier to assess shallow understanding than getting grip on the desired deep understanding.
However, based on experience teachers often know which misconceptions of certain concepts students might have and they are also aware of the key concepts which need to be fully understood.
You may expect: Knowing which misconceptions the students still have will help you to adjust your pedagogy so that these misconceptions can be corrected.
However, even though assessing for misconceptions requires additional actions that need to be planned and your lecture or course plan can be disturbed, the disadvantages of not correcting these misconceptions are significantly higher.
In introductory programming courses, the concept of passing by value (vs. pass by name vs. pass by reference) is often misunderstood by students. A simple exercise was used to assess if the concept was understood correctly: the students were given some source code of short programs and had to determine the results by pure reasoning. The results were discussed in the group, hereby revealing misconceptions and correcting them immediately.
- First mentioned in Köppe, C., Portier, M., Bakker, R., & Hoppenbrouwers, S. (in press 2015). Lecture Design Patterns: More Interactivity Improvement Patterns. In Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015). New York:ACM.
- Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., & Warburton, S. (2015). Assessment-driven course design foundational patterns. In Proceedings of the 20th European Conference on Pattern Languages of Programs (EuroPLoP 2015) (p. 31). New York:ACM.
- Patlet also published in Köppe, C., Niels, R., Bakker, R., & Hoppenbreuwers, S. (2016). Flipped Classroom Patterns-Controlling the Pace. In Proceedings of the 10th Travelling Conference on Pattern Languages of Programs (VikingPLoP 2016). New York:ACM.
- Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., & Warburton, S. (2016). Student's choice of assessment. In Proceedings of the 21st European Conference on Pattern Languages of Programs (EuroPLoP 2016) (p. 22). New York:ACM.
- Larson, K. A., Trees, F. P., & Weaver, D. S. (2008). Continuous feedback pedagogical patterns. In Proceedings of the 15th Conference on Pattern Languages of Programs (PLoP 2008) (p. 12). New York:ACM.