Difference between revisions of "Personalized Problems"

From Open Pattern Repository for Online Learning Systems
Jump to navigation Jump to search
m (Pinventado moved page Match problems to skill levels to Personalized problems without leaving a redirect: Changed name of pattern)
(Update pattern format and content according to feedback from PLoP 2015 shepherding)
Line 1: Line 1:
<!-- Design pattern template
Fill in all succeeding fields that are either surrounded by a comment tag (<!--  - ->) or angle brackets (< >) when applicable -->
{{Infobox_designpattern
{{Infobox_designpattern
|image=Match_problems_to_skill_levels.png
|image=Match_problems_to_skill_levels.png
|author=[[User:Pinventado|Paul Inventado]]<br/>Peter Scupelli
|author=[[User:Pinventado|Paul Inventado]], Peter Scupelli
|contributor=
|contributor=
|datasource= [[Data:ASSISTments2012-2013_problem-student_level|ASSISTments data]]
|dataanalysis= <!-- If applicable, list of data analyses used for mining the pattern separated by a " , "comma -->
|dataanalysis=
|domain= General
|domain= General
|stakeholders= Teachers<br/>Students
|stakeholders= Teachers, Students, System developers
|evaluation = PLoP 2015 writing workshop <br/>[[Talk:ASSISTments]]
|evaluation = PLoP 2015 writing workshop, [[Talk:ASSISTments]]
|application = [[ASSISTments]]
|application = [[ASSISTments]]
|appliedevaluation = [[ASSISTments]]
|appliedevaluation = [[ASSISTments]]
}}
}}


If students have difficulty answering a problem, then give students problems that fit their level of understanding.
If students become bored or disengage from the exercise when they solve problems that are either too easy or too difficult for them to solve, then assign to students problems that they have the ability to solve.


==Context==
==Context==
Teachers use ASSISTments to select the problems in an assignment, to specify the sequence and conditions for presenting questions to students, and to assign the homework or activity to their students.  
Students are asked to answer an exercise in an online learning system. Teachers design and encode problems in an online learning system with corresponding answers and feedback. Teachers decide on the difficulty of problems in the exercise according to a general assessment of student ability in their class.


==Problem==
==Problem==
To ensure that students are challenged but not overwhelmed by the questions they are asked to answer.
Students become bored or disengage from the exercise when they solve problems that are either too easy or too difficult for them to solve.
 
==Forces==
#'''Prior knowledge.''' Students may find it impossible to solve a problem when they have not acquired the necessary skills to solve it<ref name="Sweller2004">Sweller, J. (2004). [http://link.springer.com/article/10.1023%2FB%3ATRUC.0000021808.72598.4d Instructional design consequences of an analogy between evolution by natural selection and human cognitive architecture]. Instructional science, 32(1-2), 9-31.</ref>.
#'''Expertise reversal.''' Presenting students information they already know can impose extraneous cognitive load and interfere with additional learning<ref name="Sweller2004"></ref>.
#'''Risk taking.''' Students who are risk takers prefer challenging tasks, because they find satisfaction in maximizing their learning. However, students who are not risk takers often experience anxiety when they feel the difficulty of a learning task has exceeded their skill <ref>Meyer, D. K., and Turner, J. C. (2002). [http://www.tandfonline.com/doi/abs/10.1207/S15326985EP3702_5#.VZ1KU5NVhBc Discovering emotion in classroom motivation research]. Educational psychologist, 37(2), 107-114.</ref>.
#'''Learning rate.''' Students learn at varying rates, which could be affected by their prior knowledge, learning experience, and the quality of instruction they receive<ref name="Bloom1974">Bloom, B. S. (1974). [http://psycnet.apa.org/journals/amp/29/9/682/ Time and learning]. American psychologist, 29(9), 682.</ref>.
#'''Limited resources.''' Student attention and patience is a limited resource possibly affected by pending deadlines, upcoming tests, achievement in previous learning experiences, personal interest, quality of instruction, achievement in previous learning experiences, personal interest, quality of instruction, and others<ref>Arnold, A., Scheines, R., Beck, J. E., and Jerome, B. (2005). [https://oli.cmu.edu/wp-oli/wp-content/uploads/2012/05/Arnold_2005_Time_and_Attention.pdf Time and attention: Students, sessions, and tasks]. In Proceedings of the AAAI 2005 Workshop Educational Data Mining (pp. 62-66).</ref><ref name="Bloom1974"/>.


==Solution==
==Solution==
Assign to students math problems that they have enough skills to solve.
Therefore, assign to students problems that they have the ability to solve.<br/>A student’s capability to solve a problem can be identified using assessments of their knowledge on pre-requisite skills, or model-based predictors<ref>Yudelson, M. V., Koedinger, K. R., and Gordon, G. J. (2013). [http://link.springer.com/chapter/10.1007/978-3-642-39112-5_18 Individualized bayesian knowledge tracing models]. In Artificial Intelligence in Education (pp. 171-180). Springer Berlin Heidelberg.</ref>.
 
==Forces==
# Teachers assign the same homework to all students.
# The ASSISTments interface provides teachers with simple control mechanisms (e.g., sequencing of questions in a homework, selecting questions based on the correctness) to control how questions are presented to students.
# Problem definitions (e.g., difficulty, presentation, wording, sequence) affect students’ learning experiences.


==Consequences==
==Consequences==


===Benefits===
===Benefits===
# Students do not get bored or too frustrated by the level of question difficulty
#Students have enough prior knowledge to solve a problem.
# Students are more likely to complete their homework
#Students do not need to “slow down” to adjust to the difficulty of the exercise.
# Students master the skills being taught in the assignment
#Risk takers will get challenging problems, while non-risk takers will not be overwhelmed by overly difficult problems.
#Students will solve problems appropriate for their skill level.
#Exercises that are neither too easy nor too challenging can motivate students to spend more time performing them.
 


===Liabilities===
===Liabilities===
# Teachers and content experts will need to create content and define pedagogies for handling differences in student skills.
#Content writers will need to provide content for students with different levels of ability
# ASSISTments will need to be modified to adapt content based on a students’ skill level.
#If students’ skill level is incorrectly identified, the system can still give students problems that are too easy or too difficult.
 
==Example==
A teacher would create problems for a homework with different difficulty levels appropriate for students who show low, medium or high performance. As students answer questions in their homework, ASSISTments will keep track of their progress to classify them as low performance (i.e., student makes mistakes ≥ 60% of the time), medium performance (i.e., student makes mistakes < 60% and ≥ 40% of the time) or high performance (i.e., student makes mistakes < 40% of the time). Depending on students’ performance level, ASSISTments will provide them with the corresponding question type so it is more likely for students to receive questions that are fit for their skill level.  


==Evidence==
==Evidence==


===Literature===
===Literature===
Learners experience cognitive overload when they are given a task that is too difficult for them to accomplish. Experiments showed higher learning outcomes, more task involvement, and less effort when tasks were adapted to learner's skill levels <ref>Corbalan, G., Kester, L., & van Merriënboer, J. J. (2008). [http://www.sciencedirect.com/science/article/pii/S0361476X08000118 Selecting learning tasks: Effects of adaptation and shared control on learning efficiency and task involvement]. Contemporary Educational Psychology, 33(4), 733-756.</ref>.
Research in different learning domains showed that personalizing content to students’ skill level had similar learning gains as non-personalized content, but took a shorter amount of time (e.g., simulated air traffic control<ref>Salden, R. J., Paas, F., Broers, N. J., and Van Merriënboer, J. J. 2004. [http://link.springer.com/article/10.1023/B:TRUC.0000021814.03996.ff Mental effort and performance as determinants for the dynamic selection of learning tasks in air traffic control training]. Instructional science, 32(1-2), 153-172.</ref>, algebra<ref>Cen, H., Koedinger, K. R., and Junker, B. 2007. [http://dl.acm.org/citation.cfm?id=1563681 Is Over Practice Necessary?-Improving Learning Efficiency with the Cognitive Tutor through Educational Data Mining]. Frontiers in Artificial Intelligence and Applications, 158, 511.</ref>, geometry<ref>Salden, R.J.C.M., Aleven, V., Schwonke, R. and Renkl, A. (2010). [http://link.springer.com/article/10.1007/s11251-009-9107-8 The expertise reversal effect and worked examples in tutored problem solving]. Instructional Sicience, 38, 289--307.</ref>, and health sciences<ref>Corbalan, G., Kester, L. and van Merrieonboer, J.J.G. (2008). [http://www.sciencedirect.com/science/article/pii/S0361476X08000118 Selecting learning tasks: Effects of adaptation and shared control on learning efficiency and task involvement]. Contemporary Educational Psycholoy, 33, 733--756.</ref>.
 
===Discussion===
In a meeting with Ryan Baker and his team at Teacher's College in Columbia University, Neil Heffernan and his team at Worcester Polytechnic Institute, and Peter Scupelli and his team at the School of Design in Carnegie Mellon University (i.e., ASSISTments stakeholders), the team agreed that issues arise because of the lack of personalization and personalizing content to students' skill levels may help address the issue.


===Data===
===Data===
Data showed that students got bored or disengaged when a problem was too easy or too difficult. [http://http://www.contrib.andrew.cmu.edu/~lelab/patternwiki/ASSISTments_experiments].
According to an [[Analysis:Student_affect_and_interaction_behavior_in_ASSISTments#externalresources | analysis of ASSISTments’ data]], boredom and gaming behavior correlated with problem difficulty (i.e., evidenced by answer correctness and number of hint requests).
 
<!--===Applied evaluation===
Results from randomized controlled trials (RCTs) or similar tests that measures the pattern's effectiveness in an actual application. For example, compare student learning gains in an online learning system with and without applying the pattern. -->


==Related patterns==
==Related patterns==
Problems can be personalized much like Content Personalization (Danculovic et al. 2001), but will heavily depend on a model of student knowledge.
This pattern applies the concept of '''Different exercise levels'''<ref>Bergin, J., Eckstein, J., Völter, M., Sipos, M., Wallingford, E., Marquardt, K., Chandler, J., Sharp, H. and Manns, M. L. (2012). [http://www.pedagogicalpatterns.org/right.html Pedagogical patterns: advice for educators]. Joseph Bergin Software Tools.</ref> in online learning systems, and '''Content personalization'''<ref>Danculovic, J., Rossi, G., Schwabe, D., and Miaton, L. (2001). [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.89.5975&rep=rep1&type=pdf Patterns for Personalized Web Applications]. In Proceedings of the 6th European Conference on Pattern Languages of Programs, Universitaetsverlag Konstanz, Germany, 423--436.</ref> in exercise problem selection. It can be used with '''[[Just enough practice]]''' to help students master skills and select the next set of problems according to their learning progress. '''[[Worked examples]]''' can be used when students have lack enough skills to solve the problem.
 
==Example==
A teacher would encode into an online learning system a math exercise containing problems with varying difficulty. As students answer questions in their homework, the online learning system would keep track of students’ progress to identify their skill level such as low (i.e., student makes mistakes ≥ 60% of the time), medium (i.e., student makes mistakes < 60% and ≥ 40% of the time) or high (i.e., student makes mistakes < 40% of the time). Based on students’ performance, the online learning system would provide the corresponding question type so it is more likely for students to receive questions that are fit for their skill level.  


==References==
==References==
<references/>
<references/>


[[Category:Design_patterns]] [[Category:ASSISTments]]
<includeonly>[[Category:Design_patterns]]</includeonly> <!-- List of other categories the design pattern belongs to. The syntax for linking to a category is: [[Category:<Name of category]] -->

Revision as of 09:30, 8 July 2015


Personalized Problems
Contributors
Last modification July 8, 2015
Source {{{source}}}
Pattern formats OPR Alexandrian
Usability
Learning domain General
Stakeholders Teachers, Students, System developers
Confidence
Evaluation PLoP 2015 writing workshop, Talk:ASSISTments
Application ASSISTments
Applied evaluation ASSISTments

If students become bored or disengage from the exercise when they solve problems that are either too easy or too difficult for them to solve, then assign to students problems that they have the ability to solve.

Context

Students are asked to answer an exercise in an online learning system. Teachers design and encode problems in an online learning system with corresponding answers and feedback. Teachers decide on the difficulty of problems in the exercise according to a general assessment of student ability in their class.

Problem

Students become bored or disengage from the exercise when they solve problems that are either too easy or too difficult for them to solve.

Forces

  1. Prior knowledge. Students may find it impossible to solve a problem when they have not acquired the necessary skills to solve it[1].
  2. Expertise reversal. Presenting students information they already know can impose extraneous cognitive load and interfere with additional learning[1].
  3. Risk taking. Students who are risk takers prefer challenging tasks, because they find satisfaction in maximizing their learning. However, students who are not risk takers often experience anxiety when they feel the difficulty of a learning task has exceeded their skill [2].
  4. Learning rate. Students learn at varying rates, which could be affected by their prior knowledge, learning experience, and the quality of instruction they receive[3].
  5. Limited resources. Student attention and patience is a limited resource possibly affected by pending deadlines, upcoming tests, achievement in previous learning experiences, personal interest, quality of instruction, achievement in previous learning experiences, personal interest, quality of instruction, and others[4][3].

Solution

Therefore, assign to students problems that they have the ability to solve.
A student’s capability to solve a problem can be identified using assessments of their knowledge on pre-requisite skills, or model-based predictors[5].

Consequences

Benefits

  1. Students have enough prior knowledge to solve a problem.
  2. Students do not need to “slow down” to adjust to the difficulty of the exercise.
  3. Risk takers will get challenging problems, while non-risk takers will not be overwhelmed by overly difficult problems.
  4. Students will solve problems appropriate for their skill level.
  5. Exercises that are neither too easy nor too challenging can motivate students to spend more time performing them.


Liabilities

  1. Content writers will need to provide content for students with different levels of ability
  2. If students’ skill level is incorrectly identified, the system can still give students problems that are too easy or too difficult.

Evidence

Literature

Research in different learning domains showed that personalizing content to students’ skill level had similar learning gains as non-personalized content, but took a shorter amount of time (e.g., simulated air traffic control[6], algebra[7], geometry[8], and health sciences[9].

Discussion

In a meeting with Ryan Baker and his team at Teacher's College in Columbia University, Neil Heffernan and his team at Worcester Polytechnic Institute, and Peter Scupelli and his team at the School of Design in Carnegie Mellon University (i.e., ASSISTments stakeholders), the team agreed that issues arise because of the lack of personalization and personalizing content to students' skill levels may help address the issue.

Data

According to an analysis of ASSISTments’ data, boredom and gaming behavior correlated with problem difficulty (i.e., evidenced by answer correctness and number of hint requests).


Related patterns

This pattern applies the concept of Different exercise levels[10] in online learning systems, and Content personalization[11] in exercise problem selection. It can be used with Just enough practice to help students master skills and select the next set of problems according to their learning progress. Worked examples can be used when students have lack enough skills to solve the problem.

Example

A teacher would encode into an online learning system a math exercise containing problems with varying difficulty. As students answer questions in their homework, the online learning system would keep track of students’ progress to identify their skill level such as low (i.e., student makes mistakes ≥ 60% of the time), medium (i.e., student makes mistakes < 60% and ≥ 40% of the time) or high (i.e., student makes mistakes < 40% of the time). Based on students’ performance, the online learning system would provide the corresponding question type so it is more likely for students to receive questions that are fit for their skill level.

References

  1. 1.0 1.1 Sweller, J. (2004). Instructional design consequences of an analogy between evolution by natural selection and human cognitive architecture. Instructional science, 32(1-2), 9-31.
  2. Meyer, D. K., and Turner, J. C. (2002). Discovering emotion in classroom motivation research. Educational psychologist, 37(2), 107-114.
  3. 3.0 3.1 Bloom, B. S. (1974). Time and learning. American psychologist, 29(9), 682.
  4. Arnold, A., Scheines, R., Beck, J. E., and Jerome, B. (2005). Time and attention: Students, sessions, and tasks. In Proceedings of the AAAI 2005 Workshop Educational Data Mining (pp. 62-66).
  5. Yudelson, M. V., Koedinger, K. R., and Gordon, G. J. (2013). Individualized bayesian knowledge tracing models. In Artificial Intelligence in Education (pp. 171-180). Springer Berlin Heidelberg.
  6. Salden, R. J., Paas, F., Broers, N. J., and Van Merriënboer, J. J. 2004. Mental effort and performance as determinants for the dynamic selection of learning tasks in air traffic control training. Instructional science, 32(1-2), 153-172.
  7. Cen, H., Koedinger, K. R., and Junker, B. 2007. Is Over Practice Necessary?-Improving Learning Efficiency with the Cognitive Tutor through Educational Data Mining. Frontiers in Artificial Intelligence and Applications, 158, 511.
  8. Salden, R.J.C.M., Aleven, V., Schwonke, R. and Renkl, A. (2010). The expertise reversal effect and worked examples in tutored problem solving. Instructional Sicience, 38, 289--307.
  9. Corbalan, G., Kester, L. and van Merrieonboer, J.J.G. (2008). Selecting learning tasks: Effects of adaptation and shared control on learning efficiency and task involvement. Contemporary Educational Psycholoy, 33, 733--756.
  10. Bergin, J., Eckstein, J., Völter, M., Sipos, M., Wallingford, E., Marquardt, K., Chandler, J., Sharp, H. and Manns, M. L. (2012). Pedagogical patterns: advice for educators. Joseph Bergin Software Tools.
  11. Danculovic, J., Rossi, G., Schwabe, D., and Miaton, L. (2001). Patterns for Personalized Web Applications. In Proceedings of the 6th European Conference on Pattern Languages of Programs, Universitaetsverlag Konstanz, Germany, 423--436.