Analysis:Student affect and interaction behavior in ASSISTments

From Open Pattern Repository for Online Learning Systems
Jump to navigation Jump to search
Student affect and interaction behavior in ASSISTments
Author(s) Paul Inventado, Peter Scupelli
Data set(s) used ASSISTments problem-student-level data, ASSISTments problem-level data
Usage in design patterns All Content In One Place
Consistent Language
Just Enough Practice
Personalized Problems
Worked Examples

Multiple studies were conducted on data collected from the ASSISTments system between September, 2012 to September, 2013. The goal of the analysis was to uncover relationships between content, student behavior, and student affect.

Student learning strategies

Online learning systems that host student exercises and activities sometimes provide feedback in the form of hints or additional explanations. Feedback can help students figure out the answer, but can also be used for gaming the system, which hurts learning [1]. Gaming is taking advantage of system feedback to get the answer without actually learning such as continuously requesting for hints or trying out all possible answers.

Such behavior was seen in student data collected from ASSISTments (i.e., quickly requesting for hints to get the answer). However, further analysis showed that some students seemed to game the system to get the answer for one problem, but no longer displayed such behavior in a similar problem. It is possible that some students actually want to display all the hints not just to get the answer, but learn how to solve the problem.

Problem content and student affect

Relationships between content delivery design and student affect were observed. These results could be used to guide preferable design decisions and designs to avoid.

External resources

An analysis of ASSISTments' problem level data showed a correlation between the has_text_color feature, and boredom and gaming. Upon further investigation, 99% of math problems with text colors were drill and practice problems taken from textbooks. More than 70% of these problems did not contain the actual problem, but only referred students to the problems in the textbook (e.g., “Page 599 #63 Part B: Your answer should be in seconds.”). This means that when students answered these problems, they had to switch between looking up the problem in their book and using the ASSISTments interface.

According to Sweller (2004[2]), unnecessary processing of information imposes a cognitive load that interferes with learning. High cognitive load impairs performance, which could increase the difficulty of a learning task. When students experience too much difficulty or get stuck in trying to solve a problem, they are likely to disengage from the activity[3]. Resulting disengagement behavior can explain the boredom and gaming behavior seen in the data.

Hint usage

An analysis of ASSISTments' problem-student level data showed that students rapidly requested for hints the first time they encountered the problem type. The next time the student encountered the same problem type, he/she was not involved in gaming behavior. Students could have used the hints as a worked example to find the solution[2]. After learning from the hints, students no longer needed to engage in this gaming-like behavior.


We use the term language to mean the style, standards, or format used when writing problem elements such as term usage, text formatting, color usage, notations, visual representations, and others. A consistent language would refer to a problem whose elements are styled and formatted the same way through out. According to ASSISTments' problem level data, frustration correlated with problems that used language inconsistently. For example, a math problem dealing with angles used the degree notation inconsistently – “Subtract the given angle from 180°. 180 - 47 = 133.” It is possible that the inconsistencies would require students to spend extra effort in trying to resolve them. When different elements are used to refer to the same thing, it will also require students to figure out mappings between these elements. This can result in a split attention effect, that increases cognitive load[2]. Increasing cognitive load aside from the mental effort required to solve the problem can make the task too heavy for a student. This could lead to confusion, frustration, and disengagement from the activity[3].

Over-practice and student affect

Frustration is an emotion often experienced by students while learning. It is unclear whether it hinders learning or pushes students to exert more effort to learn (cf. Artino & Jones 2012[4]; Patrick et al. 1993[5]; Pekrun et al. 2002[6]). The role of frustration in learning with ASSISTments was investigated using both problem-student-level data and problem-level data that described student behavior while answering a series of questions in the system.

First, problems that likely led to student frustration were identified by ranking their average frustration confidence values. The most frustrating problems were investigated manually and revealed that they were usually difficult to understand and their hints were confusing. However, the data also showed that students were frustrated even when they answered questions correctly.

The problem level instances were investigated further by referring to the student-level data, which described students' low level actions while answering the problem (e.g., order of hint requests and answer attempts, answer correctness, time duration between actions). Students who were frustrated even when they got the answer correctly, were observed to have spent a significant amount of time answering the exercise and consistently getting the answers correctly. This could mean that students were frustrated at the type and sequence of problems they were asked to solve rather than the problem itself.

Skill level

An analysis of student behavior data showed that students reacted differently to the same problem. Students who were skilled in a certain topic rarely used hints and could quickly answer the problem. Students who were less skilled on the topic, often requested for hints, made multiple attempts to solve the problem, and spent more time before answering it correctly. Students skilled in one topic struggled with other topics, which they might not have learned as much. Without prior knowledge, students will have difficulty answering a problem[2].

Data showed that students were often bored or disengaged from the problem when they did not have enough skills to solve it (i.e., evidenced by answer correctness and number of hint requests). When students experience too much difficulty or get stuck in trying to solve a problem, they are likely to disengage from the activity[3].


  1. Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., and Koedinger, K. (2008). Why students engage in “gaming the system” behavior in interactive learning environments. Journal of Interactive Learning Research, 19(2):185-224.
  2. 2.0 2.1 2.2 2.3 Sweller, J. (2004). Instructional design consequences of an analogy between evolution by natural selection and human cognitive architecture. Instructional science, 32(1-2), 9-31.
  3. 3.0 3.1 3.2 D’Mello, S., & Graesser, A. (2012). Dynamics of affective states during complex learning. Learning and Instruction, 22(2), 145-157.
  4. Artino, A.R. and Jones, K.D. (2012). Exploring the complex relations between achievement emotions and self-regulated learning behaviors in online learning. The Internet and Higher Education 15, 3, 170-175.
  5. Patrick, B.C., Skinner, E.A. and Connell, J.P. (1993). What motivates children's behavior and emotion? Joint effects of perceived control and autonomy in the academic domain. Journal of Personality and social Psychology 65, 4, 781-791.
  6. Pekrun,R., Goetz, T., Titz, W. and Perry, R.P. (2002). Academic emotions in students' self-regulated learning and achievement: A program of qualitative and quantitative research. Educational psychologist 37, 2 (2002), Taylor & Francis, Philadelphia, PA, 91-105.