Home > Research > Project 16

In-Depth Analysis of How to Model Student Affective States in TREs: Case Study with Meta-Tutor, a TRE for Conceptual Understanding

 

Co-investigators:

  • Cristina Conati (U. of British Columbia, Theme 2)
  • Roger Azevedo (North Carolina State U., Theme 1)

Research Assistants:

  • Nico Petrakis (U. of British Columbia)
  • Nicholas Mudrick (North Carolina State U.)

Description:

Dr. Conati will collaborate with Dr. Azevedo on investigating how to model students’ affective reactions and their relationship during learning while interacting technology-rich learning environments (TREs) designed to scaffold effective meta-cognition. The investigation will leverage and compare a variety of information sources including, gaze data, action logs, physiological signals (e.g., electrodermal activity) and facial expressions of affective states (e.g., confusion, frustration). This proposal extends the work that Conati and Azevedo have been conducting over the last 2 years, focusing on exploring the value of eye-tracking information for assessing student affect and learning with MetaTutor, a TRE for conceptual understanding (Bondareva et al, 2013; Jaques et al., 2014). This work was extremely innovative because, to date, gaze data has been leveraged to assess user states and traits such as expertise and problem solving strategies, as well as student motivation and related measures of engagement/disengagement (e.g. Mind Wondering, Feng et al., 2013), but not to detect more detailed emotional states such as boredom and curiosity. In this proposal, we aim to deepen our understanding of the value of eye-tracking data for modeling student affect and learning by (i) working with a larger dataset of MetaTutor data, (ii) considering a more extensive set of emotions (iii) investigating more sophisticated Machine Learning (ML) and Data Mining (DT) techniques to build our affective student models, and (iv) comparing and combining gaze data with additional sources of information including action logs, facial expressions and physiological data.