- Affective tutoring systems
Affective Tutoring Systems (ATS)
It is believed that Intelligent Tutoring Systems (ITSs) would be significantly enhanced if computers could adapt in response to the emotions of students (Picard, 1997; Kort, Reilly and Picard, 2001; Alexander and Sarrafzadeh, 2004). This is the idea behind the developing field of Affective Tutoring Systems (ATSs): ATSs are ITSs that are able to adapt to the affective state of students in the same ways that effective human tutors do (Sarrafzadeh, Gholam Hosseini, Fan, Overmyer, 2003; Sarrafzadeh, Fan, Dadgostar, Alexander and Messom, 2004; de Vicente, 2003). The term “Affective Tutoring System” was apparently first used only several years ago (Alexander, Sarrafzadeh and Fan, 2003; de Vicente, 2003), although the popular concept of an ITS adapting to perceived emotion can be traced back at least to Rosalind Picard’s book Affective Computing (1997). However, the only functional ATSs that are capable of both detecting and adapting to emotion are Eve developed at Massey University in New Zealand in the domain of elementary mathematics tutoring (Sarrafzadeh et. al., 2008), and Edu-Affe-Mikey (Alepis et al, 2008) developed at the University of Piraeus in Greece in the domain of medical tutoring. However, several other groups are working towards this goal (Kort, Reilly and Picard, 2001; Litman and Forbes-Riley, 2006; D'Mello, Craig, Gholson, Franklin, Picard and Graesser, 2005).
During its brief history, the field of ATSs has faced two main barriers: 1) reliably detecting the affective state of students; and, 2) knowing how best to adapt to this information once a student’s emotions have been detected. The first of these issues seems to have generated by far the most attention, with growing numbers of researchers investigating various forms of facial expression analysis and gesture analysis, voice analysis, wearable computers, and predictive emotion models. In contrast, the second of these issues seems to have suffered serious neglect. However, as the technical obstacles to detecting emotions are being gradually overcome, the relevance of how to adapt to student emotion is becoming increasingly obvious and important.
Research Related to Affective Tutoring Systems
Easy with Eve [http://ngits.massey.ac.nz] (Sarrafzadeh et al., 2008) is an Affective Tutoring System (ATS) in the domain of primary school mathematics. The system adapts to students via a lifelike animated agent called Eve, who is able to detect student emotion through facial expression analysis, and can display emotion herself. Eve’s tutoring adaptations are guided by a case-based method for adapting to student states. This method uses data that was generated by an observational study of human tutors. Easy with Eve was tested in two primary schools in New Zealand and proved that not only affective tutoring systems were a reality but that they certainly outperform intelligent tutoring systems.
de Vicente (2003) has developed a method of diagnosing the level of a student’s motivation that incorporates both self-report and motivation diagnosis rules. The motivation diagnosis rules were generated by having participants study a recorded interaction between students and an ITS; the participants were instructed to make inferences about the student’s motivational state, and to give reasons for their inference. These reasons were then molded into the set of motivation diagnosis rules, which were further refined by subsequent validation testing with expert teachers. de Vicente’s motivation diagnosis methods were applied in MOODS, a simulated ITS environment, with encouraging results that showed a reasonable level of accuracy in determining the motivational state of students.
Building on previous work by del Soldato (1994), MOODS includes a set of motivational planning rules and a set of affective dialogue rules that form the core of the system’s adaptations to students. However, the focus of de Vicente’s work was diagnosing motivation, not facilitating it, so the effectiveness of these rules were not tested. Also, the rules were particularly concerned with motivation – which is only a subset of affect – so they do not explicitly consider affective states such as confusion or frustration etc.
The Tutoring Research Group (TRG) at the University of Memphis is adding an emotional component to their ITS AutoTutor. AutoTutor is a natural language-based tutor that has been successfully tested on about 1000 computer literacy and physics students, with significant learning gains in deeper level explanations as well as surface knowledge (Graesser, Lu, Jackson, Mitchell, Ventura, Olney, & Louwerse, 2004). AutoTutor has also performed well in a bystander Turing test, where participants were unable to distinguish between AutoTutor’s responses and the responses of a real human tutor (D’Mello, Craig, Gholson, Franklin, Picard, and Graesser, 2005). Through an “emote-aloud” study of students using the system, it was found that the most significant affective states for AutoTutor students were frustration, confusion and boredom, and further investigation determined that expert judges were much more accurate than untrained peers in their analysis of learner emotions. They are currently working towards an “emotion classifier” that is able to reliably detect a student’s affective state (D’Mello, Picard, & Graesser, 2007); they plan to gather this information from the student using real-time facial expression and posture analysis, as well as conversation cues (as AutoTutor uses natural language). Their method will incorporate both standard and biologically motivated classifiers to optimize the accuracy of their results.
Conati (2002) has developed a probabilistic model of determining student affect that has been applied in Prime Climb, an educational game designed for 11 year old maths students that was developed by the Electronic Games for Education in Math and Science (EGEMS) group at the University of British Columbia. The model relies on a Dynamic Decision Network (DDN) to identify student affect, which features two types of assessment: diagnostic assessment and predictive assessment. The diagnostic assessment focuses on the effects of student emotion, which are the outward signs of feelings; when fully complete, the system will use tools such as facial expression analysis and biometric sensors to feed real-time information into the DDN. Conati argues that a diagnostic assessment alone will not always be enough, so the second half of the DDN is based on a predictive assessment of student emotion. The predictive assessment is based on the Ortony, Clore and Collins (OCC) model (Ortony, Clore, & Collins, 1988), where emotions are considered to be the result of an appraisal of how a current situation fits one’s goals and preferences. Therefore the DDN assesses the affective state of the student based on how the student is likely to feel given his/her particular goals and preferences in a current situation within Prime Climb. The appraisal model of how particular goals, preferences and situations interrelate is designed based on relevant work in psychology, several Wizard of Oz studies, and simple common sense; an individual student’s preferences and goals can be assessed in real time as the student interacts with the system.
A study evaluating the accuracy of the predictive assessment and the real time goal assessment produced encouraging, if not perfect results (Conati & Maclaren, 2004), and it was found that the most significant weakness of the predictive assessment would be elegantly complemented by the diagnostic assessment. The predictive assessment has since been further refined (Conati & Maclaren, 2005), and future work will be to integrate the DDN together with accurate diagnostic sensors (such as biometrics), and then to integrate the whole of the probabilistic model with the student learning model that is being developed in parallel (Conati & Maclaren, 2004). Integration between the probabilistic model and eye tracking (as a measure of user attention patterns) is also being explored (Conati, Merten, Amershi, & Muldner, 2007; Conati & Merten, to appear).
Burleson (2006) at MIT has developed an Affective Learning Companion that is able to mirror several student non-verbal behaviours believed to influence persuasion and liking. The agent is attached to a game based on the Towers of Hanoi problem, and mirrors students in real time based upon input from several nonverbal communication sensors: a pressure mouse, a wireless skin conductance wrist band, a posture analysis seat and a camera that detects upper facial features (but not lower facial features that convey expressions such as smiling and tension). To test the effects of the mirroring, the real-time mirroring agent was compared to an agent that used prerecorded behaviours instead of real-time adaptation – the prerecorded behaviours were chosen by identifying the most common inputs from the nonverbal sensors in a previous pilot test and using those as the basis for an “average” response. However, testing with 11-13 year old students found no significant difference between the strength of the social bond created by the mirroring and prerecorded agents; this may have been because the prerecorded version was, after all, based on commonly occurring student affective states in the previous pilot study. The agent was also capable of two types of feedback: an affective response and a task-based response – it was found that the girls in the study were significantly more likely to favour the affective response than the boys.
The Affective Learning Companion is also able to detect frustration/help seeking behaviour from students using input from the nonverbal sensors with an accuracy of 79%, but that result is based on offline computation. Future work will be needed to refine the system to accurately sense these frustration/help seeking behaviours in real time, so that the agent can be continuously aware of the student’s affective state as he/she completes the Towers of Hanoi exercise. Exactly how and when the agent will choose to intervene (or not) will be based upon the real time information from the sensors as well as the personality type of the student.
Litman and Riley (2006) continue their work with ITSPOKE, an ITS that helps students with physics problems. They plan to make the system able to adapt to the affective student by analyzing acoustic-prosodic features of student speech, in conjunction with natural language processing. However, they are still at the stage of improving the speech analyser, and research into how ITSPOKE will adapt to student emotion remains a task for future work.
Researchers at Essex University and Shanghai Jiao Tong University plan to test a special emotion-aware tutoring system on students in China, with a view towards augmenting distance learning technology (Simonite, 2007). Students will wear a Bluetooth-enabled sensor ring on their finger, which will transmit heart rate, blood pressure and skin conductance data to the tutoring system; this data will be analysed to assess the user’s boredom and confusion levels.
References
Alepis, E., Virvou, M. and Kabassi, K. (2008). Requirements Analysis and Design of an Affective Bi-Modal Intelligent Tutoring System: The Case of Keyboard and Microphone, Studies in Computational Intelligence (SCI) 104, 9–24.
Alexander, S. T. V., Sarrafzadeh, A. (2004). Interfaces that Adapt Like Humans. Proceedings of Asia-Pacific Computer-Human Interaction 2004, Rotorua, New Zealand.
Alexander, S. T. V., Sarrafzadeh, A., Fan, C. (2003). Pay Attention! The Computer is Watching: Affective Tutoring Systems. Proceedings of E-Learn 2003, Phoenix, Arizona.
Burleson, W. (2006). Affective learning companions: Strategies for empathetic agents with real-time multimodal affective sensing to foster meta-cognitive and meta-affective approaches to learning, motivation, and perseverance. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, Massachusetts.
Conati C. (2002). Probabilistic assessment of user's emotions in educational games, Applied Artificial Intelligence, 16, 7-8, 555-575.
Conati, C., & Maclaren, H. (2004). Evaluating a probabilistic model of student affect. In Intelligent Tutoring Systems. Maceio, Brazil.
Conati C., & Mclaren, H. (2005). Data-driven refinement of a probabilistic model of user affect. In User Modeling. Edinburgh, Scotland.
Conati, C., & Merten, C. (to appear). Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation. Knowledge Based Systems, special issue on Advances in Intelligent User Interfaces, 20 (6), 557-574
D'Mello, S. K., Craig, S. D., Gholson, B., Franklin, S., Picard, R. and Graesser, A. C. (2005). Integrating affect sensors in an intelligent tutoring system. In Affective Interactions: The Computer in the Affective Loop Workshop at 2005 International conference on Intelligent User Interfaces.
D’Mello, S. K., Picard, R. W., & Graesser, A. C. (2007). Toward an affect sensitive autotutor. IEEE Intelligent Systems, in press.
Graesser, A. C., Lu, S., Jackson, G. T., Mitchell, H., Ventura, M., Olney, A. & Louwerse, M. M. (2004). AutoTutor: A tutor with dialogue in natural language. Behavioral Research Methods, Instruments, and Computers, 36, 180-193.
Kort, B., Reilly, R. and Picard, R. W (2001). An affective model of interplay between emotions and learning: Reengineering educational pedagogy - building a learning companion. Proceedings of IEEE International Conference on Advanced Learning Technologies, pp.43-48.
Litman, D. J. and Forbes-Riley, K. (2006). Recognizing Student Emotions and Attitudes on the Basis of Utterances in Spoken Tutoring Dialogues with both Human and Computer Tutors. Speech Communication, in press.
Ortony, A., Clore, G., & Collins, A. (1998). Cognitive Structure of Emotions. Cambridge, England: Cambridge University Press.
Picard, R.W. (1997). Affective Computing, MIT Press, Cambridge, Mass..
Sarrafzadeh, A., Gholam Hosseini, H., Fan, C., Overmyer S. P. (2003). Using Machine Intelligence to Estimate Human Cognitive and Emotional States from Non-Verbal Signals, IEEE International Conference on Advanced Learning Technologies, Athens, Greece, July, 2003.
Sarrafzadeh, A., Fan, C., Dadgostar, F., Alexander, S. T. V. and Messom, C. (2004). Frown gives game away: Affect sensitive tutoring systems for elementary mathematics. Proceedings of the IEEE Conference on Systems, Man and Cybernetics, The Hague.
Sarrafzadeh, A., Alexander, A., Dadgostar, F., Fan, C., Bigdeli, A. (2008). How do you know that I don't understand? A look at the future of intelligent tutoring systems. Elsevier Journal- Computers in Human Behavior, 24 (4), 2008, pp. 1342-1363.
Simonite, T. (2007). Emotion-aware teaching software tracks student attention. Retrieved January 9, 2007, from http://www.newscientisttech.com/article.ns?id=dn10894&feedId=online-news_rss20.
Del Soldato, T. (1994). Motivation in Tutoring Systems. Technical Report, CSRP 303, School of Cognitive and Computing Science, The University of Sussex, UK.
de Vicente, A. (2003). Towards Tutoring Systems that Detect Students' Motivation: An Investigation, Ph.D. Thesis, School of Informatics, University of Edinburgh.
-- () 04:42, 26 September 2008 (UTC)
Wikimedia Foundation. 2010.