دولی، سعید و منتظر، غلامعلی. (1389). طراحی چارچوب معماری اطلاعاتی برای تحقق دانشگاه مجازی در ایران، فصنامه علوم و فناوری اطلاعات، 26(2)،440-413.
Al-Awni, A. (2016). Mood extraction using facial features to improvelearning curves of students in e-learning systems. International Journal of Advanced Computer Science and Applications, 7(11), 444-453.
Ammar, M. B., Neji, M., Alimi, A. M., & Gouardères, G. (2010). The affective tutoring system. Expert Systems with Applications, 37(4), 3013-3023.
B. Bozorgtabar, M. S. Rad, H. Kemal Ekenel and J. Thiran. (2019). Using Photorealistic Face Synthesis and Domain Adaptation to Improve Facial Expression Analysis, 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France, 2019, pp. 1-8. doi: 10.1109/FG.2019.8756632
D'Mello, S., Olney, A., Williams, C., & Hays, P. (2012). Gaze tutor: A gaze-reactive intelligent tutoring system. International Journal of human-computer studies, 70(5), 377-398.
Du, S., Tao, Y., & Martinez, A. M. (2014). Compound facial expressions of emotion. Proceedings of the National Academy of Sciences, 111(15), E1454-E1462.
Feng Zhou, Shu Kong, Charless C. Fowlkes, Tao Chen, Baiying Lei. (2020). Fine-Grained Facial Expression Analysis Using Dimensional Emotion Model, Neurocomputing, ISSN0925-2312,
https://doi.org/10.1016/ .
Freitas-Magalhães, A. (2013). Facial expression of emotion: from theory to application. Leya.
Graesser, A. C., Sidney, K. D., Craig, S. D., Gholson, B., Franklin, S., Picard, R. (2005). Integrating affect sensors in an intelligent tutoring system. In Affective Interactions: The Computer in the Affective Loop Workshop at (pp. 7-13).
Hospers, M., E. Kroezen, et al. (2003). An agent-based intelligent tutoring system for nurse education. Applications of intelligent agents in health care: 143-159
Juárez-Ramírez, R., Navarro-Almanza, R., Gomez-Tagle, Y., Licea, G., Huertas, C., & Quinto, G. (2013). Orchestrating an adaptive intelligent tutoring system: towards integrating the user profile for learning improvement. Procedia-Social and Behavioral Sciences, 106, 1986-1999.
Jin, Y., Von Seelen, W., & Sendhoff, B. (1999). On generating FC/sup 3/fuzzy rule systems from data using evolution strategies. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 29(6), 829-845.
Klement, M., Dostál, J., & Marešová, H. (2014). Elements of Electronic Teaching Materials with Respect to Student's Cognitive Learning Styles. Procedia-Social and Behavioral Sciences, 112, 437-446.
Kohoulat, N., Hayat, A. A., Dehghani, M. R., Kojuri, J., & Amini, M. (2017). Medical students’ academic emotions: the role of perceived learning environment. Journal of advances in medical education & professionalism, 5(2), 78.
Krithika, L. B. (2016). Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Computer Science, 85, 767-776.
Kumbhar, M., Jadhav, A., & Patil, M. (2012). Facial expression recognition based on image feature. International Journal of Computer and Communication Engineering, 1(2), 117.
Lin, D., & Tang, X. (2006). Recognize high resolution faces: From macrocosm to microcosm. In 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'062), 2, 1355-1362.
Moridis, C. N., & Economides, A. A. (2012). Affective learning: Empathetic agents with emotional facial and tone of voice expressions. IEEE Transactions on Affective Computing, 3(3), 260-272.
Pekrun, R., Elliot, A. J., & Maier, M. A. (2009). Achievement goals and achievement emotions: Testing a model of their joint relations with academic performance. Journal of educational Psychology, 101(1), 115.
Pekrun, R., Goetz, T., Frenzel, A. C., Barchfeld, P., & Perry, R. P. (2011). Measuring emotions in students’ learning and performance: The Achievement Emotions Questionnaire (AEQ). Contemporary educational psychology, 36(1), 36-48.
Porta, M., Ricotti, S., & Perez, C. J. (2012). Emotional e-learning through eye tracking. In Proceedings of the 2012 IEEE Global Engineering Education Conference (EDUCON) (pp. 1-6).
Ramos A.L.A., Dadiz B.G., Santos A.B.G. (2020). Classifying Emotion based on Facial Expression Analysis using Gabor Filter: A Basis for Adaptive Effective Teaching Strategy. In: Alfred R., Lim Y., Haviluddin H., On C. (eds) Computational Science and Technology. Lecture Notes in Electrical Engineering, vol 603. Springer, Singapore.
Raviteja Vemulapalli, Aseem Agarwala. (2019). The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 5683-5692
Ray, A., & Chakrabarti, A. (2012). Design and implementation of affective e-learning strategy based on facial emotion recognition. In Proceedings of the International Conference on Information Systems Design and Intelligent Applications 2012 (INDIA 2012) held in Visakhapatnam, India, January 2012 (pp. 613-622). Springer, Berlin, Heidelberg.
Rodrigo, M. M. T., Baker, R. S., Agapito, J., Nabos, J., Repalam, M. C., Reyes, S. S., & San Pedro, M. O. C. (2012). The effects of an interactive software agent on student affective dynamics while using; an intelligent tutoring system. IEEE Transactions on Affective Computing, 3(2), 224-236.
Samara, A., Galway, L., Bond, R.
et al. (2019). Affective state detection via facial expression analysis within a human–computer interaction context.
J Ambient Intell Human Comput 10, 2175–2184,
https://doi.org/10.1007/ s12652-017-0636-8.
Sarrafzadeh, A., Alexander, S., Dadgostar, F., Fan, C., & Bigdeli, A. (2008). “How do you know that I don’t understand?” A look at the future of intelligent tutoring systems. Computers in Human Behavior, 24(4), 1342-1363.
Sathik, M. M., & Sofia, G. (2011). Identification of student comprehension using forehead wrinkles. In 2011 International Conference on Computer, Communication and Electrical Technology (ICCCET) (pp. 66-70).
Sivaneasharajah, L., Perera, M. A. S., & Jayasekara, P. B. (2016). 2D Facial Composite through Image Processing Techniques (Doctoral dissertation).
T. Wilhelm. (2019). Towards Facial Expression Analysis in a Driver Assistance System, 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France, 2019, pp. 1-4.doi: 10.1109/FG.2019.8756565
Tanzer, M., Weinbach, N., Mardo, E., Henik, A., & Avidan, G. (2016). Phasic alertness enhances processing of face and non-face stimuli in congenital prosopagnosia. Neuropsychologia, 89, 299-308.
Russell, James.(1980).A circumplex model of affect. Journal of Personality and Social Psychology. 39: 1161–1178.
Woolf, B., Burelson, W., & Arroyo, I. (2007). Emotional intelligence for computer tutors. In Workshop on Modeling and Scaffolding Affective Experiences to Impact Learning at 13th International Conference on Artificial Intelligence in Education (Vol. 616).
Wu, Y., Wang, T., & Chu, X. (2009). Affective Modeling and Recognition of Learning Emotion: Application to E-learning. JSW, 4(8), 859-866.
Xu R, Chen J, Han J, Tan L, Xu L. (2019). Towards emotion-sensitive learning cognitive state analysis of big data in education: deep learning-based facial expression analysis using ordinal information.Computing.1-6.
Zhang, W., Shan, S., Qing, L., Chen, X., & Gao, W. (2009). Are Gabor phases really useless for face recognition?. Pattern Analysis and Applications, 12(3), 301-307.