Mostrar el registro sencillo del ítem

Aplicación web para el análisis de emociones y atención de estudiantes

dc.creatorPiedrahíta-Carvajal, Alejandro
dc.creatorRodríguez-Marín, Paula Andrea
dc.creatorTerraza-Arciniegas, Daniel F.
dc.creatorAmaya-Gómez, Mauricio
dc.creatorDuque-Muñoz, Leonardo
dc.creatorMartínez-Vargas, Juan David
dc.date2021-05-12
dc.date.accessioned2021-08-19T16:21:46Z
dc.date.available2021-08-19T16:21:46Z
dc.identifierhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1821
dc.identifier10.22430/22565337.1821
dc.identifier.urihttp://test.repositoriodigital.com:8080/handle/123456789/12073
dc.descriptionAnalyzing and monitoring students’ attention level in virtual environments allows teachers to take actions to improve teaching-learning processes. This study introduces the integration of two models, one for emotion recognition and one for attention analysis, both of them aimed at monitoring the interactions of students in virtual environments. Such integration was completed on a web platform employing the Flask framework, where the artificial intelligence models used to analyze the interaction can be executed. The results obtained show that teachers, as knowledge mediators, can use the platform to understand the behavior of the students in synchronous and asynchronous virtual environments and take actions to improve learning experiences. The results also highlight the advantages of employing the Model-View-Controller (MVC) pattern in web applications, using and integrating artificial intelligence techniques through the Flask framework.en-US
dc.descriptionEl análisis de emociones y el monitoreo del nivel de atención de los estudiantes en entornos virtuales permite a los docentes tomar acciones para mejorar los procesos de enseñanza-aprendizaje. Por esta razón, este trabajo presenta la integración de dos modelos: uno para el reconocimiento de emociones y otro para el análisis de atención, ambos con el objetivo de hacer monitoreo durante la interacción de un estudiante en entornos virtuales. Dicha integración se realiza en una plataforma web desarrollada en el entorno flask, en la que se pueden ejecutar los modelos de inteligencia artificial utilizados para la interacción. Los resultados obtenidos muestran que la plataforma podría ser utilizada por docentes como mediadores del conocimiento, para entender el comportamiento de los estudiantes en entornos virtuales tanto síncronos como asíncronos, y para tomar acciones que mejoren la experiencia de aprendizaje. Como ventaja adicional, los resultados aquí mostrados resaltan las ventajas que trae utilizar el Modelo Vista Controlador (MVC) en aplicaciones web, empleando e integrando técnicas de inteligencia artificial a través del framework Flask.es-ES
dc.formatapplication/pdf
dc.formatapplication/zip
dc.formattext/xml
dc.formattext/html
dc.languagespa
dc.publisherInstituto Tecnológico Metropolitano (ITM)en-US
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1821/1988
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1821/2060
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1821/1989
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1821/1990
dc.relation/*ref*/A. Gegenfurtner; S. Narciss; L. K. Fryer; S. Järvelä; J. M. Harackiewicz, “Editorial: Affective Learning in Digital Education,” Front. Psychol., vol. 11, pp. 2020–2022, Jan. 2020. https://doi.org/10.3389/fpsyg.2020.630966
dc.relation/*ref*/A. Puente Ferreras, Psicología contemporánea básica y aplicada. Ed, Piramide. 2011. https://www.edicionespiramide.es/libro.php?id=2786932
dc.relation/*ref*/D. Hazarika; S. Poria; R. Zimmermann; R. Mihalcea, “Conversational transfer learning for emotion recognition,” Inf. Fusion, vol. 65, pp. 1–12, Jan. 2021. https://doi.org/10.1016/j.inffus.2020.06.005
dc.relation/*ref*/N. Ibañez, “Las emociones en el aula,” Estud. Peagogicos, vol. 1, no. 28, pp. 31–45, 2002. https://www.redalyc.org/pdf/1735/173513847002.pdf
dc.relation/*ref*/A. Fernández-Castillo; M. E. Gutiérrez Rojas, “Atención selectiva, ansiedad, sintomatología depresiva y rendimiento académico en adolescentes,” Electron. J. Res. Educ. Psychol., vol. 7, no. 1, pp. 49–76, Apr. 2009. https://www.redalyc.org/articulo.oa?id=293121936004
dc.relation/*ref*/G. Caicedo Delgado, “La enseñanza en ingeniería,” Tecnológicas, no. 31, pp. 9–11, Nov. 2013. https://doi.org/10.22430/22565337.95
dc.relation/*ref*/V. Londoño-Osorio; J. Marín-Pineda; E. I. Arango-Zuluaga, “Introduction to Artificial Vision through Laboratory Guides Using Matlab,” TecnoLógicas, pp. 591- 603, 2013. https://doi.org/10.22430/22565337.350
dc.relation/*ref*/M. M. Bundele; R. Banerjee, “Detection of fatigue of vehicular driver using skin conductance and oximetry pulse: a neural network approach,” in 11th International Conference on Information Integration and web-based applications & services, Lumpur 2009, pp. 739–744. https://doi.org/10.1145/1806338.1806478
dc.relation/*ref*/C. Li; C. Xu; Z. Feng, “Analysis of physiological for emotion recognition with the IRS model,” Neurocomputing, vol. 178, pp. 103–111, Feb. 2016. https://doi.org/10.1016/j.neucom.2015.07.112
dc.relation/*ref*/S. K. D’Mello; S. D. Craig; A. C. Graesser, “Multimethod assessment of affective experience and expression during deep learning,” Int. J. Learn. Technol., vol. 4, no. 3/4, Oct. 2009, https://doi.org/10.1504/ijlt.2009.028805
dc.relation/*ref*/S. K. D’Mello; A. Graesser, “Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features,” User Model. User-adapt. Interact., vol. 20, no. 2, pp. 147–187, May. 2010. https://doi.org/10.1007/s11257-010-9074-4
dc.relation/*ref*/A. Kapoor; R. W. Picard, “Multimodal affect recognition in learning environments,” Proceedings of the 13th ACM International Conference on Multimedia, MM 2005. pp. 677–682, Nov. 2005. https://doi.org/10.1145/1101149.1101300
dc.relation/*ref*/B. Mcdaniel; S. D’Mello; B. King; P. Chipman; K. Tapp; A. Graesser, “Facial Features for Affective State Detection in Learning Environments,” in UC Merced Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 29, no. 29, pp. 467-472, 2007. https://escholarship.org/content/qt9w00945d/qt9w00945d.pdf
dc.relation/*ref*/S. Craig; A. Graesser; J. Sullins; B. Gholson, “Affect and learning: An exploratory look into the role of affect in learning with AutoTutor,” J. Educ. Media, vol. 29, no. 3, pp. 241–250, Jul. 2010. https://doi.org/10.1080/1358165042000283101
dc.relation/*ref*/R. Pekrun; T. Goetz; A. C. Frenzel; P. Barchfeld; R. P. Perry, “Measuring emotions in students’ learning and performance: The Achievement Emotions Questionnaire (AEQ),” Contemp. Educ. Psychol., vol. 36, no. 1, pp. 36–48, Jan. 2011. https://doi.org/10.1016/j.cedpsych.2010.10.002
dc.relation/*ref*/C. Jonathan; J. P.-L. Tan; E. Koh; I. S. Caleon; S. H. Tay, “Engagement as flourishing: The contribution of positive emotions and coping to adolescents’ engagement at school and with learning,” Psychology in the Schools, vol. 45, no. 5, pp. 419–431, 2017. https://doi.org/10.1002/pits.20306
dc.relation/*ref*/T. E. Oliphant, “Python for scientific computing,” Comput. Sci. Eng., vol. 9, no. 3, pp. 10–20, Jun. 2007. https://doi.org/10.1109/MCSE.2007.58
dc.relation/*ref*/I. Challenger-Pérez; Y. Díaz-Ricardo; R. A. Becerra-García, “El lenguaje de programación Python,” Ciencias Holguín, vol. 20, no. 2, pp. 1–13, Abr. 2014. https://www.redalyc.org/pdf/1815/181531232001.pdf
dc.relation/*ref*/M. Anggo; La Arapu, “Face Recognition Using Fisherface Method,” en 2nd International Conference on Statistics, Mathematics, Teaching, and Research 2017, Makassar, Indonesia, 2017, pp. 998–1001, 2018. https://doi.org/10.1088/1742-6596/1028/1/012119
dc.relation/*ref*/W. Shen; R. Khanna, “Prolog to Face Recognition: Eigenface, Elastic Matching, and Neural Nets,” Proc. IEEE, vol. 85, no. 9, p. 1422, Sep. 1997. https://doi.org/10.1109/JPROC.1997.628711
dc.relation/*ref*/N. N. Mohammed; M. I. Khaleel; M. Latif; Z. Khalid, “Face Recognition Based on PCA with Weighted and Normalized Mahalanobis distance,” en International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), Bangkok, 2018, pp. 267–267, doi: https://doi.org/10.1109/iciibms.2018.8549971
dc.relation/*ref*/I. William; D. R. Ignatius Moses Setiadi; E. H. Rachmawanto; H. A. Santoso; C. A. Sari, “Face Recognition using FaceNet (Survey, Performance Test, and Comparison),” en Proc. 2019 4th Int. Conf. Informatics Comput. ICIC, Semarang, 2019. https://doi.org/10.1109/ICIC47613.2019.8985786
dc.relation/*ref*/E. Winarno; I. H. Al Amin; H. Februariyanti; P. W. Adi; W. Hadikurniawati; M. T. Anwar, “Attendance System Based on Face Recognition System Using CNN-PCA Method and Real-Time Camera,” en 2019 2nd Int. Semin. Res. Inf. Technol. Intell. Syst. ISRITI, pp. 301–304, Yogyakarta, 2019. https://doi.org/10.1109/ISRITI48646.2019.9034596
dc.relation/*ref*/C. Li; Z. Q; N. Jia; J. Wu, “Human face detection algorithm via Haar cascade classifier combined with three additional classifiers,” en ICEMI 2017 - Proc. IEEE 13th Int. Conf. Electron. Meas. Instruments, pp. 483–487, Yangzhou, 2017. https://doi.org/10.1109/ICEMI.2017.8265863
dc.relation/*ref*/P. Viola; M. Jones, “Rapid object detection using a boosted cascade of simple features,” en Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., Kauai, 2001. https://doi.org/10.1109/cvpr.2001.990517
dc.relation/*ref*/L. Chen; Y. Hong Wang; Y. Ding Wang; D. Huang, “Face recognition with local binary patterns,” en Proc. 2009 Int. Conf. Mach. Learn. Cybern., Baoding, 2009, vol. 4, pp. 2433–2439. https://doi.org/10.1109/ICMLC.2009.5212189
dc.relation/*ref*/J. Li; T. Qiu; C. Wen; K. Xie; F. Q. Wen, “Robust face recognition using the deep C2D-CNN model based on decision-level fusion,” Sensors (Switzerland), vol. 18, no. 7, pp. 1–27, Jun. 2018. https://doi.org/10.3390/s18072080
dc.relation/*ref*/K. Zhang; Z. Zhang; Z. Li; Y. Qiao, “Joint Face Detection and Alignment Using Multitask Cascaded Convolutional Networks,” IEEE Signal Process. Lett., vol. 23, no. 10, pp. 1499–1503, Aug. 2016. https://doi.org/10.1109/LSP.2016.2603342
dc.relation/*ref*/M. R. Mufid; A. Basofi; M. U. H. Al Rasyid; I. F. Rochimansyah; A. Rokhim, “Design an MVC Model using Python for Flask Framework Development,” en 2019 International Electronics Symposium (IES)., Surabaya, 2019, pp. 214–219. https://doi.org/10.1109/ELECSYM.2019.8901656
dc.relation/*ref*/F. A. Aslam; H. N. Mohammed; J. M. M. Munir; M. A. Gulamgaus, “Efficient Way Of Web Development Using Python And Flask,” Int. J. Adv. Res. Comput., vol. 6, no. 2, pp. 54–57, Mar. 2015. https://core.ac.uk/download/pdf/55305148.pdf
dc.rightsCopyright (c) 2021 TecnoLógicasen-US
dc.rightshttp://creativecommons.org/licenses/by-nc-sa/4.0en-US
dc.sourceTecnoLógicas; Vol. 24 No. 51 (2021); e1821en-US
dc.sourceTecnoLógicas; Vol. 24 Núm. 51 (2021); e1821es-ES
dc.source2256-5337
dc.source0123-7799
dc.subjectWeb applicationen-US
dc.subjectAttention monitoringen-US
dc.subjectEmotion recognitionen-US
dc.subjectFacial recognitionen-US
dc.subjectAplicación webes-ES
dc.subjectmonitoreo de atenciónes-ES
dc.subjectreconocimiento de emocioneses-ES
dc.subjectreconocimiento de rostroses-ES
dc.titleA Web Application to Analyze Students’ Emotions and Attentionen-US
dc.titleAplicación web para el análisis de emociones y atención de estudianteses-ES
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion
dc.typeResearch Papersen-US
dc.typeArtículos de investigaciónes-ES


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem