Mostrar el registro sencillo del ítem

Interacciones basadas en gestos: revisión crítica

dc.creatorCortés-Rico, Laura
dc.creatorPiedrahita-Solórzano, Giovanny
dc.date2019-12-05
dc.identifierhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1512
dc.identifier10.22430/22565337.1512
dc.descriptionThis paper presents a critical review of human-computer interactions (HCI) based on gestures. Gestures, as ways of non-verbal communication, have been of interest in HCI because they make possible the interaction with the machine through the body, as an agent that perceives and acts in the world. The review was carried out in the most critical databases in HCI, as well as some Latin-American academic sources, and included an analysis of the evolution of gesture-based interactions, current work, and future perspectives. The article is carried out holistically, considering both technical and human issues: psychological, social, and cultural, as well as their relationships. We present this analytical process as a scientometric description of the search results, the description of the gesture as a means of interaction, the techniques used for the different steps in the gesture recognition process, and the presentation of the applications and challenges of gesture-based interactions. It concludes through a series of questions that invite the reader to think about potential research focus on gesture-based interactions.en-US
dc.descriptionEste artículo presenta una revisión crítica de la interacción humano-computador (HCI) basada en gestos. El gesto, como una forma de comunicación no verbal, ha sido de interés para el área de HCI en la búsqueda de alternativas de interacción entre el humano y la máquina, a través del cuerpo como agente que percibe y actúa en el mundo. La revisión se hizo en las bases de datos de mayor importancia en HCI y en algunas fuentes de literatura académica latinoamericana en el área, e incluye un análisis de la evolución de las interacciones basadas en gestos, el trabajo actual y las perspectivas a futuro. El análisis se desarrolla de forma holística y abarca asuntos técnicos y humanos: psicológicos, sociales y culturales, así como su relación. Este proceso analítico se presenta como una descripción cienciométrica de los resultados de las búsquedas, a fin de exponer el gesto como medio de interacción, las técnicas utilizadas para los diferentes pasos en el proceso de reconocimiento de gestos y las aplicaciones y desafíos de las interacciones basadas en gestos. Como conclusión se formula una serie de preguntas que invitan al lector a pensar en potenciales focos de investigación en las interacciones basadas en gestos.es-ES
dc.formatapplication/pdf
dc.formattext/xml
dc.formattext/html
dc.languagespa
dc.publisherInstituto Tecnológico Metropolitano (ITM)en-US
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1512/1474
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1512/1565
dc.relationhttps://revistas.itm.edu.co/index.php/tecnologicas/article/view/1512/1579
dc.relation/*ref*/W. Grimshaw, “An etymological dictionary or analysis of the English language”, Trieste Publishing, 2018. [2] Real Academia Española, «“Diccionario de la lengua española” - Edición del Tricentenario», «Diccionario de la lengua española» - Edición del Tricentenario. Disponible en: https://dle.rae.es/ [3] E. de Lera y M. Garreta-Domingo, “10 heurísticos emocionales - pautas para evaluar la dimensión afectiva de los usuarios de forma fácil y económica,” Rev. Faz, no. 2, pp. 68–81, 2008. Disponible en: https://www.revistafaz.org/articulos_2/06_diezheuristicos_delera_garreta.pdf [4] H. Lu and Y. Li, “Gesture On: Enabling Always-On Touch Gestures for Fast Mobile Access from the Device Standby Mode,” en Proceedings of the 133rd Annual ACM Conference on Human Factors in Computing Systems - CHI ’15, Seoul, 2015, pp. 3355–3364. https://doi.org/10.1145/2702123.2702610 [5] L. Chen, F. Wang, H. Deng, y K. Ji, “A Survey on Hand Gesture Recognition,” en 2013 International Conference on Computer Sciences and Applications, Wuhan, 2013, pp. 313–316. https://doi.org/10.1109/CSA.2013.79 [6] E. McAweeney, H. Zhang, y M. Nebeling, “User-Driven Design Principles for Gesture Representations,” en Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18, Montreal, 2018, pp. 1-13. https://doi.org/10.1145/3173574.3174121 [7] R. Vatavu, L. Anthony, y J. O. Wobbrock, “Gestures as point clouds: A precognizer for user interface prototypes,” en Proceedings of the 14th ACM international conference on Multimodal interaction - ICMI ’12, California, 2012, pp. 273 – 280. https://doi.org/10.1145/2388676.2388732 [8] S. Piana, A. Staglianò, F. Odone, y A. Camurri, “Adaptive Body Gesture Representation for Automatic Emotion Recognition,” ACM Trans. Interact. Intell. Syst., vol. 6, no. 1, pp. 1–31, Mar. 2016. https://doi.org/10.1145/2818740 [9] M. Denisa, A. Gams, A. Ude, y T. Petric, “Learning Compliant Movement Primitives Through Demonstration and Statistical Generalization,” IEEE/ASME Trans. Mechatronics, vol. 21, no. 5, pp. 2581–2594, Oct. 2016. https://doi.org/10.1109/TMECH.2015.2510165 [10] M. Neff, M. Kipp, I. Albrecht, y H.-P. Seidel, “Gesture modeling and animation based on a probabilistic re-creation of speaker style,” ACM Trans. Graph., vol. 27, no. 1, pp. 1–24, Mar. 2008. https://doi.org/10.1145/1330511.1330516 [11] I. Poupyrev, T. Nashida, y M. Okabe, “Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays,” in Proceedings of the 1st international conference on Tangible and embedded interaction - TEI ’07, Baton Rouge, Louisiana, 2007, pp. 205-212. https://doi.org/10.1145/1226969.1227012 [12] R. Arnheim “Hand and Mind: What Gestures Reveal about Thought by David McNeill,” Leonardo, vol. 27, no. 4, pp. 358, 1994. https://doi.org/10.2307/1576015 [13] C. S. Montero, J. Alexander, M. T. Marshall, y S. Subramanian, “Would you do that?: understanding social acceptance of gestural interfaces,” en Proceedings of the 12th international conference on Human computer interaction with mobile devices and services - MobileHCI ’10, Lisbon, 2010, pp. 275-278. https://doi.org/10.1145/1851600.1851647 [14] M. Rehm, N. Bee, y E. André, “Wave like an Egyptian: accelerometer based gesture recognition for culture specific interactions,” in BCS-HCI ’08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction, Liverpool, 2008, pp. 13–22. Disponible en: https://dl.acm.org/citation.cfm?id=1531517 [15] P. Trigueiros, F. Ribeiro, y L. P. Reis, “Generic system for human-computer gesture interaction,” en 2014 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Espinho, 2014. pp. 175–180. https://doi.org/10.1109/ICARSC.2014.6849782 [16] D. A. Norman, “Natural user interfaces are not natural,” Interactions, vol. 17, no. 3, pp. 6-10, May 2010. https://doi.org/10.1145/1744161.1744163 [17] H. Ishii, “Tangible bits: beyond pixels,” en Proceedings of the 2nd international conference on Tangible and embedded interaction - TEI ’08, Bonn, 2008. pp. 15 -25. https://doi.org/10.1145/1347390.1347392 [18] H. Ishii, “The tangible user interface and its evolution,” Commun. ACM, vol. 51, no. 6, pp. 32-36 , Jun. 2008. https://doi.org/10.1145/1349026.1349034 [19] D. Holman y R. Vertegaal, “Organic User Interfaces: Designing Computers in Any Way, Shape, or Form,” Commun. ACM, vol. 51, no. 6, pp. 48–55, Jun, 2008. https://doi.org/10.1145/1349026.1349037 [20] N. Breslauer, I. Galić, M. Kukec, y I. Samardžić, “Leap Motion Sensor for Natural User Interface,” Teh. Vjesn. - Tech. Gaz., vol. 26, no. 2, pp. 560-565, Apr. 2019. https://doi.org/10.17559/TV-20181012093055 [21] Association for Computing Machinery SIGCHI, 2019. Disponible en: https://sigchi.org/ [22] Interaction Desing Foundation, “International Federation for Information Processing”, 2019. Disponible en: https://www.ifip.org/ [23] Institute of Electrical and Electronics Engineers, Man, and Cybersecurity Society, 2019 en IEEE international conference on systems, man, and cybernetics SMC2019, Bari, 2019. Disponible en: http://ieeesmc.org/publications/enewsletter/629-2019-ieee-international-conference-on-systems-man-and-cybernetics [24] Association for Computing Machinery, «Interactions», 2019. Disponible en: http://interactions.acm.org/ [25] Association for Computing Machinery, «Communications of the ACM», 2019. Disponible en: https://cacm.acm.org/ [26] Interaction Design Foundation, Interaction Design Foundation. 2019. Disponible en: https://www.interaction-design.org/ [27] Revista FAZ, «FAZ - Revista de diseño de interacción», no. 9. Aug. 2016. Disponible en: http://www.revistafaz.org/ [28] Association for Computing Machinery ACM SIGCHI, “CHI 2019” en Conference of Human-Computer Interaction. CHI, Glasgow, 2019. Disponible en: http://chi2019.acm.org/ [29] Association for Computing Machinery SIGCHI “TEI 2019” en 13 international conference on tangible, Embedded, and embodied interactions, Tempe, Arizona, 2019. Disponible en: https://tei.acm.org/2019/ [30] Latin American, “CLIHC 2019”, en IX Latin American Conference on Human Computer Interaction, LAIHC, Panamá, 2019. Disponible: https://clihc2019.laihc.org/ [31] D. Mauney, J. Howart, A. Wirtanen, y M. Capra, «Diferencias y similitudes culturales en gestos definidos por el usuario para interfaces en pantallas táctiles», Revista FAZ - Gestuales, tangibles y de cuerpo entero: Nuevas interacciones, vol. 4, pp. 16-25, Oct. 2010. Disponible en: http://www.revistafaz.org/n4/diferencias_similitudes_gestos.pdf [32] M. Bobillier-Chaumon, S. Carvallo, F. Tarpin Bernard y J. Vancherand Revel «Adapter ou uniformiser les interactions personnes-systèmes? to adapt or standardize the human-computer interactions?», Rev. D’Interaction Homme-Mach., vol. 6, no 2, pp. 91-129, 2005. Disponible en: https://www.academia.edu/944679/Adapter_ou_uniformiser_les_interactions_personnes-syst%C3%A8mes [33] J. G. Bueno, M. González-Fierro, C. Balaguer, y L. Moreno, “Facial gesture recognition using active appearance models based on neural evolution,” en 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 133–134. Massachusetts, 2012, Disponible en: https://ieeexplore.ieee.org/document/6249491 [34] S. Bhowmick, A. K. Talukdar, y K. K. Sarma, “Continuous hand gesture recognition for English alphabets,” en 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, 2015. pp. 443–446. https://doi.org/10.1109/SPIN.2015.7095264 [35] Y. Li et al., “Hand Gesture Recognition and Real-time Game Control Based on A Wearable Band with 6-axis Sensors,” en 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, 2018, pp. 1–6. https://doi.org/10.1109/IJCNN.2018.8489743 [36] B. W Hwang, S. Kim, and S. W. Lee, “A Full-Body Gesture Database for Automatic Gesture Recognition,” in 7th International Conference on Automatic Face and Gesture Recognition (FGR06), Southampton, 2006, pp. 243–248. https://doi.org/10.1109/FGR.2006.8 [37] L. R. Ruiz, F. De la Rosa R., y J. T. Hernandez, “Platform integrating interactive applications with gesture-based interaction,” en 2012 XXXVIII Conferencia Latinoamericana En Informatica (CLEI), Medellín, 2012, pp. 1–9. https://doi.org/10.1109/CLEI.2012.6427129 [38] P. Ponsa, C. Urbina, C. Manresa-Yee, y R. Vilanova, «Estudio de Usabilidad de una Interfaz Gestual Basada en Visión», Revista FAZ, vol. 8, pp. 99-119, 2015. Disponible en: http://www.revistafaz.org/n8/6-interfazgestual.pdf [39] O. Patsadu, C. Nukoolkit, and B. Watanapa, “Human gesture recognition using Kinect camera,” in 2012 Ninth International Conference on Computer Science and Software Engineering (JCSSE), Bangkok, 2012. pp. 28–32. https://doi.org/10.1109/JCSSE.2012.6261920 [40] S. E. Nope, H. Loaiza, and E. Caicedo, “Modelo Bio-inspirado para el Reconocimiento de Gestos Usando Primitivas de Movimiento en Visión,” Rev. Iberoam. Automática e Informática Ind. RIAI, vol. 5, no. 4, pp. 69–76, Oct. 2008. https://doi.org/10.1016/S1697-7912(08)70179-1 [41] M. B. Kaaniche y F. Bremond, “Gesture recognition by learning local motion signatures,” en 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, 2010, pp. 2745–2752. https://doi.org/10.1109/CVPR.2010.5539999 [42] L. P. Morency, A. Quattoni, and T. Darrell, “Latent-Dynamic Discriminative Models for Continuous Gesture Recognition,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, 2007. pp. 1–8. https://doi.org/10.1109/CVPR.2007.383299 [43] H.-I. Suk, B.-K. Sin, and S.-W. Lee, “Hand gesture recognition based on dynamic Bayesian network framework,” Pattern Recognit., vol. 43, no. 9, pp. 3059–3072, Sep. 2010. https://doi.org/10.1016/j.patcog.2010.03.016 [44] J. Xu, S. Sun, Y. Shi, y Z. Dong, “Implementation of Digital Chime-bell Interaction System Driven by Hand Gesture,” en 2008 Third International Conference on Pervasive Computing and Applications, Alexandria, 2008, pp. 264–268. https://doi.org/10.1109/ICPCA.2008.4783590 [45] A. Królak, “Use of Haar-like features in vision-based human-computer interaction systems,” in 2012 Joint Conference New Trends In Audio & Video And Signal Processing: Algorithms, Architectures, Arrangements And Applications (NTAV/SPA), Lodz, 2012, pp. 139–142. Disponible en: https://ieeexplore.ieee.org/document/7085525 [46] S. S. Rautaray y A. Agrawal, “Interaction with virtual game through hand gesture recognition,” en 2011 International Conference on Multimedia, Signal Processing and Communication Technologies, Aligarh, 2011. pp. 244–247. https://doi.org/10.1109/MSPCT.2011.6150485 [47] S.-O. Shin, D. Kim, y Y.-H. Seo, “Controlling Mobile Robot Using IMU and EMG Sensor-Based Gesture Recognition,” en 2014 Ninth International Conference on Broadband and Wireless Computing, Communication and Applications, Guangdong, 2014, pp. 554-557. https://doi.org/10.1109/BWCCA.2014.145 [48] M. P. Tarvekar, “Hand Gesture Recognition System for Touch-Less Car Interface Using Multiclass Support Vector Machine,” en 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, 2018, pp. 1929–1932. https://doi.org/10.1109/ICCONS.2018.8663003 [49] K. Feng y F. Yuan, “Static hand gesture recognition based on HOG characters and support vector machines,” en 2013 2nd International Symposium on Instrumentation and Measurement, Sensor Network and Automation (IMSNA), pp. 936–938, Toronto, 2013. https://doi.org/10.1109/IMSNA.2013.6743432 [50] J. Oh, T. Kim, y H. Hong, “Using Binary Decision Tree and Multiclass SVM for Human Gesture Recognition,” en 2013 International Conference on Information Science and Applications (ICISA), Suwon, 2013, pp. 1–4. https://doi.org/10.1109/ICISA.2013.6579388 [51] W. Nan, Z. Zhigang, L. Huan, M. Jingqi, Z. Jiajun, y D. Guangxue, “Gesture Recognition Based on Deep Learning in Complex Scenes,” en 2019 Chinese Control And Decision Conference (CCDC), Nanchang, 2019, pp. 630-634. https://doi.org/10.1109/CCDC.2019.8833349 [52] U. Cote-Allard et al., “Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 27, no. 4, pp. 760–771, Apr. 2019. https://doi.org/10.1109/TNSRE.2019.2896269 [53] T. Obo, R. Kawabata, y N. Kubota, “Cooperative Human-Robot Interaction Based on Pointing Gesture in Informationally Structured Space,” en 2018 World Automation Congress (WAC), Stevenson, 2018, pp. 1-5. https://doi.org/10.23919/WAC.2018.8430388 [54] C. A. Diaz-León, E. M. Hincapié-Montoya, E. A. Guirales-Arredondo, and G. A. Moreno-López, “Diseño y desarrollo de un sistema de interacción para su implementación en un aula de clase inteligente,” Rev. EIA, vol. 13, no. 26, pp. 95–109, Jul. 2016. https://doi.org/10.24050/reia.v13i26.666 [55] L. Vázquez, A. Martinez, y G. López, “Videojuego serio como apoyo a la estimulación temprana del pensamiento matemático”, Revista FAZ, vol. 9, pp. 13-31, 2016. Disponible en: https://www.revistafaz.org/n9/01-videojuego_serio.pdf [56] C. A. Castillo-Benavides, L. F. García-Arias, N. D. Duque-Méndez, y D. A. Ovalle-Carranza, “IMU-Mouse: diseño e implementación de un dispositivo apuntador dirigido al desarrollo de interfaces adaptativas para personas con discapacidad física,” TecnoLógicas, vol. 21, no. 41, pp. 63–79, Jan. 2018. https://doi.org/10.22430/22565337.727 [57] D. J. Botina-Monsalve, M. A. Domínguez-Vásquez, C. A. Madrigal-González, y A. E. Castro-Ospina, “Clasificación automática de las vocales en el lenguaje de señas colombiano,” TecnoLógicas, vol. 21, no. 41, pp. 103–114, Jan. 2018. https://doi.org/10.22430/22565337.730 [58] J. Alon, V. Athitsos, Quan Yuan, y S. Sclaroff, “A Unified Framework for Gesture Recognition and Spatiotemporal Gesture Segmentation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 9, pp. 1685–1699, Sep. 2009. https://doi.org/10.1109/TPAMI.2008.203
dc.sourceTecnoLógicas; Vol. 22 (2019): Special issue-2019; 119-132en-US
dc.sourceTecnoLógicas; Vol. 22 (2019): Edición especial-2019; 119-132es-ES
dc.source2256-5337
dc.source0123-7799
dc.subjectGesturesen-US
dc.subjecthuman-computer interactionen-US
dc.subjectgesture recognitionen-US
dc.subjectgesture based interactionsen-US
dc.subjectGestoses-ES
dc.subjectinteracción humano-computadores-ES
dc.subjectreconocimiento de gestoses-ES
dc.subjectinteracciones basadas en gestoses-ES
dc.titleGestural Based Interactions: Critical Reviewen-US
dc.titleInteracciones basadas en gestos: revisión críticaes-ES
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion
dc.typeReview Articleen-US
dc.typeArtículos de revisiónes-ES


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem