Introducción al Trabajo de Título
Entrar

InCA-ActiveFloor: Touch interfaces for medium sized animals Memoria Doble Titulación Computación centrada en las personas Computación para ciencia e ingeniería Infraestructura computacional Ingeniería de software Inteligencia artificial

Profesor Guia
Profesor Coguia
Sub Áreas Computación social, Informática educativa, Interacción humano-computador, Aplicaciones en ciencia e ingeniería, Computación gráfica, Robótica, Desarrollo de software, Visión computacional

Descripción


     Solutions allowing non human animals to communicate in human words by activating the playing of recorded human words on digital devices have seen an explosion since 2015.  Medium sized non human animals have had successes using AAC voice buttons, while smaller non human animals have had successes using AAC applications on touch screen devices.
     But voice buttons fixed on a flat surface are much more limited than touch screen interfaces for communication, and touch screen devices are generally too small (and often too fragile) to be used with medium sized non human animals.
     As the cost of video projectors and video cameras have been greatly reduced in the last decade, could one project the image of an interface on the floor, and record and analyze videos of the interaction of subjects with such an interface to generate the appropriate response?
     We propose to design, implement and validate three types of virtual interfaces which can be projected on the floor and interacted with 1) a simple static interface simulating a musical piano with keys associated to music notes or animal noises, 2) a more specific interface serving as a communication keyboard with keys associated to recordings of human words used in Inter-species communication, and 3) an animated interface allowing to play a game of "pop the balloon".
     In order to validate the usability of such interfaces,
 the time and type of each interaction will be logged on a central server for later analysis. As a preliminary validation, we will consider subjects from three distinct species, and measure the timing and types of interactions with each interface, exploring various settings of color and cadence for each, in order to draw preliminary maps of 1) the intersection of the color perception abilities of each species with the color display ability of the video projecting device used on one hand, and of 2) the relation between the Critical Flicker Fusion Frequency (CFFF)~\cite{2021-Medicina-CriticalFlickerFusionFrequencyANarrativeReview-MankowskaEtAl} of each species at different times of the day and the cadence of the interfaces.

 

See the extended research proposal at https://www.overleaf.com/read/mzqjbvyttzsy for more information