Hunor Nagy

56689648800

Publications - 6

Improving the audio game–playing performances of people with visual impairments through multimodal training

Publication Name: Journal of Visual Impairment and Blindness

Publication Date: 2017-03-01

Volume: 111

Issue: 2

Page Range: 148-164

Description:

Introduction: As the number of people with visual impairments (that is, those who are blind or have low vision) is continuously increasing, rehabilitation and engineering researchers have identified the need to design sensory-substitution devices that would offer assistance and guidance to these people for performing navigational tasks. Auditory and haptic cues have been shown to be an effective approach towards creating a rich spatial representation of the environment, so they are considered for inclusion in the development of assistive tools that would enable people with visual impairments to acquire knowledge of the surrounding space in a way close to the visually based perception of sighted individuals. However, achieving efficiency through a sensory substitution device requires extensive training for visually impaired users to learn how to process the artificial auditory cues and convert them into spatial information. Methods: Considering all the potential advantages game-based learning can provide, we propose a new method for training sound localization and virtual navigational skills of visually impaired people in a 3D audio game with hierarchical levels of difficulty. The training procedure is focused on a multimodal (auditory and haptic) learning approach in which the subjects have been asked to listen to 3D sounds while simultaneously perceiving a series of vibrations on a haptic headband that corresponds to the direction of the sound source in space. Results: The results we obtained in a sound-localization experiment with 10 visually impaired people showed that the proposed training strategy resulted in significant improvements in auditory performance and navigation skills of the subjects, thus ensuring behavioral gains in the spatial perception of the environment.

Open Access: Yes

DOI: 10.1177/0145482x1711100206

Contrasting results and effectiveness of controlled experiments with crowdsourced data in the evaluation of auditory reaction times

Publication Name: 7th IEEE International Conference on Cognitive Infocommunications Coginfocom 2016 Proceedings

Publication Date: 2017-01-03

Volume: Unknown

Issue: Unknown

Page Range: 421-425

Description:

We developed an application for the Android platform to test reaction times to auditory stimuli on mobile devices. During tests, users were asked to respond as quickly as possible to auditory events provided through headphone playback, and to also identify the directions of those events based on stereo panning. This paper presents a comparative evaluation of data (i.e. response times in seconds and error rates) collected through controlled supervised experiments as opposed to a crowdsourcing based solution. It is demonstrated that in some respects, crowdsourced data and laboratory data show similar results within their own category, but that at the same time a statistical comparison between the two measurement configurations is difficult due to the significant amount of noisy (outlier) data in the crowdsourced case.

Open Access: Yes

DOI: 10.1109/CogInfoCom.2016.7804586

Evaluation of human-Myo gesture control capabilities in continuous search and select operations

Publication Name: 7th IEEE International Conference on Cognitive Infocommunications Coginfocom 2016 Proceedings

Publication Date: 2017-01-03

Volume: Unknown

Issue: Unknown

Page Range: 415-420

Description:

Tactile and haptic devices can be used to control and interact with a wide range of systems, including games, virtual environments and assistive technologies. Although many psychophysical studies have measured thresholds of human sensory capabilities for interpreting haptic and tactile feedback, relatively little is known about the precision with which we are able to guide the behavior of a system based on kinesthetic and myoelectric gestures. A broad study of the latter problem is important, especially now that a number of devices have appeared-such as the Leap Motion Controller and the Myo armband-which enable humans to use finger, hand and arm gestures to interact with the digital world. This paper provides a broad overview on the topic, and reports a set of preliminary experiments on the extent to which the Myo armband can be used to control auditory feedback in real time. Test results are evaluated based on a Bayesian statistical model of an empirical (but for the most part, unambiguous) performance scale. The goal is to investigate ways in which visually impaired users could use the Myo to control the output of an assistive technology.

Open Access: Yes

DOI: 10.1109/CogInfoCom.2016.7804585

Evaluation of training to improve auditory memory Capabilities on a mobile device based on a serious game application

Publication Name: 142nd Audio Engineering Society International Convention 2017 AES 2017

Publication Date: 2017-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

Capabilities of the auditory memory system were tested in a serious game application developed for the Android mobile platform. Participants played the well-known game of finding pairs by flipping and remembering objects on cards arranged in a matrix structure. Visual objects were replaced by iconic auditory events (auditory icons, earcons). Total time and different error rates were recorded and the effect of training was also evaluated. Results indicate that training contributes to a better performance and human voice samples are the easiest to remember.

Open Access: Yes

DOI: DOI not available

Evaluation of response times on a touch screen using stereo panned speech command auditory feedback

Publication Name: Lecture Notes in Computer Science Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics

Publication Date: 2016-01-01

Volume: 9811 LNCS

Issue: Unknown

Page Range: 279-286

Description:

User interfaces to access mobile and handheld devices usually incorporate touch screens. Fast user responses are in general not critical, however, some applications require fast and accurate reactions from users. Errors and response times depend on many factors such as the user’s abilities, feedback types and latencies from the device, sizes of the buttons to press, etc. We conducted an experiment with 17 subjects to test response time and accuracy to different kinds of speech-based auditory stimuli over headphones. Speech signals were spatialized based on stereo amplitude panning. Results show significantly better response times for 3 directions than for 5, as well as for native language compared to English, and more accurate judgements based on the meaning of the speech sounds rather than their direction.

Open Access: Yes

DOI: 10.1007/978-3-319-43958-7_33

A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research

Publication Name: Journal on Multimodal User Interfaces

Publication Date: 2015-12-01

Volume: 9

Issue: 4

Page Range: 275-286

Description:

This paper summarizes recent developments in audio and tactile feedback based assistive technologies targeting the blind community. Current technology allows applications to be efficiently distributed and run on mobile and handheld devices, even in cases where computational requirements are significant. As a result, electronic travel aids, navigational assistance modules, text-to-speech applications, as well as virtual audio displays which combine audio with haptic channels are becoming integrated into standard mobile devices. This trend, combined with the appearance of increasingly user-friendly interfaces and modes of interaction has opened a variety of new perspectives for the rehabilitation and training of users with visual impairments. The goal of this paper is to provide an overview of these developments based on recent advances in basic research and application development. Using this overview as a foundation, an agenda is outlined for future research in mobile interaction design with respect to users with special needs, as well as ultimately in relation to sensor-bridging applications in general.

Open Access: Yes

DOI: 10.1007/s12193-015-0182-7