Bakhtiar Yusuf

55625368700

Publications - 3

A novel multimodal communication framework using robot partner for aging population

Publication Name: Expert Systems with Applications

Publication Date: 2015-06-01

Volume: 42

Issue: 9

Page Range: 4540-4555

Description:

In developed country such as Japan, aging has become a serious issue, as there is a disproportionate increasing of elderly population who are no longer able to look after themselves. In order to tackle this issue, we introduce human-friendly robot partner to support the elderly people in their daily life. However, to realize this, it is essential for the robot partner to be able to have a natural communication with the human. This paper proposes a new communication framework between the human and robot partner based on relevance theory as the basis knowledge. The relevance theory is implemented to build mutual cognitive environment between the human and the robot partner, namely as the informationally structured space (ISS). Inside the ISS, robot partner employs both verbal as well as non-verbal communication to understand human. For the verbal communication, Rasmussen's behavior model is implemented as the basis for the conversational system. While for the non-verbal communication, environmental and human state data along with gesture recognition are utilized. These data are used as the perceptual input to compute the robot partner's emotion. Experimental results have shown the effectiveness of our proposed communication framework in establishing natural communication between the human and the robot partner.

Open Access: Yes

DOI: 10.1016/j.eswa.2015.01.016

Computational intelligence for gestural communication using emotional model

Publication Name: Iwaciii 2013 3rd International Workshop on Advanced Computational Intelligence and Intelligent Informatics

Publication Date: 2014-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

When conducting natural communication in addition to perform verbal communication, non-verbal communication such as gestural information should also be understood. By the word "understand" we mean not only the recognition of one action, but also grasp the meaning of the gesture itself. Therefore, in order to understand the meaning of an action, in this paper we propose emotional model along with the gesture recognition technique. First we discuss the gesture recognition method using iPhone camera by applying steady state genetic algorithm, spiking neural network and self organizing map. After that we use the gesture recognition result as an input data for the emotional model.

Open Access: Yes

DOI: DOI not available

Extraction of daily life log measured by smart phone sensors using neural computing

Publication Name: Procedia Computer Science

Publication Date: 2013-01-01

Volume: 22

Issue: Unknown

Page Range: 883-892

Description:

This paper deals with the information extraction of daily life log measured by smart phone sensors. Two types of neural computing are applied for estimating the human activities based on the time series of the measured data. Acceleration, angular velocity, and movement distance are measured by the smart phone sensors and stored as the entries of the daily life log together with the activity information and timestamp. First, growing neural gas performs clustering on the data. Then, spiking neural network is applied to estimate the activity. Experiments are performed for verifying the effectiveness of the proposed method. © 2013 The Authors.

Open Access: Yes

DOI: 10.1016/j.procs.2013.09.171