Toru Yamaguchi

56501037800

Publications - 3

Gestural and facial communication with smart phone based robot partner using emotional model

Publication Name: World Automation Congress Proceedings

Publication Date: 2014-10-24

Volume: Unknown

Issue: Unknown

Page Range: 644-649

Description:

When conducting natural communication in addition to perform verbal communication, a robot partner should also understand non-verbal communication such as facial and gestural information. The word 'understand' for the robot means how to grasp the meaning of the gesture itself. In this paper we propose a smart phone based system, where an emotional model connects the facial and gestural communication of a human and a robot partner. The input of the emotional model is based on face classification and gesture recognition from the human side. Based on the emotional model, the output action such as gestural and facial expressions for the robot is calculated.

Open Access: Yes

DOI: 10.1109/WAC.2014.6936076

Computational intelligence for gestural communication using emotional model

Publication Name: Iwaciii 2013 3rd International Workshop on Advanced Computational Intelligence and Intelligent Informatics

Publication Date: 2014-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

When conducting natural communication in addition to perform verbal communication, non-verbal communication such as gestural information should also be understood. By the word "understand" we mean not only the recognition of one action, but also grasp the meaning of the gesture itself. Therefore, in order to understand the meaning of an action, in this paper we propose emotional model along with the gesture recognition technique. First we discuss the gesture recognition method using iPhone camera by applying steady state genetic algorithm, spiking neural network and self organizing map. After that we use the gesture recognition result as an input data for the emotional model.

Open Access: Yes

DOI: DOI not available

Extraction of daily life log measured by smart phone sensors using neural computing

Publication Name: Procedia Computer Science

Publication Date: 2013-01-01

Volume: 22

Issue: Unknown

Page Range: 883-892

Description:

This paper deals with the information extraction of daily life log measured by smart phone sensors. Two types of neural computing are applied for estimating the human activities based on the time series of the measured data. Acceleration, angular velocity, and movement distance are measured by the smart phone sensors and stored as the entries of the daily life log together with the activity information and timestamp. First, growing neural gas performs clustering on the data. Then, spiking neural network is applied to estimate the activity. Experiments are performed for verifying the effectiveness of the proposed method. © 2013 The Authors.

Open Access: Yes

DOI: 10.1016/j.procs.2013.09.171