JAMRIS 2013 Vol 7 No 2

Page 1

pISSN 1897-8649 (PRINT) / eISSN 2080-2145 (ONLINE)

VOLUME 7

N째 2

2013

www.jamris.org


JOURNAL of AUTOMATION, MOBILE ROBOTICS & INTELLIGENT SYSTEMS

Editor-in-Chief Janusz Kacprzyk

Executive Editor: Anna Ładan aladan@piap.pl

(Systems Research Institute, Polish Academy of Sciences; PIAP, Poland)

Associate Editors: Jacek Salach (Warsaw University of Technology, Poland) Maciej Trojnacki (Warsaw University of Technology and PIAP, Poland)

Co-Editors: Oscar Castillo (Tijuana Institute of Technology, Mexico)

Statistical Editor: Małgorzata Kaliczyńska (PIAP, Poland)

Dimitar Filev (Research & Advanced Engineering, Ford Motor Company, USA)

Kaoru Hirota Editorial Office: Industrial Research Institute for Automation and Measurements PIAP Al. Jerozolimskie 202, 02-486 Warsaw, POLAND Tel. +48-22-8740109, office@jamris.org

(Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology, Japan)

Witold Pedrycz (ECERF, University of Alberta, Canada)

Roman Szewczyk (PIAP, Warsaw University of Technology, Poland)

Copyright and reprint permissions Executive Editor

Editorial Board: Chairman: Janusz Kacprzyk (Polish Academy of Sciences; PIAP, Poland) Mariusz Andrzejczak (BUMAR, Poland) Plamen Angelov (Lancaster University, UK) Zenn Bien (Korea Advanced Institute of Science and Technology, Korea) Adam Borkowski (Polish Academy of Sciences, Poland) Wolfgang Borutzky (Fachhochschule Bonn-Rhein-Sieg, Germany) Chin Chen Chang (Feng Chia University, Taiwan) Jorge Manuel Miranda Dias (University of Coimbra, Portugal) Bogdan Gabryś (Bournemouth University, UK) Jan Jabłkowski (PIAP, Poland) Stanisław Kaczanowski (PIAP, Poland) Tadeusz Kaczorek (Warsaw University of Technology, Poland) Marian P. Kaźmierkowski (Warsaw University of Technology, Poland) Józef Korbicz (University of Zielona Góra, Poland) Krzysztof Kozłowski (Poznań University of Technology, Poland) Eckart Kramer (Fachhochschule Eberswalde, Germany) Piotr Kulczycki (Cracow University of Technology, Poland) Andrew Kusiak (University of Iowa, USA) Mark Last (Ben–Gurion University of the Negev, Israel) Anthony Maciejewski (Colorado State University, USA) Krzysztof Malinowski (Warsaw University of Technology, Poland) Andrzej Masłowski (Warsaw University of Technology, Poland)

Patricia Melin (Tijuana Institute of Technology, Mexico) Tadeusz Missala (PIAP, Poland) Fazel Naghdy (University of Wollongong, Australia) Zbigniew Nahorski (Polish Academy of Science, Poland) Antoni Niederliński (Silesian University of Technology, Poland) Witold Pedrycz (University of Alberta, Canada) Duc Truong Pham (Cardiff University, UK) Lech Polkowski (Polish-Japanese Institute of Information Technology, Poland) Alain Pruski (University of Metz, France) Leszek Rutkowski (Częstochowa University of Technology, Poland) Klaus Schilling (Julius-Maximilians-University Würzburg, Germany) Ryszard Tadeusiewicz (AGH University of Science and Technology in Cracow, Poland)

Stanisław Tarasiewicz (University of Laval, Canada) Piotr Tatjewski (Warsaw University of Technology, Poland) Władysław Torbicz (Polish Academy of Sciences, Poland) Leszek Trybus (Rzeszów University of Technology, Poland) René Wamkeue (University of Québec, Canada) Janusz Zalewski (Florida Gulf Coast University, USA) Marek Zaremba (University of Québec, Canada) Teresa Zielińska (Warsaw University of Technology, Poland)

Publisher: Industrial Research Institute for Automation and Measurements PIAP

If in doubt about the proper edition of contributions, please contact the Executive Editor. Articles are reviewed, excluding advertisements and descriptions of products. The Editor does not take the responsibility for contents of advertisements, inserts etc. The Editor reserves the right to make relevant revisions, abbreviations and adjustments to the articles.

All rights reserved ©

1


JOURNAL OF AUTOMATION, MOBILE ROBOTICS & INTELLIGENT SYSTEMS VOLUME 7, N° 2, 2013

CONTENTS Guest Editors: Piotr Skrzypczynski, Andrzej Kasinski, Gurvinder S. Virk.

39

3

Editorial

Sensory System Calibration Method for a Walking Robot Przemysław Łabecki, Dominik Belter

5

Enhancing Sensor Capabilities of Walking Robots Through Cooperative Exploration with Aerial Robots Georg Heppner, Arne Roennau, Ruediger Dillman

12

The influence of Drive Unit on Measurement Error of Ultrasonic Sensor in Multi-Rotor Flying Robot Stanisław Gardecki, Jarosław Goslinski,

18

Tactile Sensing for Ground Classification Krzysztof Walas

46

The Registration System for the Evaluation of Indoor Visual Slam and Odometry Algorithms Adam Schmidt, Marek Kraft, Michał Fularz, Zuzanna Domagała

52

Communication Atmosphere in Humans and Robots Interaction Based on the Concept of Fuzzy Atmosfield Generated by Emotional States of Humans and Robots Zhen-Tao Liu, Min Wu, Dan-Yun Li, Lue-Feng Chen, FangYan Dong, Yoichi Yamazaki, and Kaoru Hirota

64 24

Autonomous Person Following with 3D LIDAR in Outdoor Environment Andreas Beck-Greinwald, Sebastian Buck, Henrik Marks, Andreas Zell

Motion Direction Control of a Robot Based on Chaotic Synchronization Phenomena Christos Volos

70 30

Calibration of a Rorating 2D Laser Range-Finder Using Point-Plane Coistraints Edmond Wai Yan So, Filippo Basso, Emanuele Menegatti

2

Articles

Application of Magnetovision for Detection of Dangerous Objects Michał Nowicki, Roman Szewczyk


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

This special section of JAMRIS, entitled EXTENDING AUTONOMY OF MOBILE ROBOTS: SENSORS AND MULTI-SENSOR SYSTEMS contains seven papers that appeared at the International Workshop on Perception for Mobile Robots Autonomy (PEMRA), which was held in Poznan, Poland during September 27 and 28, 2012. PEMRA was intended to provide a platform for researchers to present and discuss their latest achievements, future challenges, and applications of perception systems for mobile robots in general, and walking robots in particular. The event was focused on sensing, perception, environment modelling, and sensor-based control in unstructured and dynamic environments. The workshop was organized by the Institute of Control and Information Engineering, Poznan University of Technology under the auspices of CLAWAR Association Limited. It was decided at the very early stage of preparation of the workshop to publish the PEMRA papers in the Journal of Automation, Mobile Robotics, and Intelligent Systems. The papers included in this special section were improved and extended by authors following the workshop and all papers underwent the regular JAMRIS review procedure. Hence, we would like to express our gratitude to the reviewers for their time and effort in evaluating the final papers. The authors of the papers accepted for journal publication had also the opportunity to take into account questions and comments that emerged during the extensive discussions that took place during technical sessions of the workshop. Fifteen papers were presented at the workshop. This second PEMRA 2012 special section published in JAMRIS contains seven of the papers. These papers cover various aspects of robotic perception, sensor modelling, and applications of modern sensors and multi-sensor systems. Sensors are key components of all types of autonomous robots: wheeled, legged, and aerial. Therefore, a great amount of research has been conducted in the field of sensors for robotic applications. The reported research covers principles of sensor operation, evaluation of sensor characteristics, and fusion of sensory data. However, the level of research effort dealing with performance of sensors in real autonomous mobile robots is still relatively low. Particularly, progress is required in sensing and sensor integration for robots operating in challenging environments and conditions, and performing challenging or critical tasks, such as search and rescue missions. Moreover, as mobile robots are gradually introduced in application areas involving human-robot interaction, sensor-based human-robot recognition of human behaviour also becomes an important topic. The papers presented in this special section attempt to address those research areas. The first paper by Heppner, Roennau and Dillmann deals with a multi-sensor system distributed on cooperating robots: a walking machine, and an aerial robot. Authors show how to enhance the perception capabilities of the walking robot LAURON IVc with the use of the lightweight ARDrone quadrocopter. In this application the walking robot acts as a base station providing localization information to the quadrocopter, while the vision system of the ARDrone greatly enhances the perception range of LAURON, particularly in challenging urban search and rescue tasks, where certain important objects (e.g. human victims) might be occluded by obstacles or too far away. The aerial robot theme is continued in the second paper by Gardecki, Goslinski and Giernacki. They present an in-depth analysis of the impact the drive units of the quadrocopter have on the range data obtained from on-board ultrasonic sensors. The paper is based on experimental results that show the influence of the location of the ultrasonic sensors with regard to the drive units to the quality of the range measurements. The investigated sensor was placed in front, behind and in the line of the rotating propeller of the drive unit. The results allowed identifying locations where interferences from the drive units that affect the range readings are the smallest. Thanks to this, the ultrasonic range sensors can be placed optimally on the quadrocopter, improving quality of the measurements on a flying robot. This, in turn, makes such tasks like in-flight obstacle avoidance much safer.

Articles

3


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

Different enough, the paper by Walas deals with sensors of a walking robot. It describes the ground classification procedure, which uses information obtained with 6-DOF torque-force sensors. Statistical description of the signals allows obtaining compact representation of the ground contact type. The signal which gives the best classification results has been established. The obtained results provide the basis for the design of new contact sensors for the robot feet. The important problem of human-robot interaction in a realistic scenario is tackled in the work of Bohlmann, BeckGreinwald and Zell. This paper presents a system for autonomous following of a walking person in outdoor environments while avoiding static and dynamic obstacles. The sensor used in the presented system is a low-resolution 3D LIDAR, and reliable person detection requires a combination of 3D features, motion detection and tracking with a sampling Bayesian filter. Experiments with an outdoor mobile robot demonstrate how the robot with car-like steering incorporates the target’s path into its own path planning around local obstacles. Calibration of the on-board sensors in mobile robotics is another important issue, which is tackled in this special section. The paper by So, Basso, and Menegatti deals with the problem of recovering the parameters of the rotation axis in a 3D laser scanner made up by mounting a commercial 2D scanner on a pan-tilt device. Moreover, the proposed procedure performs the extrinsic calibration between the rotation axis and a camera mounted on the robot. This method simply requires scanning several planar checkerboard patterns that are also imaged by the camera. The authors show that the line-on-plane correspondences can be modelled as point-plane constraints, and therefore a numerical solution developed for such point-plane constraint problems in the field of robot kinematics can be applied to obtain an initial estimate of the calibration parameters. These parameters are then refined by nonlinear optimization that minimizes the line-of-sight errors in the laser scans and the re-projection errors in the camera image. Labecki and Belter address the problem of calibrating the sensory system of a mobile robot. In this case the walking robot is equipped with two exteroceptive sensors: a 2D laser scanner, and a stereo camera. The paper presents a procedure that allows the robot to find autonomously the positions of sensors with regard to the coordinate frame of the robot’s body. The results presented demonstrate that the calibration method improves results as compared to the use of a CAD model of the robot. Various robotic platforms and various sensors are used in mobile robotics research, and thus we rarely have an opportunity to evaluate the competing systems and algorithms on the same dataset, which is a prerequisite for fair comparison of the developed solutions. In an attempt to change this situation the last paper by Schmidt, Kraft, Fularz and Domagala presents the a new benchmarking dataset aimed at facilitating the development and evaluation of the visual odometry and SLAM methods. A wheeled mobile robot equipped with three cameras, attitude and heading reference system, and odometry is used to gather data in indoor scenarios. The ground truth trajectory of the robot is obtained using a multi-camera motion tracking system. The authors plan to make this dataset publicly available for research purposes. We hope that this second selection of PEMRA papers shows the workshop as a successfully event, and an international forum for researchers to present and discuss recent ideas in the area of sensors and perception in mobile robotics. Piotr Skrzypczynski, Poznan University of Technology, Poland Andrzej Kasinski, Poznan University of Technology, Poland Gurvinder S. Virk, University of Gävle, Sweden

4

Articles


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

!"#$% &' !#( ! ) "!' !"#$%

!"#$% !#( ) !"#$%

0 . 1 , 23

- ! .

"!' !#(

! "#$% & ' () *+ , !

, -

! ,

.

,

'

, !

/

* &' +

, # ! 456 #

($#' 7 $ # ' #' 7 # ' +

4*8 *2 *96

,

# $#' : $#'

426 ! 4;6 $#'

! , #%3 < 4*6

$#' 4=6 #

, 496 Articles

5


Journal of Automation, Mobile Robotics & Intelligent Systems

4?6 ) !

- ! 4>6 " 3 4*@6

,

1 !

#%3 "#$ % & !

- #%3

"#$% & $#'

"#$% &A / $#' "#$% & ,

B : @ !

$#' : 2 9

N° 2

2013

- !#( .

, $#' #

, #%3 *@ > D <0

$#' # #<

) * + , -

) ! "#$% & ' #% <

, $

"#$ % & - !

"#$% & .

/ "#$% &

) .

"#$% & , #%3 -

! 1 # #%3

23

, *@ !

"#$% & "#$% &

"#$% & #%3 , $#' 420g "#$% & "#$% &

! "#$% & ! "#$% & ' () *+ 2 @ - @8

"#$% &

!

1 @

2=8

"#$% &

, 1

,

, )

C .

"#$% & # 4**6 ' #( , #%3 <

1 288

$#' )

B 6

VOLUME 7,

Articles


Journal of Automation, Mobile Robotics & Intelligent Systems

!#( !"#$% &' '

$#' ! / $#'

"#$% & , $#' "#$ % &A

# ! $#' "#$% &

!

"#$% & 0#@ 4*>6 #%3 % : (% : - E + 4@6 , % % : 0#@ % F: &

,0<

% : , / . ! -/0 1 #( , $#'

/ #%3 A

- #%3 A

"#$% & ! $#' # "#$% &

$#'

, $#'A

#%3 ( D ) 2+ , "#$% & C:' # () >+ $#' , ! !

VOLUME 7,

N° 2

2013

(

"!'

α β

"!' /( 0

!"#$% !#(

1 23 1 3 1 3 & !"#$% 4 "#$% & $#' , <3 . ! #%3 # ,

#%3

(h+ $ α β $#' h xc yc zc "#$% & () 9+

1 B zc = h − h zc d = (α) yc = d ∗ (β)

(β) yc = z c ∗ (α) xc = d ∗ (β) (β) xc = zc ∗ (α)

(*+ (@+

(2+

(9+

, $#' (h+ h "#$% &

Articles

7


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

* * > @

>8 *9?D=D8 *8D=D8 ;*D*2D8

8 **;D@2D8 >D9*D8 >=D28D8

N° 2

2013

G>8 *5>D*=D8 >*D@2D8 *@8D=2D8

# 1 5 5) 3 6 1 3 !"#$% 2+ 7

!"#$% "!'

* * > @

! " # ! ! $ ! % $ !

( 0 !"#$% !#(

!

,

!

$#' , 1

, 1

,

#%3

#

#%3

$#'

, * @

@ 9 "#$% & , ! $#' ,

* * >8 @

>8 8 G>8 *

,

!

, xc ( 1 9+

yc ( 1 2+ (Îą) Îą ,

@

2 Îą # 8

Articles

8 ;*;D**9D8 =@>D**8D8 *;9D*>;D8

G>8 5@5D@>D8 @?@D?5D8 @8;D*=8D8

# 1 5 5) 3 8 1 3 !"#$% 27 7

* * > @

>8 592D*92D8 9*@D;8D8 >>8D*9D8

>8 >8;D*98D**5 @25D*8*D*@= *88D=8D;5

8 5;=D**D*;> @95D@>D*89 52D*9D?;

G>8 ;@9D?9D=* @2?D2D?9 *9>D>5D***

# 1 5 5) 3 6 1 3 !"#$%

27 7

* * > @

>8 *@=8D@88D59 ;9@D*;?D?5 92?D*?5D??

8 @?;8D28D229 =??D5>D*8; 9@5D**;D*8*

G>8 *@5=D*58D*5@ ;@8D@2D*29 2?>D95D*85

# 1 5 5) 3 8 1 3 !"#$%

27 7

*5 @ 59 9

, $#'

* >

! > @ =8 9 ,

#%3

2 9 , $

#%3

&

#

$#'

! ,

! ! ;8 !

@8 9 1 $#'

) ** #

, #%3


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

#

( ) 9+

"#$% &

( ) =+ , $#'

"#$% &

#%3 "#$% &

!"#$% $ !#(

* !"#$% &' "#$% & #%3 , 3

28 # ! <0 $ 1 $#'

#%3 $#' )

$#' 2=8 "#$% &

#

() 5+ ! "#$% &

() *8 + "#$% &

, #%3

"#$% & () ?+ () *8 +

2 * - 1 #( /

$#' ,

$#' .

,

1 1 ,

"#$% &: () ;+

#%3 #%3 #

) + - , % - .

4 9 !"#$% , $#' ) >

"#$% &

#

=

Articles

9


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

"#$% & - / ! $#'

- "!'

!"#$% -

@ "#$% & # * 5

# #%3

9 "#$ % & ) $#' (

" + * =

58 # $#' :

"#$% &

=98!958 ! # <3

#

. * - /

$#'

"#$% & 9 58 - 58

,

- $#' ! 1 , #%3 "#$% & - !

"#$% & ' ,

"#$% &

#%3

"#$% & /

#%3 10

Articles

∗ )H % 0 , 3: 3 : : C & : *8 *9 I E B J )H 3: B J

)H 3: B J ∗

0

# 3 # %* 4 4*6 < ?iiT,ff `/`QM2XT ``QiX+QK F @8*@ 4@6 % ?iiT,ffrrrX`QbXQ`;f rBFB F @8*@ 426 # # / % C : < & % :

K B & ! '&() !* !! ' " ! +( ;22@ ) @88? 496 # , H I I /

' 1

K B (! ! ! , ! ! ! - !* ! ! (,- @5=2 @5=? # @88? 4>6 # L F < 0 F , 0

1 K (! . @9(>+B2@? 29* @88> 4=6 & / " E / F I : : 3 " H 3 #

K B ())) (! ! ! , ! * ! ! " ! & $$ * ! " & >2 >5 # @8*@ 4;6 / M : -

3 : % : '

K B ())) (! ! * ! , ! ! ! ! ! (, @* @5 @8*8 456 - / 3 ) ! % : : , 0 ! K B & ())) (! ! ! , ! ! ! / ! (, @888 4?6 F C E 0 F , E

B 1 K B ())) (! ! !


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

, ! ! ! ! ! (, @;@2 @;28 @8*@ 4*86 / E % : F I ' I E <

K B & ! 0112 ())) (! ! ! , ! ! ! ! ! (, @@** @@*= # @88> 4**6 # % , I % 3 3

! K B 3rd ())) ' ! )-4' (! ! ! , ! ! ! 4 * ! 4 ! 4 =@= =2* : @8*8 4*@6 < % - E 0 < 3 ! K B ())) $ , ! ! * *8 @885 4*26 :

, I . " < 0 K B 5th (!* ! ! - *, ! ! ! ' 6 ' ! ! '' * = @8*@ 4*96 C : 3 C : / , " < ' , , '

: $ E # ' K (! ! ! . * ! ! ! )! ! ! . ) 9(9+B@= 25 @885 4*>6 I $ H @ ! K B & ! , 7 01186 91th (!* ! ! , ! ! ! , ! ! 7 # ! : @88;

Articles

11


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

! " # $

: &

$

;";!

& :

<

0

:

,

:

,

!

! () *+

.

4@6 )

C

1

#

426 ,

,

4*6 , !

0 . $

1

,

,

! 12

Articles

= !

( + , () @+

1 () 2+ I$I# () 9+ "

A

5 "- / 4 ) !

, (@>!@> +

(5 + , A

<0/ @0 ,

D

)


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

"

;";!

:

) > #

@8

@8 =8 3

8 > ( +

(

5 0<$ (#'%+

*8

, -

#

* ( v

t

+ / !

"

v¡t 2 1+θ v = 331.5 273.15

l=

(*+

(@+

, @ θ ( N0+ 331.5 m s 8N0 Articles

13


Journal of Automation, Mobile Robotics & Intelligent Systems

) 6 7 1 (/ 1 7 8 -) 5 4 /9 /

,

( + I$I# ( I% >88 >;8 @ <#+

( + 23

, !

K K

) , /7 ( / , A

,

,

( <- +

. 3

/ 0 3 ) () = ;+ ,

VOLUME 7,

N° 2

2013

/B x = {0, ..., 0} y = {−3, 0, 3} z = {15.7, 18.7, 21.7} 0B x = {3, ..., 3} y = {−3, 0, 3} z = {15.7, 18.7, 21.7} 3B x = {6, ..., 6} y = {−3, 0, 3} z = {15.7, 18.7, 21.7} B x = {0, 5, 10, 15} y = {18.7, ..., 18.7} z = {0, 5, 10, 15} )B x = {0, 5, 10} y = {13, ..., 13} z = {0, 5, 10, 15} 3

/ 0 3 -

( )+ ( +

A ( / 0 3+ / 0 3 )

!

#

, 1 B /B ? 0B ? 3B ?

-

B *= )B *@ # ( +

# !

!

,

)

B * : ! ( / /*+

-

1 0

3

# ) =

B

14

Articles

@ : 8O 2 0

( ;88

+ 9 % @ 2 B *@ *> *5 ;@ ;> ;5 [O] > 0 ( / /@+ * 9


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

#

!

/

B n 1 (xi − x ÂŻ )2 , (2+ s= n − 1 i=1 B n

x ÂŻ B n 1 x ÂŻ= (xi ), (9+ n i=1 # (

/ 0 3 )+ 1 #

@9 ( + #

! 8O ;5O ) ) 4 / 1 7 1 / 4 7 6 : 7 % / ,

, / 0 3 , 2

,

/ ,

) 5 ?

% * @O

)

0 3 # 2 = / , 0 *8 **

A >?O

% * >?O #

! ( ;5O+

P ! !

H !

( 8O+ %

) ? &

-

8 >

A @O

Articles

15


Journal of Automation, Mobile Robotics & Intelligent Systems

H !

,

,

#

! P !

,

) ** 0

#

/

0 /

1

* :

/ 0 3 ! / 0 3 ) )

(

)+ , *= ) *@ *2

B >?O

B @O

16

Articles

VOLUME 7,

N° 2

2013

. A , . ) *9

9 B ) *@

! P = {x : −15, z : 0} ,

( 8 *>+ H ! #

! Q C ! ) !

! ! :

!

:

) *@

588 B

% ) *2 ,

* = ,

()+

,

)

)

) *> *=

H !

Q !

! ,

, ,

) *2

1


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

: !

! 1

< >?O

∗ < R $ , 0 < 2# =8 ?=> < R < B

J < R $ , 0 < 2# =8 ?=> < R < B . J ! "

< R $ , 0 < 2# =8 ?=> < R < B . J ∗

0

# 3 # %* 4 4*6 F . E - R 3 - " % $

: : K ())) ' ! . ! ? ; F @88? 4@6 , < # C 23 , . 0 S K & ())): '. (! , ! ! (! ! ! ' , @8*8 =98 =9>

< @O

426 F ) % , : : F 0 H 3 )

S $ : ) # ) K & 3rd '*) $ ! , $ ! ! 7 # $ ! - ' - 18 , @88; 03

8O # @ ;5O

2 ,

. # - * - /

B

(

D +

<

,

#

/ 0 3

(* 2 + :

Articles

17


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

% &

! "# " $

4

4

C ($< . C ($< % 8 D

42 E

= . ( !

4 C ($<

4

.

4

- , :

4@ *@ *96 ,

, 42 =6

:

45 ?6 )

4=6 %C !

4;6

, ,

4*8 **6

,

1 1

, 18

Articles

4

!

= 3 )

,

5 - 4 / ; 1 < *-

/9 , /7 - 4 , !

!

B @= 28 =

! & 8F <5


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

+ 2 1 3 4

1 3 1 3 1 3

1 3

A + 2 1 3 4

1 3 1 3 1 3 1 3

1 3 B + Fx Fy Fz B ! > > *= @2 , 4*26 , ) * ,

1 = 3 ) 1

, #, 9> )D, ) @ ,

@?8 & ( 7 *D5 &+

1 *8 & ( 7 *D2;= & + ! , ! ! 1

,

) @ , 1 = 1 @ ;? D , ! B

,

) 2 ! = *8

! 1

) 9 ) >

1

# ) 9 (Fz +

,

8 Fz ! , ! : 1

) !

2= @*= ! , !

) = 3 4 - 1 *-

/9 ) = 3 )

B *+ T @+

T 2+ T 9+ Articles

19


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

A . + 2 1 3 4

1 3 1 3 1 3 1 3

1 3 B + Tx Ty Tz

1 3 . 1 3 C ($< ) + 2 4

, ! ( + - (@ *> + ,

4> *96B , ! (

+ ) ; ) 2 3

20

Articles

1

#

) = ) *-

/9 '

$ 3 # (3#+ 4*6


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

A 1 3 . 1 3 C ($< ) +

2 4

.

4 B

1

Fx 8 >= 8 => 8 9;

Fy 8 9= 8 9= 8 99

Fz 8 @2 8 @= 8 @*

Tx 8 2? 8 >5 8 2@

Ty 8 >9 8 ;@ 8 92

Tz 8 99 8 =* 8 2>

F z Tz 8 *> 8 *5 8 *8

) , /7 - # -

, , * S3#

C

0.8 linear mahalanobis quadratic

0.7

1

*>= (@= +

=8 (*8 + ,

) " 3 # ("3#+ & ! 1

S 3 # (S3#+

)

, , * ) 5 #

Fz S 3 #

0.6 0.5 0.4 0.3 0.2 0.1 0 Fx

Fy

Fz

Tx

Ty

Tz

Fz&Tz

4

"3# , S3# "3#

, "3# , @ 2 9 > = ; ) "3#

Fz (, 9 ) ! 9 2!2

$ B

)

, B

)

!

&

Articles

21


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

Tz (, ;+ !

A * 8 8 ? ! !

,

Fz (, 9+ Tz (, ;+ ,

, 5 B Fx Fy Tx Ty Tz

A Fx C ($<

8 ; 8 2 8 @ 8 8 8 8 8 8

8 * 8 = 8 * 8 2 8 8 8 *

8 * 8 * 8 = 8 * 8 * 8 *

8 8 8 8 8 8 8 * 8 = 8 9

8 * 8 8 8 8 8 * 8 8 8 8

8 8 8 8 8 * 8 9 8 2 8 9

A Fy C ($<

8 5 8 2 8 2 8 8 8 8 8 8

8 * 8 9 8 * 8 @ 8 8 8 8

8 * 8 @ 8 > 8 8 8 @ 8 8

8 8 8 * 8 8 8 > 8 9 8 =

8 8 8 8 8 8 8 * 8 @ 8 8

8 8 8 8 8 * 8 @ 8 @ 8 9

A Fz C ($<

* 8 8 2 8 8 8 8 8 8 8 8

8 8 8 ; 8 8 8 8 8 8 8 8

8 8 8 8 * 8 8 8 8 8 8 8

8 8 8 8 8 8 8 2 8 @ 8 5

8 8 8 8 8 8 8 ; 8 > 8 8

8 8 8 8 8 8 8 8 8 2 8 @

. * - /

, ! ! , 22

Articles

N° 2

2013

A Tx C ($<

8 ; 8 2 8 8 8 * 8 @ 8 8

8 * 8 9 8 8 8 8 8 @ 8 9

8 * 8 8 8 5 8 @ 8 8 8 8

8 8 8 8 8 8 8 9 8 > 8 *

8 * 8 8 8 8 8 * 8 8 8 8

8 8 8 2 8 @ 8 @ 8 8 8 >

A Ty C ($<

8 @ 8 * 8 @ 8 8 8 @ 8 9

8 * 8 5 8 8 8 2 8 * 8 *

8 9 8 * 8 5 8 * 8 * 8 8

8 * 8 8 8 8 8 8 8 8 8 8

8 8 8 8 8 8 8 @ 8 @ 8 *

8 @ 8 8 8 8 8 9 8 9 8 9

A Tz C ($<

8 > 8 * 8 8 8 8 8 8 8 *

8 * 8 ? 8 8 8 * 8 * 8 9

8 8 8 8 * 8 8 2 8 @ 8 2

8 8 8 8 8 8 8 * 8 * 8 8

8 8 8 8 8 8 8 2 8 > 8 8

8 9 8 8 8 8 8 @ 8 * 8 @

A Fz G Tz C ($<

* 8 8 * 8 8 8 8 8 8 8 8

8 8 8 ? 8 8 8 8 8 8 8 8

8 8 8 8 * 8 8 8 8 8 8 8

8 8 8 8 8 8 8 @ 8 * 8 2

8 8 8 8 8 8 8 > 8 5 8 8

8 8 8 8 8 8 8 2 8 * 8 ;

1 D , !

% = 3 )

& Fz Tz ,

#

:

,


Journal of Automation, Mobile Robotics & Intelligent Systems

)

#

@ 3 2 3

# $% $ & < R $ , 0 < 2# =8 ?=> < R < B

J

* %$8! (< 6 %54 ,

< & : 0 E & @8**D8*D&D:,;D8@8;8

# 3 # %* 4 4*6 0 / & ! ! ! ! - ! ! ! (! ! ' ! ! ' : ' & P : &F $:# @88= 4@6 0 / I '

! K ())) " ! ! ! @*(=+B**5> **?* 3 @88>

VOLUME 7,

N° 2

2013

4?6 I #

K B < (! ! ! , ! ! ! ! 59 (,

! ! )! ! ! 2; 9@ * F *??* 4*86 " # < # K )=$ ! 4 ! *;2B>@* >28 @88= 4**6 " # < : K )=$ ! 4 ! *;2B>>9 >>9 @88= 4*@6 " . F / E - % I ,

K . ! < @2(@+B*82 *@@ @88= 4*26 I -

3 /

K . ! !6 - / (! ! ' * >(@+B@5 29 @8** 4*96 0 -

C ) # H '

, 0

$ : ' K B 011> ())): '. (! ! ! , ! * ! ! (! ! ! ' 99@? 9929 @88=

426 < E E 3 0

K ! @=(@ 2+B*;* *5= # @88? 496 < E E 3 : K B ())) (! ! ! , ! ! ! ! ! (, 15 228* 228= @88? 4>6 < E E 3 # : , < : % K ())) " ! ! ! @;(2+B>29 >99 F @8** 4=6 < E U E 3 : : 0 < % % $ # 0 K B E : : : :

- / 3 ) ! ( + ; ' ! ! ' , , <

@88= 4;6 C 0 % C " : % : C

K B ())) (! ! ! , ! ! ! ! ! (, @5@5 @522 @8*8 456 % C I < B

K B ())): '. (! ! ! 7 # $ ! (! ! ! ' ( ' 59 *>8> *>*8 2 & *??* Articles

23


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

! ' # ( ) * &

!

! % % & ' % ( )

& /( &(!# FH 6H /(

* 4 /( &(!#

I B

/(

A 1 #

:

23

!

:

(V7*88888 " !+ .

I 0 23 ' , 23 )

"

24

Articles

# - 8 = ' 5 =/ ; # 3 -- : / ; ,

#

! 1

456 = " 3#% : @3 23 " 3#% 4?6 , ' =9 :@ @8

C =2O

@88 ,

4*86

E / .

#

4=6 23 4@6

I 4**6

%E/ 3 % F 0 , 496 ! , # # #

4* >6 426

' ( 5 =/ ; , ) * #

" 3#% !


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

B @3

. :

) C

!

,

>8O )

B # wS

: ĎƒwS , nS C hS

=

& cs

# ds : ĎƒdS :

wb

1

) *-

/9 5 =/ ; 1 6 - - > :

: &(!#

1 3 -

4

4 = )

:

, 1

(: %+ ,

A ! m si

' (

w ,

B

'

/ ; 1 # ; ( < '- ( :

'ik = k Ok−1 ('ik−1 + T ¡ vk−1 )

, %#&:#0 ,

!

$ ,

23 . <

!

k Ok−1 T

, - 1

4 ;7 : - =: 3 (

" / ; ; 7 /; + ;7 - = #

*8 D ,

"

, !

(*+

Articles

25


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

, bj

'∗ E

B 1 j () − '∗ ) Σ−1 ()j − '∗ )) 2T (@+ , T 1 A

, !

s ! ! B

N° 2

2013

1

wdj = (2πT )− 2 ! (−

'∗ =

m

'i wi : |'i − ' ! | ≤

(2+

i=1

#

1 / ; 9? ; + 1

7 -

C si

,

1 wbj 1

wsi,j = (2Ï€)− 2 ! (−()j − 'i ) ()j − 'i )) · wbj

(9+

!

, wi

wi = !(wsi,j ) · wdjmax j

(>+

; 1 - / / #

!

#

,

k n B i wki = ( n + (1 − n )wi ) · wk−1

(=+

, n

26

Articles

) 6 '- / ; $ - + / 1 ' 3 -- / ; # ,

-

! , " 3#% # !

@ * @3 " 3#% . ,

, A . A , ! ) B )

(E

) @ + , #

! #

,

C

4;6

,

) 2 ,


Journal of Automation, Mobile Robotics & Intelligent Systems

# /(

!

#

#

!

, ) , < 3

. 7 - > . # '- @ 7 - ! () 9+ ,

%0 *B5 #

2@

<0

,

, 1 $

I -

! 8 5; , 35â—Ś ,

1 $ / ! = E<:

. )( ! ( # 4 , 23

)Q =

& : 0

:

50â—Ś 60â—Ś ,

16 C

VOLUME 7,

N° 2

2013

= 4 <J C &(!# K

.

4

>?!@? ,

> * >O ) >

2 , /7 - # -

-

) = ,

@8O # ! (, * @ ) =+

!

@852 2 (, 2+ 2 > , *

!

@

, &

! @8O #

3 2

28

C @

, * # 1

) ; . @ 23

E<: . ) =

,

Articles

27


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

200

150 [m]

100

50

0

Track 1 Track 2 Track 3 0

50

100

150 [m]

200

250

9 6@?/ 8

Odometry

180 160

3 !

3

3

& &(!# <$'

140 120 100 80 60 40

A * - / 80

100

120

140

160

180

200

220

240

260

A

) 9 6

! 6 / 1 3

, * 2 2O

! ? >O ,

() ?+ 28

Articles

23 " 3#%

!

-

23

# * " ∗ 0 : 3 $ ,W ,W E B J

* +

0 : 3 $ ,W ,W ,W E B

J ! ) * 0 : 3 $ ,W ,W ,W E B

J , 0 : 3 $ ,W ,W ,W E B J


Journal of Automation, Mobile Robotics & Intelligent Systems

&

, (E<:+ s4 6

*

VOLUME 7,

# d 4 6

! d ! 4 6

>>@ ;2

, ( + s4 6 >29 =9

2 *

> >

* 8>

* >5

* 89

* =?

@

@

55* ;9

;?5 5=

@ ;

> 5

8 ?>

* ;;

8 ?@

* 5?

@* ?

2

@852 @=

*552 2>

@ >

9 ?;

* 8;

* ;2

* 85

@ @*

@* ?

! # ! ! #

4O6

v 4 D 6 v ! 4 D 6 r 4 D 6 r ! 4 D 6

N° 2

2013

E

#

- 0 : 3 $ ,W ,W ,W E B

J ∗

0

# 1

4*6 , / & % # 3 C - : " < < , $ 3 K @88? 4@6 # / 1 " 3 : < # < K B 011> ())) (! ! ! , ! ! ! ! ' ! 4 ' ! '' 1> 5? & @88=

4?6 " : I #

% , % : # 2 K B & 0@th ( , ! ! ! A (! ! ( 91 # $:# @8*8 4*86 # , F " : , , 2 .

. K B (! ! ! , ! ! ! ! ! @8** 4**6 : : - 0 0 < 3 K B 7 # $ ! , ! ! $$ ! ! & $ ! ! !* B ! ! (,, *99 @8**

426 F / < K (! ! ! . ! ' @(@+B*2; *9= @8*8 496 % E & K B & ! ,-:())) (! ! ! , ! ! ! ? !* (! ! ? ( 18 *; @9 #0 <

@88; 4>6 I E : F C < / <

K B & ! 011> ())) (! ! ! , ! ! ! ! ! (, 011> >>; >=@ @88= 4=6 # &W F C , K ! ' >=(**+B?*> ?@= & @885 4;6 % : % E F ) &

K B & * ! ,-:())) (! ! ! , ! * ! ! ? !* (! ! ? ( 18 @88; 456 : : C

, I , ,

K B 0191 ())) (! * ! ' $ ( 59? 5>9 F @8*8

Articles

29


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

+ ) # & ' "'

* + , % * ( -

! /(

6(

4 6( 0

/(

4 /(

& .

L

&

<

4

4 ) ) M

9 = N@@

4

4

1 #

) @ ! 23 9 ! "%)

) "%)

# !

,

1

! "%)

! ) XX AA "%)

"

("%)+ !

@3 "%)

: 0I " : @88 C $, 28"Q

23 "%)

' C3" =9

# 23

D

( :

% :%9888+

( I +

# 23

@3 "%) "%) , "%) ! 30

Articles

# - 8 = :

@3 "%) 4* 96 , H 4*6

426 496

) 23 "%) $ 4>6 H A

# ! Y>

#

"%) ! @3 "%) ,


Journal of Automation, Mobile Robotics & Intelligent Systems

! "%) 4= ;6

456 ) @3 "%) 4?6 - 4*86

!

! "%)

! 9

4**6 < ,

% : (% :+ <% @ ,

,

C )

XX AA

Y;

) < 7 / 6 ) % - x

' ˆ ' ˜

'

A # A A F ! A X A Y A Z , ' A A ' B A H ! A F B F B˜ ˜=B

A ' AH ' ) * / 3 7

- L F "%) () *+ , "%) ˆ ! ( ) *+ ω

L F - "%) ! θ ! i , "%) θi Li F ,

"%) L X L Y -

L X ! φ ! j - "%) θi

Ď ij

VOLUME 7,

N° 2

2013

A

φij Li 'ij = [Ď ij φij , Ď ij φij , 0] Li F - C F

- ! k

Bk B F

,

ˆ k dk

Ď€ k ) ) # - 5 1 7 1 * / 3 7

, "%) θi 'ij # "%) L0 F # Δθ = θi − θ0 "%) B

Li L0 H

Rωˆ (Δθ) = 0

(I 3 − Rωˆ (Δθ)) 1

(*+

# R F ! R X ! ! L0 F R F !

! ! R F

L Y L Z

L0 F

) @ ˆ = [ωx , ωy , ωz ]

, ω = [ux , uy , uz ] !

L0 F R F L0 F B

L0 R H

=

Raˆ (Ďˆ) 0

−Raˆ (Ďˆ) ∗ 1

(@+

Articles

31


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

ˆ ()

< W (ω, 4*@6 ,

ˆ ¡ ( = 0 ω ˆ Îť() Îť = 0

T

(Νω, ˆ ( 9 3 ) ! ω E < W (ω, ()

4*@6B =

(×ω ω 2

(5+

. $+ +/ 1 * -/ ' A #< #<

8 + 6 0 ∗ = [0, u∗y , u∗z ] Y − Z #< ˆ

6 ω

⎤ ⎥ 0 u x ˆ = ⎣uy − ωuxx ωy ⎌ ∗ = − ω ωx uz − ωuxx ωz ⎥ ⎤ 0 ˆ Ă— ω ˆ 1 âŽŁâˆ’Ď‰z ⎌ ˆ= = 2 ˆ ˆ Ă— ω ωy + ωz2 ω

(2+

(9+

y

−1

Ďˆ =

ˆ = −1 (ωx ) (ˆ ¡ ω)

(>+

⎥

ωx ⎢ ωy ˆ (Ďˆ) = ⎣ ωz

âˆ’Ď‰y x ωx + ωz2 ω1âˆ’Ď‰ 2 +ω 2 y

z

x âˆ’Ď‰y ωz ω1âˆ’Ď‰ 2 +ω 2 y

z

⎤ âˆ’Ď‰z x ⎼ âˆ’Ď‰y ωz ω1âˆ’Ď‰ 2 2 y +ωz ⎌ 2 1âˆ’Ď‰x ωx + ωy ω2 +ω2 y

z

(=+ # Δθ = θi −θ0 R F Li F B Li R H

Rωˆ (Δθ)Raˆ (Ďˆ) = 0

−Rωˆ (Δθ)Raˆ (Ďˆ) ∗ 1 (;+

) . # 1 # - ,/

1 (*+ !

ˆ > 3 ) C ω L F ˆ ! = +Νω Îť ∈ R 1 (@+ (;+ ∗ = [0, u∗y , u∗z ] L L Y Z @ 3 ) # ! ˆ Ă—

( = ω ˆ × ( + Νω) ˆ =ω ˆ × = ( , ( = ω 32

Articles

. ( B / /

A + / #<

1 nk ≼ 3 Bk "%)

) 2 #

Bk C H , "%) ni ≼ 2 θi

Si = {'ij } ,

Si Bk Si,k = {'ij | 'ij ∈ Bk } nk

nk 1 {Si,k } . ( '

/ ; , . ˆ () ! (ω,

# * ) 9 : * 1 @3 "%) , ! Y@ C

! Y> "%)

3

1


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

" . 0 < *+ "%) 0 $ S0 L C H "%)

θ0 ˆ () @+ ! (ω, Ln

+ "%) C i H $ Sni "%)

θni Ln

ˆ () L0 i H + ! (ω, 0 ˆ () 2+ & L C H (ω, L0 ˆ () !

E C H (ω, {S+ i,k } ,

+ + + 0 ˆ , ( L C H ω {Si,k } {S+ i,k }

: @ ˆ () ) ! (ω,

* "%) θni , !

! ˆ ()

! Y= # (ω,

@ 0 2 L C H ˆ () (ω, + + + 0 ˆ , (

L C H ω {Si,k } ! {S+ i,k } Y;

= N+ &

ˆ () = 6+ & (ω,

2 4 4 * 7 :!#3 * -/ / ; ' / :'- * /

2 / 7 ' / :'- * /

n 'i B n π i G F () >+ , G

B G H B F , 4*26 : = 5

4*9 *>6 - 1 2 6 --/ ; 4 * 7 :!#3 * -/ , "%)

B

G F

"%)

, "%)

'ij L F , π k ,

0 ˆ () = /+ # 4 L C H (ω,

A - ˆ k , dk ) ( C F "%) 1

- 2 n = 6

2 ) % 7 / - 4 - ) 'ij B F ˆ k dk π k Articles

33


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

˜ + / / ˆ Fijk (/, ) = C ijk / k

N° 2

2013

(**+

C C ijk 1 Ă— 10

Ëœ 10 Ă— 1 @ M ijk / Ëœ= / ! B / 2 2 2 [qw , qw qx , qw qy , qw qz , qx , qx qy , qx qz , qy , qy qz , qz 2 ] - n n

1 (**+ 1 B ˜ + / / N F (/, ) = C /

G F B ˆ k (R 'ij + ) + dk = 0

(?+

C R ∈ SO(3) ∈ R3 G B B G H F F $ 1 q = [qw , qx , qy , qz ]

R = q 1 q R(q) (?+

B ˆ ˆ Fijk (/, ) = k (R(/) 'ij ) + / / dk + / / k =0

ˆ = / M ijk / + / / k (*8+ C 1 / P3

,

1 # / / 1 M ijk 4 × 4 ! ˆ k dk 'ij E n n 1 1 (*8+

1 / , / / E

n − 3 1 / )

n = 6 2 1

2 / 5 , E M

4*=6 4*9 *>6 -

XX! C D ! AA ) n > 6 1 n

C

1 -

n = 7 8 ≼ n > 12 n ≼ 12

8 ≼ n > 12 , - A 4*9 *>6 1 (*8+

@ /B 34

Articles

(*@+

C C n Ă— 10 ! N n Ă— 3 ! ˆ k , rank(N ) = 3 2 < 1 (*@+ N ⊼ (n − 3) Ă— n ! N n − 3 1 /B Ëœ F (/) = N ⊼ C /

(*2+

˜ , :'3 q

, R(q) ! ˜ (*@+ q 2 . 6/ /7 - * /

"%) H 4*6 "%) L C H 1

5 "%) -

A 426 "%) 1 9 "%) " A 496

5 "%) #

# 4*;6

, * "%)

1

#

C

{Si } θi ,


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

#<

0

: <

< < H 4*6 > 2 (9+Z -

426 ? = (;+Z " 496 > 2 (9+Z Z 1 1

"%) θi Y; # 1

A 4 / - 7 1 # - ,/

/ ; 4 ( 7 / $ "%) 0 Y> L C H Lni C H

* @( + 0

"%) "%)

θ0 θni B

Lni L0 H

=

L0 C H

−1

Ln

¡ C iH

p=

2Ď€ L ˆ ¡ Ln0 i ω Δθ

= (I 3 − Rωˆ (Δθ))

Rωˆ (Δθ) = 0 Rωˆ (Δθ) = 0

−

Δθ ˆ pω 2Ď€

(*;+ (*5+

C 4 ) % -/ # 9 7 ,

Y> 2 ,

, "%) P eP jk B

Lni L0 H

L ni L0

ˆ = 0 (I 3 − Rωˆ (Δθ))

: (I 3 − Rωˆ (Δθ)) ω

!T 1 (*5+ Lni L0 H "%) θ0 θni p = 0 !

ˆ () ! "%) ) (ω, "%) i θi L L0 H ˆ ()

! (ω, C ,

ˆ + , (+ ! ω

(*9+

# 0

A

4*@6B

â€

LH C

j,k

2

P eP = jk

2

k (R 'j + ) + dk

j,k

(*?+

Lni L0

1

Δθ ˆ 2Ď€ pω

(*>+ + (I 3 − Rωˆ (Δθ)) 1 (*=+

1 (*=+

XX =AA 4*@ *56 Δθ ˆ

!

ω p ˆ

ω ! 0 1 (*>+ !

ˆ ! Δθ 2Ď€ pω ! (I 3 − Rωˆ (Δθ)) Ln

E Rωˆ (Δθ) L0 i "%) θ0 θni

1

B

P - eP jk eLOS jk

# ) = π k 'j C

'j = f (φj , Ď j )

"%) E

L wj âˆź L N (0, L Ďƒ 2 ) Ď j Ď j = Ď + j + wj PP ejk XX ! * * AA eLOS jk

) = Articles

35


Journal of Automation, Mobile Robotics & Intelligent Systems

LH C

2

eLOS = jk

j,k

=

N° 2

2013

2 '+ j − 'j

j,k

VOLUME 7,

2 Ď + j − Ď j

(@8+

j,k

ˆ k , dk ) k 1 (@8+ π k : ( + 'j

Ď€ k

"%)

ˆj ''jj B 'j $ ' '+ j =

−dk − k ˆj ' ˆ ' j R k

(@*+

ˆ k , dk ) 1 (@8+

Ď€ k : (

! R, ,

ˆ k , dk )

Ď€ k : ( E

C wj âˆź N (0, C Ďƒ 2 ) %j 23 )j . B

6( #< ) 5 0' 2 5 "%) 9 # ˆ k }

Y> { 3 {Ď€ k } R

) ! ˆ k } R2 {

Ďƒ 2 '+ j − 'j + C 2 + n% Ďƒ

C 2

L H,Bk H C C

L 2

Ďƒ n' L Ďƒ 2 + n% C Ďƒ 2

n'

L Ďƒ2

%j − K ¡

j,k

Bk C R

2 k )j + B C

j,k

(@@+ k R B C k 0 1 n'

'j n% %j + + ˆ , ( , ω "%) ! 'ij

θi i = 0, . . . , ni B C

Bk C H

=

B C

k

A L ) ?

%#&:#0

Ďƒ 2 '+ ij − 'ij + C 2 + n% Ďƒ

C 2

Bk L0 ˆ C H,ω,(,C H

L 2

n'

L Ďƒ2

Ďƒ + n% C Ďƒ 2

n'

L Ďƒ2

%j − K ¡

i,j,k

Bk C R

2 k )j + B C

j,k

(@2+ "%) 0 L C H

T "%) i ˆ () 1 (*+

L C H (ω,

(5+

D , /7 - # -

, "%) (: 0I " : *88+ (3 < <,$ 39=+

) ; 36

Articles

-

#!%=!A 4O ) 1 (@2+ Ďƒ = 12 " : *88

C Ďƒ = 0.5 ! ,

, @ L


Journal of Automation, Mobile Robotics & Intelligent Systems

) *8 , ) ** A #

< #! ,

( + #! % " : ( +

[0, −30, 160]

) [0, −27.6, 158.9]

0â—Ś 8.08

10.14â—Ś 7.24

VOLUME 7,

N° 2

2013

C ! "%)

, T

) XX AA

"%)

1 2 ∗ #: " $& <3 < B J 3 * #: B

J 1 , #: B J ∗

2 L

0

* %$8! (< 6 %54 ,

$ [ : < 3'" :"# K : , : % K < 3 ) /

# 3 # %* 4 4*6 S H % <

!

( +K B ())): '. (! ! ! , ! ! ! (! ! ! ' 2 @889 @28* @28= & 2 , !

# !

! )

E * - /

! @3 "%)

! ! "%)

23 9 3 ) , "%)

4@6 0 < % 0

K B ())) (! ! ! , ! ! ! ! ! @88= >2@ >2; 426 : -

:

0

% D0 K B (! ! ' $ *??> 9;@ 9;; 496 E " P " " 3 Q 0 3 H # !

K B ())): '. (! ! ! , ! ! ! (! ! ! ' @88; 25>9 25>? 4>6 % $ C )

!

K " $ @88> 4=6 C : I " # &W F C # 23

K B (! ! ! ' $ ! @88* Articles

37


Journal of Automation, Mobile Robotics & Intelligent Systems

4;6 C : # &W F C # 23

23 ! K ! ! ' 9> 2 9 *5* *?5 @882 456 C #

"

K < 3

\ $ @885 4?6 3 : # C % : !

23

K B ())): '. (! * ! ! , ! ! ! (! ! ! ' * @88; 9*=9 9*=? 4*86 F - )

23 :"# K < 3

] < 1 )^ ^ "

@88= 4**6 ' < I I / 0

B # . K B (! ! ! ' $ ! )=$ ! @8*8 4*@6 F : C < ! ! @nd : @889 4*26 F :

K $* $ ! - @8** 4*96 0 - - " & & K B 0Eth 4 !! - ! ! , ! ! @889 >*2 >@* 4*>6 0 - - . K . ! - ! ! *@5 * *>* *>5 @88= 4*=6 3 0 ! F " 3 A: ( 6 6 ! ; ! (! ! , $ ! C ! , @nd : ) @88= 4*;6 - P : # !

K B 9st 7 # $ ! & $ ! - * ! @8*@ 4*56 F 0 C ! !# *st : # @888

38

Articles

VOLUME 7,

N° 2

2013


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

$ $ ! &

. " # / '0 %

& . + 6(

A!( 4 I $ 4

A!(

#

,

1 . #

,

! A

!

() *+ 1

,

, ,

% ) ("%)+ C $%E 89"Q

1 #

4*6 , ( 4*86+

$%E ,

* @

2 Ă— 2 ,

1.5 Ă— 1.5 ( + , 1 ' 3

:, 0 , ,

, 10 Ă— 10 0.1 Ă— 0.1 , , 1

, (

) @+ , OM OR A ( ! xM yM

xR yR zM 8+ ,

C "%) OC OS , R 4 "%) R (0#3+ :

# - 8 =

.

4 , @3 "

:

-

Articles

39


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

OC zC R

yC

C R

OR

S

OS yS

zR

xR

zS

yR

zM M

R

xM

OM

yM

; 4 I

N° 2

2013

F !

. 0

D , I

$ ( $+ 456 C $ , ! " 1

496 4*96

* -/ 6 * -/ , /7

- & 1 , ! #

4?6 ,

, C

@8

:, 0 98 #

-

:

4*=6 # 23

,

. : "%) 4*56 / "%) 4**6 / ! #

4*> *;6

C 1

, ! , 1 : 426 40

Articles

.

# 4*26 %#&:#0 4>6 3 !

23

"%)

0.4

0.2

II

I

[m,rad] 0 ι β zR xR,yR,γ

-0.2

-0.4 0

sample no 10

20

30

4 - )

! () 2+ yR ! ( β+ xR ! ( ι+ ) (**

) 2+

! (29

) 2+ #

, ! ) 2 , A (ι β γ +

#C%: ,

1


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

, Q #C%: 8 >â—¦ , (z+ 3 ! α β , (xR yR zR γ+ ( 3/- / ; 3 !

< OS : 23 ,

OC , !

,

- ! ) !

xC > 1 yC > 1 :

23 - %#&:#0 4>6 !

-

! , "%)

, ! ,

A ) "%) 1 B |xS | < 0.2 , |xS | < 0.5 , yS > 0.2 , r > 0.4, (*+ r ) ' 7 9 , R 4 R

(M iS M iC + OM B M

M

N° 2

2013

N

zM i

OM ,

"%) ,

B

R 4,R

N

2 zM i,

(>+

i=0

-

4 R B " 1 (" #+

< : <

1

4=6 #

!

! B i i

, Δ i i

B R

Δ i := Δ i +c1 · ·( − i i )+c2 · ·( − i ), (=+

i i , 8 * , c1 c2 @ I 4=6 <: 1 ,

,

#

!

@>O ! , i i

B

iS = R , · R · iS ,

(@+

i := i + Δ i .

iC = R , · R 4 · iC .

(2+

, " 1 (" #+ 4*@6

E

& i+1

B

1 R , OM OR

iS iC

"%) / z M is M iC , -

( +

B ε=

N z2

Mi

i=0

N

,

(9+

i+1 = i − ( − λdiag )−1 .

(;+

(5+

1 i+1 i λ ! C

!

,

E

& Articles

41


Journal of Automation, Mobile Robotics & Intelligent Systems

Îť

Îť

E

&

Îť

" # :

C .

" #

) # -

VOLUME 7,

N° 2

2013

OS ,

OM (@+ R

0#3 ) 9/ /

A zM = 0 , Îľ 0#3 5 >== - Îľ

(, *+ ) 90

zM = 0 :

# #

/

/ 0

0 =

1!3

R 1*3

R 1A3

B

!4 6 4 6 4 6 Îą4â—Ś 6 β4â—Ś 6 Îł4â—Ś 6 Îľ4 6

8 88 @8@ 88 *;> 88 9> 88 8 88 8 88 5 >==

" # ; 9; *?? *9 *;8 92 99 =9 8 5? @ 25 ; 99*

<: 2 ;; *?; >? *=5 52 99 99 8 ;; @ *5 ; 9?@

, C ) 9 , * ) 9#

42

Articles

A

1!3

R 1*3

R 1A3

OS ) ># ,

(2+ () >/+

() >0+ /

! (

0#3 + #

() >0+ , Îľ

9 *9>


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

@ 85>

<: *@ 5; *;> *9 @@9 ?? **9 ;; 8 99 8 =2 @ @?>

) 6 / ; , /7 ,

! , "%) , R R 4

1 (@+ (2+ #

"%) ,

< , (<,# + 4;6

$ 4@6 , ) = -

0#3 "%) ( +

1 ( +

! (

* ) =#+

(

@+ ,

) =/ ,

1

:

"%)

zM , ) ; )

1 B n m i,j i,j − e e C i=0 j=0 S err = , (?+ No i j

i,j ei,j S eC i, j No , 8 8*9 8 89*

0#3 ) 4 /-/ 1 * -/ # -

, " #

C

1!3 1*3

1A3

0.2 zM [m]

" # @? *> *;? =2 @@2 *? **9 9? * 89 @ @5 @ 85>

0.1 0 1.2 1 0.8 0.6

0.6 0.4

0.4

0.2 0

0.2

yM [m]

−0.2 0

−0.4

x [m] M

0.2 zM [m]

28 88 *58 88 @28 88 **> 88 8 88 8 88 9 *9>

2013

B

!4 6 4 6 4 6 Îą4â—Ś 6 β4â—Ś 6 Îł4â—Ś 6 Îľ4 6

N° 2

0.1 0 1.2 1 0.8 0.6

0.6 0.4

0.4 yM [m]

0.2 0

0.2

−0.2 0

−0.4

x [m] M

B 1!3

1*3

#

/

" #

!

Articles

43


Journal of Automation, Mobile Robotics & Intelligent Systems

,

* @ ,

Ďƒd Ďƒa ,

>8 , !

2 9 )

-

(Ďƒd = 0.1 Ďƒa = 10â—Ś + =

Ďƒx [ ] Ďƒy [ ] Ďƒz [ ] Ďƒ Îą 4â—Ś 6 Ďƒ β 4â—Ś 6 ĎƒÎł 4 â—Ś 6

Ďƒd = 0.01 Ďƒa = 5â—Ś 8 888889 8 88888; 8 888882 8 8882= 8 8888>2 8 8882>

Ďƒd = 0.1 Ďƒa = 10â—Ś 8 888889 8 88888> 8 88888@ 8 888@9 8 8888>; 8 88899

Ďƒd = 0.2 Ďƒa = 50â—Ś 8 8*5 8 *= 8 9* *29 =* *29 =* *85 55

=

Ďƒx [ ] Ďƒy [ ] Ďƒz [ ] Ďƒ Îą 4â—Ś 6 Ďƒ β 4â—Ś 6 ĎƒÎł 4 â—Ś 6

Ďƒd = 0.01 Ďƒa = 1â—Ś 8 8888* 8 88888> 8 888889 8 8889 8 888*9 8 888?9

Ďƒd = 0.1 Ďƒa = 10â—Ś 8 88888; 8 888889 8 888889 8 88898 8 888*2 8 888>5

Ďƒd = 0.2 Ďƒa = 50â—Ś 8 852 8 >> 8 99 25 *= ;@ 22 29 **

. * - /

,

$

A -

/ C " 1 Îľ

" # / ! ( + , <:

, ! 44

Articles

VOLUME 7,

N° 2

2013

*>

( *888 ! *8+ )

<: ! 23 I :

% -

5 $ % 6 )7 < R $ , 0 < 2# =8 ?=> < R < B < " J *

∗ < R $ , 0 < 2# =8 ?=> < R < B 3 / J ∗

0

* %$8! (< 6 %54 ,

&0& @8**D8*D&D:,;D8@858

# 3 # %* 4 4*6 3 / < : R %

K . < @5(9+ @8** 9?; >@5 4@6 3 / < : R < : " - % % , $ <,# K B # # ( + $ - - : : @8*@ 5? ?= 426

/ / /_ $ ) # : 0 0 : C A $ / K B & ())) (! , ! ! ! ! (, : < $:# @8*@ 2*82 2*85

496 F / : , # 0 0 : K ; ' ! ! ' @8** 4>6 3 # ) F < , $ !6 ! $$ < C @882 4=6 F I % 0 < K B & ())) (! , ! ! # < # *??> *?9@ *?95 4;6 E I 3 < #% K B (! ' $ ! - = ! ! & F @88; @@> @29


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

456 F I E : : ' : ) B " : : : 0 K (! . 28(*+ @8** >= ;? 4?6 I "

% 0 - % K B ())) (! , ! ! ! ! : $:# *??* @>=5 @>;2 4*86 < ` a 3 % R < : R , / - % 1

@3 % : K . ! * !6 - / (! ! ' > 2 @8** =; ;5 4**6 E " P " " 3 Q 0 3 H # # ! < 0

0 "

% ) $ " ) K B ())) (! , ! ! (! ! ! ' : 3 $:# @88; 25>9 25>? 4*@6 F , " 1 B K ! =28 *?;5 *8> **= 4*26 I : I C < : ) B # # K B & ())) (! , ! ! ! ! (, 11 @888 ??> *88* 4*96 E E : < E ) : # $ /

$ K B ())): '. (! , ! ! (! ! ! ' ( ' 0199 : ) $:# @8** 222> 229* 4*>6 : ) /

# < #

!

K B 9st (! 7 # $ ! & $ ! - ! 03 % @8*@ 4*=6 : -

:

0

D K B & ())) (! ! 52 ' $ * 3 $:# *??> 9;@ 9;; 4*;6 ) '

F / $ & # : ! 0 0 "

% K B ())) " ! ! & ! ! ! - ! (! ! 29(**+ @8*@ @8?; @*8; 4*56 S H % <

!

( +K B ())) (! , ! ! (! ! ! ' : F @889 @28* @28=

Articles

45


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

$ ! &

, ) & *& ! $ !

( ! 1 ( , " ) "

= ! < *$ !* '/ .

J=B%= O 1!P#=3 P

9 !

4

= ! ' $ *

,

4* @6

(:"# + 42 96 C

3 % < . 4>6

, 1

( $+

(E,+ A . 4=6 :"# , 1

! E, .

E<:

$

A

, 1 ) $ 4;6 %E/ 3 I

23 E, .

$ I

! # :"# 46

Articles

456 , 1 : E, .

! , :"# , B 1

#C%: 1

1

. ,

:"#

:"# ,

, @ 2 E, .

9 , :"#

+/ 7 4 * -/ # , - ) / , "#/ '2 4?6 () 2+ , 1

" : /

#0 =98 *88

2 >

(*88 + 640Ă— 480 Q: &: , *8 (#C%:+ C #

A

,

0' 4**6


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

B TRCi = TRC TM C −1 TM Ci TRC

TMC TRC1 TRC2 TRC3

(*+

i A TMC1 TMC2 TMC3

4

< *$ !* '/

A , #C%: #

A # !

, (TM C +

(TRC + ,

(TM C1 TM C2 TM C3 +

() *+ #

A

5 =/ ; 4 7 , /

# *=88 @8

2 > ,

() @+ ,

0' 4**6 , A

(TCC + ,

A

(TP C1 TP C2 TP C3 TP C4 +

() 9+B TP Ci = TM i1CC −1 TM P Ci (@+ i ,

, A

. 0' ! ,

4*86

# " 1 ,

.

1 . . . ) / - 4 * 7

, /

# =98 *88

,

1 A . E, . , 1 E, .

A

:"# ,

0' :

A (TSC1 TSC2 TSC3 +

Articles

47


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

TPC1

N° 2

2013

TPC2

TPC3

TPC4

TSC3

TMCC TSC1

TMPC4

TMSC4

TSC2

() 9+B TSCi = TM i1CC −1 TM SCi

(2+

i . 9 / - ! 7 =

: 4*@6 1 * , () =+

) # ;/ ' ) 4 /0 ,

!

, A

, 1 , < , < (<,<+ <,< *>55 @88@ ( +

*>55 @885 (<,< @+ 4*26

48

Articles

, 3 ,

<,< ,

<,<

() ;+

. * - /

, A .

, A .

#C%:

1

, :"# # ! A

) > ,

#

! !

1


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

Peripheral camera 1

N° 2

2013

Peripheral camera 2

Central camera

Peripheral camera 3

Peripheral camera 4

Robot camera 1

Robot camera 2

Robot camera 3

Static camera 1

Static camera 2

Static camera 3

B

L

Articles

49


Journal of Automation, Mobile Robotics & Intelligent Systems

fx =98 @8 >9@ >; =>* >; 59> 9? 5@; >= 52> ;> 5@? 95 5@? *9 =>* *? =>8 @5 =9; =?

& %0* %0@ %02 00 <0* <0@ <02 <09 :0* :0@ :02

fy =98 =; >9@ >? =>@ @> 59> 8> 5@; =8 52= *> 528 98 5@5 =* =>* ?5 =>* 28 =9; =8

cx 22@ 9; 2*? *5 2>> 2? 58= =* 582 88 ;?* 9; ;?? *= 5*@ @; 298 >* 29@ 9> 2@= 22

cy @>5 *> @29 =8 @@= @> >55 *? >55 @5 =8? >; =8= 5? =2@ 2> @>; ;; @29 5> @>? =5

VOLUME 7,

k1 8 **@ 8 8>> 8 *8= 8 8*9 * >>> * 8@@ 8 8*> 8 892 8 *55 8 *;? 8 *5>

k2 8 @*8 8 >59 @ *95 8 899 8 *8@ 8 @@9 8 8?; 8 8;= * **2 * 2;2 8 @*5

p1 8 88@ 8 889 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88*

p2 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88* 8 88*

k3 * 5*8 * *5; * 2>5 8 85@ 8 8@? 8 @9> 8 8*; 8 8;8 8 @@8 8 >=5 > 85>

k4 8 @55 8 8>* 8 @?8 8 8>@ * =@@ * 85= 8 82> 8 8** 8 @@? 8 @82 8 @@=

N° 2

2013

k5 8 8** 8 @== * ?5= 8 8>5 8 8*= 8 85@ 8 8@2 8 8>; 8 ;=? * @22 8 ;@8

k6 * 5;2 * 92= 8 ;*9 8 *2@ 8 8;* 8 *58 8 8;? 8 *99 8 ?;> * 85? = 2??

&

Q

& %0* %0@ %02 00 <0* <0@ <02 <09 :0* :0@ :02

8 8>* 8 89= 8 8>@ 8 858 8 8== 8 8;; 8 **2 8 8>; 8 8;8 8 89? 8 8>2

! 8 2>8 8 982 8 28@ 8 2;> 8 @98 8 2*@ 8 289 8 @>> 8 @>; 8 288 8 25@

8 *=@ 8 *25 8 *>5 8 *5? 8 *2@ 8 *=5 8 @*5 8 *9@ 8 *2@ 8 **; 8 *9?

9

= 4 ) B 4

" ∗ < R $ , 0 < 2# =8 ?=> < R < B # : J ,

# & < R $ , 0 < 2# =8 ?=> < R < B I J , " 3 $ < R $ , 0 < 2# =8 ?=> < R < B ) J - $ < R $ , 0 < 2# =8 ?=> < R < B H 3 J ∗

50

0

Articles

* %$8! (< 6 %54 # : I b )

. :

<C 3 . - A K :

5 @ @ C 0 < $ : ) ,

& : 0 3 0 @8**D8*D&D:,;D8>?98

# 3 # %* 4 4*6 3 : ) ) ' B < , ) 28 P ) K ())) ! ! - F ! *5(9+ @8** 58 ?@ 4@6 ) ) 3 : ' B < %

#


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

K ())) ! ! - F ! *?(@+ @8*@ ;5 ?8 426 # F 3 % & :

:"# B % , : 0 :"# K ())) " ! & -( @?(=+ @88; *8>@ *8=; 496 # : # I

R , ' :"# : C ! % K ! , * $ ' ! =2;> @8*8 @=8 @=; 4>6 : 0 E ) # E 3 3 3 % 3 E : < , %#-: 3: K ! . ! @;(9+ @88? 2>2 2;* 4=6 : # 3 0 # F : H ,

:"# K B & ! 011> ())) (! ! ! , ! ! ! ! ! @88= *>9@ *>9; 4;6 F : & ) - / 3 0 , %E/ 3 :"# K B & C4* 7 #* $ ! ! ! ! $ , * ; ' ! ! ' , ! '' @8** 456 F ) , < # ) ' :"# K B & 4 - ! ! , ! ! @88? 4?6 ?iiT,ffrrrXrB7B#QiX+QK 4*86 % C 0 & " I & % K (!* ! ! . ! , $ ! *2(2+ *??9 22* 2>= 4**6 ?iiT,ffQT2M+pXQ`; 4*@6 % / - . . .

. K < 3 ,

< $ , @88; 4*26 I " F *>55 : < 0 : < &

0 : K B 3@th !! & " ! " (! &""( - ! @88@ ?5 *8>

Articles

51


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

COMMUNIGATION TMOSPHERE IN >UMANS AND #OBOTS NTERAGTION WASED ON THE CONGEPT OF 3UZZY TMOSFIELD GENERATED BY MOTIONAL 4TATES OF >UMANS AND #OBOTS Submitted: 6th September 2012; accepted 6th December 2012

Zhen-Tao Liu, Min Wu, Dan-Yun Li, Lue-Feng Chen, Fang-Yan Dong, Yoichi Yamazaki, and Kaoru Hirota

bstract: Communication atmosphere based on emotional states of humans and robots is modeled by using Fuzzy Atmosfield (FA), where the human emotion is estimated from bimodal communication cues (i.e., speech and gesture) using weighted fusion and fuzzy logic, and the robot emotion is generated by emotional expression synthesis. It makes possible to quantitatively express overall affective expression of individuals, and helps to facilitate smooth communication in humans-robots interaction. Experiments in a household environment are performed by four humans and five eye robots, where emotion recognition of humans based on bimodal cues achieves 84% accuracy in average, improved by about 10% compared to that using only speech. Experimental results from the model of communication atmosphere based on the FA are evaluated by comparing with questionnaire surveys, from which the maximum error of 0.25 and the minimum correlation coefficient of 0.72 for three axes in the FA confirm the validity of the proposal. In ongoing work, an atmosphere representation system is being planned for casual communication between humans and robots, taking into account multiple emotional modalities such as speech, gesture, and music. Keywords: human-robot interaction, communication atmosphere, fuzzy logic, emotion recognition

1. ntroduction # c c

c c c c c c c c c c(C% + c c c c c c

c c

c c c c c c c c c c c c c c c c c c c c c . c c c ! c c c c c4*6 c, c c c c c c c c c c c c c c c c c c4@6c426c

D c c c c496c4>6c c c ! c c c c c c c c c c c c c c4=6 c, c c c

c c c c0 c c 1 c c c c c c c c c c c c !

c c c c c

c c c c c ) c c c c c c c c c c c c c c c c c c c c c c c c ! c

c c

c c c c c c c c

c c c c c, c c c 52

Articles

c c c c c )#c c c c c c c c c c c c c c c c4;6c4@?6 c c c c c c c c c c c c c c c c

c

c 0 c c c C% c c c

c c c c c c c c c c c)#c4;6c4@?6 c c c c c c

c c c c c c c c c c c c c c c c c c c c c c c c c

c c

c c c d

ec d ec d ec d ec d ec d

ec c d ec c c c

c c c c c c# <

#

c(#<#+c c c456Tc: c c c c c c c c c c c c c c c c c c c c c 1 c)8 c c c c c c

c c c , c c c c c c c c c c c c c

c c c c c c 496 c c c !

c c4?6 c, c c c c c c c c c c)#c c c

c c c 1 c c c c c

c c c 23c c c d) C ec d" 0 ec c d0

) ec ! c 4;6 c 4@?6c c c c c c c c c c#<#c c c c c)# c c c c c

c c c c c c c c c c c c c c c)# , c c c c c c

c c !

c c c c c c c c c 1 c c c c c c c c c c c c c c c c c # c c c c c c c c c . c c c !

c c c c c c c c c c

c c c ! c c c

c c c c c c c c c c c c c c c c c c

c c c c c c c c c c c c c c c c c c c ! c c c c c c !c

c c c c c c c c c c c

c% c: c( %:+c4*86c c c c c c c c cc c c ! c d

ec d ec d ec d

ec c d ec c c c !

c c


Journal of Automation, Mobile Robotics & Intelligent Systems

Fig. 1. Mascot Robot System in a home party environment [29] c c c c c c c c ) c c !

c c c c d

ec d

ec cd ec c c c c c c c c !

c c c c c c c c c456 c, c c c c c c c c c

c c c c c4*6 c4**6c c c c4*@6 c ! c c c c c c c c

c c c)#c c c c c c . c1 c c c

c c c

c c c c %:c c c c c

c c

c 1 c c c c c c c c c@ c c c c c

c c

c c c c c c c c c

c c 2 c c 9 c c c

c c c c c c c c c c c c c )# c ! c c c c c c c )#c

c c c c c c c c c c c

c c c>

2. 6ascot #obot 4ystem

c% c: c c c c c

c c c

c c c c c c c c c c c c c c c

c c c c c

VOLUME 7,

N° 2

2013

d3 c < . c c c 0 c /

c c & ! E c % ec c c & 3 c : c c c< c c: c4*86 , c %:c c c c c c

c c c c c c c c ! c c c c

c c c c c c c c c ,' c c c c c c c c c c c c c c c

c c) c* c c c c c c c c c , c c c c c

c c c c c c c c c c c c c , c c c c c ! c c c c c c c c "#&c c c c c c c c c

c

c"#& c, c c c c c %:c c

c c) c@ c ccc% c, c c(%, +c c c c c c c c c c c %, & c 4**6 c c c %, & c c c c c c

c c c c c c c c c c c c c% c, c0 c (%,0+ c , c c c c c c %, & c c c c c c (:% + c c c c(E% + c c c c( %0 + c c

c c ( < + c c c c c c c

c c c c c c c4*@6 c, c < c c c c c c c:% c c cE% c c c c c c c c c c c c

c c

c c c c c c c c c # c c c c c c c c c c c c c c c c %0 c c !

c c c c c c c c ccc) c c %: c c c

c c

c c c c c c c c c c c c c c c) c c c c c c c c c c c c c c c c c c c !

c c c c c c c c c c

3. motional 4tates of >umans and ye #obots 3.1 ffinity-'leasure- rousal motion 4pace

Fig. 2. Block diagram of the Mascot Robot System

c c c c c

c c c c 1 c c c c c c c !

c c c c c c c

c c c c , c !

c c c c c c c c c c c %: c c 23c c

c c # <

#

c c c 456c c c

c c c) c2 c$ c#<#c c c c c c c

c c !

c ! c c c c c c c c c c c c c c c c c c c456 , c c c c #<#c c c c c

Articles

53


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

Affinity Displeasure E (t + 1)

Arousal

Sleep E ( t)

E (t − 1)

Pleasure

Fig. 3. Affinity-Pleasure-Arousal emotion space [8]

E = ( eaffinity , e pleasure , earousal ),

∀e ∈ [ −1, 1], cccccc(*+

+ ΔE ⎧E E (t ) = ⎨ Initial ⎊ E (t − 1) + ΔE

Fusion

Gesture Recognition

Initial States of Emotion

Affinity Displeasure

Gesture Emotional

Arousal

Sleep

States

Speaker

Pleasure

Feature Acoustic Extraction of Features Speech

Fuzzy Inference

ΔE

Emotion Space

Fig. 4. Emotion recognition based on bimodal information 54

Articles

t >1

cc cccccccccccccccccccc(@+

: !c c c4*;6 c cd

ecd ecd ecd ecd ec cd

ec c cd ec c c

c c c c c c c: c* Keywords 1

Keywords 2 Speech Weighted Fusion Keywords n

Semantic Speech Recognition Cues

t =1

cEInitialc c c c c cΔEc c c c c c c# c cei GΔeicfc*c cei(t−*+cGc Δei fc* c cei(t+c7c*Tc cei GΔei Vc−*c cei(t−*+cGcΔeicVc −* c ccei(t+c7c−* ci7d ecd

ec cd

e

3.2 motion #ecognition of >umans Wased on Wimodal Cues , c c c c c c c c c c c c c

c c c c c c c c !

c c c c c c c4*9 *=6 c c c %: c 1 c c c c c c c c c c c !

c c c C c c c

c c c c c c c c c c ) c 9c

c c c c c c c c c c %: c c c c c Bc Stepc*Bc c c c c c c

c c c c c c c c c c c c c c c

2013

Stepc@Bc: c c c D c c c c c c c c c #<#c c c c c

c c c c c

c c c c c c c Stepc 2Bc # c c

c c tG* c c c c c c c %: c c. c c: c* c c c. c c: c@c c c c c c c , c c c c c : c 2c c

c c c c c c c ! c c c c

c c c c c

c c c c c c

c c c c c, c c c c c c c c c c c

c c c c c . c c : c * c # c c c

c c c c c c

cEc c c c c ceafAinity cepleasure c c earousalc c c c c d# ec d<

3

ec cd#

: ec ! c d# ec ! c c c c c c c( c +c c c( c +Tcd<

3

ec ! c c

c ( c +c c

c ( c +Tc c d#

: ec ! c !

c ! c ( c +c c c( c +c4*26 ' c c c c c

c c c

c c

c c c 23c c c

c c c c c c c c cE(t−*+ cE(t+ c cE(tG*+c c) c2 c c c c c

c c c c c c c c c

c c c c c

c c c c c c

c c c c c c c c

c c c

N° 2

Hand Movement

Gesture

Emotion

fail to recognize the emotion

Recalculate

Fig. 5. Emotion recognition by weighted fusion

3.2.1 4eman c Cues from mo onal 4peech % c c c c c c c c c c c c c c 4*56c 4*?6 c c c c c

c c c c c

c c c c

c c c c c c c c ! c c c c c c:% c c c @ c c c c −c F c 4@86c c c c c c, c c c . c

c c 1 c c c c c c c c c c c c

c c

c4@86 , c c c c c c c c c c c c c c c c c

c c c c c c !

c c c c g c c 4@*6 c, c c c c c c c c c c c :% c c c c

c c c c c c c c, c c c c c Dc

cwc


Journal of Automation, Mobile Robotics & Intelligent Systems

c c c c c c cemotionic c c

cP(emotionihw+ c cPĂŽ48 c*6 cÎŁP(emotionihw+7* c i7d

ec d ec d ec d ec d ec d

ec c d ec P(emotionihw+c c c c c c c

c c c 1 c c c c c c c c c c c

c c c c c c4@@6 3.2.2 mo onal Gesture #ecogni on # c c c c c c c c c c c ! c c c c ! c c c c c c c c c c c c c c ! c c 4@*6 c # c c c c c c c c c c c c c4@26 c # c c c c c c c %:c c c c c c

c c c c c c c c c c c c23c c c c c c c c c c gc c c 4*@6 c , c c c c c c c c c c

c c c c c c c c c, c c c c c c c c c c) c0 1 c c c c c c c c c c c4*@6 c c c

c c

c c c c c c c cgc c c c c c c cemotioni c c

cP(emotionihg+ PĂŽ48 c*6 cÎŁP(emotionihg+7* i7d

ecd ec d ecd ecd ecd

ec cd e 3.2.3 mo on #ecogni on by Weighted 3usion - c c c c c c 1 c c

c c c c c c c c c

c c c c c c c c , c

c c c c c c c c c c c c c c c c c c c c c c c c c c4*6c c

c c c c c c c c c c c c c c c

c c c c c c c c

c

c c) c> c c c c c c c c c

c

c c c

c c c c c c* c , c c c c c c c c

c c c c c c c

c c c c c c c emotionic c c ! c c c c c

c c c c c c c

(2+ cEmotionc c c c c c c c c c c c c c cNc c c c c c c c

cMc c c c c c c c c c c c c c c c c

c c c c c c c . c c c c

VOLUME 7,

N° 2

2013

, c c c c c c c c c#<#c c c c c( c@8c c 28c c c c c !c + c c c c 1 c c c c 1 c ic C c d# eDd<

eDc d#

ec c c c c c c4*26jc: c c c c c c c1 c c ! c c

c cd#

: ec c c c c c c ! c c c c c c * ! c : c @ ' c : c 2 : c 9 c > #

c = ' c #

c c ; ! c #

c c c c

c c c c c

c c *c ( *+ c @c ( 8 ==+ c 2c ( 8 22+ c 9c (8+ c >c (8 22+ c =c (8 ==+ c c;c(*+ : c c c c c c c c c c c c c c c c c c c c c c c c ! c0 c c c4@96c c c c c

c c c c cd

ec ! c c c

c c c c c c c c c c c c c c c c c c c c !c c c c

c c c c#<#c c c c c c c c c c c1 c c c c c c c c c c c c c c c c c %: c , c c c c c 1 c

c c c

c c c c c d

ec(8 >5 c8 =2 c8 >9+ cd ec(8 @; c8 9? c 8 >5+ c d ec ( 8 22 c 8 2 c 8 @*+ c d ec ( 8 =; c 8 ;@ c 8 =>+ c d ec ( 8 29 c 8 >? c 8 @;+ c c d

ec ( 8 9; c 8 99 c 8 22+ 3.2.4 5ransi on of mo onal 4tates in ' mo on 4pace < c c c c c c c c c c c c c c c

c c c c c

c c c c c c

c c4@>6 c c c c c c c c c c c c c c c c c c c c c c c c4@=6 c) c ! cE7(8 >5 c8 =2 c8 >9+c c c c

c c c

c c c c c c c c kEc c c c c c c c

c c c 1 c c cke c c c c c8Vkel8 * c c c

c c c c c 8 *lkeV8 c c c

c c c , c c kE c c c c c c

c c c 1 c F0 c c c c c c c) c c c c c c c c c c c c c c#<#c c c c c c c c c c c tG*c c c c ct c

ravolume c c cF0c c

c c c c c( c*@8cC c c c@88cC c c c288cC c c +c c

craF0 c cc c c c c c c c

ctintervalc

c ! c , c c c c c

c c c c c c c ! c c c c c#<#c c c c c c c c c

c c c) c=.

c c c c c c cΔeafAinity c Δepleasure c cΔearousalc c c c) c; c c c Articles

55


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

E (t)

Interval Time

E (t-1) Pleasure

ΔE

Fig. 6. Transition of emotional state by using fuzzy inference S

1

0

M

0.75

0

0.5

1 (c) tinterval [s]

0.8

N

L

1.5

M

S

1

0

1.25

M

S

1

1 (a) ra volume

L

-0.2

1 (b) ra F0 1 NT

L

1.2

P

-0.1 0 0.1 0.2 (d) Δeaffinity, Δepleasure, and Δearousal

Fig. 7. Membership functions for acoustic features of speech and DE c c c c c c c c : c(:+ c c( + c c" c("+ c c c c c cΔe c c& c(&+ c& c(&,+ c

c < c (<+c c c c c cravolumec c c c c c c c c c ! c28c /c7c c98c /c7c1 c c>8c /c7c c c c=8c /c7c c

c4@;6 c$ c98c /c

c c

c c c*8c /c c c

c" c c c*8c /c c c

: c c) c;c( + c c craF0c c c

c c c c cF0 c c ! cF0c c c c c c c*88c c*>8cC TcF0c c c *;8c c @@8c C c c c c c 4@56 c 0 c c c c c c

c @8c C c c c

c " c c

c @8c C c c c

c: c c) c;c( + c c ctintervalc c c c c c c c c c c c c c

c c8 >c c c c c

c: c c c

c *c c c c c

c c c c c* >c c c c c

c" , c c c c cΔE c cΔeafAinity c Δepleasure c c Δearousalc

c c c c c

c c , c * c , c @ c c , c 2 c c , cd ) ,C &ecc cc cc cc cc c cc cc cc cc cc cc c c c c c c c c c

c c#<#c c c) c ! c c ravolume raF0

c c c c c

c c c, c c c c c c ΔeafAinity cΔepleasure c cΔearousalc c#<#c c c c c c c c c c c c

3.3 motional xpression 4ynthesis of ye #obots c !

c c c c c #<#c c c c c c c

c c c c c c c c c c456 c c c c

c c

c c c c c c c c c

c c c c c c c c c c c 1 c c c c1 c c ) c c c c c c c c c c c %: c c c c c c c) c5 c , c c c c 1 c c c c c c c

c c c c c c c

c c c c c(3 ) +c c c c

c c3 ) c456 c / c c c c c c c c c c c c c c !

c c

c c c c

cd

ecd

ec d ec cd e c c c c c c c c c c c c Ec c #<#c c

c, c c c c c c c c c c c c c c

c c c c c c

c c c

c

c c c c456 c

c c c) c? c c c c c c c c c cLlc c 1 ctl c c c c c c c c c c c cLp c c c Ly c c 1 to c#c c c c c c c c c

c c c<

#

c c c c456c c c c, c9 Table 1. Fuzzy rules for

Δeaffinity me

Arousal

Sleep

ra volu

ΔE

Feature Fundamental Fuzzy Extraction Frequency F 0 Inference

Speech Signal

S M L

Table 2. Fuzzy rules for

Δepleasure S M L

56

Articles

me

ra volu

Δearousal S M L

L NT P P

Δepleasure ra F0 M NT NT P

S N N NT

S NT P P

Δeaffinity

ra F0 M N NT P

S N NT NT

Table 3. Fuzzy rules for

Fig. 8. Structure of an eye robot [8]

2013

c c c c c c c c: c2c c2 @ c c c c c c c c c ! c c c#<#c c c c c ckE c c c c c c c c c , c c c c c c c c c c

c c c c c c

c c c c ! c

me

Displeasure E (t+1)

ra volu

Affinity Volume

N° 2

L NT P P

Δearousal

tinterval M N NT P

L N N NT


Journal of Automation, Mobile Robotics & Intelligent Systems

4. Communication tmosphere based on motional 4tates of >umans and ye #obots 0 c c c c c c c

c c ! c c c c c c c4@?6 c c c c c c c c

c c c c c c c c c c c c c c c, c)#c4;6c4@?6c c c

c c c c !

c c c

c c c

c c

c c c c c c c

c c c c c c c c c c c c c c c c c c

VOLUME 7,

N° 2

2013

5able 4. Look-up table for eye motion parameters based on Pleasure-Arousal plane [8] l4N6 p4N6 y4N6 tl4 6 to4 6

c *>8c *@8c ?8c =8c *t> *8 *> =t5 *9c *2c *@ *;t@8c c 98c @8c 8c @8c > *8 *>c 2 9 5 ? *9c *2c @ = *@ *;t@8 c 98c @8c * = ** *= @*c @ ; *@ *; @@c c @ >c @c * @c *c *t>c =t*8c **t*>c *>t@8c c 8 *c 8 2c 8 9c 8 >c *>t@8c **t*>c =t*8c *t>c

28c c 98c c 8c c 8 >c c *c c

c c c c cd) ccC ecd" 0 ec

cd0

) ec ! c

sual

4.2 6odel of Communication tmosphere Wased on the 3 for >uman-#obot nteraction

Fig. 9. Twenty-five partitions of Pleasure-Arousal plane [8]

4.1 3uzzy tmosfield (escribed in 3( 4pace Atmosfieldc4@?6cic c c cAtmos c c Field – c c c c !

c c ! c c c c c c c c c 3 c c c c c c c c # c c c c c

cFuzzy Atmosfield c , c )#c c c c 23c c c d) C ec d" 0 ec c d0

) ec ! c

c

c c) c*8 c , c c c c)#c c c23c c c c

c

FA = ( a friendly , alively , acasual ),

∀a ∈ [ −1, 1], c(9+

cFAc c c c c c)# c cafriendly calively c caca-

Casual Hostile FA (t+1) Lively

Calm

, c c c c )#c

c c c

c c c c c c c c c c %:c c c c c4*6c c c c) c** Stepc*Bc c c c c c c c c c c c c c c

c c c c c c c c c c !

c c Stepc @Bc , c c c c c c c c c c c c c c c c c c c c c c c !

c c c c#<#c c c c c )# c c ! c c c c c c c c c c c c c c c c c c c c c c)# Stepc 2Bc , c c c c c

c( c c c c)#+c c c c23c c c c)# , c c c c c

c c c )#c 4@?6c c c c c c %:c c c

⎧ f ( EH 1 (t ), â‹… â‹…â‹…, EH 4 (t ), ER1 (t ), ..., ER 5 (t )), t = 1 ⎪ FA(t ) = ⎨(1 − Îť ) FA(t − 1) â‹… Îł + Îť f ( EH 1 (t ), â‹… â‹…â‹…, EH 4 (t ), (>+ ⎪ E (t ), ..., E (t )), t = 2, 3, ..., m R5 ⎊ R1 cfc c c c c c c c c EH(t+c c c c ER(t+Tc Îłc c c c

c c c c c

c

c c c c cFA(t−*+c c c c c c c c 8cÂŁ Îł ÂŁ* c c c ! c c c %: cÎł c c c ! (−8 *T+c c c c c c Ec c c c c c c c c c c c c c* c cTc c c

c c c c c)#TcÎťc c c c c8cÂŁ ÎťcÂŁ * ) c c c c c c c c c c c cf ccccccccc 4

FA (t) FA (t-1) Friendly Formal

Fig. 10. Fuzzy Atmosfield expressed in 3D space [7]

∑w

5

⋅ defuzzy ( EHi (t ) R ) + ∑ wRj ⋅ defuzzy ( ERj (t ) R )c (=+ i =1 j =1 c Rc c c c c #<#c c

c c c)# c c c c c c c c c c c;>c c c4@?6 c

c c c) c*@TcE ( t )c c c c c c c c c c Tc c c c c c c c

c f :

Hi

Articles

57


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

Displeasure

Affinity

Arousal Affinity

Displeasure

Arousal Affinity

Sleep

Displeasure

Arousal Affinity

Sleep

Displeasure

Arousal

Sleep

Arousal

Sleep

Pleasure

Fusion of Emotional States

Sleep

Pleasure Pleasure

Casual

wi =

Hostile

Pleasure Pleasure

emotional states

Displeasure

Affinity

Displeasure

Affinity

Fuzzy Inference

atmosphere state

Lively

Calm

Expert Rules

Displeasure

Arousal Affinity

Arousal

Sleep

Arousal

Sleep

n

v j =1 j

, i = 1,2, ..., n, j = 1,2, ..., n cccccccc(;+

Friendly

cvc c c c

Formal

Sleep

Pleasure

Fuzzy Atmosfield

Pleasure Pleasure

, c c cÎťc c 1 c(>+c c c

Pleasure

Speech

Emotional states of 4 humans

⎧ ⎪0, ⎪ ⎪ ⎪⎪1, Îť=⎨ ⎪1 ⎪ 2 â‹… ravolume , ⎪ ⎪1 − 1 exp( −( ravolume − 1)2 ), ⎪⎊ 2 Ďƒ

Fig. 11. Framework of the FA based communication atmosphere in four humans to five eye robots interaction

c c c cdefuzzyTcwc c c c c c c c c c c)#

∀w ∈ [0, 1] c ∑ i =1 wHi +∑ j =1 wRi = 1 c 4

c

5

c

c) c*@ c c c c c#<#c c c c c )#c c c c 4@?6 c , c c c c c c cd# ec ! c ( c N c N t c c P + c c c c cd<

3

ec ! c( cH cD

c L c D

c N t c L c P

c H c P

+c c d#

: ec ! c ( c H c S cL cS cN t cL cA

cH cA

+ c c, c c c c c c

! c c c)# c cE! cFr cV cFr c Fr cN t cH cV cH c cE! cH c cd) C ec ! TcE! cL c V c L c L c N t c C c V c C c c E! cC c cd" 0 ec ! TcE! cC

cV cC

cC

cN t cF cV cF c cE! cF c cd0

) ec ! c , c c c c c c

c

c c

c c c c c c c c c) c*@ c c c %: c c c c c c c c c c c c c !

c c c c c c c c c c c c c c c c c c : c c c c c c c c c c c c c c c c c c c c c c c 4@?6 c c c c c c c c ! c c

c c c c c c c ! c , c N

1 NT

-1 LD

EH

P

0 HD

1 eaffinity

1 NT LP

cc

-1

0

-0.5 HS

LS

-0.5

0.5

1 NT LA

0

1 epleasure HA

0.5

-1

Fig. 12. Fuzzy system for mapping Affinity-PleasureArousal emotion space to Fuzzy Atmosfield Articles

∑ v (t ) i

i =1

i =1

i

n

if

∑ v (t − 1) = 0 i =1

i

if ravolume ≤ 1 if ravolume > 1

cccc ccccccccccccccccccccccccc(?+

n

∑ v (t − 1) i =1

i

5. xperiments on the Communication

tmosphere in 6ascot #obot 4ystem 5.1 xperimental nvironment

VH

VC

-1 EF

1 earousal

ravolume =

$ c c %:c (

c % c : +c 4*86 c c c c c c c c

c c c )#c c c c c c c c ! c c c c( cC cE c * cE c@ c cE c2+c c c c c c% c*c(,'+ c% c@c( c c + c% c2c( c + c% c9c( c + c c% c>c ( c + c# c c c c c c c c1 c c c c c c c c c : !c c c c c c ! c : c*BcdE c c c c ec: c@Bcd3 c c c c ec : c 2Bc d< c ec : c H EFR VFR 1 NT FR 9Bc d- c ,' ec : c >Bc dS c c c c c ec : c 0 1 a =Bc d) c c c c VL C NT EL L 1 c ec c c c

c c c c c c c ) c *2c 1 a 0

c c c c: c@ c VCA F ECA 1 NT CA c c

c c C cE c* cE c@ c% c9 c 1 a 0

c% c>c c c c c c c c c c c, c c c: c@c c

c c) c*9 friendly

EC

∑ v (t ) = 0

n

-1

HP

n

if

ccc(5+ cĎƒ c c c c c c c c ravolumec c Îť c Ďƒ c c c @c c c c ! c c c c c

Fuzzy System

58

∑

vi

Displeasure

Arousal Affinity Sleep

-1

2013

c c c c c c c c wic c c c c c 1 c(=+c

Emotional states of 5 eye robots

Displeasure

Affinity

N° 2

lively

VF

casual


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

) c c ( c c c c c c F c c c ' c c c @@c

c 28+c c c c c c c c c c c c c c c c c c c c c c ! c c c c

c c c c !

c c c c !c c c c c # c c c c c c c c c ! c c c c c c c c c c c c 4*@6c c c , c c c d

ec (E*+ c d c ec (E@+ c d c ec (E2+ c d c ec (E9+ c d 1 c c c c ec (E>+ c d c ec (E=+ c d ec (E;+ c c d ec (E5+ c , c c c c c c c !

c c c c c cd

ec cE* cE2 cE9 cE; c

cE5 cd

ec cE>c cE= c cd ec cE@ c # c c c c c c c c c c c c c c c 1 c c c c Tc c c c c c c c c c c c c ! c c c c !c c c c c 2 @ c c d

ec d ec d ecd

ec cd ec c c c c c c: c c c c c . c c c %:c c c c c c c c c c c c c d ec c d

ec c c !

c c c c c

c d ec c d ec c c c c c c , c c c c ! c c c c

c : &Pc C3% $Q*c c c c c cc c c c c c . c c c c c c c c

c c #' c c c ;@8 958c ! c c c28c c c, c c c c c c99*88cC c*=c c c cC c c c c c c c c c c c4*@6 c, c c c c c c c c c! c c c c ! c c c c c c c c c / c c4*@6 c

5.2 xperiments on >uman motion #ecognition c c c c ! c c c > * c c c c c c9c c c>c c c c c c c c c

c c c c c c c c c

c c !

c ! c c c c c c !

c c c c c c c c c c c c c c c c ic c c c

c c c c c c c c

c c c ! c c c c c c c c, c>c c, c = c c c c c c c c c c c c c c c c, c c

2013

Fig. 13. A photo of Scenario 2

c c c c c c c1 c c c c c c c c c ( c c @@c c 28c c c c c + c, c1 c cd c c c c c !

je c c c

c c cd

ecd ecd ec d

ec cd ec, c . c c c c

c

c c c c c c

c c c c c c c ;9 2=Oc c c

c c c , c> c, c=c c c c c

c c c c c c 59 *=Oc c c

c c c ! c *8Oc c c c c c c / c c c c c c ! c c c c c c !

c c c c c c c d

ec d

ec c d ec ! c c cc c c cd

ec cd

ec

c c c c c c c, c>c

c, c= c cd ec c c c c c c c

c c c c !

c c c c c

c

c c

c c c c c c c

5able 5. Confusion matrix of emotion recognition based on speech Happiness Surprise Anger Sadness Neutral

Happiness Surprise Anger Sadness Neutral

N° 2

33 0 0 0 9 0 0 0 3 0 0 1 0 0 0 Average recognition rate

0 0 0 2 0

11 2 1 2 10

Recognition rate 75% 81.8% 75% 40% 100% 74.36%

5able 6. Confusion matrix of emotion recognition based on bimodal cues Happiness Surprise Anger Sadness Neutral Happiness Surprise Anger Sadness Neutral

37 1 0 0 9 0 0 0 3 0 0 0 0 0 0 Average recognition rate

0 0 0 4 0

6 2 1 1 10

Recognition rate 84% 81.8% 75% 80% 100% 84.16% Articles

59


Journal of Automation, Mobile Robotics & Intelligent Systems

Host Cheers ! (toast) Robot 4 & 5 Cheers ! [happy] Guest 1 Cheers ! (toast) Guest 2 Cheers ! (toast) Guest 1 It’s delicious. Robot 4 That’s good. [happy] Guest 2 Let’s do something. Host Yeah. Guest 1 What is that? Host Ah, that’s darts game. Do you want to play? Guest 1 Yeah, why not? Host OK, let’s go. (guiding)

(): Gesture []: Emotion

Fragment 1 Fragment 2 Fragment 3 Fragment 4

Fragment 5

Fig. 14. The dialog of Scenario 2

c , c > c c c

c c d

ec c c c

c d ec c

c c c c c

c c c c c c

c c !

c c c c c c c c c c c c c c c c c c c- c c

c c c c c c c c cd

ec c c c?O c c 59O c d# ec c d

ec c !

c c c c c c c c c c c ! c c c c c c c c cd ec cd

ec c

c c c c c) c ! c c: c2 c cC c c c c c c c c

c d c no ec c c c c ( c no+ c c

c c d ec c c cd

ec c c

c c c c c ( c no+c c c ( c d c ec c cd

ec + c cd

ec c c

c c c

c c c c

5.3 motion 4ynthesis of ye #obots c c c c c c c c c c c

c c c 2 2c

c c c c c, c c c c c c c c

c c% c*Bcd c Tec% c@Bcd c Tec% c2Bcd c Tec% c9Bc d c Tec c% c>Bcd c e c c ! c d

ec d

ec c d ec c c c c !

c c c c , c d

ec c c c c c c c c c c c c c !

c c c c c c c c, c; c, c c c c c c c c c c c c c c c !

c c c ! c c c c c c

c c c c c c c c c c c c c c c c ) c !

c c : c @ c c C c E c * c c E c @c

c

c c c c

cdCheers ec% c9c

c% c>c cdCheersec c c c c c c c d ec c c c c c c c !

c c 60

Articles

VOLUME 7,

N° 2

2013

# c c c2 @ 2 c cd

ec c c# <

#

c c c c

c ( 8 9; c 8 99 c 8 22+ c c c c c c< c@*c c c) c? c# c c c c c c, c9 c c c c c c cLl728N cLp=-98N cLy=98N tl=8 > c

to=* c

c c c) c*>c( + , c c c c c c c c ! c c c c c c c c c c c c c c c c c c c c c

c c c) c*2

5.4 xperiments on the Communication

tmosphere using the 3 in >uman-#obot nteraction < c c c ! c c

c c c c 4@?6 c c c c c c ! c c c c c c c c c c

c !c c( cT 7c= c cÎł c c 1 (>++c c c c ! c c c c c c ! c c

c c c c c c c c c !c c c

c c c c c, c c c c c c c c c c c c c c c c c c c c c

c c c c c c c c c c c ( c c

c c c c c c 1 (@++ c c c ! c c c c c

c c c) c*= c c ! c c: c*c

c: c= c c c c c c

c c c c c c c c c Tc c: c2c( c c c c + c c c c c c

c Tc c c ! c c c c c c c c c1 c c c

c c c c c c c: c> , c c c c c ! c c c ! c c c: c@c c c

c c ! c c c)#c c

c c c c cC c (EH+ cE c*c(EG1+ cE c@c(EG2+ c c% c9c(ER4+c c % c>c(ER5+c c c c

c c c c , c: c c c c c c

c c c ) c*9 c c c) c* c c ccheersc c c cd

ec c c c cd e c c c c c EH(8 >5 c 8 =2 c 8 >9+ c EG1(8 >5 c 8 =cc2 c 8 >9+ c EG2(8 >5 c 8 =2 c 8 >9+ c ER4(8 > c 8 = c 8 @+ c c ER5(8 > c 8 = c 8 @+ c c c )#c

c FA(8 = c 8 = c 8 2+ c c c c c c c c

c Tc c c ) c 2 c

c c c c

cEH(8 22 c8 25 c8 9@+ c EG1(8 9= c8 >* c8 @?+ cEG2(8 @ c8 *2 c8 >9+ cER4(8 2 c8 2> c 8 9+ c cER5(8 c8 c8+ c cFA(8 @ c8 *2 c8 @+c c c

ccccccccc ( +c& c !

cccccccccccccccccc( +c:

c

3ig. 15. Eye movement for “sadness� emotion

c


Journal of Automation, Mobile Robotics & Intelligent Systems

0.6

0.5

0.5

0.4

Casual-Formal

Casual-Formal

VOLUME 7,

0.4 0.3

Scenario1 Scenario6

0.2 0.1

0.3 Scenario2 Scenario4

0.2 0.1

0 1

0 1 1

L iv

elyC

1

0.8

0.5

alm

0

0

dly F rien

L iv

ile -H ost

(a) Atmospheres in Scenario 1 & 6

elyC

0.6 0.4

alm

0

ostile dly -H F rien

0.2 0

(b) Atmospheres in Scenario 2 & 4

1

1

Casual-Formal

Casual-Formal

0.8

0.5

0.6 0.4 0.2

0.5

0

-0.5 1

0.5

0

-0.5 1

0.5

1 0

L iv

elyC

0.5

0

-0.5

-0.5

al m

-1

-1

1

0.5

0.5

L iv

ostile dly -H F rien

(c) Atmospheres in Scenario 3

elyC

0

alm

0 -0.5

-0.5

ostile dly-H F rien

(d) Atmospheres in Scenario 5

Fig. 16. Communication atmospheres represented by the FA in a home party

5able 7. Emotion of each eye robot c ,'c% c 3 c% c c, c % cc c% c c% c

C

c * @

:

c

: c

*c 8c

* 8

2c

*c

2c

@ =

8c @c

* @

c c c c

c Tc c c c ) c> cEH(8 9= c8 =2 c8 ==+ cEG1(8 >5 c8 =2 c8 9@+ c

c c c c c c c c c

c cEG2(8 * c8 c8 @?+ cER4(8 c8 c8+ c cER5(8 c8 c8+ c c c)#c cFA(8 9 c8 > c8 2+c c !

c

c c c c c c

c , c c c ! c c c c

c

c c c)#c c c1 c c c c 4@?6 c # c c c ! c c c c c ( c c@8c c2>c c c c c c c +c c

c c c c

c c c c c c c c c c c c , c 1 c c c 4@?6 c cd c c c c c je cd c c c c c je c d c c c c

c c jec: c c c c c c c1 c) c ! c c cd D e c c c c*cic ! c c@cic c c2cic c9cic c >c ic c =ic c c c ;ic ! c c ) c c 1 c

c c c c c 1 c c c

c c c c c *c c* c c*( *+ c@( 8 ==+ c 2( 8 22+ c9(8+ c>(8 22+ c=(8 ==+ c;(*+ c, c c c c c c c1 c c c c

c c c c c c)#

N° 2

2013

, c

c c c c c c c c c c c c c c c c c

c c c c c c c c c 4286c c c , c ! c c c c c c c 8 *5 c 8 *9 c c 8 @> c c c c c8 ?@ c8 5= c c8 ;@c cd) C ec d" 0 ec cd0

) ec ! c c c c c c c c c c c c c c 1 c) c c ! c c c c c c cd ec cd ec c

c c c cd

ec cd ec, c c cd

ec c d ec c c c c c c c c c c c c c1 c c ! c c c c c

c 1 , c ! c c c c c c c c c c c c c c c c c c c c c c c c c c c c c, c c c c c c c c c

c c c c c c c c c c c c c#c c c c c c c c c c c c

c c c c c c c c ! c c

c c

c . c c c c c c c c D c c c c ) c c c c c c c c c c

c c c c c

c c c c c c

c c c c . c

c c

c c c c4*26 c c c

c c c ! c c c c c c c)#c c c c c c c c c c c

c c c 4@?6 c c c c c c c c c g c c c c c c c c c

c c c c c

c c !

c c c c c c c % c c c c c c c

c c c c c - g c c c

c c c c c c c c c c c c c c c c c c c c c c)#c c) c*@c c c .

6. Conclusion #c c c c c c c c ! c c c c c c c c c c c %:c (

c % c : +c 4*86 c /

c c c c c c )#c () c # +c 4;6c4@?6 c c c c c c 1 c c c c c c c c c c c c c c c

c c c c c c 1 c

c c c 8 *5 c 8 *9 c c 8 @> c c c c c8 ?@ c8 5= c c8 ;@c cd) C ec d" 0 ec c d0

) ec ! c c c )# c c c c # c c c ! c c c

c c 1 c c c c c c c c c c c c c c Articles

61


Journal of Automation, Mobile Robotics & Intelligent Systems

c c(“8 ;@+c c c c c c c c c c . c c c c c c c c c c c c c c c c c c c c

c c c . c c c c %: , c c c c c c c c D

c c ! c c c F c c c c c c c c c c c c23c c c % c c c c c c59Oc c

c c c*8Oc c c c c c c c c c c c c c c c d

ec c d

ec c c c ?Oc c 98O c c , c c

c c c c c c c c c c c c c

c c c c c c c c c cd ec c c c c c c c c c c c c c c c cd

ec c d

ec c c c c c c c c c c c c c c c c c c c gc c c !

c c c c . c c c c c c c c c c c c c 42*6 c c c c c c c c c c

c c c c c c c

c c c c c c c c c c c 42@6c c c c

c c c

c c c c c c c c c c

c c c c . c

c c

c

cknowledgment , c c c c c c F c : c c c < c c : c (F:<:+c c c I#I &C c @*288858c c c & c & c : c ) c c0 c c c=*@*88** c, c c c c c# c c

c c c" c, c c c c c c c c .

"5>$#4 Zhen-Tao Liu*, Lue-Feng Chen, Fang-Yan Dong, and Kaoru Hirotac −c 3 c 0 c ”c : : c , c c c , cE2 9? c9@>?c& c cP cI c@@= 5>8@ cF • c c c –J . Min Wu and Dan-Yun Lic−c: c c c: c c c0 c: c$ cP c c0 cC c9*8852 c0 J c 8=8=J Yoichi Yamazaki −c 3 c c c ”c c c I c E c $ c * >8 *c

c I c P cI c@2= 5>8* cF J . Z0 c 62

Articles

VOLUME 7,

N° 2

2013

# 3 # %C 4 [1] c H , c " c c - c c c d c c

c 2 3c) c# c c

c c c c c e c BcIEEE Int. Conf. on Fuzzy Systems c, c, c@8** c c;;;i ;5@ [2] c < c% c0 c" c c cd# c c c c c c 1 c c c c c c e c Pattern Analysis & Applications c c? c c* c@88= c c>5i=? [3] c 3 cI — c c c# c0 cd# c c c c c ec IEEE Trans. on Robotics c c@2 c c> c@88; c c??*i*888 [4] c < c % c c c C 1 c c c d# c c

c c c c c c

c c c e c Bc Proc. of the Int. Workshop on Affective-Aware Virtual Agents and Social Robots c/ c$:# c@88? [5] c Q c " c / c 3 c c c d !

c c

c c c c c e c B Int. Conf. on Intelligent Robots and Systems c: c" c $:# c@88? [6] c % c , c P c c c c d< c c

c c c c c c c c c c c e c Jounal of Advanced Computational Intelligence and Intelligent Informatics c c*9 c c; c@8*8 c c5>@i5>? [7] c H , c " c ) P c 3 c c c d<

c c ) c # c c c !

c c c e c Bc Int. Symp. on Intelligent Systems c, cF c@8*8 [8] c P c P c P c C c c c d) c c

c c !

c c c c c# c<

#

c e cJounal of Advanced Computational Intelligence and Intelligent Informatics c c *@ c c 2 c @885 c c 289i2*2 [9] c " c 3 c % c d, c c c c ! c c !

c

c c c c e c Bc Proc. of the Doctoral Consortium at the IEEE Conf. on Affective Computing and Intelligent Interaction c # c& c@88? [10] c I cC c c) P c3 cd3 c c

c % c: c c& 3 c . e c BcProc. 4th IEEE Int. Conf. Intelligent Systems c@885 c c25i99 [11] c C c# c' cP cP c c cd c c

c c c c c c c c%,c e c BcIEEE Int. Conf. on Fuzzy Systems c, c, c@8** c c;5;i;?* [12] c P I c , c C c # c ' c c c d c c c c

c % c : c

c c 1 c c c c c 23c

c e c Journal of Advanced Computational Intelligence and Intelligent Informatics c c> c c> c@8** c c>=2i>;@ [13] c H , c " c H c c c c d c c c c c

c c c c c c

c % c : e c Bc The 9th Int. Conf. on Informatics in Control Automation and Robotics c % c c@8*@ c c>i*9


Journal of Automation, Mobile Robotics & Intelligent Systems

[14] c P c - c c " c E c d% c c c c c c e c IEEE Transactions on Multimedia c c*8 c c> c@885 c c?2=i?9= [15] c " c : c P c P c c c d#c c c c c c e c Neurocomputing c c;* c c*8 c@885 c c*?*2i *?@8 [16] c F c C c F C c C c c c d#c c c c c c c c c e cJournal of Computers c c2 c c; c @885 c c2?i9; c [17] c < c c d# c c

c je c Psychological Review c c ?? c c 2 c *??@ c c >>8i>>2 [18] c H F c0 c c0 C c- cd c c c c c c ! c e c B IEEE Int. Conf. on Multimedia and Expo c , c , c@889 [19] c / c : c E c % c c c d: c c c c c c c c c c c c : c ' c c c c c e c BcIEEE Int. Conf. on Acoustics Speech and Signal Processing cS c0 c@889 c [20] c # c" c, cI c c cd% c c c c c c c F e c Bc Proc. of Asia-PaciAic Signal and Information Processing Association c: cF c@88? [21] c 0 c - c H c 0 c c c d c c c ! c c c c c c ! c e c ACM Transactions on Asian Language Information Processing c c > c c @ c @88= c c*=>i*5@ [22] c / c: c: c% c c cdc: c c

c c c c c

e c Bc IEEE Int. Conf. on Multimedia and Expo c# c& c@88> [23] c 3 c E c # c 0 c c c d, 1 c c

c c c c c c

e c Bc IEEE Computer Society Conf. on Computer Vision and Pattern Recognition c # c#I c$:# c@885 c [24] c F c 0 c C c - c c c d<#3c c

c c !

c e c Advances in Visual Computing c Lecture Notes in Computer Science c c>2>? c@885 c c9>8i9>? [25] c F c ˜™ c c# c# cd c c c c c e cComputers in Human Behavior c c @* c @ c @88> c c 2@2i 29* [26] c : c H c H c - c c c d) c !

c

c c<#3c c c c c 0 c !

c e cAffective Computing and Intelligent Interaction Lecture Notes in Computer Science c c9;25 c@88; c c@9i2> [27] c BDD

D D D D [28] c BDD D [29] c H , c " c c - c 3 P c " c " ) c 0 c ) P c 3 c P c P c c I c C c d0 c c ) c # c c % c 0 c # c c c # c c C

VOLUME 7,

N° 2

2013

% c e c Journal of Advanced Computational Intelligence and Intelligent Informatics c c*; c c* c@8*2 c c2i*; [30] c cE c cI cI cd c c c c c c 23c c c e c Robust Speech Recognition and Understanding I-Tech Education and Publishing c@88; c c@5*i 288 [31] c ) c c# c3 1 c c cd0 c c c c c c c c

c c e c BcIEEE Int. Conf. on Systems Man and Cybernetics c-

c$:# c @882 [32] c 0 P c 0 c 0 P c " c c c d#c c c c c c c c e c Bc Int. Computer Symp. c , c, c@8*8 c

Articles

63


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

6  ‚( ƒG  ‚*  ‚ „‚ ‚# … ‚W †ƒ ‚  ‚*‡ G‚4ˆ G‡  ‰  ‚ '‡ƒ Šƒ Submitted:19th September 2012; accepted 9th November 2012.

Christos K. Volos Abstract: This work presents chaotic motion direction control of a robot and especially of a humanoid robot, in order to achieve complete coverage of the entire work terrain with unpredictable way. The method, which is used, is based on a chaotic true random bits generator. The coexistence of two different synchronization phenomena between mutually coupled identical nonlinear circuits, the well-known complete chaotic synchronization and Š Š Š Š Š ‹ Š ) tion, is the main feature of the proposed chaotic generator. Computer simulations confirm that the proposed method can obtain very satisfactory results in regard to the fast scan of the entire robot’s work terrain. Keywords: humanoid robot, motion direction control, nonlinear circuit, chaos, true random bits generator, Š ) Š Š‹ Š )

‚ Introduction Autonomous mobile robots have acquired a keen interest of the scientific community, especially in the last two decades, because of their applications in various fields of activities, such as industrial and military missions. Therefore, many interesting applications of mobile robots, such as floor-cleaning devices, industrial transportation and fire fighting devices [1–3], have been developed. Especially, the use of autonomous mobile robots for military applications, such as the surveillance of terrains, the terrain exploration for searching (e.g. for explosives or dangerous materials) or patrolling (e.g. for intrusion in military facilities) [4–6], has become a very interesting task. For such applications many mobile robots are commercially available, which in many cases focus on some features such as, the perception and identification of the target, the positioning of the robot in the terrain and the updating of the terrain’s map. However, the most important from all the above features is the path planning, because it determines the success of the robot’s missions especially in many military tasks. Additionally, the research subject of the interaction between mobile robots and chaos theory has been studied intensively. The basic feature of these research attempts in a chaotic robot field is a motion controller, which is based on microcontrollers or CPUs that ensure chaotic motion to the robot. Signals, which are produced by chaotic systems, are used to guide autonomous robots. Until now, some of the most well-known chaotic systems, such as Chua circuit [6], 64

Articles

Arnold system [7], Standard or Taylor-Chirikov map [8] and Lorenz system [9], have been used. The problem of patrolling a terrain with a mobile robot is an issue that has to do with finding a plan for production not only of unpredictable trajectories but also a way to scan fast the entire predicted region. These are the main reasons for using nonlinear dynamic systems, because the chaotic behavior of such systems ensures the unpredictability of the robot’s trajectories. The second aim, the fast scanning of the terrain, is the subject of study among the researchers for selecting the most suitable dynamic system. This work, presents a new strategy, which generates an unpredictable trajectory, by using a chaotic true random bits generator. Also, a humanoid robot is selected because such a kind of robots has a specific way of movement and it used nowadays in many activities. The proposed motion planner of a humanoid robot’s motion produces a sequence of steps in the four basic directions (forward, right, left and backward) or in eight directions (forward, diagonal forward-right, diagonal forward-left, right, left, diagonal backward-right, diagonal backward-left, backward). This paper is organized as follows. In Section 2 basic features of chaotic systems and the synchronization phenomena, which are the base of this work, are presented. Section 3 describes the robot’s motion generator block by block. In Section 4 the statistical tests of the proposed true random bits generator, is discussed. Section 5 presents the simulation results of the humanoid robot’s motion and their analysis. Finally, Section 6 includes the conclusions remarks of this work.

‚ ‚* / ‚4 7 ‚ ‚4 /0 / ‚ ' 7 ‚ A dynamical system, in order to be considered as chaotic, must fulfill the following conditions [10]: ³c c c c c c c c ³c c c c c c c c ³c c c c c ! The most important of the three above conditions is the system’s sensitivity on initial conditions or on system’s parameters. This means that a small variation on system’s initial conditions or parameters can lead to a totally different dynamic behavior, so to a totally different trajectory. That’s why chaotic systems are very good candidates for using in robot’s motion planners because theirs sensitivity can con-


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

tribute to robot’s unpredictable trajectory, which is a necessary condition in many robotic activities as it is mentioned before. In the last two decades the study of the interaction between coupled chaotic systems was a landmark in the evolution of the chaotic synchronization’s theory [11]. The topic of synchronization between coupled nonlinear chaotic systems plays an important role in several research areas, such as biological networks, secure communication and cryptography [12–15]. The most well-known type of synchronization is the complete or full synchronization, in which the interaction between two identical coupled chaotic systems leads to a perfect coincidence of their chaotic trajectories, i.e. x1(t) = x2

where x1 and x2 are the signals of the coupled chaotic systems. From 2010 a new synchronization phenomenon, c c ´ c c c c mutually coupled identical nonlinear systems, has been observed [16]. More precisely, this type of syn c c c c c ´ c nization, is observed when the coupled system is in a phase locked (periodic) state, depending on the coupling factor, and it can be characterized by eliminating the sum of two relevant periodic signals (x1 and x2) c c c cÂľ c c c 1 c c,D@ c c,c c c period of the signals x1 and x2. x1(t) = – x2

) ‚ # ‹ ‚6 / ‚* / ‚< The basic element of the proposed motion planner is a chaotic true random bits generator, which consists of three blocks. The first block includes the system of two coupled identical nonlinear circuits. The autonomous circuit (Fig. 1), which is used, is the well-known circuit of Chua oscillator, in which the nonlinearity is described by a piecewise-linear function. By the term “autonomous�, a nonlinear circuit without any external voltage or current source is considered, as it is shown in Fig. 1. Although in this paper the nonlinear element NR of the circuit, which is used, implements a cubic function. This type of circuits is capable of producing double-scroll chaotic attractors (Fig. 2). In this type of behavior the chaotic systems have two attractors, between which the process state will oscillate. So, a double-scroll oscillator needs to have at least three degrees of freedom in order to be chaotic.

2013

The state equations describing the circuit of Chua oscillator are the follows:

⎧ ⎪ ⎪ dx1 = 1 ⎥ 1 (y 1 − x1 ) − g(x 1 )⎤ ⎌⎼ ⎪ dt C1 ⎣⎢ R ⎪ ⎪ dy ⎪ 1 1 ⎥1 ⎤ x − y 1 ) + z1 = ⎨ ⎢⎣ R ( 1 ⎼⎌ dt C 2 ⎪ ⎪ ⎪ dz1 1 = [ − y 1 − R 0 z1 ] ⎪ ⎪ dt L ⎪⎊

(3)

where, x1c7cÂşC1, x2c7cÂşC2, z1 = iL and g(x1) is the cubic function of the form: g(x1 ) =

−k 1 x1 + k 3 x13

(4)

where, k1, k3 > 0. The absence of the term of the sec c c c c c c c cÂş c acteristic. The practical circuit for realizing the cubic polynomial (4) is shown in Fig. 3. This realization proposed for the first time by Zhong [17]. The two terminal nonlinear resistor NR consists of one Op-Amp (LF411), two analog multipliers (AD633JN) and five resistors. Each multiplier implements the function:

w=

Nevertheless, depending on the coupling factor and the chosen set of system’s initial conditions, c c ´ c c ! c c a complete synchronization [16]. The proposed TRBG, which is used for the motion of the humanoid robot, is based on the coexistence of these two types of synchronization, which are used as representing the states “0â€? and “1â€? in the seed generation, as it will be described in details in the next section.

N° 2

(x1 − x2 )(y 1 − y 2 ) +z 10V

(5)

where, the factor 10 V is an inherent scaling voltage in the multiplier. The connections of the Op-Amp and the resistors R1, R2 and R3 form an equivalent negative resistor Re, when R1 = R2 and the Op-Amp operates in its linear region, in order to obtain the desired coef c 1 and k3 c, c c cÂş c c of NR is as below: (6)

where, k 1 =

R + R5 1 1 1 and k 3 = 4 . R3 R 3R 4 10V 10V

The values of circuit parameters are: R0c7c28c c%c

Fig. 1. The schematic of the Chua oscillator Articles

65


Journal of Automation, Mobile Robotics & Intelligent Systems

7c*?=8c c%1 = R2 7c@c  c%3c7c* =;*c  c%4c7c2 8*c  c R5c7c; 55;c Âc01 = 7.4 nF, C2 = 95.8 nF, L = 19.2 mH and the voltages of the positive and negative electrical sources are ¹15 V. For these values the circuit of Chua oscillator presents the desired chaotic behavior according to author’s previous work [18]. So, the normalized parameters take the following values: k1 = 0.6384 mS, k3c7c8 8@>@c :D'3.

Fig. 2. The double-scroll chaotic attractor

VOLUME 7,

N° 2

2013

Fig. 4. The system of two bidirectionally or mutually coupled nonlinear circuits via a linear resistor of amplitude 1 V and has a duty cycle of 4%. So, the pulse duration is 2ms, while the period of the pulse train is 50 ms (Fig. 5a). Consequently, the first block of the proposed True Random Bits Generator (TRBG) produces the synchronization signal [x2(t) – x1(t)] of the coupled system which varies between two states (Fig. 5b). In the first one, the signals x1(t) and x2(t) are identical and the difference [x2(t) – x1(t)] is equal to zero because the system is in a complete synchronization mode. In the second state the signal x2(t) is inverse of the signal x1( +c c´c

c c: c c c4!2(t) – x1(t)] oscillates around the value of 2.5 V. In the second block, the two different levels of the output signal [x2(t) – x1(t)] are quantized to “0â€? and “1â€? according to the following equation: ⎧⎪0, if x 2 (t) − x1 (t) < 1V Ďƒi = ⎨ ⎪⎊1, if x 2 (t) − x1 (t) > 1V

(7)

Therefore, if the system is in a complete synchronization state a bit “0â€? is produced, while if the system c c c c´ c c c c cd*ec is produced (Fig. 5c). The sampling period equals the period of the pulse train (T = 50 ms) and the sampling occurs at the middle of each pulse.

(a)

Fig. 3.Š Š Š Š ) Š Š Š ÂŒ Š Š characteristic The system of two bidirectionally or mutually coupled circuits of Chua oscillators is shown in Fig. 4. The coupling of the identical nonlinear circuits is achieved via a linear resistor RC connected between the nodes A of each circuit. For small values of the resistor RC (e.g. RCc7c@>8cĂ‚+c c ! c c c c tioned synchronization phenomena is observed. Furthermore, the necessary perturbation p for changing the system’s initial conditions and consequently the synchronization state of the coupled system is an external source that produces a pulse train 66

Articles

(b)

(c)

Fig. 5. Time-series of (a) pulses p(t), (b) difference signal [x2(t) – x1(t)] and (c) the produced bits sequence, with the


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

proposed technique Finally, the third block extracts unbiased bits, with the well-known de-skewing technique [19]. This technique eliminates the correlation in the output of the generator of random bits by converting the bit pair “01� into an output “0�, “10� into an output “1�, while the pairs “11� and “00� are discarded.

. ‚ 4 / / -‚5 ‚ 1‚ ‚* / ‚< In this section the “randomness� of the produced bits sequence, by the proposed chaotic TRBG, is confirmed. So, the dynamical system (8) of the coupled circuits of Fig. 4 was solved numerically by using the fourth order Runge-Kutta algorithm and the signal [x2(t) – x1(t)] is used for producing the chaotic bits sequence with the procedure described in Section 3.

⎧ dx1 1 ⎥ 1 ⎤ 1 = ⎢ (y 1 − x1 ) − g(x 1 ) + (x2 − x1 )⎼ ⎪ RC ⎌ ⎪ dt C1 ⎣ R ⎪ ⎪ dy 1 1 ⎥ 1 = (x1 − y 1 ) + z1 ⎤⎼ ⎪ ⎢ dt C R ⎣ ⎌ 2 ⎪ ⎪ ⎪ dz1 1 = [ − y 1 − R 0 z1 ] ⎪ ⎪ dt L ⎪ ⎨ ⎪ dx2 = 1 ⎥ 1 (y − x ) − g(x ) + 1 (x − x )⎤ 2 2 1 2 ⎼ ⎪ dt C1 ⎢⎣ R 2 RC ⎌ ⎪ ⎪ ⎪ dy 2 = 1 ⎥ 1 (x − y ) + z ⎤ 2 2⎼ ⎪ dt C2 ⎢⎣ R 2 ⎌ ⎪ ⎪ ⎪ dz2 = 1 [− y 2 − R 0z2 ] ⎪ dt L ⎪ ⎊ (8) For this reason one of the most important statistical test suites is used. This is the FIPS (Federal Information Processing Standards) [20] of the National Institute of Standards and Technology (NIST), which comprises of four statistical tests: Monobit test, Poker test, Runs test, and Long Run test. As it is known, according to FIPS statistical tests, the examined TRBG will produce a bitstream, bi = b0, b1, b2, ‌, b Æ*, of length n (at least 20000 bits), which must satisfy the four above mentioned statistical tests. Using the fact in information theory that noise has ! c c c c c c c(!01 = 0.60, y01 = 0.10, z01 = 0.05) and the second circuit (x02 = 0.70, y02 = 0.20, z02c7cÆ8 *8+c c c c c c measured entropy of the TRBG is maximal. The measure-theoretic entropy [21] of the proposed chaotic TRBG with respect to system’s parameters and initial conditions is calculated to be Hn = 0.69172 for n = 3 and Hn = 0.69189 for n = 4, where n is the length of the n-word sequences. So, by using the procedure described previously, bits sequence of length 20000 bits has been obtained from the output of the proposed chaotic TRBG calculated via the numerical integration of Eq.(8). Then this bits sequence is subjected to the four tests of FIPS-

140-2 suite. As a result, it has been numerically veri c c c c 1 c

c

c c c c c FIPS-140-2 (Table 1). 5 - ‚ ‚Results of FIPS-140-2 test, for the chaotic TRBG Monobit Test

n1=10018 (50.09%)

Passed

Poker Test

Runs Test

Long Run Test

2.3245

B1=2565 B2=1253 B3=605 B4=319

No

B5=144 B6=149 Passed

Passed

Passed

2 ‚ 4/7 - / ‚# - ‚ 1‚ ‚# ‹ ‚6 / ‚ A humanoid robot, like the commercial model of Kondo KHR-2HV (Fig.6) is adopted because it is an interesting compromise of simplicity between control and implementation. In this work two different humanoid’s motion approaches have been used so as to take advantage the ability of the specific type of robot. In the first case, the chaotic motion planner converts the bits pairs: 00, 01, 10 and 11, which are produced by the chaotic generator, into steps in the four basic directions: forward, right, left and backward. With the same way, in the second case, the bits triads: 000, 001, 010, 011, 100, 101, 110 and 111, are converted into steps in the following eight directions: forward, diagonal forward-right, diagonal forwardleft, right, left, diagonal backward-right, diagonal backward-left and backward.

Fig. 6. The humanoid robot Kondo KHR-2HV Also, in this work, for a better understanding of the behavior of the robot’s chaotic motion generator, we assume that the robot works in flat area, with boundaries, without obstacles and without any sensor. So, in the case that the proposed humanoid robot reaches boundaries of the terrain waits the next direction in order to move. In all similar works, the first step is the study of the robot’s motion by using computer simulation. For this reason the terrain coverage is analyzed, by using the well-known coverage rate (C). A square terrain with dimensions: M = 25 x 25 = 625, in normalized unit cells, is chosen. The coverage rate (C) is given by the following equation: C=

1 M ⋅ ∑ I(i) M i=1

(9) Articles

67


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

where, I(i) is the coverage situation for each cell [22]. , c c c c c c 1 B

(10)

where, i = 1, 2, ..., M. So, in the following test the motion generator produces a sequence of 10000 steps in the four basic directions (forward, right, left and backward) starting from three different initial positions on the terrain: (x0, y0) = {(5, 20), (12, 12), (20, 5)}. The results for 2000 and 10000 robot’s steps for (x0, y0) = (5, 20), are shown in Fig. 7. Especially, in Fig. 7b the coverage of the whole terrain, can be observed. In Fig. 8, the coverage rate versus the number of steps, for the robot with the proposed chaotic motion generator, starting from the three, above mentioned, initial positions, is shown. In the three simulations, the complete terrain’s coverage was calculated. Furthermore, the robot has covered, practically, all the terrain (91%) after only the 4000th step.

N° 2

2013

(motion in four directions) is shown. In conclusion, in the case of moving in eight directions the robot shows a more quick coverage of the terrain’s space. Especially, for the first 2000 steps the robot has shown 20% faster terrain‘s coverage in the second approach in regard to the first one. More precisely from the 2000th step the robot has covered the 85.6% of the total terrain. Furthermore, in the remaining 3000 steps, until 5000th step, the robot covers only 10.7% of the terrain. So, finally the total terrain’s coverage percentage in the second approach was calculated to be equal to 96.3%. Therefore, the robot, with the capability of moving in eight directions, has better and faster coverage rate in regard to the case of moving in four directions.

Fig. 8. The coverage rate versus the number of steps, when the robot moves in four directions, starting from three different initial positions on the terrain: (x0, y0) = {(5 20), (12, 12), (20, 5)}

(a)

Fig. 9. The comparison of the two different kinematic control approaches (moving in four or eight directions)

A ‚* - / (b) Fig. 7. Terrain covering using the robot with the proposed chaotic generator in the first case for initial position (x0, y0) = (5, 20), for (a) 2000 steps and (b) 10000 steps For the validation of the robot’s kinematic motion by using the capability of the robot to move in eight directions, an arbitrarily initial position is chosen: (x0, y0) = (8, 10). In Fig. 9, the coverage rate versus the number of steps, in comparison to the previous case 68

Articles

In this work, a chaotic path planning generator for autonomous humanoid robots was presented. In contrary with other similar works, where the control unit defines the position goal in each step, here only the motion of the humanoid robot is controlled by using the coexistence of synchronization phenomena between coupled chaotic circuits. Statistical tests of the proposed chaotic generator guarantees the “randomness� of the produced bits sequence and consequently the “randomness� of the planning path.


Journal of Automation, Mobile Robotics & Intelligent Systems

Furthermore, validation tests based on numerical simulations of the robot’s motion direction control, confirm that the proposed method can obtain very satisfactory results in regard to unpredictability and fast scanning of the robot’s workplace. Finally, the use of the specific chaotic TRBG in this work provides significant advantage concerning other similar works because of the improved statistical results, which were presented in details.

"5>$# Dr. Christos K. Volos – He is teaching Electronics at the Laboratory of Electronics and Communications. Department of Military Sciences, University of Military Education – Greek Army Academy, Athens, GR-16673, Greece, chvolos@gmail.com.

# 3 # %* 4 [1] J. H. Suh, Y. J. Lee, and K. S. Lee, “Object Transportation Control of Cooperative AGV Systems Based on Virtual Passivity Decentralized Control Algorithm�, J. Mech. Sc. Techn., vol. 19, 2005, pp. 1720–1730. [2] J. Palacin, J. A. Salse, I. Valganon, and X. Clua, “Building a Mobile Robot for a Floor-Cleaning Operation in Domestic Environments�, IEEE Trans. Instrum. Meas., vol. 53, 2004, pp. 1418– 1424. [3] M. J. M. Tavera, M. S. Dutra, E. Y. V. Diaz, O. Lengerke, “Implementation of Chaotic Behaviour on a Fire Fighting Robot�, In: 20th International Congress of Mechanical Engineering, Gramado, Brazil, 2009. [4] L. S. Martins-Filho, E. E. N. Macau, “Trajectory Planning for Surveillance Missions of Mobile Robots�, In: Studies in Computational Intelligence, Springer, Heidelberg, 2007, pp. 109–117. [5] S. Marslanda and U. Nehmzowb, “On-line Novelty Detection for Autonomous Mobile Robots�, Robot. Auton. Syst., vol. 51, 2005, pp. 191–206. [6] P. Sooraksa and K. Klomkarn, “No-CPU Chaotic Robots: From Classroom to Commerce�, IEEE Circuits Syst. Mag., vol. 10, 2010, pp. 46–53. [7] A. A. Fahmy, “Implementation of the Chaotic Mobile Robot for the Complex Missions�, Journal of Automation, Mobile Robotics & Intelligent Systems, vol. 6, 2012, pp. 8–12. [8] L. S. Martins-Filho and E. E. N. Macau, “Patrol Mobile Robots and Chaotic Trajectories�, Math. Probl. Eng., vol. 2007, 2007, p. 1. [9] D. I. Curiac and C. Volosencu, “Developing 2D Trajectories for Monitoring an Area with two Points of Interest�, In: 10th WSEAS International Conference on Automation and Information, 2009, pp. 366–369. [10] B. Hasselblatt and A. Katok, “A First Course in Dynamics: With a Panorama of Recent Developments�, University Press: Cambridge, 2003. [11] L. M. Pecora and T. L. Carroll, “Synchronization in Chaotic Systems�, Phys. Rev. Lett., vol. 64, 1990,

VOLUME 7,

N° 2

2013

pp. 821–824. [12] C. K. Tse and F. Lau, “Chaos-based Digital Connunication Systems: Operating Principles, Analysis Methods, and Performance Evaluation�, Berlin, New York: Springer Verlag, 2003. [13] Ch. K. Volos, I. M. Kyprianidis, and I. N. Stouboulos, “Experimental Demonstration of a Chaotic Cryptographic Scheme�, WSEAS Trans. Circuits Syst., vol. 5, 2006, pp. 1654–1661. [14 M. Mamat, Z. Salleh, M. Sanjaya, N. M. M. Noor, and M. F. Ahmad, “Numerical Simulation of Unidirectional Chaotic Synchronization on Non-autonomous Circuit and its Application for Secure Communication�, Adv. Studies Theor. Phys., vol. 6, 2012, pp. 497–509. [15] J. M. Gonzalez – Miranda, “Synchronization and Control of Chaos: An Introduction for Scientists and Engineers�, Imperial College Press, 2004. [16] Ch. K. Volos, I. M. Kyprianidis, and I. N. Stouboulos, “Various Synchronization Phenomena in Bidirectionally Coupled Double-Scroll Circuits�, Commun. Nonlinear Sci. Numer. Simulat., vol. 16, 2011, pp. 3356–3366. [17] G. –Q. Zhong, “Implementation of Chua’s Circuit with a Cubic Nonlinearity�, IEEE Trans. Circuits Syst, I, vol. 41, 1994, pp. 934–941. [18] Ch. K. Volos, I. Laftsis, and G. D. Gkogka, “Synchronization of two Chaotic Chua-type Circuits with the Inverse System Approach�, In: 1st PanHellenic Conference on Electronics and Telecommunications, Patras, Greece, 2009. [19] J. Von Neumann, “Various Techniques Used in Connection with Random Digits�, G. E. Forsythe, Applied Mathematica Series-Notes: National Bureau of Standards, vol. 12, 1951, 36-38. [20] NIST, “Security requirements for cryptograph c e c ) <:c <$/c *98 @ c BDD D D D *98 @Dc *98@ c 2001. [21] A. M Fraser, “Information and Entropy in Strange Attractors�, IEEE Trans. Inf. Theory, vol. 35, 1989, pp. 245–262. [22] S. Choset, “Coverage for Robotics - A Survey of Recent Results�, Ann. Math. Artif. Intel., vol. 31, 2006, pp. 113–126.

Articles

69


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

ŒŒ G  Â‚ „‚6 Â? Âƒ ÂŽ †  Â‚„ ‚(ƒ ƒG  Â‚‚ „‚(  Â?ƒ Â?†‚$Â…Â?ƒG † Submitted: 26th June 2011; accepted 23rd September 2011

( 67 6 6 " "#

‚ - Š Š Š Š Š Š Š Š 4 Š Š Š Š Š Measurement system was developed to study 9:;<=>?@;DH<J;KN O;H9QU NVW9UVXYDQ@W Š The measurements of the Earth’s J;KN NVW9YUX>@H;W H>YW;N<XZ \;UUQ=>?@;DH<QX^;H9W<_;U;< H>UUV;N<QY9 ŠThe ability for

Š Š Š Š Š 9 and determine 9:;VU< KQH>DQ@< _>W< demonstrated Keywords: Š Š Š Š

‚ Magnetovision utilizes the measurement of the distribution of magnetic field induction in a particular plane or in space and presenting it with a 2D image (for the plane) or 3D (for space). The name “magnetovision� comes from an analogy with thermal imaging, because the color in the magnetovision image corresponds to the magnetic flux density or the value of the magnetic field strength at a given point. It is also possible to obtain monochrome image in the form of isolines. The most suitable sensors for magnetovision are thin-film magnetoresistive sensors. They exhibit high sensitivity and have small size – typically 1x1 mm [3]. Resolution of images depends directly on the number of measurement points per meter. The typical imaging device is a two-dimensional XY scanning system with Hall effect or magnetoresistive sensor [4], moving on the meandering path of a specific, usually rectangular area. In the case of a XY system with a single sensor, critical constraint affecting the measurement time is the number of lines along which the probe moves. Magnetovision studies carried out previously were focused on the ability to measure stress in the ferromagnetic materials, in relation to the inverse magnetostrictive (Villari) effect [5]. By measuring the intensity of the magnetic field at the surface of samples subjected to mechanical stresses, good correlation of the magnetovision images and the stress distribution inside the test piece was obtained. This phenomenon open the new possibilities for non-destructive testing of the fatigue processes under cyclic mechanical stresses, also in the high frequency range. In case of Villari effect, external magnetic field was not applied [6]. This paper presents an application of magnetovision method for passive detection of dangerous metal objects. This method enable obtaining magnetovision 70

Articles

images of unknown objects from a greater distance and on a larger surface area, required the development of new methods for measuring and processing the results. The application of passive magnetovision system is important, because the active metal detectors can provoke a reaction of the specially constructed detonators. This applies particularly to the newer generation of landmines, reacting to the presence of active detectors, which represents a direct threat to minesweeper’s life [1, 2].

‚ 6 - ; ‚ ‚ ‘ ‚ 1‚ For the study an XY scanning system was designed and built, with a single, tri-axial magnetoresistive Honeywell HMR2300 sensor. It is shown schematically in Figure 1. In this system, the distribution of the magnetic induction vectors in the plane of measurement was measured. On the base of these measurements, the magnetovision image of magnetic field distribution was calculated. Although the magnetic flux density is a vector quantity, in existing magnetovision systems this fact was omitted. Application of tri-axial sensor enabled to obtain images of the magnetic induction in the three axes XYZ system, which resulted in the information about the magnetic induction vector value and its direction with respect to each measurement point. During the measurements, no additional magnetizing fields have been applied. As a result only background disturbances were measured, mainly disturbances of the natural Earth’s magnetic field. Scanning probe system transits along parallel lines with a given interval, setting the measuring plane. The testing area of 200x200 mm was adopted, with 11 parallel measurement lines. On each line there were 100 measurement points. These parameters were selected based on the desired resolution and measurement time. The results were calculated in Matlab, assigning them to individual measurement lines. Then obtained 100x10 matrix with results was interpolated to 100x100 points, which allowed for a clear picture. Since the magnetoresistive sensor measures only the value of the three components of the flux density vector at a point in which it is physically located, a problem appeared in separation of distortion generated by a sample object from the background. The simplest laboratory solution is the differential measurement by measurement without the test object and subtracting the result from the measurement with an object. This method gives the best results, allowing precise separation of magnetic induction distribution of the background and the object, which allows for


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

low-level noise in the magnetovision image. However, this method is possible only in certain conditions, where it is possible to make measurements with an object and without, in the same plane. For this reason, a method of differential measurement was developed, minimizing the impact of the background to the measurement result, including both the Earth’s magnetic field as well as the other sources.

N° 2

2013

c c c c c c c c c cffc!c c c c

c c c on the value of magnetic induction in the ence planes P1 and P2 will be similar. Therefore: (2) (3) Where: cic c c !c c

c c c<1, cic c c !c c

c c c<2. – Background magnetic induction values in P1,P2 planes. Assuming:

Then: (4)

Fig. 1 Š = Š Š Š Š Š

Š +Š P #Š 6/@@Š —Š Š Š = Š —Š Š 9 Š = Š —Š Š Š AŠ —Š Š Š Š &Š/ ? HŠ—Š 9 Š Š- Š—Š Š

Š-AŠ—Š In its simplest form, a differential measurement is the measurement in two planes: P1 – at the height x above the test object, and P2 – at the height x + a , where: x – the distance approximately known between the object and the plane of measurement P1, a – the distance approximately known between the plane of measurement P1, and the plane of measurement P2. Distribution of flux density lines near a ferromagnetic object placed in the Earth’s magnetic field is similar to a bar magnet field distribution. In particular, the magnetic flux density can be described as a dipole magnetic field characterized by a magnetic dipole moment . Induction of the magnetic field on the axis of the magnet, in a vacuum, at a distance x from its center is expressed by the formula: (1) Where: – magnetic dipole moment (A/m2) – magnetic permeability of vacuum – induction replacement constant (Am3) : c c c c c !c cB is reduced in proportion to the cube of the distance from the source, c cÑc! c c caused by t c . c c c c measurement plane will be up to 8 times greater than the

in the second plane. If, however, other sources

As a result it is therefore possible to get a rough magnetovision image of the sample located a short distance from the sensor by subtracting the results of a measurement in the plane P2 from the results in the plane P1. Differential two-plane measurement gives the absolute value of the difference in magnetic induction value between the measurement planes. A similar method to compensate for the impact of background on the measurement result is the gradient measurement used in astrophysics and geology (e.g. in gravity gradiometer). In the generalization it is based on the measurement of the magnetic field or gravity values at different levels and the field gradient designation on that basis. Use of this method also yields good results, but the images obtained are distinctly different than those obtained by the differential method. They allow to distinguish between positive and negative areas of magnetic disturbance relative to the Earth’s field.

) ‚ 5 ‚ , /7 -‚ -

Measurements were carried out on a test stand setup described in previous chapter. The following ferromagnetic samples were used for testing: Sample 1 – steel cylinder, 80 mm diameter and 20 mm height, Sample 2 – steel cylinder 71 mm diameter and 35 mm height, Sample 3 – Swiss army knife, 120 mm long (closed). Distance between measurement planes was set to a = 50 mm. Figure 2 shows a picture of the three-dimensional distribution of magnetic induction vectors at the measuring points for Sample 1. Absolute values were obtained by GLIIHUHQWLDO PHDVXUHPHQW LQ D VLQJOH SODQH ZLWK WKH LQÀXence of the background removed. Distance of the sample from the plane of the measurement was x = 50 mm. Figure 3 shows an magnetovision image obtained by differential bi-plane measurement within 50 mm of the second sample. Minimization of the background impact on the result is clearly visible. Articles

71


Journal of Automation, Mobile Robotics & Intelligent Systems

VOLUME 7,

N° 2

2013

on the resulting image. It should be noted that the position of the specimen is clearly visible, what creates new possibilities of development of security systems.

Fig. 2.Š( Š Š Š Š Š Š Š

Š Š= ŠN Š* Š Š Š

Š

Fig. 4.ŠK Š

Š Š Š= ŠN Š Š Š Š Š Š Š Š

Fig. 3.Š & Š Š Š = Š 6Š —Š Š Š Š Š

Š & Š Š Š Š Š )

In the Figure 4 the gradient measurement results of the Sample 1, with removing the influence of background magnetic field are shown. It should be emphasized, that the results exhibit a distinct difference between the positive and negative areas of magnetic disturbance. The Figure 5 shows image obtained by a one-time measuring just 50 mm from the second sample, at different angular positions relative to the plane of measurement. For such a small distance effect of the background becomes negligible. Sample was rotated around the axis perpendicular to the plane of measurement, allowing the visualization of the impact of sample position relative to the Earth’s magnetic field 72

Articles

Fig. 5. = Š NŠ 1 Š 3 Š Š

Š Š Š Š= Š Š Š Š

+Š 3Š—@˜ Š 3Š—8F˜ Š Š Š ŠB ™ Š


Journal of Automation, Mobile Robotics & Intelligent Systems

Fig. 6.Š* Š Š

Š Š Š Š Š= ŠNŠ 3Š Š Š Š Š* Š Š 3Š* Š Š 3Š*)Š Figure 6 shows the image of the bi-plane differential measurement of Sample 1 for the individual components of the magnetic induction vector, i.e. Bx (Fig. 6a), By (Fig. 6b), Bz (Fig. 6c). The results clearly show, that on the magnetovision image, the easiest to recognize is the (x,y) position of the sample in the image of component (Fig. 6c), perpendicular to the plane of measurement.

VOLUME 7,

N° 2

2013

Fig. 7.‚

Š Š Š Š Š Š 9 Š 1= Š /3+ 3Š Š Š

Š 3Š Š

Š Š Š 3Š Š Š Š Š Š Figure 7 shows the results of the application of the developed method of measurements to determine the location of a sample, relative to the measurement plane. In addition, the comparison of the results of the differential bi-planar (Fig. 7a) and gradient (Fig. 7b) methods was performed. The object subjected to the test ZDV VDPSOH ¹ VWHHO IROGLQJ NQLIH 7KH GLVWDQFH RI WKH ¿UVW Articles

73


Journal of Automation, Mobile Robotics & Intelligent Systems

plane from the object was 50 mm. Bi-plane differential PHDVXUHPHQW UHVXOWV DUH VKRZQ LQ ÂżJXUH D ZKHUHDV JUDGLHQW PHDVXUHPHQW ZLWKRXW UHPRYLQJ WKH LQĂ€XHQFH RI WKH EDFNJURXQG LV VKRZQ LQ ÂżJXUH E 7KH SRVLWLRQ RI VDPSOH on the reference grid is shown in Figure 7c. % c c c c=8cĂ’,c c c c c c c c c() c; +c c *>cic98cĂ’,c c c measurement of the background gradient (Fig. 7b). Based on the results, the location and size of the object can be determined, which is very useful from practical point of view.

. ‚ 4 77 Experimental setup for planar measurements of vector distribution of weak magnetic fields was developed. Moreover, new methodology of measurement, leading to decreasing the impact of magnetic background on the visualization of the results was presented. The developed methods allow a visualization of the distribution of the magnetic induction vector absolute values, its gradient as well as the value and direction of the magnetic flux density vector in different measurement points. Obtained results indicate, that it is possible to detect and determine the location of dangerous objects. This opens the way to use magnetovision in public security systems, in particular for the detection of dangerous objects by police or mobile demining robots. Such system can also be used in non-destructive testing, for detection of structural defects of the tested objects.

* %$8! (< 6 %54 This work was partially supported by The National Center for Research and Development within grant no. O ROB 0015 01/ID15/1.

"5>$#4 , " => ? – Graduated from Faculty of Mechatronics, Warsaw University of Technology. PhD student in Institute of Metrology and Biomedical Engineering of the Warsaw University of Technology since February 2012. e-mail: m.nowicki@mchtr.pw.edu.pl

= $ $% – the Industrial Institute for Automation and Measurements PIAP and Institute of Metrology and Biomedical Engineering, Warsaw University of Technology. e-mail: rszewczyk@piap.pl *Corresponding author

# 3 # %* 4 [1] Guelle D., Smith A., Lewis A., Bloodworth T. �Metal detector handbook for humanitarian demining�. In: A ‚ ‚ A ‚& ! ‚ ‚ ‚) $ !‚ Communities, 2003. [2] Billings S.D., Pasion C., Walker S., Beran L. „Magnetic Models of Unexploded Ordnance�, ()))‚ 74

Articles

VOLUME 7,

N° 2

2013

" ! ! ‚ !‚ C ! ‚ ! ‚ ‚ ' ! ! , vol. 44, 2006, p. 2115. 426c , R c: cĂ”0 c . c . ec c- c< niki Warszawskiej, 2000. (in Polish) 496c , R c: cMagnetovision, McGraw-Hill, 2000. [5] Kaleta J., Zebracki J., Application of the Villari effect in a fatigue examination of nickel, Fatigue Fracture Eng. Mater. Struc., 19:, 1996, 1435–1443. [6] Mohd Ali B. B., and Moses A. J., A grain detection system for grain-oriented electrical steels, ()))‚ Trans. Magnetism, 25:, 1989, pp. 4421–4426. [7] PfĂźtzner H., “Computer mapping of grain structure in coated silicon ironâ€?, . ! ‚ ‚- ! ‚ ! ‚- ! ‚- , vol. 19, 1980, pp. 27–30 [8] Tumanski S., Stabrowski M., “The magnetovision method as a tool to investigate the quality of electrical steelâ€?, - ! ‚ ' ! ‚ ! ‚ " ! , no. 9, 1998, pp. 488–495.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.