High-fidelity haptics in multimodal human-robot interaction [Elektronische Ressource] / Zheng Wang
143 pages
English

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris

High-fidelity haptics in multimodal human-robot interaction [Elektronische Ressource] / Zheng Wang

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus
143 pages
English
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Informations

Publié par
Publié le 01 janvier 2010
Nombre de lectures 15
Langue English
Poids de l'ouvrage 5 Mo

Extrait

Lehrstuhl fu¨r Steuerungs- und Regelungstechnik
Technische Universit¨at Mu¨nchen
Univ.-Prof. Dr.-Ing.(Univ. Tokio) Martin Buss
High-Fidelity Haptics in Multimodal
Human-Robot Interaction
Zheng Wang
Vollst¨andiger Abdruck der von der Fakult¨at fu¨r Elektrotechnik und Informationstechnik
der Technischen Universit¨at Mu¨nchen zur Erlangung des akademischen Grades eines
Doktor-Ingenieurs (Dr.-Ing.)
genehmigten Dissertation.
Vorsitzender: Univ.-Prof. Gordon Cheng, Ph.D.
Pru¨fer der Dissertation:
1. Univ.-Prof. Dr.-Ing.(Univ. Tokio) Martin Buss
2. Univ.-Prof. Dr.-Ing., Dr.-Ing.habil. Alois Knoll
Die Dissertation wurde am 21.06.2010 bei der Technischen Universit¨at Mu¨nchen einge-
reicht und durch die Fakult¨at fu¨r Elektrotechnik und Informationstechnik am 29.10.2010
angenommen.Foreword
This dissertation concludes four years of my research conducted at the Institute of Auto-
maticControlEngineering(LSR),TechnischeUniversit¨atMu¨nchen. Theworkissupported
by European Union FP6 project Immersence.
First of all, I would like to thank mein Doktorfater, my supervisor Prof. Dr.-Ing/Univ.
Tokio Martin Buss, who not only offered me a research position in one of the best estab-
lishedroboticsgroup, butalsoshowedmethepaththroughacademiatowhereIam today.
He had always shared with me his vision upon research topics while leaving me enough
freedom for exploration and innovation.
My next most sincere appreciation goes to Dr. Angelika Peer, with whom I shared
uncountable scientific discussions. She had always helped me with her experience and
wisdom during the last two years of my research at LSR.
A special thank goes to Prof. Louis Phee, my current supervisor at Nanyang Techno-
logical University, Singapore, whose kindness and help made it possible for me to finish
writing the dissertation.
MuchoftheworkinthisdissertationwasconductedincollaborationwithmyImmersence
partnerswithin,andoutsideLSR.MysincerethankstoJensHoelldampf,RaphaelaGroten,
Ansgar Bittermann, for all the inspirations and help in LSR; to Elias and Mel at UPC,
to Nicola, Pasquale, Mario, and other members of the Italian team I cannot name all,
at UNIPI, to Benjamin, Juan, Paul, and Abder at LSC, to Christos at UBIRM, to Max
and Marc at MPI, to Javier and Manuel at UPM, for all the good collaboration and
achievements in Immersence.
A brief, but very special appreciation goes to Mr. Andreas Schweinberger. He had
offered me so much help when I first started my life in Deutschland, and kept being a
friend and supporter of me during my hardest times. Little kindness goes a long way.
Next I would like to thank my LSR colleagues, with whom I shared past four years of
my life. Special thanks go to Chih-Chung Chen, Tingting Xu, Tianguang Zhang, Hao
Ding, and Haiyan Wu, for lunch, dinner, and other uncountable events and help. Thanks
to Kwang-kyu Lee, my first roommate and best friend, who shared with me beers and
wisdom. I also would like to thank Daniela Feth, Carollina Passenberg, Thomas Schauss,
and Nikolay Stefanov, for the fruitful discussions in the haptic group; to thank Tobias
Goepel, Matthias Rungger, Iason Vittorias and Andreas Schmid, for sharing an office with
me.
My next appreciations go to Dr. Dirk Wollherr, who had always kindly helped me in
administrative issues, and the secretaries and technicians of LSR, Fr. Schmid, Fr. Werner,
Fr. Renner, Hr. Jaschik, Hr. Gradl, Hr. Kubick, and Hr. Lowitz, without your help,
nothing would have been possible.
I would also like to thank all students who had worked with me during the past years,
especially Jun, Ziqing, Ji, Yang, Lei, Qixun, Mingxiang, Licheng, Rubens, Hong, and
iiiShuning, who contributed their hard work to the development of the systems as well as
the experiments.
My final and most sincere thanks go to my family, and to the ones who had always put
˘their faith in me, no matter how remote they might be. Xi`e Xi`e Ni M´en.
Singapore, June 2010 Zheng Wang
ivTo life...
vContents
1 Introduction 1
1.1 Problem definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Main contributions and structure of the dissertation . . . . . . . . . . . . . 2
2 Haptic rendering of arm dynamics 1: modeling and replay 4
2.1 Physicality and challenges to haptic rendering . . . . . . . . . . . . . . . . 4
2.2 A framework for human haptic modeling . . . . . . . . . . . . . . . . . . . 5
2.2.1 Handshake: a process-oriented human motor skill . . . . . . . . . . 5
2.2.2 Related works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.3 A framework for human haptic modeling . . . . . . . . . . . . . . . 6
2.3 Developing a handshake robot with realistic arm behavior: overview . . . . 8
2.3.1 Pilot study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.2 Robotic interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 Basic controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.1 The first modeling iteration . . . . . . . . . . . . . . . . . . . . . . 11
2.4.2 The second modeling iteration . . . . . . . . . . . . . . . . . . . . . 13
2.4.3 The third modeling iteration . . . . . . . . . . . . . . . . . . . . . . 14
2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3 Haptic rendering of arm dynamics 2: towards an interactive controller 18
3.1 Human behavior model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.2 Interactive controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2.1 Fast online parameter estimation . . . . . . . . . . . . . . . . . . . 21
3.2.2 Symbol abstraction and HMM intention estimation . . . . . . . . . 26
3.2.3 Trajectory planning and parameter adaptation . . . . . . . . . . . . 27
3.2.4 Refined trajectory planning . . . . . . . . . . . . . . . . . . . . . . 29
3.3 Performance validation tests . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.3.1 HBP estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.3.2 HMM estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3.3.3 Overall system performance . . . . . . . . . . . . . . . . . . . . . . 37
3.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4 Haptic rendering of hand dynamics 43
4.1 Gesture data acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
4.1.1 The CyberGlove . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.1.2 Pisa glove . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.2 Haptic data acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.2.1 Tactile sensing technology . . . . . . . . . . . . . . . . . . . . . . . 46
viiContents
4.2.2 Glove design and implementation . . . . . . . . . . . . . . . . . . . 47
4.2.3 Measuring handshakes with TSG gloves . . . . . . . . . . . . . . . . 52
4.3 Robotic hand actuation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.3.1 BarrettHand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5 Visual and sound rendering 63
5.1 Visual rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.1.1 The rendering workflow . . . . . . . . . . . . . . . . . . . . . . . . 64
5.1.2 Virtual human characters . . . . . . . . . . . . . . . . . . . . . . . 66
5.1.3 Virtual hand model . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5.1.4 Virtual environments . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.1.5 Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
5.2 Auditory rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
6 System integration and optimization 74
6.1 User dependency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
6.1.1 User dependency in haptic subsystem . . . . . . . . . . . . . . . . . 75
6.1.2 User dependency in vision subsystem . . . . . . . . . . . . . . . . . 77
6.2 Real/virtual world integration . . . . . . . . . . . . . . . . . . . . . . . . . 80
6.2.1 Incorporating real world data . . . . . . . . . . . . . . . . . . . . . 80
6.2.2 Registration of real world object . . . . . . . . . . . . . . . . . . . . 82
6.3 Integration of a second human input . . . . . . . . . . . . . . . . . . . . . 83
6.3.1 Problem definition . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
6.3.2 Proposed remedy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
6.4 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
6.4.1 Improving natural haptic interaction . . . . . . . . . . . . . . . . . 86
6.4.2 Minimizing computational load . . . . . . . . . . . . . . . . . . . . 87
6.4.3 Time delay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
7 Experiments and evaluation studies 90
7.1 Plausibility and questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . 90
7.2 Experiment 1: Robotic control algorithms comparison . . . . . . . . . . . . 91
7.2.1 Haptic r

  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents