La lecture en ligne est gratuite
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
Télécharger Lire

3D human detection and tracking on a mobile platform for situation awareness [Elektronische Ressource] / Niklas Beuter. Technische Fakultät

De
171 pages
Publié par :
Ajouté le : 01 janvier 2011
Lecture(s) : 18
Signaler un abus

3DHumanDetectionand
TrackingonaMobilePlatform
forSituationAwareness
NiklasBeuter3DHumanDetectionand
TrackingonaMobilePlatform
forSituationAwareness
Dissertation zur Erlangung des akademischen Grades
Doktor der Ingenieurwissenschaften (Dr.-Ing.)
der Technischen Fakultät der Universität Bielefeld
vorgelegt von
Niklas Beutervorgelegt am 06. Juli 2011
Gutachter:
Prof. Dr.-Ing Franz Kummert
Prof. Dr. rer. nat. Christian Wöhler
Prüfungsausschuss:
Prof. Dr. Barbara Hammer
Prof. Dr.-Ing Franz Kummert
Prof. Dr. rer. nat. Christian Wöhler
Dr.-Ing Hendrik Koesling
gedruckt auf alterungsbeständigem Papier nach ISO 97066
A few words
Many people supported me in diverse areas, while writing my thesis. Here, I want to
write a few words to thank all these people. First of all, I want to thank my supervisors
Franz Kummert and Christian Wöhler, who always had an open door for my requests
and who took their time to support the work on my thesis.
The inspiring work with my colleagues at the Bielefeld University raised many ideas,
which resulted in several publications. It was a great time with many opportunities and
fruitful collaborations. I want to thank all my colleagues for the heartily athmosphere
and the conspirative work. I will miss the good and mostly funny discussions, where
not only research was in the focus.
Most notably I want to thank my wife, who backed me up wherever it was necessary
and who stands for the water of my fountain. Her happiness and her wonderful lust for
life are my inspiration and accordingly, the most important part in my life to realise the
final goal of getting the PhD.7
Abstract
The vision of robots supporting the human in daily life encouraged research in the area
of mobile robots in the recent past. The robots are meant to share the same environment
and they are supposed to deal with the same requirements like the human in order
to be able to assist the human in his tasks. But, the dynamic and highly complex
human environment makes it affordable to implement algorithms, which enable the
robot to deal with the arising requirements. Thereby, such algorithms are based on
sensing and interpretation of the environment. Here, the following thesis applies by
achieving a solid basement for the robot’s situation awareness. Situation awareness can
be divided into four categories, whereas the first three categories sense and interpret the
environment and the fourth predicts the gathered knowledge into the future in order
to adapt the aspired actions. The thesis provides a solution to the first three categories,
which implement the perceptual part of situation awareness.
The first category deals with the sensing of the environment. Here, a complete scene
analysis by building an articulated scene model by observation of the Vista space is pro-
posed. The model inherits different abstract scene parts, which are the static back-
ground, movable objects like e.g. chairs and moving objects like humans, which are all
revealed by a single model building process. The second category deals with a temporal
linking of information, which is especially difficult on a moving platform. In addition to
a map, the most important information for a mobile robot is the information of positions
and walking paths of present humans. Utilising the dynamic movements of humans the
robot is able to calculate a safe path through the environment. Here, a dynamic human
detection and tracking system is introduced, which is able to create temporal links between
human occurrences even in the presence of challenging ego motion and scene changes.
The third category of situation awareness is implemented through a top-down visual
attention system, which directs the focus of attention onto desired objects, humans or
locations. As the main purpose of a mobile robot is the interaction with the human, it
is proposed to use a human model in combination with the top-down directed visual
search. The model represents more precise the diverse appearance of the human. This
way, the robot is able to recover aspired humans or interaction partners, if they were
absent for a short time.
For each category experiments are conducted to show the performance of my solutions.
The results show that each category implementation provides solid and stable informa-
tion, which support the robot in achieving a broad situation awareness.Contents
1 Motivation 1
1.1 Scenario Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Problem and Definition . . . . . . . . . . . . . . . . . . . . . . 5
1.3 Contribution of this Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2 Visual Basis for Situation Awareness 9
2.1 Description of Sensor Set-ups . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.1 Monochrome Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.2 Stereo Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.1.3 Multi-Camera Set-up . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.1.4 Active Cameras for 3D Vision . . . . . . . . . . . . . . . . . . . . . . 16
2.1.5 Sensor Ensemble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.2 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.2.1 Object Detection ideal for Static Cameras . . . . . . . . . . . . . . . 20
2.2.2 for Static and Moving Cameras . . . . . . . . . . 21
2.3 Visual Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.3.1 Feature Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.3.2 Multiple Feature Tracking . . . . . . . . . . . . . . . . . . . . . . . . 30
2.3.3 Template Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.4 Kernel Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.5 Tracking using Filters . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.6 T by Multiple Models for Adaptive Estimation . . . . . . . 37
2.3.7 Tracking by Joint Probabilistic Data Association Filter . . . . . . . 37
2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3 Acquiring 3D Scene Models in Vista Spaces 39
3.1 Introduction in Scene Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.2 Proposed System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.3 Preprocessing of the Input Data . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.4 3D Motion Computing using Optical Flow . . . . . . . . . . . . . . . . . . 47
3.5 Detection and Tracking of Dynamic Objects . . . . . . . . . . . . . . . . . . 48
3.6 Adaptive Background Modelling . . . . . . . . . . . . . . . . . . . . . . . . 53
3.7 Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
4 A 3D Tracking System on a Moving Platform 59
4.1 Request of Static and Moving Cameras . . . . . . . . . . . . . . . . . . . . 61
4.2 Tracking Systems on a Mobile Platform . . . . . . . . . . . . . . . . . . . . 62ii Contents
4.2.1 Laser-Based Tracking Systems . . . . . . . . . . . . . . . . . . . . . 62
4.2.2 Person Following Tracking Systems . . . . . . . . . . . . . . . . . . 64
4.2.3 Motion Detection T . . . . . . . . . . . . . . . . . . 65
4.2.4 Mobile Robot Tracking Systems . . . . . . . . . . . . . . . . . . . . 69
4.2.5 Vehicle Tracking Systems . . . . . . . . . . . . . . . . . . . . . . . . 71
4.3 A Modular Person Tracking System . . . . . . . . . . . . . . . . . . . . . . 73
4.3.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.3.2 Integration on a Mobile Robot . . . . . . . . . . . . . . . . . 77
4.4 Object Detection Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.4.1 Pre-Detection through U-V-Disparity . . . . . . . . . . . . . . . . . 77
4.4.2 Detection Verification . . . . . . . . . . . . . . . . . . . . . . . . . . 82
4.5 Hypotheses Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
4.6 Tracking Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
4.7 Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4.7.1 Evaluated Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4.7.2 Qualitative Results of the Proposed System Approach . . . . . . . 89
4.7.3 Quantitative Analysis of the Proposed System Approach . . . . . . 94
4.7.4 Enhancement through Pre-Detection . . . . . . . . . . . . . . . . . 98
4.7.5 Analysis of the Proposed Tracking Algorithm . . . . . . . . . . . . 99
4.7.6 Comparison with State-Of-The-Art Tracking Algorithms . . . . . . 100
4.7.7 with a laser based Person Tracking . . . . . . . . . . . 102
4.7.8 Parameter Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
4.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
5 Attention Focus for Situation Awareness 107
5.1 Attention Systems in Human-Robot-Interaction . . . . . . . . . . . . . . . 109
5.2 Directing the Attention Focus . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.2.1 Bottom-Up Saliency . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.2.2 Top-Down Attention . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
5.3 Weak Object Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
5.4 Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
5.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
6 Conclusion and Outlook 123
A Appendix 127
B Appendix 131
Bibliography 145

Un pour Un
Permettre à tous d'accéder à la lecture
Pour chaque accès à la bibliothèque, YouScribe donne un accès à une personne dans le besoin