Feature level fusion of laser scanner and video data for advanced driver assistance systems [Elektronische Ressource] / von Nico Kämpchen
247 pages
English

Feature level fusion of laser scanner and video data for advanced driver assistance systems [Elektronische Ressource] / von Nico Kämpchen

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
247 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

Feature-Level Fusion of Laser Scanner and VideoData for Advanced Driver Assistance SystemsDISSERTATIONzur Erlangung des akademischen Grades einesDOKTOR-INGENIEURS(DR.-ING.)der Fakult¨at fu¨r Ingenieurwissenschaftenund Informatik der Universit¨at Ulmvon¨NICO KAMPCHENaus Hannover1. Gutachter: Prof. Dr.-Ing. K. Dietmayer2. Gutachter: Prof. Dr.-Ing. J.-U. VarchminAmtierender Dekan: Prof. Dr. rer. nat. Helmuth PartschDatum der mu¨ndlichen Pru¨fung: 29. Juni 2007iiiiiAcknowledgmentsFirst,IwouldliketothankProfessorDr.-Ing.KlausDietmayerforhisexceptionalsupportthroughout the years. In particular I am grateful for the scientific freedom and encour-agement he gave me, enabling me to follow my own ideas.MygratitudealsogoestoProfessorDr.-Ing.J¨orn-UweVarchminwhosesuggestionsandcritique proved invaluable to this thesis.I am indebted to my colleague and good friend Kay Fu¨rstenberg who has been gener-ous with his advise, expertise and experience. Throughout the years we shared the sameoffice, our ideas and also the ups and downs of a PhD life.I would like to thank the members of the ARGOS group for the many fruitful discus-sions and excellent team work. In particular, I should like to mention Stefan Wender,Thorsten Weiß, Mirko M¨ahlisch, Daniel Streller, Jan Sparbert and Holger Berndt. Spe-cial thanks also goes to the students involved in the ARGOS project for their pricelesscontributions.

Informations

Publié par
Publié le 01 janvier 2007
Nombre de lectures 29
Langue English
Poids de l'ouvrage 5 Mo

Extrait

Feature-Level Fusion of Laser Scanner and Video
Data for Advanced Driver Assistance Systems
DISSERTATION
zur Erlangung des akademischen Grades eines
DOKTOR-INGENIEURS
(DR.-ING.)
der Fakult¨at fu¨r Ingenieurwissenschaften
und Informatik der Universit¨at Ulm
von
¨NICO KAMPCHEN
aus Hannover
1. Gutachter: Prof. Dr.-Ing. K. Dietmayer
2. Gutachter: Prof. Dr.-Ing. J.-U. Varchmin
Amtierender Dekan: Prof. Dr. rer. nat. Helmuth Partsch
Datum der mu¨ndlichen Pru¨fung: 29. Juni 2007iiiii
Acknowledgments
First,IwouldliketothankProfessorDr.-Ing.KlausDietmayerforhisexceptionalsupport
throughout the years. In particular I am grateful for the scientific freedom and encour-
agement he gave me, enabling me to follow my own ideas.
MygratitudealsogoestoProfessorDr.-Ing.J¨orn-UweVarchminwhosesuggestionsand
critique proved invaluable to this thesis.
I am indebted to my colleague and good friend Kay Fu¨rstenberg who has been gener-
ous with his advise, expertise and experience. Throughout the years we shared the same
office, our ideas and also the ups and downs of a PhD life.
I would like to thank the members of the ARGOS group for the many fruitful discus-
sions and excellent team work. In particular, I should like to mention Stefan Wender,
Thorsten Weiß, Mirko M¨ahlisch, Daniel Streller, Jan Sparbert and Holger Berndt. Spe-
cial thanks also goes to the students involved in the ARGOS project for their priceless
contributions. These are: Matthias Bu¨hler, Tobias Bu¨hler, J¨org Kibbel, Ulrich Pl¨ockl,
MichaelSch¨afer, MichaelSch¨onherr, BrunoSchiele, AlexanderSkibicki, AndreasWimmer
and Markus Zocholl.
I would like to acknowledge Franz Degenhard, Martin Nieß, Oliver Betz and Thomas
L¨offler for their excellent work on maintaining the test vehicles, constructing the experi-
mental setups and helping with the collection of data.
MygratitudealsogoestothemembersofthecompanyIBEOAutomobileSensorGmbH
for the intensive and fruitful cooperation.
Specialthanksgotomyparentsandbrotherfortheirnumerouswaysofencouragement.
With ”Gute Laune Drops”, post cards, poems, phone calls and by many other means,
they lightend up especially the last month of intense writing up.
Finally, I would like to thank my wife Katherine for her constant support, encourage-
ment and love. She and our son Frederik brought me happiness and were a refreshing
source of life outside the PhD.iv
”The important thing is not to stop questioning. Curiosity has its own reason
forexisting. Onecannothelpbutbeinawewhenhecontemplatesthe mysteries
of eternity, of life, of the marvelous structure of reality. It is enough if one
tries merely to comprehend a little of this mystery every day. Never lose a
holy curiosity.”
Albert Einsteinv
Contents
Notation ix
1 Introduction 1
2 State of the Art and Motivation 5
2.1 Advanced Driver Assistance Systems . . . . . . . . . . . . . . . . . . . . . 5
2.2 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.1 Laser Scanner Data based Environment Perception . . . . . . . . . 10
2.2.2 Video Data based Environment Perception . . . . . . . . . . . . . . 12
2.3 Sensor Data Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.4 State Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3 Sensor Calibration 19
3.1 Sensor and Vehicle Coordinate Systems . . . . . . . . . . . . . . . . . . . . 20
3.2 Calibration of a Multi-Layer Laser Scanner . . . . . . . . . . . . . . . . . . 21
3.2.1 Estimation of Pitch and Roll Angle . . . . . . . . . . . . . . . . . . 21
3.2.2 Estimation of the Yaw Angle . . . . . . . . . . . . . . . . . . . . . 24
3.3 Calibration of a Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.4 Worst Case Error Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.4.1 Erroneous Measurements and Calibration Parameters . . . . . . . . 30
3.4.2 Worst Case Errors of the Estimated Angles. . . . . . . . . . . . . . 31
3.4.3 Worst Case Projection Errors . . . . . . . . . . . . . . . . . . . . . 32
3.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4 Sensor Data Fusion 37
4.1 Low-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.1.1 Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.1.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2 High-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2.1 Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3 Feature-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3.1 System Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3.2 Cuboidal Object Model. . . . . . . . . . . . . . . . . . . . . . . . . 42
4.3.3 Sensor Specific Modules . . . . . . . . . . . . . . . . . . . . . . . . 43vi Contents
4.3.4 Kalman Filter based State Estimation and Data Fusion . . . . . . . 43
4.3.5 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.3.6 Object Management . . . . . . . . . . . . . . . . . . . . . . . . . . 53
4.3.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4.4 Alternative Fusion Architectures . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5 System Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.1 Object Detection Latency . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.2 State Estimation Latency . . . . . . . . . . . . . . . . . . . . . . . 58
4.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5 Laser Scanner Data based Feature Extraction and Association 61
5.1 The Multi-Layer Laser Scanner . . . . . . . . . . . . . . . . . . . . . . . . 62
5.2 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.2.1 Reflector Classification . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.2.2 Ground Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
5.2.3 Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5.2.4 Occlusion Detection . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.3 Feature Model and Feature Measurement Equation . . . . . . . . . . . . . 68
5.3.1 Contour Model Features . . . . . . . . . . . . . . . . . . . . . . . . 68
5.3.2 Displacement Feature . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.4 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.4.1 Matching of Contour Models . . . . . . . . . . . . . . . . . . . . . . 70
5.4.2 Exploiting the Echo Pulse Width Information . . . . . . . . . . . . 75
5.4.3 Estimation of Object Displacement in Consecutive Scans . . . . . . 81
5.5 Association . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
5.5.1 Gating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.5.2 Segment to Track Association . . . . . . . . . . . . . . . . . . . . . 85
5.5.3 Shape Hypothesis Selection . . . . . . . . . . . . . . . . . . . . . . 85
5.6 Object Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6 Video Data based Feature Extraction 89
6.1 Feature Model and Measurement Equation . . . . . . . . . . . . . . . . . . 91
6.2 Classical Model based Feature Extraction Algorithms . . . . . . . . . . . . 92
6.2.1 Edge Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
6.2.2 Matching of a Deformable Synthetic Template . . . . . . . . . . . . 94
6.3 Kalman Filter based Template Tracking . . . . . . . . . . . . . . . . . . . 96
6.3.1 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
6.3.2 Augmented State Representation . . . . . . . . . . . . . . . . . . . 100
6.3.3 Feature Measurement Equation . . . . . . . . . . . . . . . . . . . . 101
6.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
7 Parameter Identification 103
7.1 Experimental Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
7.2 Identification of the Measurement Error Probability Distributions . . . . . 103
7.2.1 Laser Scanner — Azimuth . . . . . . . . . . . . . . . . . . . . . . . 103Contents vii
7.2.2 Laser Scanner — Distance . . . . . . . . . . . . . . . . . . . . . . . 113
7.2.3 Laser Scanner — Position . . . . . . . . . . . . . . . . . . . . . . . 113
7.2.4 Laser Scanner — Width and Length . . . . . . . . . . . . . . . . . 114
7.2.5 Laser Scanner — Orientation . . . . . . . . . . . . . . . . . . . . . 117
7.2.6 Laser Scanner — Displacement . . . . . . . . . . . . . . . . . . . . 121
7.2.7 Camera — Template Tracking . . . . . . . . . . . . . . . . . . . . . 121
7.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
8 Evaluation 125
8.1 Motion Model Pa

  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents