La lecture en ligne est gratuite
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
Télécharger Lire

Feature level fusion of laser scanner and video data for advanced driver assistance systems [Elektronische Ressource] / von Nico Kämpchen

247 pages
Feature-Level Fusion of Laser Scanner and VideoData for Advanced Driver Assistance SystemsDISSERTATIONzur Erlangung des akademischen Grades einesDOKTOR-INGENIEURS(DR.-ING.)der Fakult¨at fu¨r Ingenieurwissenschaftenund Informatik der Universit¨at Ulmvon¨NICO KAMPCHENaus Hannover1. Gutachter: Prof. Dr.-Ing. K. Dietmayer2. Gutachter: Prof. Dr.-Ing. J.-U. VarchminAmtierender Dekan: Prof. Dr. rer. nat. Helmuth PartschDatum der mu¨ndlichen Pru¨fung: 29. Juni 2007iiiiiAcknowledgmentsFirst,IwouldliketothankProfessorDr.-Ing.KlausDietmayerforhisexceptionalsupportthroughout the years. In particular I am grateful for the scientific freedom and encour-agement he gave me, enabling me to follow my own ideas.MygratitudealsogoestoProfessorDr.-Ing.J¨orn-UweVarchminwhosesuggestionsandcritique proved invaluable to this thesis.I am indebted to my colleague and good friend Kay Fu¨rstenberg who has been gener-ous with his advise, expertise and experience. Throughout the years we shared the sameoffice, our ideas and also the ups and downs of a PhD life.I would like to thank the members of the ARGOS group for the many fruitful discus-sions and excellent team work. In particular, I should like to mention Stefan Wender,Thorsten Weiß, Mirko M¨ahlisch, Daniel Streller, Jan Sparbert and Holger Berndt. Spe-cial thanks also goes to the students involved in the ARGOS project for their pricelesscontributions.
Voir plus Voir moins

Feature-Level Fusion of Laser Scanner and Video
Data for Advanced Driver Assistance Systems
DISSERTATION
zur Erlangung des akademischen Grades eines
DOKTOR-INGENIEURS
(DR.-ING.)
der Fakult¨at fu¨r Ingenieurwissenschaften
und Informatik der Universit¨at Ulm
von
¨NICO KAMPCHEN
aus Hannover
1. Gutachter: Prof. Dr.-Ing. K. Dietmayer
2. Gutachter: Prof. Dr.-Ing. J.-U. Varchmin
Amtierender Dekan: Prof. Dr. rer. nat. Helmuth Partsch
Datum der mu¨ndlichen Pru¨fung: 29. Juni 2007iiiii
Acknowledgments
First,IwouldliketothankProfessorDr.-Ing.KlausDietmayerforhisexceptionalsupport
throughout the years. In particular I am grateful for the scientific freedom and encour-
agement he gave me, enabling me to follow my own ideas.
MygratitudealsogoestoProfessorDr.-Ing.J¨orn-UweVarchminwhosesuggestionsand
critique proved invaluable to this thesis.
I am indebted to my colleague and good friend Kay Fu¨rstenberg who has been gener-
ous with his advise, expertise and experience. Throughout the years we shared the same
office, our ideas and also the ups and downs of a PhD life.
I would like to thank the members of the ARGOS group for the many fruitful discus-
sions and excellent team work. In particular, I should like to mention Stefan Wender,
Thorsten Weiß, Mirko M¨ahlisch, Daniel Streller, Jan Sparbert and Holger Berndt. Spe-
cial thanks also goes to the students involved in the ARGOS project for their priceless
contributions. These are: Matthias Bu¨hler, Tobias Bu¨hler, J¨org Kibbel, Ulrich Pl¨ockl,
MichaelSch¨afer, MichaelSch¨onherr, BrunoSchiele, AlexanderSkibicki, AndreasWimmer
and Markus Zocholl.
I would like to acknowledge Franz Degenhard, Martin Nieß, Oliver Betz and Thomas
L¨offler for their excellent work on maintaining the test vehicles, constructing the experi-
mental setups and helping with the collection of data.
MygratitudealsogoestothemembersofthecompanyIBEOAutomobileSensorGmbH
for the intensive and fruitful cooperation.
Specialthanksgotomyparentsandbrotherfortheirnumerouswaysofencouragement.
With ”Gute Laune Drops”, post cards, poems, phone calls and by many other means,
they lightend up especially the last month of intense writing up.
Finally, I would like to thank my wife Katherine for her constant support, encourage-
ment and love. She and our son Frederik brought me happiness and were a refreshing
source of life outside the PhD.iv
”The important thing is not to stop questioning. Curiosity has its own reason
forexisting. Onecannothelpbutbeinawewhenhecontemplatesthe mysteries
of eternity, of life, of the marvelous structure of reality. It is enough if one
tries merely to comprehend a little of this mystery every day. Never lose a
holy curiosity.”
Albert Einsteinv
Contents
Notation ix
1 Introduction 1
2 State of the Art and Motivation 5
2.1 Advanced Driver Assistance Systems . . . . . . . . . . . . . . . . . . . . . 5
2.2 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.1 Laser Scanner Data based Environment Perception . . . . . . . . . 10
2.2.2 Video Data based Environment Perception . . . . . . . . . . . . . . 12
2.3 Sensor Data Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.4 State Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3 Sensor Calibration 19
3.1 Sensor and Vehicle Coordinate Systems . . . . . . . . . . . . . . . . . . . . 20
3.2 Calibration of a Multi-Layer Laser Scanner . . . . . . . . . . . . . . . . . . 21
3.2.1 Estimation of Pitch and Roll Angle . . . . . . . . . . . . . . . . . . 21
3.2.2 Estimation of the Yaw Angle . . . . . . . . . . . . . . . . . . . . . 24
3.3 Calibration of a Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.4 Worst Case Error Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.4.1 Erroneous Measurements and Calibration Parameters . . . . . . . . 30
3.4.2 Worst Case Errors of the Estimated Angles. . . . . . . . . . . . . . 31
3.4.3 Worst Case Projection Errors . . . . . . . . . . . . . . . . . . . . . 32
3.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4 Sensor Data Fusion 37
4.1 Low-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.1.1 Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.1.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2 High-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2.1 Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3 Feature-Level Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3.1 System Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.3.2 Cuboidal Object Model. . . . . . . . . . . . . . . . . . . . . . . . . 42
4.3.3 Sensor Specific Modules . . . . . . . . . . . . . . . . . . . . . . . . 43vi Contents
4.3.4 Kalman Filter based State Estimation and Data Fusion . . . . . . . 43
4.3.5 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.3.6 Object Management . . . . . . . . . . . . . . . . . . . . . . . . . . 53
4.3.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4.4 Alternative Fusion Architectures . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5 System Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.1 Object Detection Latency . . . . . . . . . . . . . . . . . . . . . . . 57
4.5.2 State Estimation Latency . . . . . . . . . . . . . . . . . . . . . . . 58
4.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5 Laser Scanner Data based Feature Extraction and Association 61
5.1 The Multi-Layer Laser Scanner . . . . . . . . . . . . . . . . . . . . . . . . 62
5.2 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.2.1 Reflector Classification . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.2.2 Ground Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
5.2.3 Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5.2.4 Occlusion Detection . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.3 Feature Model and Feature Measurement Equation . . . . . . . . . . . . . 68
5.3.1 Contour Model Features . . . . . . . . . . . . . . . . . . . . . . . . 68
5.3.2 Displacement Feature . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.4 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.4.1 Matching of Contour Models . . . . . . . . . . . . . . . . . . . . . . 70
5.4.2 Exploiting the Echo Pulse Width Information . . . . . . . . . . . . 75
5.4.3 Estimation of Object Displacement in Consecutive Scans . . . . . . 81
5.5 Association . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
5.5.1 Gating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.5.2 Segment to Track Association . . . . . . . . . . . . . . . . . . . . . 85
5.5.3 Shape Hypothesis Selection . . . . . . . . . . . . . . . . . . . . . . 85
5.6 Object Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6 Video Data based Feature Extraction 89
6.1 Feature Model and Measurement Equation . . . . . . . . . . . . . . . . . . 91
6.2 Classical Model based Feature Extraction Algorithms . . . . . . . . . . . . 92
6.2.1 Edge Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
6.2.2 Matching of a Deformable Synthetic Template . . . . . . . . . . . . 94
6.3 Kalman Filter based Template Tracking . . . . . . . . . . . . . . . . . . . 96
6.3.1 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
6.3.2 Augmented State Representation . . . . . . . . . . . . . . . . . . . 100
6.3.3 Feature Measurement Equation . . . . . . . . . . . . . . . . . . . . 101
6.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
7 Parameter Identification 103
7.1 Experimental Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
7.2 Identification of the Measurement Error Probability Distributions . . . . . 103
7.2.1 Laser Scanner — Azimuth . . . . . . . . . . . . . . . . . . . . . . . 103Contents vii
7.2.2 Laser Scanner — Distance . . . . . . . . . . . . . . . . . . . . . . . 113
7.2.3 Laser Scanner — Position . . . . . . . . . . . . . . . . . . . . . . . 113
7.2.4 Laser Scanner — Width and Length . . . . . . . . . . . . . . . . . 114
7.2.5 Laser Scanner — Orientation . . . . . . . . . . . . . . . . . . . . . 117
7.2.6 Laser Scanner — Displacement . . . . . . . . . . . . . . . . . . . . 121
7.2.7 Camera — Template Tracking . . . . . . . . . . . . . . . . . . . . . 121
7.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
8 Evaluation 125
8.1 Motion Model Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
8.2 Center of Gravity Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
8.3 Performance Measures for Environment Descriptions . . . . . . . . . . . . 127
8.3.1 Error Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
8.3.2 Filter Consistency. . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
8.4 Evaluation of the Environment Description for Passenger Cars . . . . . . . 131
8.4.1 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
8.4.2 Position Estimate . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
8.4.3 Velocity Estimate . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
8.4.4 Velocity Estimation Error for Stationary Roadside Objects . . . . . 146
8.4.5 Detection Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
8.5 Evaluation of the Environment Description for Trucks . . . . . . . . . . . . 152
8.5.1 Stationary Truck . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
8.5.2 Moving Truck . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
8.5.3 Detection Rate and Precision . . . . . . . . . . . . . . . . . . . . . 157
8.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
9 Interacting Multiple Model Filter 161
9.1 Dynamic Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
9.2 The Interacting Multiple Model Algorithm . . . . . . . . . . . . . . . . . . 165
9.2.1 Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
9.2.2 Model Specific Filtering . . . . . . . . . . . . . . . . . . . . . . . . 166
9.2.3 Model Probability Update . . . . . . . . . . . . . . . . . . . . . . . 167
9.2.4 Combination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
9.3 Model Set Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
9.3.1 Choice of Dynamic Models . . . . . . . . . . . . . . . . . . . . . . . 169
9.3.2 Choice of Markov Matrix . . . . . . . . . . . . . . . . . . . . . . . . 169
9.4 Evaluation of Traffic Jam Situations. . . . . . . . . . . . . . . . . . . . . . 170
9.4.1 Model Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
9.4.2 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
9.4.3 Results for Strong Acceleration Changes . . . . . . . . . . . . . . . 172
9.4.4 Results for Traffic Jam Situations . . . . . . . . . . . . . . . . . . . 174
9.5 Evaluation of Intersection Scenarios . . . . . . . . . . . . . . . . . . . . . . 176
9.5.1 Model Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
9.5.2 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
9.5.3 Results for Strong Acceleration Changes . . . . . . . . . . . . . . . 178
9.5.4 Results for Turning Situations . . . . . . . . . . . . . . . . . . . . . 181viii Contents
9.5.5 Results for Rapid Steering Maneuvers . . . . . . . . . . . . . . . . . 183
9.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
10 Situation Assessment for an Emergency Brake 187
10.1 Kamm’s Circle and Resulting Trajectories . . . . . . . . . . . . . . . . . . 187
10.2 Collision Prediction by Kopischke . . . . . . . . . . . . . . . . . . . . . . . 189
10.3 Situation Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
10.3.1 Collision Prediction Algorithm . . . . . . . . . . . . . . . . . . . . . 190
10.3.2 Collision Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
10.3.3 Matrix of Collision Indices . . . . . . . . . . . . . . . . . . . . . . . 192
10.3.4 Example of a Rear End Collision . . . . . . . . . . . . . . . . . . . 193
10.3.5 Example of a Collision at an Intersection . . . . . . . . . . . . . . . 195
10.3.6 Search for the Minimal Collision Index . . . . . . . . . . . . . . . . 195
10.3.7 Stationary Obstacles . . . . . . . . . . . . . . . . . . . . . . . . . . 195
10.4 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
10.4.1 Rear End Collision . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
10.4.2 Negative Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
10.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
11 Conclusions 207
A Proofs and Derivations 211
A.1 Existence of Maximally One Solution . . . . . . . . . . . . . . . . . . . . . 211
A.2 Argument under the Root . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
A.3 Calculation of the Sub-pixel Correction Vector . . . . . . . . . . . . . . . . 212
A.4 Calculation of the Reference Position . . . . . . . . . . . . . . . . . . . . . 214
Bibliography 217
Supervised Student Research Projects and Diploma Theses 231
Publications and Patent Applications 233ix
Notation
Abbreviations
2D Two-dimensional
3D Three-dimensional
ABS Antilock Brake System
ACC Adaptive/Active Cruise Control
ACC S&G Adaptive/Active Cruise Control Stop and Go
ANIS Average normalized innovation squared
CA Collision Avoidance
CA Constant acceleration
CAD Computer-aided design
CCD Charge coupled device
CMOS Complementary metal oxide semiconductor
COG Center of gravity
CPU Central processing unit
CT Cuboid tracking
CT+TT Cuboid tracking with template tracking
CV Constant velocity
DC Driving corridor
DIN Deutsche Industrie-Normenausschuss (engl. German Indus-
trial Standards Authority)
DOF Degrees of freedom
DSC Dynamic Stability Control
DT Displacement tracking
EB Emergency Brake
EKF Extended Kalman filter
EPW Echo pulse width
ESP Electronic Stability Program
FMCW Frequency-modulated continuous wave
FOV Field of view
FSK Frequency shift keying
F1 Filter 1 of the IMM
F2 Filter 2 of the IMM
F3 Filter 3 of the IMM
GPS Global Positioning System
HC Heading Control
IBA Intelligent Brake Assistantx Notation
ICP Iterative closest point
IMM Interacting multiple model
LCA Lane Change Assistant
LCW Lane Change Warning
LDW Lane Departure Warning
Lidar LIght Detection And Ranging
NEES Normalized (state) estimation error squared
NIS Normalized innovation squared
Npp Normal probability plot
PA Parking Assistant
Pdf Probability density function
Pel Picture element
Radar RAdio Detection And Ranging
RMS Root mean square
ROI Region of interest
S Stationary
SKF Single Kalman filter
SPKF Sigma-point Kalman filter
UKF Unscented Kalman filter
WICP Weighted iterative closest point
General Notations
a Scalar
a Vector
A Matrix
′A Transpose of matrix A
−1A Inverse of matrix A
I Identity matrix
Latin Symbols
d Distance
¯d Residual distance of the ICP algorithm
2d Mahalanobis distance
d Width of the object in the imageu
d Height of the object in the imagev
d Length of the objectx
d Width of the objecty
d Height of the objectz
f(.) State transition function
F State transition matrix

Un pour Un
Permettre à tous d'accéder à la lecture
Pour chaque accès à la bibliothèque, YouScribe donne un accès à une personne dans le besoin