doctorat de doctorat de l'université l'université de toulouse de toulouse

Embed Size (px)

Text of doctorat de doctorat de l'université l'université de toulouse de toulouse

  • En vue de lobtention du

    DOCTORAT DEDOCTORAT DE LUNIVERSITLUNIVERSIT DE TOULOUSE DE TOULOUSE

    Dlivr par :

    Discipline ou spcialit :

    Prsente et soutenue par :

    Titre :

    Ecole doctorale :

    Unit de recherche :

    Directeur(s) de Thse :

    Rapporteurs :

    le :

    Membre(s) du jury :

    Institut National des Sciences Appliques de Toulouse (INSA Toulouse)

    Systmes (EDSYS)

    Contributions to the use of 3D lidars for autonomous navigation: calibration and qualitative localization

    mercredi 1 fvrier 2012Naveed MUHAMMAD

    Robotique

    David FOFI Fawzi NASHASHIBI

    Simon LACROIX

    LAAS-CNRS

    Rachid ALAMI Roland CHAPUIS Paul CHAVENT

  • Abstract

    0.1 Abstract in English

    In order to autonomously navigate in an environment, a robot has to per-

    ceive its environment correctly. Rich perception information from the envi-

    ronment enables the robot to perform tasks like avoiding obstacles, build-

    ing terrain maps, and localizing itself. Classically, outdoor robots have

    perceived their environment using vision or 2D lidar sensors. The intro-

    duction of novel 3D lidar sensors such as the Velodyne device has enabled

    the robots to rapidly acquire rich 3D data about their surroundings. These

    novel sensors call for the development of techniques that efficiently exploit

    their capabilities for autonomous navigation.

    The first part of this thesis presents a technique for the calibration of 3D

    lidar devices. The calibration technique is based on the comparison of ac-

    quired 3D lidar data to a ground truth model in order to estimate the

    optimal values of the calibration parameters. The second part of the thesis

    presents a technique for qualitative localization and loop closure detection

    for autonomous mobile robots, by extracting and indexing small-sized sig-

    natures from 3D lidar data. The signatures are based on histograms of

    local surface normal information that is efficiently extracted from the li-

    dar data. Experimental results illustrate the developments throughout the

    manuscript.

    Keywords: Autonomous mobile robot, Autonomous navigation, 3D Lidar

  • 0.2 Resume court en francais

    Afin de permettre une navigation autonome dun robot dans un environ-

    nement, le robot doit tre capable de percevoir son environnement. Dans la

    litterature, dune maniere gnerale, les robots percoivent leur environnement

    en utilisant des capteurs de type sonars, cameras et lidar 2D. Lintroduction

    de nouveaux capteurs, nommes lidar 3D, tels que le Velodyne HDL-64E S2,

    a permis aux robots dacqurir plus rapidement des donnes 3D a partir de

    leur environnement.

    La premiere partie de cette these presente une technique pour la calibrage

    des capteurs lidar 3D. La technique est basee sur la comparaison des donnees

    lidar un modele de verite de terrain afin destimer les valeurs optimales

    des parametres de calibrage. La deuxime partie de la these presente une

    technique pour la localisation et la detection de fermeture de boucles pour

    les robots autonomes. La technique est basee sur lextraction et lindexation

    des signatures de petite-taille a partir de donnees lidar 3D. Les signatures

    sont basees sur les histogrammes de linformation de normales de surfaces

    locale extraite a partir des donnees lidar en exploitant la disposition des

    faisceaux laser dans le dispositif lidar.

    Mots Cles: Robotique mobile autonome, Navigation autonome, Lidar 3D

  • To my parents, Maqbool and Shabana

  • Acknowledgements

    I am very thankful to my advisor Dr. Simon Lacroix for his guidance,

    ideas, encouragement and patience throughout the course of my Ph.D. He

    was always there to listen to me and help me with any issues, technical or

    otherwise.

    Im also very thankful to the RIA group at LAAS for providing me the op-

    portunity to carry out my Ph.D. research here. Id like to thank DGA and

    ANR for funding my research under the Action and 2RT3D projects respec-

    tively. Id like to thank Matthieu for providing his technical help, whenever

    I needed it, for the robots and computing facilities at LAAS. Thanks to

    Natacha, Camille and Helene for always helping me with administrative

    things at the lab.

    Im very grateful to my friends from LAAS: Ali, Lavi, Assia, Aamir, Umer,

    Amit, Red, Layale, Cyril, Gil, Bob, David, Diego, Matthieu, Ibrahim, Al-

    hayat, Ela, Antonio, Hayat, Wassima, Xavier, Hung, Bach Van, Jean-Marie,

    Arnaud, Aurelien, Wassim, and Sakku for their help, care and more impor-

    tantly the motivation and encouragement throughout my stay in Toulouse.

    Im also very grateful to my friends Sofie, Matthieu, Solene, Merle, Bushra,

    Celine, Emma, Humaira, Umer, Saqlain, Saif, Tauseef, Kaleem, Nadeem,

    Onder, Gulfam, Sara, Tusawar, Hussnain, Saima, Ayesha, Ousama, Anne,

    Julie, Owais, Rosanne, Saad, and Tameez for their care, help, encourage-

    ment and the tasty dinners.

    Finally a special thanks to my parents, my brothers and sister, my cousins,

    uncles and aunts for always being there for me, for their support, care,

    encouragement, advice, prayers and love.

  • vi

  • Contents

    0.1 Abstract in English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii

    0.2 Resume court en francais . . . . . . . . . . . . . . . . . . . . . . . . . . iii

    List of Figures xi

    List of Tables xv

    1 Introduction 1

    1.1 Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1.1.1 Environment perception for outdoor mobile robots . . . . . . . . 1

    1.1.2 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    1.2 Main contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

    1.3 Thesis structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

    I Lidar Calibration 11

    2 Lidar in Robotics 13

    2.1 Lidar devices used in robotics . . . . . . . . . . . . . . . . . . . . . . . . 13

    2.1.1 Principle of operation . . . . . . . . . . . . . . . . . . . . . . . . 13

    2.1.2 Lidar devices used in robotics . . . . . . . . . . . . . . . . . . . . 14

    2.2 Lidar Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

    2.3 The Velodyne HDL-64E S2 . . . . . . . . . . . . . . . . . . . . . . . . . 23

    2.3.1 Geometric model . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

    2.3.2 Sensor behaviour and characteristics . . . . . . . . . . . . . . . . 24

    2.4 Velodyne lidar applications in robotics . . . . . . . . . . . . . . . . . . . 29

    2.5 Significance of lidar in robotics . . . . . . . . . . . . . . . . . . . . . . . 37

    vii

  • CONTENTS

    3 Velodyne lidar calibration 39

    3.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

    3.1.1 Sensor modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

    3.1.2 Calibration environment . . . . . . . . . . . . . . . . . . . . . . . 41

    3.1.3 Data segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . 41

    3.1.4 Optimization objective function . . . . . . . . . . . . . . . . . . . 42

    3.1.5 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

    3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

    3.2.1 Geometric model . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

    3.2.2 Calibration environment . . . . . . . . . . . . . . . . . . . . . . . 48

    3.2.3 Objective/Cost function . . . . . . . . . . . . . . . . . . . . . . . 49

    3.2.4 Suitability Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 50

    3.2.5 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

    3.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

    3.3.1 Recalibrating a subset of lasers . . . . . . . . . . . . . . . . . . . 54

    3.3.2 Recalibrating all 64 lasers . . . . . . . . . . . . . . . . . . . . . . 55

    3.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

    II Qualitative Localization 67

    4 The localization problem 69

    4.1 Importance of localization, Why localize? . . . . . . . . . . . . . . . . . 69

    4.2 Solutions to localization . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

    4.2.1 Dead reckoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

    4.2.2 Simultaneous localization and mapping . . . . . . . . . . . . . . 73

    4.2.3 Absolute localization . . . . . . . . . . . . . . . . . . . . . . . . . 75

    4.2.4 Choice of a localization method . . . . . . . . . . . . . . . . . . . 80

    4.3 View-based localization . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

    4.3.1 Using global signatures . . . . . . . . . . . . . . . . . . . . . . . 84

    4.3.2 Using local signatures . . . . . . . . . . . . . . . . . . . . . . . . 86

    4.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

    viii

  • CONTENTS

    5 View-based localization using 3D lidar 89

    5.1 Global signatures for 3D lidar data . . . . . . . . . . . . . . . . . . . . . 90

    5.1.1 Local vs. global signatures . . . . . . . . . . . . . . . . . . . . . 90

    5.1.2 Surface normal extraction . . . . . . . . . . . . . . . . . . . . . . 91

    5.1.3 Signature definition . . . . . . . . . . . . . . . . . . . . . . . . . 98

    5.1.4 Comparing the signatures . . . . . . . . . . . . . . . . . .

Recommended

View more >