Korea Autonomous Vehicle Contest 2013

1. Purpose of the Contest 


The purpose of the contest is to implement the autonomous vehicle technique using limited sensors. The self-driving vehicles in the competition were tested the performance for perception of surroundings, localization, circumstantial judgement, and path planning & tracking using relatively low-priced sensors. Our team, named ‘Smart Mobility’, attended this contest in order to pave the way for the autonomous vehicle research work.  We have kept working for the autonomous vehicle research after the contest.

2. Results of the Contest


We reached a satisfactory results in contest though a short period for preparing. Spirit-1, implemented by Smart Mobility team, didn’t success all the missions, but she finished the full course without emergency stop.

  • Full course complete
  • 5th in overall
  • Laptime : 9min 11sec
  • Results of each mission :
    1. Mission 1 (The recognition of a traffic light signal) : Failure
    2. Mission 2 (The recognition of a traffic light direction) : Success
    3. Mission 3 (The falling obstacle avoidance) : Success
    4. Mission 4 (The recognition of a speed limit sign and the control of the vehicle speed): Success
    5. Mission 5 (The recognition of a pedestrian and the stop in the stop-zone) : Failure
    6. Mission 6 (The recognition of vehicles and the path planning for avoidance) : Success
    7. Mission 7 (The recognition and the avoidance of complex obstacles): Partial Success (Failure of detecting tires)
    8. Mission 8 (The recognition of a under-construction sign and the path planning for avoidance) : Success
    9. Mission 9 (The recognition of the narrow road line and lane keeping) : Failure
    10. Mission 10-1 (The recognition of a crossroad and the stop at the stop-line): Success
    11. Mission 10-2 (The recognition of a moving vehicle at the intersection): Success

3. Technical Information

1) The Sensors Limited in the Contest :

  • GPS (1EA): B20 (CHC)
  • LiDAR (2 EA) : LMS511 (SICK)
  • Stereo Camera (1 EA) : VSTS-P260 (VisionST) – Option
  • Single Camera (1 EA) : LDWS camera (MCNEX) – Option

GPS       LMS511      크기변경_사본 -stereo         크기변경_카메라

2) Perception of Surroundings :

There are two kinds of sensors for perception of surroundings around the autonomous vehicle ‘Spirit-1’. One of them is the LiDAR(or Laser scanner) sensor that was used for detecting various obstacles like walls, vehicles, pedestrians, etc.  in the contest. We used the sequential segmentation with raw data from LiDAR sensor. After segmentation process, bounding box algorithm was used for detecting objects in a sensing range.  Another is the vision sensor (or the camera) that was used for recognizing traffic signals and traffic lights in the competition. We used color-based algorithms for detecting traffic lights and learning algorithms for detecting traffic signs. After detecting objects, learning based algorithms were used for classifying contents of objects.

3) Localization :

Localization system for our autonomous vehicle was integrated using Kalman filtering. To get reliable positions, there are two ways, either using a GPS or dead reckoning from velocity and steering angle of the vehicle. Stable position data of the autonomous vehicle is necessary for a successful operation even with the error elements. Kalman filter is suggested between the GPS and dead reckoning to get stable position data.

4) Decision :

Our decision system is to be specialized in performing the contest mission. In each mission, the system makes a decision based on perception data and localization data. Priority based tag manager provides time out action and drivability map, target velocity. These information are delivered to path tracking system.

5) Path Planning & Tracking :

In this section, our goal is to generate stable path and track that path based on history of path. We used conditional-weighted potential field based A-star algorithm to achieve robust path generating. It makes drivability map using localization information and set the target velocity in each mission. Then, path tracking method based on Pure pursuit tracks a look ahead point. It doesn’t take into account look ahead point’s orientation.

4. The Document for Further Information

This document can be seen only in this page.

 Table of Contents

1. Contest Information
  1-1. Summary of the Contest
  1-2. Preparation for Contest and Missions

2. Technical Information
  2-1. Information of Spirit-1 
  2-2. Functional Technique
    2-2-1. Perception of Surroundings – Vision and LiDAR
    2-2-2. Localization
     2-2-3. Decision
    2-2-4. Path Generation & Tracking

1.Contest Information

      What is the Autonomous Vehicle?

what the auto
An autonomous car is an autonomous vehicle capable of fulfilling the human transportation capabilities of a traditional car with sensing its environment and navigating without human input.

1-1. Summary of the Contest

More Information about the contest…

*Official Homepage of the contest : http://autonomous.ksae.org/

1-2. Preparation for Contest and Missions





2.Technical Information

2-1. Information of Spirit-1

2-2. Functional Technique 

  2-2-1. Perception of Surroundings – Vision and LiDAR

More Information about Perception of Surroundings…

[1] HSV color space : http://en.wikipedia.org/wiki/HSL_and_HSV

[2] PCA (with regard to face recognition) : http://docs.opencv.org/modules/contrib/doc/facerec/facerec_tutorial.html?highlight=eigenface

[3] SVM : http://docs.opencv.org/doc/tutorials/ml/introduction_to_svm/introduction_to_svm.html?highlight=svm

[4] Deterministic tracking : A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Comput. Surv., vol. 38, no. 4, pp. 1–45, 2006.

[5] Cascade Classification & Haar-like feature : http://docs.opencv.org/modules/objdetect/doc/cascade_classification.html?highlight=viola%20jones 

[6] Lane Detection : http://vision.caltech.edu/malaa/software/research/caltech-lane-detection/

[7] Camera – Laser Calibration : http://www-personal.acfr.usyd.edu.au/akas9185/

  2-2-2. Localization

  2-2-3. Decision

  2-2-4. Path Generation & Tracking