A Robust Lane Detection Method Based on Vanishing Point Estimation Using the Relevance of Line Segments
- A Robust Lane Detection Method Based on Vanishing Point Estimation Using the Relevance of Line Segments
- 박성기; 김동환; 유주한; 이성환
- Lane detection; line segment; vanishing point estimation; probabilistic voting; lane departure warning (LDW) system
- Issue Date
- IEEE transactions on intelligent transportation systems : a publication of the IEEE Intelligent Transportation Systems Council
- VOL 18, NO 12-3266
- In this paper, a robust lane detection method based on vanishing point estimation is proposed. Estimating a vanishing point can be helpful in detecting lanes, because parallel lines converge on the vanishing point in a projected 2-D image. However, it is not easy to estimate the vanishing point correctly in an image with a complex background. Thus, a robust vanishing point estimation method is proposed that uses a probabilistic voting procedure based on intersection points of line segments extracted from an input image. The proposed voting function is defined with line segment strength that represents relevance of the extracted line segments. Next, candidate line segments for lanes are selected by considering geometric constraints. Finally, the host lane is detected by using the proposed score function, which is designed to remove outliers in the candidate line segments. Also, the detected host lane is refined by using inter-frame similarity that considers location consistency of the detected host lane and the estimated vanishing point in consecutive frames. Furthermore, in order to reduce computational costs in the vanishing point estimation process, a method using a lookup table is proposed. Experimental results show that the proposed method efficiently estimates the vanishing point and detects lanes in various environments.
- Appears in Collections:
- KIST Publication > Article
- Files in This Item:
There are no files associated with this item.
- RIS (EndNote)
- XLS (Excel)
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.