电子、通信与自动控制

4D毫米波雷达点云与视觉图像自动外参配准方法

  • 毕欣 ,
  • 翁才恩 ,
  • 王瑜 ,
  • 胡再刚
展开
  • 1.同济大学 汽车学院,上海 201804
    2.成都汽车产业研究院,四川 成都 610100
    3.成都天软信息技术有限公司,四川 成都 610041
毕欣(1980—),男,博士,研究员,主要从事自动驾驶感知融合与测评技术研究。E-mail: bixin@tongji.edu.cn

收稿日期: 2024-04-10

  网络出版日期: 2024-06-14

基金资助

国家重点研发计划项目(2022YFE0117100);成都市重大科技创新项目(2021-YF08-00140-GX);广东省基础与应用基础研究项目(2021B1515120032)

An Autonomous Extrinsic Calibration Method for 4D Millimeter-Wave Radar Point Clouds and Visual Images

  • BI Xin ,
  • WENG Caien ,
  • WANG Yu ,
  • HU Zaigang
Expand
  • 1.School of Automotive Studies,Tongji University,Shanghai 201804,China
    2.Chengdu Automobile Industry Academy,Chengdu 610100,Sichuan,China
    3.Chengdu Skysoft Information Technology Co. ,Ltd. ,Chengdu 610041,Sichuan,China
毕欣(1980—),男,博士,研究员,主要从事自动驾驶感知融合与测评技术研究。E-mail: bixin@tongji.edu.cn

Received date: 2024-04-10

  Online published: 2024-06-14

Supported by

the National Key R & D Program of China(2022YFE0117100);the Major Science and Technology Innovation Project of Chengdu City(2021-YF08-00140-GX);the Basic and Applied Basic Research Foundation of Guangdong Province(2021B1515120032)

摘要

随着自动驾驶技术的快速发展,环境感知系统对多传感器融合的需求日益增加。4维(4D)毫米波雷达因其在复杂天气和光照条件下的稳定性能而成为自动驾驶领域的重要传感器之一。尽管4D毫米波雷达通过增加俯仰角信息和提高点云密度改善了目标检测的精度,但其点云稀疏性和噪声问题限制了其独立应用。因此,4D毫米波雷达与视觉传感器的融合成为提升自动驾驶感知精度的关键。然而,传统的外参配准方法依赖繁琐的手动操作,难以满足高效自动化配准的需求。为解决这一问题,该文提出了一种基于标定板的4D毫米波雷达与视觉图像自动外参配准方法。该方法首先设计了包含ChArUco标记、红色圆环和角反射器的标定板,然后通过圆检测算法和角反射器检测算法自动提取配准点的图像坐标和雷达点云坐标。此外,还提出了一种通过3D Max和Unity仿真的方式实现配准数据采集与验证的方法。最后,通过实验比较直接线性变换(DLT)和外参配准(EC)2种方法的性能,评估配准精度。结果表明,所设计的标定板和自动配准算法能够有效地减少人工操作,并且在配准点个数较多时,EC方法具有更高的配准稳定性和配准精度。

本文引用格式

毕欣 , 翁才恩 , 王瑜 , 胡再刚 . 4D毫米波雷达点云与视觉图像自动外参配准方法[J]. 华南理工大学学报(自然科学版), 2025 , 53(1) : 74 -83 . DOI: 10.12141/j.issn.1000-565X.240172

Abstract

With the rapid development of autonomous driving technology, the demand for multi-sensor fusion in environmental perception systems is increasing. Four-dimensional (4D) millimeter-wave radar has become one of the critical sensors in autonomous driving due to its stable performance under complex weather and lighting conditions. Although 4D millimeter-wave radar improves object detection accuracy by adding elevation angle information and increasing point cloud density, its sparse point clouds and noise issues limit its independent application. Therefore, the fusion of 4D millimeter-wave radar with vision sensors has become key to enhancing perception accuracy in autonomous driving. However, traditional extrinsic calibration methods rely on cumbersome manual operations, making it challenging to meet the requirements for efficient automated calibration. To address this issue, this study proposed an automated extrinsic calibration method for 4D millimeter-wave radar and visual images based on a calibration board. The method first designs a calibration board with ChArUco markers, red circular rings, and corner reflectors, and then automatically extracts image coordinates and radar point cloud coordinates of the calibration points using a circle detection algorithm and a corner reflector detection algorithm. Furthermore, a method for calibration data acquisition and validation was proposed using simulations in 3D Max and Unity. Finally, the performance of direct linear transformation (DLT) and extrinsic calibration (EC) methods is compared through experiments to evaluate calibration accuracy. Experimental results indicate that the designed calibration board and automated calibration algorithm effectively reduce manual operations, and the EC method demonstrates higher calibration stability and accuracy when more calibration points are involved.

参考文献

1 YAO S, GUAN R, HUANG X,et al .Radar-camera fusion for object detection and semantic segmentation in autonomous driving:a comprehensive review[J].IEEE Transactions on Intelligent Vehicles20249(1):2094-2128.
2 ZHOU Y, LIU L, ZHAO H,et al .Towards deep radar perception for autonomous driving:datasets,methods,and challenges[J].Sensors202222(11):4208/1-45.
3 HAN Z, WANG J, XU Z,et al .4D millimeter-wave radar in autonomous driving:a survey[EB/OL].(2023-06-07)[2024-04-01]..
4 林永杰,陈宁,卢凯 .基于毫米波雷达点云的路口车辆轨迹跟踪方法[J].华南理工大学学报(自然科学版)202351(10):110-125.
  LIN Yongjie, CHEN Ning, LU Kai .Vehicle trajectory tracking at intersections based on millimeter wave radar point cloud[J].Journal of South China University of Technology(Natural Science Edition)202351(10):110-125.
5 SINGH A .Vision-radar fusion for robotics BEV detections:a survey[C]∥ Proceedings of 2023 IEEE Intelligent Vehicles Symposium.Anchorage:IEEE,2023:10186647/1-7.
6 XIONG W, LIU J, HUANG T,et al .LXL:LiDAR excluded lean 3D object detection with 4D imaging radar and camera fusion[J].IEEE Transactions on Intelligent Vehicles20249(1):79-92.
7 ZHENG L, LI S, TAN B,et al .RCFusion:fusing 4-D radar and camera with bird’s-eye view features for 3D object detection[J].IEEE Transactions on Instrumentation and Measurement202372:8503814/1-14.
8 DOMHOF J, KOOIJ J F P, GAVRILA D M .An extrinsic calibration tool for radar,camera and lidar[C]∥ Proceedings of 2019 International Conference on Robotics and Automation.Montreal:IEEE,2019:8107-8113.
9 ZHANG J, ZHANG S, PENG G,et al .3DRadar2ThermalCalib:accurate extrinsic calibration between a 3D mmWave radar and a thermal camera using a spherical-trihedral[C]∥ Proceedings of 2022 IEEE the 25th International Conference on Intelligent Transportation Systems.Macau:IEEE,2022:2744-2749.
10 KIM D, KIM S .Extrinsic parameter calibration of 2D radar-camera using point matching and generative optimization[C]∥ Proceedings of 2019 the 19th International Conference on Control,Automation and Systems.Jeju:IEEE,2019:99-103.
11 LIU M, LI D, LI Q,et al .An online intelligent method to calibrate radar and camera sensors for data fusing[J].Journal of Physics:Conference Series20201631:012183/1-9.
12 SHEENY M, DE PELLEGRIN E, MUKHERJEE S,et al .RADIATE:a radar dataset for automotive perception in bad weather[C]∥ Proceedings of 2021 IEEE International Conference on Robotics and Automation.Xi’an:IEEE,2021:5617-5623.
13 SCH?LLER C, SCHNERRLER M, KR?MMER A,et al .Targetless rotational auto-calibration of radar and camera for intelligent transportation systems[C]∥ Proceedings of 2019 IEEE Intelligent Transportation Systems Conference.Auckland:IEEE,2019:3934-3941.
14 WANG X Y, WANG X S, ZHOU Z Q .A high-accuracy calibration method for fusion systems of millimeter-wave radar and camera[J].Measurement Science and Technology202234(1):015103/1-14.
15 CUI H, WU J, ZHANG J,et al .3D detection and tracking for on-road vehicles with a monovision camera and dual low-cost 4D mmWave radars[C]∥ Procee-dings of 2021 IEEE International Intelligent Transportation Systems Conference.Indianapolis:IEEE,2021:2931-2937.
16 OH J, KIM K S, PARK M,et al .A comparative study on camera-radar calibration methods[C]∥ Proceedings of 2018 the 15th International Conference on Control,Automation,Robotics and Vision.Singapore:IEEE,2018:1057-1062.
17 CHENG L, SENGUPTA A, CAO S .3D radar and camera co-calibration:a flexible and accurate method for target-based extrinsic calibration[C]∥ Proceedings of 2023 IEEE Radar Conference.San Antonio:IEEE,2023:10149669/1-6.
18 WISE E, PER?IC J, GREBE C,et al .A continuous-time approach for 3D radar-to-camera extrinsic calibration[C]∥ Proceedings of 2021 IEEE International Conference on Robotics and Automation.Xi’an:IEEE,2021:13164-13170.
19 WIES E, CHENG Q, KELLY J .Spatiotemporal calibration of 3-D millimetre-wavelength radar-camera pairs[J].IEEE Transactions on Robotics202339(6):4552-4566.
20 GAO D, DUAN J, YANG X,et al .A method of spatial calibration for camera and radar[C]∥ Proceedings of 2010 the 8th World Congress on Intelligent Control and Automation.Jinan:IEEE,2010:6211-6215.
21 WANG T, ZENG N, XIN J,et al .Integrating millimeter wave radar with a monocular vision sensor for on-road obstacle detection applications[J].Sensors201111(9):8992-9008.
22 ZHOU Y, DONG Y, HOU F,et al .Review on millimeter-wave radar and camera fusion technology [J].Sustainability202214(9):5114/1-32.
23 JI Z, PROKHOROV D .Radar-vision fusion for object classification[C]∥ Proceedings of 2008 the 11th International Conference on Information Fusion.Cologne:IEEE,2008:1-7.
24 Inc GitHub .OpenCV:open source computer vision library[EB/OL].(2022-12-28)[2024-04-01]..
25 LEE C L, HSUEH Y H, WANG C C,et al .Extrinsic and temporal calibration of automotive radar and 3D LiDAR[C]∥ Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems.Las Vegas:IEEE,2020:9976-9983.
26 AGRAWAL S, BHANDERI S, DOYCHEVA K,et al .Static multitarget-based autocalibration of RGB cameras,3-D rada,and 3-D lidar sensors[J].IEEE Sensors Journal202323(18):21493-21505.
文章导航

/