With the rapid development of autonomous driving technology,the demand for multi-sensor fusion in environmental perception systems is increasing.Four-dimensional(4D)millimeter-wave radar has become one of the critical sensors in autonomous driving due to its stable performance under complex weather and lighting condi-tions.Although 4D millimeter-wave radar improves object detection accuracy by adding elevation angle information and increasing point cloud density,its sparse point clouds and noise issues limit its independent application.There-fore,the fusion of 4D millimeter-wave radar with vision sensors has become key to enhancing perception accuracy in autonomous driving.However,traditional extrinsic calibration methods rely on cumbersome manual operations,making it challenging to meet the requirements for efficient automated calibration.To address this issue,this study proposed an automated extrinsic calibration method for 4D millimeter-wave radar and visual images based on a cali-bration board.The method first designs a calibration board with ChArUco markers,red circular rings,and corner reflectors,and then automatically extracts image coordinates and radar point cloud coordinates of the calibration points using a circle detection algorithm and a corner reflector detection algorithm.Furthermore,a method for cali-bration data acquisition and validation was proposed using simulations in 3D Max and Unity.Finally,the perfor-mance of direct linear transformation(DLT)and extrinsic calibration(EC)methods is compared through experiments to evaluate calibration accuracy.Experimental results indicate that the designed calibration board and automated calibration algorithm effectively reduce manual operations,and the EC method demonstrates higher calibration sta-bility and accuracy when more calibration points are involved.