Implicit Illumination-Based Multimodal All-Weather Perception for Autonomous Driving
Accurate and real-time perception is crucial for ensuring the safety of autonomous driving.However,in low-light conditions such as nighttime,and extreme weather conditions like rain,snow,sandstorms,and fog,perception systems based on vision or LiDAR may experience significantly increased error rates.Therefore,this paper investigates an all-weather real-time perception method for autonomous driving by combining the advantage of visible imagery in capturing detailed information with the high penetrative detection capability of infrared thermal imaging.To address the spatiotemporal misalignment and modality imbalance between visible and infrared sensor data,this paper first proposes a cross-modality attention mechanism to perform local-to-nonlocal feature fusion,thereby correcting spatial positional deviations and capturing semantically complementary information for efficient multimodal fusion.Subsequently,pixel-level illumination weights,implicitly estimated based on Retinex theory,are employed to further guide the attention mechanism,balancing visible and infrared features across multi-level feature maps.Compared to previous studies,the proposed method achieves significant improvements in both perception performance and efficiency,providing robust technical support for enhancing vehicle safety.