RGBT object tracking benefits from the complementary advantages of visible RGB and thermal infrared modalities,which can effectively enhance the object localization capability of trackers in challenging environmental conditions.Existing works mainly focus on how to extract and fuse features from these two modalities,while neglecting the potential value of hierarchical deep features within each modality,which play crucial roles in object localization and classification.To address this problem,the Multi-layer Feature Interaction and Modal-adaptation Fusion Network(MIMFNet)is proposed to achieve the RGB and thermal tracking.Firstly,the algorithm extracts and adaptively calibrates hierarchical features through feature extractors and attention mechanisms.Secondly,a hierarchical feature aggregation sub-network combines features from different layers in a top-down fashion,allowing low-level features to retain their spatial details while capturing semantic information from high-level features.Finally,a multi-modal information propagation module is designed to adaptively fuse hierarchical information from both modalities to direct the model's focus towards higher-quality feature channels.Extensive experimental results on multiple publicly available datasets demonstrate the strong anti-jamming properties of the proposed RGBT tracking algorithm.In particular,significant improvements have been achieved in dealing with tracking drifts caused by factors such as Scale Variation(SV),Thermal Crossover(TC)and Occlusion(OCC).