首页|Northeast Agricultural University Researchers Target Machine Learning (Multimoda l deep fusion model based on Transformer and multi-layer residuals for assessing the competitiveness of weeds in farmland ecosystems)
Northeast Agricultural University Researchers Target Machine Learning (Multimoda l deep fusion model based on Transformer and multi-layer residuals for assessing the competitiveness of weeds in farmland ecosystems)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News-Investigators discuss new findings in artificial intelligence. According to news reporting out of Harbin, People's Rep ublic of China, by NewsRx editors, research stated, "Weed competitiveness monito ring is crucial for field management at specific locations. Recent research in t he fusion of multimodal data from unmanned aerial vehicles (UAVs) has propelled this advancement." Financial supporters for this research include National Natural Science Foundati on of China. The news journalists obtained a quote from the research from Northeast Agricultu ral University: "However, these studies merely stack extracted features equivale ntly, neglecting the full utilization of fused information. This study utilizes hyperspectral and LiDAR data collected by UAVs to proposes a multimodal deep fus ion model (MulDFNet) using Transformer and multi-layer residuals. It utilizes a comprehensive competitive index (CCI-A) based on multidimensional phenotypes of maize to assess the competitiveness of weeds in farmland ecosystems. To validate the effectiveness of this model, a series of ablation studies were conducted in volving different modalities data, with/without the Transformer Encoder (TE) mod ules, and different fusion modules (shallow residual fusion module, deep feature fusion module). Additionally, a comparison was made with early/late stacking fu sion models, traditional machine learning models, and deep learning models from relevant studies. The results indicate that the multimodal deep fusion model uti lizing HSI, VI, and CHM data achieved a predictive effect of R2 = 0.903 (RMSE = 0.078). Notably, the best performance was observed during the five-leaf stage. T he combination of shallow and deep fusion modules demonstrated better predictive performance compared to a single fusion module. The positive impact of the TE m odule on model performance is evident, as its multi-head attention mechanism aid s in better capturing the relationships and importance between feature maps and competition indices, thereby enhancing the model's predictive capability."
Northeast Agricultural UniversityHarbi nPeople's Republic of ChinaAsiaCyborgsEmerging TechnologiesMachine Lea rning