首页|基于无人机航拍图像的水稻叶片SPAD值反演

基于无人机航拍图像的水稻叶片SPAD值反演

扫码查看
为利用无人机航拍图像实现水稻叶绿素含量的高通量检测,以籼型三系杂交水稻品种兆优5431为材料,设置3个密度水平和5个施氮量水平,共15个处理,在水稻不同生育期通过大疆精灵4RTK无人机获取航拍图像和人工测定水稻叶片SPAD值,并选取7种与水稻叶片SPAD值显著相关的可见光植被指数,采用线性回归和机器学习方法构建了水稻叶片SPAD值反演模型,通过精度验证确定水稻叶片SPAD值最优预测模型.结果表明,机器学习模型中,随机森林模型精度均高于其他回归模型,该算法构建的模型具有较高的预测精度,其模型各项指标分别是建模集R2为0.85、RMSE为2.73,验证集R2为0.76、RMSE为3.64.因此,机器学习模型能为水稻叶片SPAD值进行无损、快速监测提供参考.
Inversion of SPAD Values of Rice Canopy Based on Aerial Images Taken by Unmanned Aerial Vehicle
To realize the high-throughput detection of chlorophyll content of rice using aerial images taken by unmanned aerial vehicle(UAV),Zhaoyou 5431,a three-line indica hybrid rice variety,was used as the materials,three planting density levels and five nitrogen application levels were set,and a total of 15 treatments were performed.The aerial images were obtained by Dajiang Genie 4RTK UAV and the SPAD value of rice leaves was manually measured at different growth stages of rice.Seven kinds of visible light vegetation index that were significantly correlated with the SPAD value of rice leaves were selected,and the inversion model of SPAD value of rice leaves was constructed by linear regression and machine learning methods.The optimal prediction model of SPAD value of rice leaves was determined by accuracy verification.The results showed that among the machine learning models,the accuracy of random forest model was higher than that of other regression models,and the model constructed by this algorithm had higher prediction accuracy.The model indicators were modeling set R2 of 0.85 and RMSE of 2.73,validation set R2 of 0.76 and RMSE of 3.64.Therefore,the machine learning model can provide reference for non-destructive and rapid monitoring of rice leaf SPAD value.

ricechlorophyll contentSPAD valueunmanned aerial vehicleaerial imagemachine learningrandom forest algorithm

陈兆中、段少坤、岳云开、李焕群、吴霞、陈建福、王小卉、李绪孟

展开 >

湖南农业大学农学院,湖南长沙 410128

水稻 叶绿素含量 SPAD值 无人机 航拍图 机器学习 随机森林算法

湖南省重点研发计划

2022NK2047

2024

杂交水稻
国家杂交水稻工程技术研究中心 湖南杂交水稻研究中心

杂交水稻

CSTPCD北大核心
影响因子:0.432
ISSN:1005-3956
年,卷(期):2024.39(2)
  • 31