ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于無(wú)人機(jī)多光譜影像和關(guān)鍵點(diǎn)檢測(cè)的雪茄煙株數(shù)提取
CSTR:
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類號(hào):

基金項(xiàng)目:

國(guó)家自然科學(xué)基金面上項(xiàng)目(42171350)和湖北省煙草公司科技項(xiàng)目(027Y2021-009)


Counting Cigar Tobacco Plants from UAV Multispectral Images via Key Points Detection Approach
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問(wèn)統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    為從無(wú)人機(jī)遙感影像中準(zhǔn)確識(shí)別煙草,,實(shí)現(xiàn)植株定位與計(jì)數(shù),,以雪茄煙草植株為研究對(duì)象,提出一種新的深度學(xué)習(xí)模型,。區(qū)別于傳統(tǒng)的利用檢測(cè)框識(shí)別目標(biāo),,本文模型利用少量的關(guān)鍵點(diǎn)學(xué)習(xí)煙草中心形態(tài)學(xué)特征,并采用輕量級(jí)的編,、解碼器從無(wú)人機(jī)遙感影像快速識(shí)別煙草并定位計(jì)數(shù),。首先,,提出的模型針對(duì)煙草植物形態(tài)學(xué)特點(diǎn),通過(guò)中心關(guān)鍵點(diǎn)標(biāo)注的方法,,使用高斯函數(shù)生成概率密度圖,,引入更多監(jiān)督信息。其次,,對(duì)比不同主干網(wǎng)絡(luò)在模型中的效果,,ResNet18作為主干網(wǎng)絡(luò)時(shí)平均精度大于99.5%,精度和置信度都高于測(cè)試的其他主干網(wǎng)絡(luò),。而MobileNetV2在CPU環(huán)境下達(dá)到運(yùn)行效率最優(yōu),,但平均置信度相對(duì)較低。使用損失函數(shù)Focal Loss與MSE Loss結(jié)合的Union Loss時(shí),,平均精度大于99.5%,。最后,利用不同波段組合作為訓(xùn)練數(shù)據(jù),,對(duì)比結(jié)果發(fā)現(xiàn)使用紅邊波段更有助于模型快速收斂且能夠很好地區(qū)分煙草和雜草,。由于紅邊波段與植株冠層結(jié)構(gòu)相關(guān),使用紅邊,、紅,、綠波段時(shí)平均精度達(dá)到99.6%。本文提出的深度學(xué)習(xí)模型能夠準(zhǔn)確地檢測(cè)無(wú)人機(jī)遙感影像中的煙草,,可為煙草的農(nóng)情監(jiān)測(cè)提供數(shù)據(jù)支持,。

    Abstract:

    Tobacco is an important industrial crop in China. The survival rate and growing status of tobacco plants after being transplanted to the field are essential for the field management and yield predictions. However, counting the number of live plants is traditionally conducted by labors, which is time consuming and expensive. Unmanned aerial vehicle is a cost-effective option for monitoring croplands and plantations. However, visual inspection for such images can be a challenging and biased task, specifically for locating and detecting plants. As tobacco plant has a characteristic center-oriented feature, a novel deep-learning algorithm was developed to locate and count tobacco plants via key points detection method, instead of using a common bounding-box object-detection approach. The proposed deep learning algorithm was tested on the cigar plants. In the algorithm, the center of each plant was firstly annotated with a point, and a Gaussian probability density was derived to provide useful information of morphological features. Secondly, different backbones and loss functions in the proposed algorithm were evaluated. Using ResNet18 as a backbone provided the most accurate prediction of the plant number (average precision higher than 99.5%). MobileNetV2 was the most efficient backbone, but the uncertainty of predictions was higher than that of ResNet18. The combination of Focal Loss function and MSE Loss function (Union loss) reached the highest accuracy (average precision higher than 99.5%) while reduced the uncertainty. Finally, the evaluation of different combinations of multispectral bands showed that the combination of red-edge, red, and green bands had a better performance than using red, green, and blue bands in differentiating tobacco plants and weeds, resulting in less uncertainty in the tobacco plant detection. The proposed algorithm can accurately locate and count tobacco plant in the UAV images, providing an effective tool and a valuable data support for planting high-quality tobaccos.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

饒雄飛,周龍宇,楊春雷,廖世鵬,李小坤,劉詩(shī)詩(shī).基于無(wú)人機(jī)多光譜影像和關(guān)鍵點(diǎn)檢測(cè)的雪茄煙株數(shù)提取[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2023,54(3):266-273. RAO Xiongfei, ZHOU Longyu, YANG Chunlei, LIAO Shipeng, LI Xiaokun, LIU Shishi. Counting Cigar Tobacco Plants from UAV Multispectral Images via Key Points Detection Approach[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(3):266-273.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2022-05-22
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2023-03-10
  • 出版日期:
文章二維碼