ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于改進YOLO v7的農(nóng)田復(fù)雜環(huán)境下害蟲識別算法研究
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

天津市科技支撐計劃項目(19YFZCSN00360)


Pest Identification Method in Complex Farmland Environment Based on Improved YOLO v7
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻
  • |
  • 相似文獻
  • |
  • 引證文獻
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    為使巡檢機器人能夠?qū)w積小且密集,、形態(tài)多變,、數(shù)量多且分布不均的害蟲進行高效精準(zhǔn)識別,提出了一種基于改進YOLO v7的害蟲識別方法,。該方法將CSP Bottleneck與基于移位窗口Transformer(Swin Transformer)自注意力機制相結(jié)合,,提高了模型獲取密集害蟲目標(biāo)位置信息的能力;在路徑聚合部分增加第4檢測支路,,提高模型對小目標(biāo)的檢測性能;將卷積注意力模塊(CBAM)集成到Y(jié)OLO v7模型中,,使模型更加關(guān)注害蟲區(qū)域,,抑制背景等一般特征信息,提高被遮擋害蟲的識別精確率,;使用 Focal EIoU Loss 損失函數(shù)減少正負(fù)樣本不平衡對檢測結(jié)果的影響,,提高識別精度。采用基于實際農(nóng)田環(huán)境建立的數(shù)據(jù)集的實驗結(jié)果表明,,改進后算法的精確率,、召回率及平均精度均值分別為91.6%、82.9%和88.2%,,較原模型提升2.5,、1.2、3個百分點,。與其它主流模型的對比實驗結(jié)果表明,,本文方法對害蟲的實際檢測效果更優(yōu),對解決農(nóng)田復(fù)雜環(huán)境下害蟲的精準(zhǔn)識別問題具有參考價值,。

    Abstract:

    In order to enable the inspection robot to efficiently and accurately identify small, dense, morphologically variable, numerous and unevenly distributed pests, a pest recognition method based on the improved YOLO v7 was proposed. CSP Bottleneck was combined with a selfattentional mechanism based on shift window transformer (Swin Transformer), which improved the ability of the model to obtain the location information of dense pests. A fourth detection branch was added to the path aggregation part to improve the detection performance of the model on small targets. The convolutional attention module (CBAM) was integrated into the YOLO v7 model to make the model pay more attention to the pest area, suppress the background and other general feature information, and improve the recognition accuracy of blocked pests. Focal EIoU Loss function was used to reduce the influence of positive and negative sample imbalance on detection results and improve the recognition accuracy. According to the experimental results, the accuracy rate, recall rate and mAP of the improved algorithm were 91.6%, 82.9% and 88.2%, respectively by using the data set established based on the actual farmland environment, which was 2.5, 1.2 and 3 percentage points higher than that of the original model. Compared with other mainstream models, the experimental results showed that the proposed method was more effective in the actual detection of pests, and it had practical application value in solving the problem of accurate identification of pests in complex farmland environment.

    參考文獻
    相似文獻
    引證文獻
引用本文

趙輝,黃鏢,王紅君,岳有軍.基于改進YOLO v7的農(nóng)田復(fù)雜環(huán)境下害蟲識別算法研究[J].農(nóng)業(yè)機械學(xué)報,2023,54(10):246-254. ZHAO Hui, HUANG Biao, WANG Hongjun, YUE Youjun. Pest Identification Method in Complex Farmland Environment Based on Improved YOLO v7[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(10):246-254.

復(fù)制
分享
文章指標(biāo)
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2023-04-11
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2023-05-27
  • 出版日期:
文章二維碼