ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于紋理感知模塊改進(jìn)的雜交水稻制繁種雜株檢測方法
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家重點(diǎn)研發(fā)計劃項目(2023YFD2000402),、國家現(xiàn)代農(nóng)業(yè)產(chǎn)業(yè)技術(shù)體系項目(CARS-01),、浙江省“三農(nóng)九方”農(nóng)業(yè)科技協(xié)作計劃項目(2023SNJF048)和浙江大學(xué)科研項目(XY2023042)


Hybrid Rice Breeding Abnormal Plant Detection Method Improved on Texture Cognition Module
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    除雜是雜交水稻制繁種過程中保證種子純度的關(guān)鍵步驟,。為了防止雜株產(chǎn)生異?;ǚ塾绊戨s交優(yōu)勢,,除雜作業(yè)需要反復(fù)人工操作,,耗費(fèi)大量的人工和時間。田間雜株的自動化地識別是實現(xiàn)機(jī)械化和自動化除雜的基礎(chǔ),。為了實現(xiàn)雜交水稻制繁種雜株的自動化精確檢測,,首先使用無人機(jī)航拍采集含有雜株的雜交水稻制繁種田俯拍圖像,通過中心裁剪獲得無畸變的高質(zhì)量圖像,,標(biāo)注出圖像中的雜株目標(biāo)后經(jīng)過幾何變化和顏色變化進(jìn)行數(shù)據(jù)增強(qiáng),,獲得雜交水稻制繁種田間雜株數(shù)據(jù)集。針對圖像數(shù)據(jù)集中雜株和正常植株之間的高相似度,,提出了一種雜株目標(biāo)檢測網(wǎng)絡(luò)模型T-CenterNet2,,在 CenterNet2網(wǎng)絡(luò)的特征金字塔網(wǎng)絡(luò)中添加紋理感知模塊,這一模塊通過重組通道信息獲得特征圖中的紋理特征,,進(jìn)而增強(qiáng)雜株目標(biāo)和背景的特征差異;并重新設(shè)計了損失函數(shù),,添加測量紋理特征和標(biāo)簽真值之間差異的紋理損失,用于控制紋理感知模塊;針對除雜的實際作業(yè)情況引入 DIoU作為邊界框損失,通過增加預(yù)測框和標(biāo)簽中心點(diǎn)的距離懲罰項以提高網(wǎng)絡(luò)預(yù)測的目標(biāo)中心點(diǎn)準(zhǔn)確度,。為了驗證各項改進(jìn)對模型的性能提升,,首先使用mAP和召回率作為評價指標(biāo)描述模型對雜株目標(biāo)的檢測效果,將改進(jìn)后模型與原始模型 CenterNet2以及 4種典型模型(Faster R-CNN,、FCOS,、YOLOX、DeTR)進(jìn)行對比,,實驗結(jié)果表明改進(jìn)后 T-CenterNet2模型 mAP達(dá)到 86.4%,,較原始模型提高 11.0個百分點(diǎn),召回率達(dá)到 82.5%,,較原始模型提高 11.6個百分點(diǎn),,而典型模型最高 mAP和召回率僅為 73.1%和 66.2%,T-CenterNet2模型取得明顯的優(yōu)勢,。其次對比了不同損失函數(shù)組合對模型收斂速度和檢測精度的影響,,其中具有權(quán)重的紋理損失和 DIoU組取得最佳結(jié)果,證明重新設(shè)計的損失函數(shù)有效適用于雜株檢測任務(wù),。改進(jìn)后模型具有較高的檢測精度和魯棒性,,能夠?qū)崿F(xiàn)良好的雜株檢測效果。

    Abstract:

    Abnormal plant removal is a critical step in ensuring seed purity during hybrid rice seed production. To prevent abnormal plants from producing abnormal pollen that could compromise hybrid vigor, current abnormal plant removal operations require repeated manual efforts, consuming significant time and labor. The automation of abnormal plant identification in the field is fundamental to achieve mechanized and automated removal. Aiming to achieve automated and precise detection of abnormal plant in hybrid rice seed production, UAV aerial images of hybrid rice seed production fields containing abnormal plants were collected, and high-quality and distortion-free images were obtained through center cropping. The abnormal plants in the images were annotated, and data augmentation was performed through geometric and color transformations to create a dataset of abnormal plants in hybrid rice seed production fields. To address the high similarity between abnormal and normal plants in the image dataset, a novel abnormal plant detection network model,,T-CenterNet2, was proposed. This model enhanced the CenterNet2 network by incorporating a texture-aware module within the feature pyramid network, which reorganized channel information to extract texture features from the feature maps, thereby increasing the feature distinction between abnormal plants and the background. Additionally, a combination of loss functions was designed,, including a texture loss that measured the difference between texture features and label ground truth to control the texture-aware module. DIoU was introduced as the bounding box loss to improve the accuracy of target center point predictions, in line with the practical requirements of abnormal plant removal operations. The effects of different loss function combinations on model convergence speed and detection accuracy were compared, with the combination of weighted texture loss and DIoU yielding the best results, demonstrating the effectiveness of the redesigned loss function for abnormal plant detection tasks. Using mAP and recall rate as evaluation metrics, the improved model was compared with the original CenterNet2 model and four typical models—Faster R-CNN, FCOS,YOLOX, and DeTR. Experimental results showed that the improved T-CenterNet2 model achieved an mAP of 86.4%, an increase of 11 percentage points over the original model, and a recall rate of 82.5%, an increase of 11.6 percentage points over the original model. The highest mAP and recall rate among the typical models were only 73.1% and 66.2%, respectively. The enhanced model exhibited high detection accuracy and robustness, effectively achieving reliable abnormal plant detection.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

楊惇泓,王永維,王俊.基于紋理感知模塊改進(jìn)的雜交水稻制繁種雜株檢測方法[J].農(nóng)業(yè)機(jī)械學(xué)報,2024,55(s2):286-293. YANG Dunhong, WANG Yongwei, WANG Jun. Hybrid Rice Breeding Abnormal Plant Detection Method Improved on Texture Cognition Module[J]. Transactions of the Chinese Society for Agricultural Machinery,2024,55(s2):286-293.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2024-08-10
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2024-12-10
  • 出版日期:
文章二維碼