ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于農(nóng)業(yè)機(jī)器人本體傳感信號的旱田平作與壟作類型識別方法
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家自然科學(xué)基金項目(32271988),、吉林省科技發(fā)展計劃項目(20230508032RC)和吉 林省重點研發(fā)計劃項目(20220202028NC)


Recognition Method of Flat and Ridged Crop Types in Dry Fields Based on Propriety Sensing Signals of Agricultural Robot
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    旱田農(nóng)業(yè)耕作模式包括平作與壟作,不同耕作模式的地形起伏差異大,作物行耕作模式的準(zhǔn)確識別對機(jī)器人行走穩(wěn)定性具有重要意義,提出一種基于本體傳感器信號的平作與壟作類型地形識別方法。首先,采集四足機(jī)器人在玉米田間作物行內(nèi)行走的機(jī)身慣性測量單元(Inertial measure mentunit,IMU)信號,使用機(jī)器人左前腿的足端速度數(shù)據(jù)作為補充,生成機(jī)器人在平作與2種不同起壟高度的壟作種植模式下行走的信號數(shù)據(jù)集,。其次,利用卷積神經(jīng)網(wǎng)絡(luò)(Convolutional neural networks,CNN)提取信號的空間信息特征,通過雙向長短期記憶網(wǎng)絡(luò)(Bidirectional longshort-term memory,BiLSTM)提取時間序列特征,采用注意力機(jī)制(Self-attention,SA)提取CNN與BiLSTM輸出特征信息的注意力分值,。最后,通過模型對比和田間試驗,驗證本文模型對平作與壟作類型識別的有效性。結(jié)果表明,本文CNNBiLSTMSA模型F1值為92%,與CNN,、CNNLSTM,、CNNLSTMSA與CNNBiLSTM模型相比,分別提升10.17、3.51,、2.57,、1.27個百分點。內(nèi)嵌識別模型的田間機(jī)器人可在1.4s內(nèi)實現(xiàn)對當(dāng)前作物行平作與壟作類型90%的識別準(zhǔn)確率,在4.8s內(nèi)達(dá)到對作物行類別分類要求,滿足機(jī)器人面對作物行不同地形的識別快速性,、準(zhǔn)確性要求,。該算法能提供機(jī)器人在旱田典型耕作模式下的地形識別能力,為提高四足機(jī)器人作業(yè)的田間穩(wěn)定性提供技術(shù)支撐。

    Abstract:

    Dryland agricultural cultivation modes include flat cropping and ridge cropping, and the terrain undulation of different cultivation modes varies greatly, so accurate crop row cultivation mode recognition is of great significance to the stability of robot travelling. A methodology for identifying the terrain of crop rows and ridges utilizing appropriate sensor signals was introduced. Initially, the inertial measurement unit ( IMU) signals were collected from a quadrupedal robot navigating through the crop rows of a corn field. The velocity data from the robot’s left front leg served as supplementary information to compile a comprehensive signal dataset, encompassing the robot’s movement in both flat cropping and row cropping modes, with two distinct row heights. Subsequently, spatial information features were extracted from the signals by using convolutional neural networks (CNN), while time series features were derived through bidirectional long short-term memory (BiLSTM) networks. Additionally, self-attention (SA) was employed to capture the attention scores of the output feature information from both CNN and BiLSTM. Ultimately, the efficacy of the proposed model in distinguishing between flat and ridge crop types was validated through model comparisons and field experiments. The results indicated that the F1 score of proposed CNN BiLSTM SA model reached 92% , marking an improvement of 10.17, 3.51, 2.57 and 1.27 percentage points over that of the CNN, CNN LSTM, CNN LSTM SA, and CNN BiLSTM models, respectively. When the recognition model was embedded in the field robot, it achieved a 90% accuracy rate in identifying the current crop row tillage type within 1.4 s, and met the classification criteria for flat and ridge categories within 4.8 s. This performance satisfied the robot’s requirements for rapid and accurate recognition across various tillage terrains. The algorithm can provide the robot with ability to recognize crop rows under typical tillage patterns in dry fields, and the results can provide technical support for improving the field stability of quadrupedal robots in autonomous operations.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

張偉榮,陳學(xué)庚,齊江濤,周俊博,溫浩軍,劉慧力.基于農(nóng)業(yè)機(jī)器人本體傳感信號的旱田平作與壟作類型識別方法[J].農(nóng)業(yè)機(jī)械學(xué)報,2025,56(2):164-174. ZHANG Weirong, CHEN Xuegeng, QI Jiangtao, ZHOU Junbo, WEN Haojun, LIU Huili. Recognition Method of Flat and Ridged Crop Types in Dry Fields Based on Propriety Sensing Signals of Agricultural Robot[J]. Transactions of the Chinese Society for Agricultural Machinery,2025,56(2):164-174.

復(fù)制
分享
文章指標(biāo)
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2024-11-04
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2025-02-10
  • 出版日期:
文章二維碼