ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于相機(jī)與激光雷達(dá)融合的溫室機(jī)器人行間導(dǎo)航方法
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家重點研發(fā)計劃項目(2020AAA0108103),、中國科學(xué)院機(jī)器人與智能制造創(chuàng)新研究院自主項目(C2021002)和中國科學(xué)院合肥物質(zhì)科學(xué)研究院院長基金項目(YZJJZX202013)


Inter-rows Navigation Method of Greenhouse Robot Based on Fusion of Camera and LiDAR
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    針對溫室顛簸不平,、枝葉遮擋道路的復(fù)雜環(huán)境,開展基于相機(jī)與激光雷達(dá)數(shù)據(jù)融合的機(jī)器人行間導(dǎo)航方法研究,。首先,,利用改進(jìn)的U-Net模型實現(xiàn)圖像道路區(qū)域的準(zhǔn)確快速分割,;其次,通過融合圖像分割結(jié)果進(jìn)行地面點云預(yù)分割,,減少地面起伏造成的點云傾斜,;然后,采用改進(jìn)的KMeans算法實現(xiàn)作物行點云快速聚類,,并將聚類中心作為作物行主干區(qū)域點,,降低枝葉遮擋對作物行中線提取的影響;最后,,采用RANSAC算法擬合兩側(cè)作物行方程并計算出導(dǎo)航線,。通過實驗評估導(dǎo)航線精度,在測試集中94%以上數(shù)據(jù)幀可以準(zhǔn)確實現(xiàn)提取導(dǎo)航線,,平均角度誤差不高于1.45°,,滿足溫室機(jī)器人沿作物行自主導(dǎo)航行駛要求。

    Abstract:

    Aiming at the complex greenhouse environment where the ground is bumpy and the branches and leaves block the road, the research on the inter-rows navigation method of greenhouse robot based on the fusion of camera and LiDAR data was carried out. Firstly, the improved U-Net model was used to realize the accurate and fast segmentation of image road area. Secondly, the ground point cloud was pre-segmented by fusing the image segmentation result to reduce the incline of the point cloud data caused by the ground bumpiness. Then, the improved KMeans algorithm was used to realize the rapid clustering of the crop row point cloud, and the cluster centers were used as the main area points of crop rows to reduce the influence of branches and leaves blocking the road on extraction of crop row centerline. Finally, the RANSAC algorithm was used to fit the crop row equations on both sides and calculate the navigation lines. The navigation line accuracy was evaluated by experiment, the validation work was conducted in two greenhouse scenarios at three typical greenhouse robot operation speeds. The experimental results showed that the performance and timing of the segmented images met the requirements of subsequent point cloud pre-segmentation;the experiment of point cloud data frames by bumpy environment can effectively calibrate the ground point cloud;compared with the raster height difference segmentation of ground point cloud, the segmentation effect was better and the time consumption of single frame processing was increased very little;on the test set, more than 94% of the data frames can accurately extract the navigation line and the average angle error was not higher than 1.45°. The research result can meet the greenhouse robot along the crop row autonomous navigation driving requirements.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

王杰,陳正偉,徐照勝,黃滋棟,經(jīng)俊森,牛潤新.基于相機(jī)與激光雷達(dá)融合的溫室機(jī)器人行間導(dǎo)航方法[J].農(nóng)業(yè)機(jī)械學(xué)報,2023,54(3):32-40. WANG Jie, CHEN Zhengwei, XU Zhaosheng, HUANG Zidong, JING Junsen, NIU Runxin. Inter-rows Navigation Method of Greenhouse Robot Based on Fusion of Camera and LiDAR[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(3):32-40.

復(fù)制
分享
文章指標(biāo)
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2022-04-20
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2023-03-10
  • 出版日期:
文章二維碼