ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

吊裝機(jī)器人肢體動(dòng)作指令識(shí)別技術(shù)研究
CSTR:
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類號(hào):

基金項(xiàng)目:

國(guó)家自然科學(xué)基金項(xiàng)目(51575219)和福建省海洋經(jīng)濟(jì)創(chuàng)新發(fā)展區(qū)域示范項(xiàng)目(2014FJPT03)


Research on Limb Motion Command Recognition Technology of Lifting Robot
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    鑒于Kinect相機(jī)進(jìn)行肢體識(shí)別監(jiān)控距離有限,,提出使用網(wǎng)絡(luò)大變焦攝像頭,、構(gòu)建CNN-BP融合網(wǎng)絡(luò)進(jìn)行肢體動(dòng)作識(shí)別,,并以9組機(jī)器人吊裝指令為例進(jìn)行訓(xùn)練和識(shí)別。首先,基于OpenPose提取18個(gè)骨架節(jié)點(diǎn)坐標(biāo),生成RGB骨架圖和骨架向量,;然后,采用遷移學(xué)習(xí)方法對(duì)RGB骨架圖使用InceptionV3網(wǎng)絡(luò)提取圖像深層抽象特征,,并對(duì)訓(xùn)練數(shù)據(jù)集采用旋轉(zhuǎn),、平移、縮放和仿射多種數(shù)據(jù)增強(qiáng)方式,,以擴(kuò)充訓(xùn)練數(shù)據(jù),,防止過擬合;再將提取的骨架向量使用BP神經(jīng)網(wǎng)絡(luò)提取點(diǎn)線面等淺層特征,;最后對(duì)InceptionV3網(wǎng)絡(luò)和BP神經(jīng)網(wǎng)絡(luò)輸出進(jìn)行融合,,并使用Softmax求解器得到肢體識(shí)別結(jié)果。將肢體識(shí)別結(jié)果輸入機(jī)器人輔助吊裝控制系統(tǒng),,建立雙重驗(yàn)證控制方法,,完成機(jī)器人輔助吊裝操作。實(shí)驗(yàn)結(jié)果表明,,該方法保證了模型運(yùn)行的精度和時(shí)效性,實(shí)時(shí)識(shí)別精度達(dá)0.99以上,,大大提升了遠(yuǎn)距離人機(jī)交互能力,。

    Abstract:

    In view of the limited monitoring distance of Kinect for limb recognition, the large zoom network camera was used and CNN-BP fusion network for human behavior recognition was constructed, and the nine groups of robot lifting instructions were trained and identified. Firstly, totally 18 skeleton nodes were extracted based on OpenPose to generate RGB skeleton map and skeleton vector. Then, using the migration learning method, the InceptionV3 network was used to extract the deep abstract features of the image, and the training data set was rotated, translated, scaled and affine. A variety of data enhancement methods were used to extend the training data to prevent overfitting;and then the extracted skeleton vector was extracted from the shallow layer features such as the point line surface using BP neural network;the InceptionV3 network and the BP neural network output were merged and obtained by using the Softmax solver to obtain limb classification results. Finally, the result of limb recognition was input into the robot auxiliary hoisting control system, and the double verification control mode was established to complete the robot auxiliary hoisting operation. The test results showed that the method ensured the timeliness of the model operation, and the real-time recognition accuracy reached 0.99, which greatly improved the long-distance human-computer interaction capability.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

倪濤,鄒少元,劉海強(qiáng),黃玲濤,陳寧,張紅彥.吊裝機(jī)器人肢體動(dòng)作指令識(shí)別技術(shù)研究[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2019,50(6):405-411,426. NI Tao, ZOU Shaoyuan, LIU Haiqiang, HUANG Lingtao, CHEN Ning, ZHANG Hongyan. Research on Limb Motion Command Recognition Technology of Lifting Robot[J]. Transactions of the Chinese Society for Agricultural Machinery,2019,50(6):405-411,,426.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2018-11-16
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2019-06-10
  • 出版日期:
文章二維碼