ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于自適應(yīng)無參核密度估計算法的運(yùn)動奶牛目標(biāo)檢測
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家重點(diǎn)研發(fā)計劃項目(2017YFD0701603),、國家自然科學(xué)基金項目(61473235),、陜西省重點(diǎn)產(chǎn)業(yè)創(chuàng)新鏈項目(2019ZDLNY02-05)和中央高?;究蒲袠I(yè)務(wù)費(fèi)專項資金項目(2452019027)


Detection of Moving Cows Based on Adaptive Kernel Density Estimation Algorithm
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    復(fù)雜養(yǎng)殖環(huán)境下運(yùn)動奶牛目標(biāo)的準(zhǔn)確檢測是奶牛跛行,、發(fā)情等運(yùn)動行為感知的基礎(chǔ),。針對現(xiàn)有方法多采用參數(shù)化模型實(shí)現(xiàn)運(yùn)動奶牛目標(biāo)檢測的缺陷,,提出了一種無參核密度估計背景建模方法。該方法根據(jù)各像素的歷史樣本估計像素的概率模型,,針對歷史樣本信息中冗雜信息導(dǎo)致模型復(fù)雜度過高的問題,,采用關(guān)鍵幀檢測技術(shù)剔除樣本中的冗余信息以降低算法的復(fù)雜度,并實(shí)現(xiàn)了在小樣本下核函數(shù)對遙遠(yuǎn)歷史幀圖像信息的獲取,,從而提高了檢測精度,。針對檢測目標(biāo)輪廓缺失的問題,結(jié)合三幀差法進(jìn)一步實(shí)現(xiàn)了運(yùn)動目標(biāo)的完整提取,。為了驗(yàn)證本算法的有效性,,對不同環(huán)境和干擾下的運(yùn)動奶牛視頻樣本進(jìn)行了試驗(yàn),并與高斯混合模型(Gaussian mixture model, GMM)和核密度估計模型(Kernel density estimation, KDE)方法進(jìn)行了對比,。試驗(yàn)結(jié)果表明,,本文算法平均前景正檢率為95.65%,比高斯混合模型提高了15.56個百分點(diǎn),,比核密度估計模型提高了10.56個百分點(diǎn),。同時,本文算法平均實(shí)時性指標(biāo)為1.11,,基本可以實(shí)現(xiàn)運(yùn)動奶牛目標(biāo)的實(shí)時,、準(zhǔn)確檢測,該研究結(jié)果可為奶牛跛行疾病的預(yù)防,、診斷以及奶牛運(yùn)動行為的精確感知提供參考,。

    Abstract:

    Realizing the accurate detection of moving cows in complex farming environment is the basis for behavioral perception of cows such as lameness detection and estrus behavior analysis. Aiming to solve the defects of the existing methods using parametric model to achieve the target detection of moving cows, a background modeling method based on nonparametric kernel density estimation was proposed. The probability model of the pixel was estimated according to the historical sample of each pixel, which had the disadvantage of high complexity caused by the redundant information contained in the historical sample information. The key frame detection technique was adopted to eliminate the redundant information in the sample to reduce the complexity of the algorithm and the ability to acquire the remote frame image information by the kernel function under small samples to improve the detection accuracy. In view of the lack of detected target contours, the threeframe difference method was applied to further achieve a more complete extraction of moving targets. In order to verify the effectiveness of the proposed method, the video samples of moving cows under different environments and disturbances were tested and compared with the Gaussian mixture model and the Kernel density estimation model. The experimental results showed that the average detection rate of the proposed algorithm was 95.65%, which was 15.56 percentage points higher than that of the Gaussian mixture model and 10.56 percentage points higher than that of the Kernel density estimation model. It also showed that the research algorithm had greater improvement than the Gaussian mixture model and the Kernel density estimation model in complex environments such as sunny, rainy and night time. In addition, the average realtime indicator of the algorithm was 1.11, which can basically realize the realtime and accurate detection of moving cow targets. The results were of great significance for the prevention and diagnosis of dairy cows disease and the accurate perception of cows movement behaviors.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

宋懷波,陰旭強(qiáng),吳頔華,姜波,何東健.基于自適應(yīng)無參核密度估計算法的運(yùn)動奶牛目標(biāo)檢測[J].農(nóng)業(yè)機(jī)械學(xué)報,2019,50(5):196-204. SONG Huaibo, YIN Xuqiang, WU Dihua, JIANG Bo, HE Dongjian. Detection of Moving Cows Based on Adaptive Kernel Density Estimation Algorithm[J]. Transactions of the Chinese Society for Agricultural Machinery,2019,50(5):196-204.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2019-03-06
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2019-05-10
  • 出版日期: 2019-05-10
文章二維碼