天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

當(dāng)前位置:主頁(yè) > 科技論文 > 軟件論文 >

基于Kinect的室內(nèi)異常行為檢測(cè)

發(fā)布時(shí)間:2018-10-30 20:28
【摘要】:近幾十年來,科學(xué)技術(shù)不斷地進(jìn)步和提高,視頻監(jiān)控尤其是高清系統(tǒng)也得到了普及,計(jì)算機(jī)視覺處理技術(shù)也得到了的提升。使用計(jì)算機(jī)視覺處理技術(shù)對(duì)HD視頻進(jìn)行處理,將計(jì)算機(jī)視覺處理技術(shù)應(yīng)用到安防監(jiān)控領(lǐng)域,以提高公共場(chǎng)合的安全性,主要是通過檢測(cè)公共場(chǎng)合出現(xiàn)的異常行為并對(duì)出現(xiàn)的異常行為向人們報(bào)警。計(jì)算機(jī)視覺技術(shù)廣泛地應(yīng)用到視頻監(jiān)控領(lǐng)域,同時(shí)也需要高效率的算法去解決實(shí)時(shí)問題。在計(jì)算機(jī)視覺領(lǐng)域中,科研工作者不斷地用計(jì)算機(jī)對(duì)“人體行為”進(jìn)行識(shí)別和理解,從前景目標(biāo)檢測(cè)、目標(biāo)的追蹤定位和最終對(duì)行為進(jìn)行理解。采集的視頻由于受到光照、陰影、遮擋和噪聲等影響,行為的理解也存在一定的困難。由于Kinect的出現(xiàn),使得深度圖像(RGB-D)進(jìn)入人們的關(guān)注領(lǐng)域,Kinect傳感器受到的外界干擾小,在黑暗的環(huán)境下也可以識(shí)別目標(biāo)人體,可以獲取骨骼特征,具有空間特性?捎糜谌梭w行為識(shí)別,由此科研工作者引發(fā)了高度興趣并激發(fā)新的靈感和解決思路,基于Kinect平臺(tái)對(duì)異常行為進(jìn)行檢測(cè)。本文就是使用Kinect設(shè)備檢測(cè)室內(nèi)的異常行為,RGB-D就是獲取的數(shù)據(jù)信息。論文中研究的異常行為是針對(duì)室內(nèi)中的場(chǎng)景,對(duì)這些不符合人們預(yù)期的行為進(jìn)行檢測(cè),通常異常行為包括:跌倒,打斗,追逐等等。并對(duì)檢測(cè)到的異常進(jìn)行報(bào)警。本文首先敘述了人體異常行為檢測(cè)的三個(gè)階段所使用的算法和特征,分析這些算法和特征的優(yōu)缺點(diǎn),討論了研究現(xiàn)狀和面臨的問題和難點(diǎn),并分析了使用圖片深度信息和骨架關(guān)節(jié)點(diǎn)信息的可行性。其次介紹了Kinect硬件設(shè)備、軟件架構(gòu),闡述了它是如何獲取RGB-D信息的,接著對(duì)骨架關(guān)節(jié)點(diǎn)進(jìn)行了描述。并對(duì)采集的數(shù)據(jù)提取骨架關(guān)節(jié)點(diǎn)信息,使用關(guān)節(jié)點(diǎn)角度信息進(jìn)行特征表示,使用這些特征對(duì)行為進(jìn)行區(qū)分。接著,本文對(duì)主流的人體行為識(shí)別算法進(jìn)行介紹,本文采用動(dòng)態(tài)規(guī)整算法對(duì)人體行為進(jìn)行檢測(cè),并對(duì)該算法進(jìn)行改進(jìn),提高運(yùn)行效率。最后,論文概括了研究工作,然后對(duì)即將開展的工作和發(fā)展趨勢(shì)進(jìn)行了討論和展望。
[Abstract]:In recent decades, with the continuous progress and improvement of science and technology, video surveillance, especially high-definition system has been popularized, computer vision processing technology has also been improved. Computer vision processing technology is used to process HD video, and computer vision processing technology is applied to the field of security monitoring, in order to improve the security of public places. It mainly detects abnormal behavior in public and alerts people about abnormal behavior. Computer vision technology is widely used in the field of video surveillance. At the same time, it needs efficient algorithms to solve real-time problems. In the field of computer vision, researchers constantly use computers to identify and understand "human behavior", from foreground target detection, target tracking and orientation, and ultimately to understand behavior. Because of the influence of illumination, shadow, occlusion and noise, it is difficult to understand the behavior of the collected video. Because of the appearance of Kinect, the depth image (RGB-D) has come into the field of attention. The Kinect sensor has little external interference, and can recognize the target human body in the dark environment. It can obtain the bone feature and have the spatial characteristic. It can be used to identify human behavior, which arouses high interest, inspiration and solution, and detects abnormal behavior based on Kinect platform. This article uses the Kinect device to detect the abnormal behavior in the room, and RGB-D is the data information obtained. The abnormal behaviors studied in this paper are aimed at indoor scenes and detect these behaviors which do not meet the expectations of people. The abnormal behaviors usually include falling, fighting, chasing and so on. And the detection of abnormal alarm. This paper first describes the algorithms and features used in the three stages of human abnormal behavior detection, analyzes the advantages and disadvantages of these algorithms and features, and discusses the present research situation, problems and difficulties. The feasibility of using image depth information and skeleton node information is analyzed. Secondly, the hardware and software architecture of Kinect are introduced, and how to obtain RGB-D information is described. Then the skeleton node is described. The skeleton node information is extracted from the collected data, and the angle information of the node is used to represent the feature, and the behavior is distinguished by these features. Then, this paper introduces the mainstream human behavior recognition algorithm, this paper uses dynamic warping algorithm to detect human behavior, and improves the algorithm to improve the running efficiency. Finally, the research work is summarized, and the future work and development trend are discussed and prospected.
【學(xué)位授予單位】:吉林大學(xué)
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2017
【分類號(hào)】:TP391.41

【相似文獻(xiàn)】

相關(guān)期刊論文 前10條

1 羅瑞琨;魏有毅;尹華彬;徐靜;劉召;劉峰;陳懇;;基于Kinect的主動(dòng)式伴舞機(jī)器人的研究與設(shè)計(jì)[J];機(jī)械設(shè)計(jì)與制造;2013年06期

2 郭迪;孫富春;劉華平;黃文炳;;基于Kinect的冗余機(jī)械臂直線推移操作控制[J];東南大學(xué)學(xué)報(bào)(自然科學(xué)版);2013年S1期

3 韓崢;劉華平;黃文炳;孫富春;高蒙;;基于Kinect的機(jī)械臂目標(biāo)抓取[J];智能系統(tǒng)學(xué)報(bào);2013年02期

4 楊東方;王仕成;劉華平;劉志國(guó);孫富春;;基于Kinect系統(tǒng)的場(chǎng)景建模與機(jī)器人自主導(dǎo)航[J];機(jī)器人;2012年05期

5 ;三大家用游戲機(jī)殊死戰(zhàn) Kinect推動(dòng)Xbox 360迎向2011年龍頭地位[J];電子與電腦;2011年02期

6 ;微軟Kinect的機(jī)會(huì)[J];IT時(shí)代周刊;2011年Z1期

7 金燁;;Kinect:微軟新生?[J];中國(guó)經(jīng)濟(jì)和信息化;2011年07期

8 ;微軟新版Kinect可讀唇語(yǔ)[J];計(jì)算機(jī)與網(wǎng)絡(luò);2012年Z1期

9 黃露丹;嚴(yán)利民;;基于Kinect深度數(shù)據(jù)的人物檢測(cè)[J];計(jì)算機(jī)技術(shù)與發(fā)展;2013年04期

10 羅東;;Kinect,看手勢(shì)![J];21世紀(jì)商業(yè)評(píng)論;2013年24期

相關(guān)會(huì)議論文 前1條

1 郭迪;孫富春;劉華平;黃文炳;;基于Kinect的冗余機(jī)械臂直線推移操作控制[A];2013年中國(guó)智能自動(dòng)化學(xué)術(shù)會(huì)議論文集(第三分冊(cè))[C];2013年

相關(guān)碩士學(xué)位論文 前10條

1 李鵬飛;基于Kinect的體感識(shí)別技術(shù)及其在旗語(yǔ)培訓(xùn)中的應(yīng)用[D];西南交通大學(xué);2015年

2 陳福財(cái);基于Kinect的連續(xù)中國(guó)手語(yǔ)識(shí)別[D];山東大學(xué);2016年

3 呂巖;微型四軸飛行器設(shè)計(jì)及基于Kinect手勢(shì)控制的實(shí)現(xiàn)[D];鄭州大學(xué);2016年

4 陳嘉衍;基于Kinect的動(dòng)態(tài)虛擬聽覺重放[D];華南理工大學(xué);2016年

5 葉平;基于Kinect的實(shí)時(shí)手語(yǔ)識(shí)別技術(shù)研究[D];南京航空航天大學(xué);2016年

6 任洪林;基于Kinect的個(gè)性化人體三維動(dòng)作重現(xiàn)與動(dòng)作細(xì)節(jié)比對(duì)研究[D];天津大學(xué);2014年

7 陳策;基于Kinect深度視覺的服務(wù)機(jī)器人自定位研究[D];沈陽(yáng)建筑大學(xué);2016年

8 劉偉康;基于Kinect的靜態(tài)數(shù)字手語(yǔ)識(shí)別研究及系統(tǒng)實(shí)現(xiàn)[D];河南大學(xué);2016年

9 劉亞楠;基于Kinect的電梯客流統(tǒng)計(jì)方法研究[D];沈陽(yáng)建筑大學(xué);2014年

10 張瑩瑩;基于Kinect的大屏幕手勢(shì)互動(dòng)系統(tǒng)研究與實(shí)現(xiàn)[D];安徽大學(xué);2017年

,

本文編號(hào):2301093

資料下載
論文發(fā)表

本文鏈接:http://www.sikaile.net/kejilunwen/ruanjiangongchenglunwen/2301093.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶447fb***提供,本站僅收錄摘要或目錄,作者需要?jiǎng)h除請(qǐng)E-mail郵箱bigeng88@qq.com