win10安装openni2以及获取kinect v1图像

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/u011988573/article/details/56012416
  • 初始环境:win10(64位)+vs2015+opencv2.4.13
  • 硬件:kinect v1
  • 最终环境:win10(64位)+vs2015+opencv2.4.13+kinect SDK 1.8+OpenNI2.2+NITE2.2

1.配置openni2环境(kinect SDK 1.8+OpenNI2.2+NITE2.2)

参考网址:
-(1)http://blog.csdn.net/chenxin_130/article/details/8580636
-(2)https://social.msdn.microsoft.com/Forums/en-US/bae80fb0-fce5-468e-a292-fe46381af3e5/how-to-install-openni-22-nite-22-kinect-sdk-18-windows-7-3264-bit?forum=kinectsdk
-(3)http://www.cnblogs.com/tornadomeet/archive/2012/11/16/2772681.html

(1)一开始不要把kinect插在电脑上!先安装Kinect for Windows SDK 1.8(Kinect for Windows SDK 1.8 官方下载网址),再连接kinect,可以从设备管理器里面看或者通过运行sample检验是否安装成功。(如果一开始就先把kinect连上了,windows会自动装上不正确的驱动,后面在设备管理器里会发现不能识别kinect,只能识别为“未知设备”,这时候先把这个未知设备的驱动删了,然后把kinect拔了,装了SDK以后再插上kinect)
(2)安装好驱动以后,先安装openni2.2(64位openni2.2 CSDN下载网址32位openni2.2 CSDN下载网址),再安装nite2.2(64位nite2.2 CSDN下载网址32位nite2.2 CSDN下载网址)。安装包从网址(2)下载,或者从CSDN上搜。网址(2)中建议64位电脑需要同时安装openni2.2和nite2.2的64位和32位版本,但我尝试过两种版本都安装,安装后会导致打不开kinect,所以只装64位或者只装32位的openni和nite就好了。
(3)每次运行exe程序,将OpenNI2\Redist\文件夹下所有文件粘贴到exe同目录下。
(4)如果有kinect初始化失败的情况,可以尝试用管理员身份打开记事本,打开”C:\Program Files (x86)\OpenNI2\Redist\OpenNI2\Drivers\PS1080.ini”,将”;UsbInterface=2”修改为”UsbInterface=0”(记得去掉UsbInterface前的”;”)。(测试openni2 sample 如NiViewer时,需修改sample同目录下的”OpenNI2\Drivers\PS1080.ini”文件)
(4)配置VS开发环境,见参考网址(1)。

2.获取kinect v1图像

(1)获取彩色图和深度图程序
参考网址:http://blog.csdn.net/chenxin_130/article/details/8619909

(2)获取红外图像

代码如下:

#include <stdlib.h> 
#include <iostream> 
#include <string> 
#include "OpenNI.h" 
#include "opencv2/core/core.hpp" 
#include "opencv2/highgui/highgui.hpp" 
#include "opencv2/imgproc/imgproc.hpp" 
using namespace std;
using namespace cv;
using namespace openni;


void CheckOpenNIError(Status result, string status)
{
if (result != STATUS_OK)
cerr << status << " Error: " << OpenNI::getExtendedError() << endl;
}

int main(int argc, char** argv)
{
Status result = STATUS_OK;

//OpenNI2 image 
VideoFrameRef oniIrImg;

//OpenCV image 
cv::Mat cvIrImg;

cv::namedWindow("Ir");
char key = 0;

//【1】 
// initialize OpenNI2 
result = OpenNI::initialize();
CheckOpenNIError(result, "initialize context");

// open device 
Device device;
result = device.open(openni::ANY_DEVICE);

//【2】 
// create Ir stream 
VideoStream oniIrStream;
result = oniIrStream.create(device, openni::SENSOR_IR);

//【3】 
// set Ir video mode 
VideoMode modeIr;
modeIr.setResolution(640, 480);
modeIr.setFps(30);
modeIr.setPixelFormat(PIXEL_FORMAT_GRAY16);
oniIrStream.setVideoMode(modeIr);
// start Ir stream 
result = oniIrStream.start();

while (key != 27)
{
if (oniIrStream.readFrame(&oniIrImg) == STATUS_OK)
{
cv::Mat cvRawImg16U(oniIrImg.getHeight(), oniIrImg.getWidth(), CV_16UC1, (void*)oniIrImg.getData());
cvRawImg16U.convertTo(cvIrImg, CV_8U);

cv::imshow("Ir", cvIrImg);
}
key = cv::waitKey(20);
}

//cv destroy 
cv::destroyWindow("Ir");

//OpenNI2 destroy 
oniIrStream.destroy();
device.close();
OpenNI::shutdown();

return 0;
}

3.运行程序

注意:假设project位置为”..\Visual Studio 2015\Projects\GetKinectIRImageProject\”,程序在Debug x86模式下运行,需要将”C:\Program Files (x86)\OpenNI2\Redist”下所有文件粘贴到”..\Visual Studio 2015\Projects\GetKinectIRImageProject\Debug\”文件夹下。

kinect v1 获取红外图像

4.读取彩色(RGB)图和红外(IR)图并保存

/*************************
OpenNI2 Deep, Color and Fusion Image
Author: Xin Chen, 2013.2
Blog: http://blog.csdn.net/chenxin_130
*************************/
/*************************
Add IR image getting and Image saving
Author: LuHaiyan, 2017.2
Blog: http://blog.csdn.net/u011988573
*************************/

//否则会出现错误:This function or variable may be unsafe.Consider using localtime_s instead
#pragma warning(disable:4996)
#include<time.h> //获取时间
#include <direct.h> //创建文件夹
#include <stdlib.h>  
#include <iostream>  
#include <string>  
#include "OpenNI.h"  
#include "opencv2/core/core.hpp"  
#include "opencv2/highgui/highgui.hpp"  
#include "opencv2/imgproc/imgproc.hpp"  
using namespace std;
using namespace cv;
using namespace openni;

/*
博客(http://www.cnblogs.com/tornadomeet/archive/2012/11/16/2772681.html)中提到,
Kinect在使用时,微软官方推荐的距离为1220mm(4’)~3810mm(12.5’),
网友heresy在他的博文Kinect + OpenNI 的深度值中统计过,
kinect在距离为1.0m时其精度大概是3mm,而当距离是3.0m时,其精度大概是3cm,
因此当kinect在官方推荐的距离范围内使用时,
如果是1.2m时,其精度应该在3mm附近,
如果是3.6m时其精度就大于3cm了,
因此距离越远,其深度值精度越低。
另外,通过OpenNI获取到的深度信息(即z坐标)的单位是mm,
这一点在程序编程中要注意,且一般的深度值用12bit(其实是可以用13bit表示的)表示,即最大值为4095,也就是代表4.095m,
所以平时我们采集到的深度数据如果需要扩展到灰度图,可以乘以一个因子255/4095
(为了加快该结果,一般设置为256/4096,即转换后的灰度值每变化1,代表kinect采集到的深度值变化了16mm,
如果当人的距离为1米左右时,本来精度是3mm,现在经过归一化后精度确更加下降了,
如果其深度值为0表示该位置处侦测不到像素点的深度。
*/
#define DEPTH_SCALE_FACTOR 256./4096.

string getTime()
{
    time_t rawtime;
    struct tm * timeinfo;
    char buffer[128];

    time(&rawtime);
    timeinfo = localtime(&rawtime);

    strftime(buffer, sizeof(buffer), "%Y_%m_%d_%H_%M_%S", timeinfo);
    string time_now = buffer;
    cout << time_now << endl;
    return time_now;
}

void CheckOpenNIError(Status result, string status)
{
    if (result != STATUS_OK)
        cerr << status << " Error: " << OpenNI::getExtendedError() << endl;
}

int main(int argc, char** argv)
{
    Status result = STATUS_OK;

    //OpenNI2 image  
    //VideoFrameRef oniDepthImg;
    VideoFrameRef oniColorImg;
    VideoFrameRef oniIrImg;

    //OpenCV image  
    //cv::Mat cvDepthImg;
    cv::Mat cvBGRImg;
    //cv::Mat cvFusionImg;
    cv::Mat cvIrImg;

    //cv::namedWindow("depth");
    cv::namedWindow("rgb");
    //cv::namedWindow("fusion");
    cv::namedWindow("Ir");

    //【1】  
    // initialize OpenNI2  
    result = OpenNI::initialize();
    CheckOpenNIError(result, "initialize context");

    // open device    
    Device device;
    result = device.open(openni::ANY_DEVICE);

    //【2】  
    //// create depth stream   
    //VideoStream oniDepthStream;
    //result = oniDepthStream.create(device, openni::SENSOR_DEPTH);
    //
    ////【3】  
    //// set depth video mode  
    //VideoMode modeDepth;
    //modeDepth.setResolution(640, 480);
    //modeDepth.setFps(30);
    //modeDepth.setPixelFormat(PIXEL_FORMAT_DEPTH_1_MM);
    //oniDepthStream.setVideoMode(modeDepth);
    //// start depth stream  
    //result = oniDepthStream.start();

    // create Ir stream 
    VideoStream oniIrStream;
    result = oniIrStream.create(device, openni::SENSOR_IR);

    //【3】 
    // set Ir video mode 
    VideoMode modeIr;
    modeIr.setResolution(640, 480);
    modeIr.setFps(30);
    modeIr.setPixelFormat(PIXEL_FORMAT_GRAY16);
    oniIrStream.setVideoMode(modeIr);
    // start Ir stream 
    //result = oniIrStream.start();

    // create color stream  
    VideoStream oniColorStream;
    result = oniColorStream.create(device, openni::SENSOR_COLOR);
    // set color video mode  
    VideoMode modeColor;
    modeColor.setResolution(640, 480);
    modeColor.setFps(30);
    modeColor.setPixelFormat(PIXEL_FORMAT_RGB888);
    oniColorStream.setVideoMode(modeColor);

    //【4】  
    // set depth and color imge registration mode  
    /*if (device.isImageRegistrationModeSupported(IMAGE_REGISTRATION_DEPTH_TO_COLOR))
    {
    device.setImageRegistrationMode(IMAGE_REGISTRATION_DEPTH_TO_COLOR);
    }*/
    // start color stream  
    result = oniColorStream.start();

    // mkdir
    /*string path_of_all = "F:\\Data_kinect\\test\\";
    string path_of_rgb_d = path_of_all + getTime();
    _mkdir(const_cast<char*>(path_of_rgb_d.c_str()));*/

    char pathOfAll[1024]= "F:\\Data_kinect\\test\\";
    strcat(pathOfAll, getTime().c_str());
    strcat(pathOfAll, "\\");
    _mkdir(pathOfAll);
    char path[200];
    char path_rgb[200];
    char path_ir[200];
    //string path_of_rgb = path_of_rgb_d + "\\rgb_image";
    //string path_of_depth = path_of_rgb_d + "\\depth_image";
    //string path_of_ir = path_of_rgb_d + "\\ir_image";
    //string path_of_fusion = path_of_rgb_d + "\\fusion_image";
    //_mkdir(const_cast<char*>(path_of_rgb.c_str()));
    //_mkdir(const_cast<char*>(path_of_depth.c_str()));
    //_mkdir(const_cast<char*>(path_of_fusion.c_str()));
    //_mkdir(const_cast<char*>(path_of_ir.c_str()));

    // image index
    int frame_index = 0;
    char key = 'a';
    while (key != 27)
    {
        // read frame  
        if (oniColorStream.readFrame(&oniColorImg) == STATUS_OK)
        {
            // convert data into OpenCV type  
            cv::Mat cvRGBImg(oniColorImg.getHeight(), oniColorImg.getWidth(), CV_8UC3, (void*)oniColorImg.getData());
            cv::cvtColor(cvRGBImg, cvBGRImg, CV_RGB2BGR);
            flip(cvBGRImg, cvBGRImg, 1);
            cv::imshow("rgb", cvBGRImg);

        }

        //if (oniDepthStream.readFrame(&oniDepthImg) == STATUS_OK)
        //{
        //  cv::Mat cvRawImg16U(oniDepthImg.getHeight(), oniDepthImg.getWidth(), CV_16UC1, (void*)oniDepthImg.getData());
        //  //cvRawImg16U.convertTo(cvDepthImg, CV_8U, 255.0 / (oniDepthStream.getMaxPixelValue()));
        //  cvRawImg16U.convertTo(cvDepthImg, CV_8U, DEPTH_SCALE_FACTOR);
        //  //cvRawImg16U.convertTo(cvDepthImg, CV_8U);

        //  //inverse depth color
        //  /*for (int i = 0; i < cvDepthImg.rows; i++)
        //  {
        //      uchar* pData = cvDepthImg.ptr<uchar>(i);
        //      for (int j = 0; j < cvDepthImg.cols ; j++)
        //      {
        //          pData[j] = cv::saturate_cast<uchar>(255-pData[j]);
        //      }
        //  }*/

        //  //【5】  
        //  // convert depth image GRAY to BGR  
        //  //cv::cvtColor(cvDepthImg, cvFusionImg, CV_GRAY2BGR);
        //  cv::imshow("depth", cvDepthImg);
        //}



        //【6】  
        //cv::addWeighted(cvBGRImg, 0.5, cvFusionImg, 1.0, 0, cvFusionImg);
        //cv::imshow("fusion", cvFusionImg);

        key = waitKey(1);
        if (key == 's') {
            //save image to certain path
            /*stringstream stream;
            stream << frame_index;
            string frame_index_str = stream.str();*/

            /*string rgb_path = path_of_rgb + "\\";
            rgb_path += frame_index_str + ".jpg";*/

            /*string depth_path = path_of_depth + "\\";
            depth_path += frame_index_str + ".jpg";*/

            //string fusion_path = path_of_fusion + "\\";
            //fusion_path += frame_index_str + ".jpg";

            /*string ir_path = path_of_ir + "\\";
            ir_path += frame_index_str + ".jpg";*/

            strcpy(path, pathOfAll);
            sprintf(path_rgb, "%04d_rgb.jpg", frame_index);
            strcat(path, path_rgb);
            cout <<"rgb:"<< path << endl;
            imwrite(path, cvBGRImg);
            //imwrite(depth_path, cvDepthImg);
            //imwrite(fusion_path, cvFusionImg);

            oniColorStream.stop();
            oniIrStream.start();
            if (oniIrStream.readFrame(&oniIrImg) == STATUS_OK)
            {
                cv::Mat cvRawImg16U(oniIrImg.getHeight(), oniIrImg.getWidth(), CV_16UC1, (void*)oniIrImg.getData());
                cvRawImg16U.convertTo(cvIrImg, CV_8U);
                flip(cvIrImg, cvIrImg, 1);
                cv::imshow("Ir", cvIrImg);
            }
            strcpy(path, pathOfAll);
            sprintf(path_ir, "%04d_ir.jpg", frame_index);
            strcat(path, path_ir);
            cout << "ir:" << path << endl;
            imwrite(path, cvIrImg);
            cout <<"Index "<< frame_index << " rgb and ir image saved." << endl;
            oniIrStream.stop();
            oniColorStream.start();
            frame_index++;
        }


    }

    //cv destroy  
    //cv::destroyWindow("depth");
    cv::destroyWindow("rgb");
    //cv::destroyWindow("fusion");
    cv::destroyWindow("Ir");

    //OpenNI2 destroy  
    //oniDepthStream.destroy();
    oniColorStream.destroy();
    oniIrStream.destroy();
    device.close();
    OpenNI::shutdown();

    return 0;
}

猜你喜欢

转载自blog.csdn.net/u011988573/article/details/56012416