Realsense T265 简单上手

Realsense T265 简单上手

实验室新买了一个相机,属于intel realsense 系列,两个单色鱼眼相机,自带IMU并配有VPU,性能强大,可以直接输出位置。就是拿来当IMU用也是不错的。

废话不多说,直接上手


系统:ubuntu16.04

1. 安装SDK

  • 可以直接去官网看教程,SDK的安装可以采用编译安装,也可以使用源安装,这里我们选用简单的源安装。
# Add Intel server to the list of repositories :
echo 'deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo xenial main' | sudo tee /etc/apt/sources.list.d/realsense-public.list
# It is recommended to backup /etc/apt/sources.list.d/realsense-public.list file in case of an upgrade.
# Register the server’s public key :
sudo apt-key adv --keyserver keys.gnupg.net --recv-key 6F3EFCDE
# Refresh the list of repositories and packages available :
sudo apt-get update

# In order to run demos install:
sudo apt-get install librealsense2-dkms
sudo apt-get install librealsense2-utils

# Developers shall install additional packages:
sudo apt-get install librealsense2-dev
sudo apt-get install librealsense2-dbg

安装完成后连上相机,使用realsense-viewer命令就可以看到图像了

2. 简单测试程序

  • 主程序
#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API
#include "example.hpp"          // Include short list of convenience functions for rendering

// Capture Example demonstrates how to
// capture depth and color video streams and render them to the screen
int main(int argc, char * argv[]) try
{
    
    
    // Create a simple OpenGL window for rendering:
    window app(1280, 720, "RealSense Vision可视化");

    // Declare RealSense pipeline, encapsulating the actual device and sensors
    rs2::pipeline pipe;

    // Start streaming with default recommended configuration
    // The default video configuration contains Depth and Color streams
    // If a device is capable to stream IMU data, both Gyro and Accelerometer are enabled by default
    pipe.start();

    while (app) // Application still alive?
    {
    
    
        rs2::frameset data = pipe.wait_for_frames();

        // The show method, when applied on frameset, break it to frames and upload each frame into a gl textures
        // Each texture is displayed on different viewport according to it's stream unique id
        app.show(data);
    }

    return EXIT_SUCCESS;
}
catch (const rs2::error & e)
{
    
    
    std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n    " << e.what() << std::endl;
    return EXIT_FAILURE;
}
catch (const std::exception& e)
{
    
    
    std::cerr << e.what() << std::endl;
    return EXIT_FAILURE;
}
  • CMakeLists.txt文件
cmake_minimum_required(VERSION 3.1)
project(test)

set(CMAKE_BUILD_TYPE "release")
set(CMAKE_CXX_STANDARD 11)

find_package(OpenGL REQUIRED)
find_package(glfw3 REQUIRED)

include_directories("../third_party")

add_executable(main trajector.cpp)
target_link_libraries(main realsense2 glfw ${OPENGL_LIBRARIES})

add_executable(test capture.cpp)
target_link_libraries(test realsense2 glfw ${OPENGL_LIBRARIES})

我的github 项目上还有一个demo,可以显示相机的3D位置

3. 后记

推荐两篇博客

猜你喜欢

转载自blog.csdn.net/qq_34935373/article/details/109445239