机器人 检测点云中的抓取姿势 Grasp Pose Detection (GPD) 扩展实例编写

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/flyfish1986/article/details/85266537

机器人 检测点云中的抓取姿势 Grasp Pose Detection (GPD) 扩展实例编写

flyfish

环境
Ubuntu16.04
ROS kinetic
Detect Grasps With an RGBD camera
使用RGBD摄像机检测抓取
语言:C++
相机:astra

在CmakeLists中添加:

add_compile_options(-std=c++11)

add_executable(${PROJECT_NAME}_get_grasps src/interface/get_grasps.cpp)

target_link_libraries(${PROJECT_NAME}_get_grasps
                      ${catkin_LIBRARIES})

set_target_properties(${PROJECT_NAME}_get_grasps
                      PROPERTIES OUTPUT_NAME get_grasps
                      PREFIX "")

实例代码 有些多余的头文件

//std
#include <tuple>
#include <map>
#include <vector>
#include <queue>
#include <mutex>
#include <string>

// PCL
#include <pcl/io/pcd_io.h>
#include <pcl/filters/random_sample.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
// ROS

#include <ros/ros.h>
#include <ros/callback_queue.h>
#include <sensor_msgs/PointCloud2.h>
#include <visualization_msgs/MarkerArray.h>

#include <sensor_msgs/JointState.h>
#include <std_msgs/Float64MultiArray.h>
#include <std_msgs/Empty.h>
#include <std_srvs/SetBool.h>
#include <std_msgs/Int64.h>
#include <std_msgs/Empty.h>
// this project (services)

// GPG
#include <gpg/cloud_camera.h>

// this project (messages)
#include <gpd/CloudIndexed.h>
#include <gpd/CloudSamples.h>
#include <gpd/CloudSources.h>
#include <gpd/GraspConfig.h>
#include <gpd/GraspConfigList.h>
#include <gpd/SamplesMsg.h>
#include <gpd/detect_grasps.h>

//tf
#include <tf/transform_listener.h>
#include <tf_conversions/tf_eigen.h>
//moveit  

#include <geometry_msgs/PoseStamped.h>
#include <geometry_msgs/PoseArray.h>
#include <moveit/move_group_interface/move_group_interface.h>
#include <moveit/planning_scene_interface/planning_scene_interface.h>
#include <moveit/planning_scene_monitor/planning_scene_monitor.h>
#include <moveit/robot_state/conversions.h>
#include <moveit/ompl_interface/ompl_interface.h>
#include <moveit/dynamics_solver/dynamics_solver.h>




//action
#include <actionlib/client/simple_action_client.h>
#include <control_msgs/FollowJointTrajectoryAction.h>
#include <trajectory_msgs/JointTrajectoryPoint.h>


#include <dynamic_reconfigure/server.h>


//GraspConfigList的解释
// # This message stores a list of grasp configurations.

// # The time of acquisition, and the coordinate frame ID.
// Header header

// # The list of grasp configurations.
// gpd/GraspConfig[] grasps


// # Position
// geometry_msgs/Point bottom # centered bottom/base of the robot hand)
// geometry_msgs/Point top # centered top/fingertip of the robot hand)
// geometry_msgs/Point surface # centered position on object surface

// # Orientation represented as three axis (R = [approach binormal axis])
// geometry_msgs/Vector3 approach # The grasp approach direction
// geometry_msgs/Vector3 binormal # The binormal
// geometry_msgs/Vector3 axis # The hand axis

// geometry_msgs/Point sample # Point at which the grasp was found

// std_msgs/Float32 width # Required aperture (opening width) of the robot hand

// std_msgs/Float32 score # Score assigned to the grasp by the classifier
void detect_grasps_callback(const gpd::GraspConfigList& l)
{
  // 将此消息转换成moveit能够识别的消息
  gpd::GraspConfig g = l.grasps[0];

  //将自定义形式转换成标准的多种形式输出

  double coefficient = M_PI / 180;
  double roll, pitch, yaw;

// File: geometry_msgs/Vector3.msg
// Raw Message Definition
// # This represents a vector in free space. 
// # It is only meant to represent a direction. Therefore, it does not
// # make sense to apply a translation to it (e.g., when applying a 
// # generic rigid transformation to a Vector3, tf2 will only apply the
// # rotation). If you want your data to be translatable too, use the
// # geometry_msgs/Point message instead.

// float64 x
// float64 y
// float64 z


tf::Matrix3x3 m(
g.approach.x, g.binormal.x, g.axis.x, 
g.approach.y, g.binormal.y, g.axis.y, 
g.approach.z, g.binormal.z, g.axis.z); 


            m.getEulerYPR(yaw, pitch, roll); //RPY提供给人肉眼看


            std::cout << m.getRow(0).getX() << m.getRow(0).getY() << m.getRow(0).getZ() << std::endl;
            std::cout << m.getRow(1).getX() << m.getRow(1).getY() << m.getRow(1).getZ() << std::endl;
            std::cout << m.getRow(2).getX() << m.getRow(2).getY() << m.getRow(2).getZ() << std::endl;


            geometry_msgs::Point point;
            point.x = g.sample.x;
            point.y = g.sample.y;
            point.z = g.sample.z;
            //输出x,y,z,r,p,y
             ROS_INFO("output xyzrpy: %f  %f  %f  %f  %f  %f", point.x ,point.y ,point.z, roll, pitch, yaw);

            geometry_msgs::PoseStamped target_pose_stamped; //四元数提供给计算机使用
            target_pose_stamped.pose.position.x = point.x;
            target_pose_stamped.pose.position.y = point.y;
            target_pose_stamped.pose.position.z = point.z;
            target_pose_stamped.pose.orientation = tf::createQuaternionMsgFromRollPitchYaw(roll, pitch, yaw);
}

int main(int argc, char *argv[])
{
  const std::string FRAME = "world";

  ros::init(argc, argv, "get_grasps"); //初始化ROS节点
  ros::NodeHandle n; //创建节点句柄

  ros::Subscriber sub = n.subscribe("/detect_grasps/clustered_grasps", 1000 , detect_grasps_callback);

  ros::Rate rate(1);
  while (ros::ok())
  {
    ros::spinOnce();
    rate.sleep();
  }
  

  return 0;
}

最后是启动
1启动相机
roslaunch astra_launch astra.launch

2启动rviz
rosrun rviz rviz

3 rviz中加载配置文件
gpd/tutorials/openni2.rviz.

4 启动ROS节点 检测夹抓
roslaunch gpd tutorial1.launch

5启动自定义节点
rosrun gpd get_grasps

6 输出结果

tutorial1 输出结果

==== Selected grasps ====
Grasp 0: -5757.96
Grasp 1: -5781.67
Grasp 2: -6231.01
Grasp 3: -6646.38
Grasp 4: -6646.38
[ INFO] [1545795935.616926176]: Selected the 5 highest scoring grasps.
[ INFO] [1545795935.616977212]: Published 5 highest-scoring grasps.
[ INFO] [1545795935.617042676]: Waiting for point cloud to arrive ...
[ INFO] [1545795935.635950098]: Received cloud with 307200 points.
Processing cloud with: 307200 points.
Calculating surface normals ...
Using integral images for surface normals estimation ...
 runtime (normals): 0.113144
Reversing direction of normals that do not point to at least one camera ...
 reversed 206577 normals
 runtime (reverse normals): 0.00747134
After workspace filtering: 102965 points left.
After voxelization: 38100 points left.
Subsampled 100 at random uniformly.
Estimating local reference frames ...
Estimated 100 local reference frames in 0.0043785 sec.
Finding hand poses ...
 Found 12 grasp candidate sets in 0.527498 sec.
====> HAND SEARCH TIME: 0.541366
[ INFO] [1545795936.337648733]: Generated 12 grasp candidate sets.
Creating grasp images for classifier input ...
time for computing 12 point neighborhoods with 4 threads: 0.0397289s
 Image creation time: 0.625206
 image_list: 96, valid_images: 28, valid_grasps: 28
 Prediction time: 1.04166
Selecting the 5 highest scoring grasps ...
 grasp #0, score: -2354.93, -2354.93
 grasp #1, score: -4028.55, -4028.55
 grasp #2, score: -4255.32, -4255.32
 grasp #3, score: -4424.9, -4424.9
 grasp #4, score: -5083.04, -5083.04
Total classification time: 1.04174
[ INFO] [1545795937.379457995]: Selected 5 valid grasps after predicting their scores.
[ INFO] [1545795937.379486942]: Found 0 clusters.

get_grasps 输出结果

0.802892-0.4522060.388426
-0.2570310.3252990.910008
-0.537866-0.8304760.144949
[ INFO] [1545792533.066958774]: output xyzrpy: 0.266088  0.157333  0.677000  -1.398000  0.567903  -0.309823

更好的查看类型

rossrv show detect_grasps
[gpd/detect_grasps]:
gpd/CloudIndexed cloud_indexed
  gpd/CloudSources cloud_sources
    sensor_msgs/PointCloud2 cloud
      std_msgs/Header header
        uint32 seq
        time stamp
        string frame_id
      uint32 height
      uint32 width
      sensor_msgs/PointField[] fields
        uint8 INT8=1
        uint8 UINT8=2
        uint8 INT16=3
        uint8 UINT16=4
        uint8 INT32=5
        uint8 UINT32=6
        uint8 FLOAT32=7
        uint8 FLOAT64=8
        string name
        uint32 offset
        uint8 datatype
        uint32 count
      bool is_bigendian
      uint32 point_step
      uint32 row_step
      uint8[] data
      bool is_dense
    std_msgs/Int64[] camera_source
      int64 data
    geometry_msgs/Point[] view_points
      float64 x
      float64 y
      float64 z
  std_msgs/Int64[] indices
    int64 data
---
gpd/GraspConfigList grasp_configs
  std_msgs/Header header
    uint32 seq
    time stamp
    string frame_id
  gpd/GraspConfig[] grasps
    geometry_msgs/Point bottom
      float64 x
      float64 y
      float64 z
    geometry_msgs/Point top
      float64 x
      float64 y
      float64 z
    geometry_msgs/Point surface
      float64 x
      float64 y
      float64 z
    geometry_msgs/Vector3 approach
      float64 x
      float64 y
      float64 z
    geometry_msgs/Vector3 binormal
      float64 x
      float64 y
      float64 z
    geometry_msgs/Vector3 axis
      float64 x
      float64 y
      float64 z
    geometry_msgs/Point sample
      float64 x
      float64 y
      float64 z
    std_msgs/Float32 width
      float32 data
    std_msgs/Float32 score
      float32 data

PARAMETERS

 * /detect_grasps/angle_thresh: 0.1
 * /detect_grasps/camera_position: [0, 0, 0]
 * /detect_grasps/cloud_topic: /camera/depth_reg...
 * /detect_grasps/cloud_type: 0
 * /detect_grasps/create_image_batches: False
 * /detect_grasps/filter_grasps: False
 * /detect_grasps/filter_half_antipodal: False
 * /detect_grasps/filter_table_side_grasps: False
 * /detect_grasps/finger_width: 0.01
 * /detect_grasps/hand_depth: 0.06
 * /detect_grasps/hand_height: 0.02
 * /detect_grasps/hand_outer_diameter: 0.12
 * /detect_grasps/image_depth: 0.06
 * /detect_grasps/image_height: 0.02
 * /detect_grasps/image_num_channels: 15
 * /detect_grasps/image_outer_diameter: 0.1
 * /detect_grasps/image_size: 60
 * /detect_grasps/init_bite: 0.01
 * /detect_grasps/lenet_params_dir: /home/pumpkinking...
 * /detect_grasps/max_aperture: 0.072
 * /detect_grasps/min_aperture: 0.029
 * /detect_grasps/min_inliers: 1
 * /detect_grasps/min_score_diff: 0
 * /detect_grasps/nn_radius: 0.01
 * /detect_grasps/num_orientations: 8
 * /detect_grasps/num_samples: 100
 * /detect_grasps/num_selected: 5
 * /detect_grasps/num_threads: 4
 * /detect_grasps/plot_candidates: False
 * /detect_grasps/plot_clusters: False
 * /detect_grasps/plot_filtered_grasps: False
 * /detect_grasps/plot_normals: False
 * /detect_grasps/plot_samples: False
 * /detect_grasps/plot_selected_grasps: False
 * /detect_grasps/plot_valid_grasps: False
 * /detect_grasps/remove_outliers: False
 * /detect_grasps/remove_plane_before_image_calculation: False
 * /detect_grasps/rviz_topic: grasps_rviz
 * /detect_grasps/samples_topic: 
 * /detect_grasps/table_height: 0.0
 * /detect_grasps/table_thresh: 0.05
 * /detect_grasps/use_importance_sampling: False
 * /detect_grasps/vertical_axis: [0, 0, 1]
 * /detect_grasps/voxelize: True
 * /detect_grasps/workspace: [-1, 1, -1, 1, -1...
 * /detect_grasps/workspace_grasps: [0.55, 1.0, -0.41...
 * /rosdistro: kinetic
 * /rosversion: 1.12.14

猜你喜欢

转载自blog.csdn.net/flyfish1986/article/details/85266537