NtKinect: Kinect V2 C++ Programming with OpenCV on Windows10

How to recognize joint positions and palm state with Kinect V2

2016.07.16: created by
Japanese English
To Table of Contents

Prerequisite knowledge

Recognize the State of the Palm

The state of the palm is defined as follows in Kinect for Windows SDK 2.0.

Quoted from Kinect.h of Kinect for Windows SDK 2.0
enum _HandState {
    HandState_Unknown= 0,
    HandState_NotTracked= 1,
    HandState_Open= 2,
    HandState_Closed= 3,
    HandState_Lasso= 4

enum _TrackingConfidence {
    TrackingConfidence_Low= 0,
    TrackingConfidence_High= 1

In NtKinect, you can get the palm state (Open, Closed, Lasso) when calling setSkeleton() function.

NtKinect の手の状態に関するメソッド

返り値の型 メソッド名 説明
pair<int,int> handState(int id =0, bool isLeft = true)


skeleton[id ] の手の状態を取得する。
左手の場合はtrue, 右手の場合falseを指定する。
手の状態: HandState 列挙型
確信度: TrackingConfidence 列挙型

How to write program

  1. Start using the Visual Studio's project KinectV2_skeleton.zip of "NtKinect: How to recognize human skeleton with Kinect V2" .
  2. Change the contents of main.cpp.
  3. Call kinect.setSkeleton() function to set skeleton information to kinect.skeleton. The first argument of handState() function is "the index of kinect.skeleton vector". So the program loops with the control variable i and access the skeleton information and palm state with the variable i .

    The position information of the left and right palms is in kinect.skeleton[i ][JointType_HandLeft] and kinect.skeleton[i ][JointType_HandRight] , respectively. The position of each joint is looped with variable j and displayed as a red rectangle. When the value of j is JointType_HandLeft or JointType_HandRight, a larger rectangle is displayed by the color representing the palm state.

    #include <iostream>
    #include <sstream>
    #include "NtKinect.h"
    using namespace std;
    void doJob() {
      NtKinect kinect;
      cv::Scalar colors[] = {
        cv::Scalar(255,0,0),  // HandState_Unknown
        cv::Scalar(0,255,0),  // HandState_NotTracked
        cv::Scalar(255,255,0), // HandState_Open
        cv::Scalar(255,0,255), // HandState_Closed
        cv::Scalar(0,255,255),  // HandState_Lass
      while (1) {
        for (int i = 0; i < kinect.skeleton.size(); i++) {
          auto person = kinect.skeleton[i];
          for (int j = 0; j < person.size(); j++) {
            Joint joint = person[j];
            if (joint.TrackingState == TrackingState_NotTracked) continue;
            ColorSpacePoint cp;
            cv::rectangle(kinect.rgbImage, cv::Rect((int)cp.X-5, (int)cp.Y-5,10,10), cv::Scalar(0,0,255),2);
            if (j == JointType_HandLeft || j == JointType_HandRight) {
              pair<int, int> handState = kinect.handState(i, j == JointType_HandLeft);
              cv::rectangle(kinect.rgbImage, cv::Rect((int)cp.X - 8, (int)cp.Y - 8, 16, 16), colors[handState.first], 4);
        cv::imshow("rgb", kinect.rgbImage);
        auto key = cv::waitKey(1);
        if (key == 'q') break;
    int main(int argc, char** argv) {
      try {
      } catch (exception &ex) {
        cout << ex.what() << endl;
        string s;
        cin >> s;
      return 0;
  4. When you run the program, RGB images are displayed. Exit with 'q' key.
  5. Recognized joints are indicated by red squres on the RGB image. A square is writen with the next color according to the palm state.

    palm statecolocv::Scalar's specification
    Not trackedGreen0,255,0

  6. Please click here for this sample project KinectV2_skeletonPalm.zip
  7. Since the above zip file may not include the latest "NtKinect.h", Download the latest version from here and replace old one with it.