With the Visual Gesture Builder by Microsoft, you can generate gesture classifier by machine learning. The extension of the generated file is ".gbd" or ".gba".
In Visual Gesture Builder, there is more than one project in one solution. When you build each project you get a ".bga" file, and when you build the whole solution you get a ".bgd" file. That is, the extension of the individual gesture classifier file is ".gba", and that of the file gathering multiple classifiers is ".gbd".
Gesture recognition of Visual Gesture Builder is defined in Kinect for Windows SDK 2.0 as follows.
Quoted from Kinect.VisualGestureBuilder.h in Kinect for Windows SDK 2.0 |
---|
typedef enum _GestureType GestureType; enum _GestureType { GestureType_None = 0, GestureType_Discrete = 1, GestureType_Continuous = 2 }; |
There are two kinds of gestures: "Discrete" and "Continuous", with the former "confidence" and the latter with "progress" value added. Each value is from 0.0 to 1.0.
The discrete gesture is determined by the AdaBoost algorithm and the result is returned as a boolean value. You can change the judgement criterion by referring to the value of "confidence".
The continuous gesture is determined by the Random Forest algorithm , and the result is a progressive result in real number. This value is "progress".
If you define USE_GESTURE constant before including NtKinect.h, the functions and variables of NtKinect for gesture recognition becom effective.
You can use the classifier file ".gbd" and ".gba" generated by Visual Gesture Builder to recognize gestures with Kinect V2.
type of return value | function name | descriptions | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
void | setGestureFile(wstring& path ) | version1.5 or later Set the path path to the file (*.gbd or *.gba) that defines the gesture. By default, it is set to L"SampleDatabase.gbd". |
|||||||||||||||
void | setGesture() | version1.5 or later. Call this function to recognize the gesture after calling setSkeleton(). Values are set to the following member variables.
|
|||||||||||||||
string | gesture2string(const CComPtr<IGesture>& gesture ) | version1.5 or later. Returns the name of the gesture gesture |
type | variable name | descriptions |
---|---|---|
vector<pair<CComPtr<IGesture>,float>> | discreteGesture | version1.5 or later. A recognized discrete gesture is represented by a pair of the gesture itself and its confidence value, pair<CComPtr<IGesture>,float> . To handle multiple people, the type of this variable is vector<pair<CComPtr<IGesture>,float>> . |
vector<UINT64> | discreteGestureTrackingId | version1.5 or later. vector of skeleton's trackingId corresponds to the discrete gesture. |
vector<pair<CComPtr<IGesture>,float>> | continuousGesture | version1.5 or later. A recognized continuous gesture is represented by a pair of the gesture itself and its progress value, pair<CComPtr<IGesture>,float> . To handle multiple people, the type of this variable is vector<pair<CComPtr<IGesture>,float>> . |
vector<UINT64> | continuousGestureTrackingId | version1.5ไปฅ้ใ vector of skeleton's trackingId corresponds to the continuous gesture. |
If you want to know a skeleton such as "who do the gesture?", you can judge it using the following expression.
In case of discrete gesture: kinect.discreteGestureTrackingId[i] == kinect.skeletonTrackingId[j] In case of continuous gesture: kinect.continuousGestureTrackingId[i] == kinect.skeletonTrackingId[j]
Click the project name (KinectV2 in this example) in Solution Explorer to select, right-drag and select Properties in the menu.
Kinect20.VisualGestureBuilder.lib |
"Configuration Properties" -> "Build Event" -> "Post Build Event" -> "Command Line". Add the next two lines.
xcopy "$(KINECTSDK20_DIR)Redist\VGB\x64" "$(OutDir)" /e /y /i /r if exist "$(ProjectDir)\*.gbd" ( copy "$(ProjectDir)\*.gbd" "$(OutDir)" /y ) |
In this example, I use the following ".gbd" file destributed with Kinect for Windows SDK.
$(KINECTSDK20_DIR)Tools\KinectStudio\databases\SampleDatabase.gbd
First, we define USE_GESTURE constant before including "NtKinect.h".
We declare a variable "kinect" of type "NtKinect".
Call kinect.setGestureFile(string) and set the gesture classifier file.
After calling kinect.setSkeleton() function, Call kinect.setGesture() function(). When one or more gestures are recognized, they are set to kinect.discreteGesture or kinect.continuousGesture.
main.cpp |
|
Recognized gestures are displayed above the RGB image. The discrete gesture is displayed along with the confidence in the upper left, and the continuous gesture is displayed along with the progress to the right.
Since the above zip file may not include the latest "NtKinect.h", Download the latest version from here and replace old one with it.
$(KINECTSDK20_DIR)bin\Database\Seated.gbd
First of all, using Kinect Studio, we shoot a movie "including both cases where you are performing certain gestures and not doing them".
Data type to record with "Record" panel o Nui Body Frame x Nui Body Index <--not needed, remove check o Nui Depth o Nui IR o Nui Nui Sensor Telemetry x Nui Title Audio <-- not needed, remove check x Nui Uncompressed Color <-- not needed, remove check
Next, we generate a gesture classifier file from the video using Visual Gesture Builder. Manually, you define the part of the movie that "moves the left hand above the should" as positive, and others as negative. If you let Visual Gesture Builder do machine learning, the gesture classifier file LeftHandUp.gba is generated as a result.
By the way, the name of this file is not "LeftHandUp.gbd", but "LeftHandUp.gba".
Let's replace the gesture classifier file of the project in this article with LeftHandUp.gba generated this time.
kinect.setGestureFile(L"LeftHandUp.gba");Please recognize the action "Raise the left hand above the shoulder" motion as a discrete gesture and not recognize the action "Raise only the right hand on the shoulder" motion or "Raise both hands above the shoulder" motion.