Gesture Based Data Interaction Using Kinect

Processing, Kinect

An extension of Chasing Programming Languages || Extruding Circos, an Interactive 3D Data visualization

Exhibited at MAT's End of the Year Show



Computer science is a very dynamic field with rapidly changing standards and trending programming languages. To measure the the evolution and popularity of these languages I worked with George Legrady’s Making Visible the Invisible Seattle Public Library dataset. Seattle is an excellent site for this as it is home to heavy weight companies like Microsoft and Amazon, amongst others. This visualization allows individuals who are not familiar with the Seattle Public Library dataset to interact with the data and explore the trends in programming languages.

Description of Representation: 

Check in and check out events from popular programming languages were represented as arcs through space. These arcs climb up a staircase-like structure that represents time, with each step of the staircase being a month. The height and density of arcs, and the languages they represent via color-coding, allow a viewer to intuitively see trends in computer science languages. Audio tones provide feedback for certain commands.

Gesture Interaction: 

A continued area of research in this project has been to develop a gesture based interaction system, which utilizes the Microsoft Kinect.

The software currently supports the following gestures:

Two hand mode: activated when both hands are above the neck.

Span to zoom:

To zoom into the structure, the user has to spread his hands out. Walking towards the data in the position zooms in further; conversely moving away from the Kinect zooms out.

Twist along Z-axis to revolve: 

To revolve the structure to access the other side of the data structure, the user has to increase the z distance between his hands: in other words move one hand forward towards the data while pulling the other one back. This increase I distance revolves the structure

Tilt along the Y-axis:

To rotate the structure to its side, the y distance between the hands need to be increased. In other words, on hand needs to be higher while the other one goes lower. This results in the y distance to increase thereby rotating the structure in the direction of the lower hand.

One hand mode:

One hand mode is detected when one hand is above the neck while the other one is below. This gesture is applicable for both hands. It is primarily used for the user to interact with the graphical use interface. In IO mode the use can choose the languages he wants to be displayed thereby enabling him to make comparisons. In addition, he can modify or turn on/off the structure as well as take snapshots of the points of interest.

Technical details: 

My design has been inspired by circos ( a data visualization tool used to visualize relationships and I focused on the languages Java, javascript, objective c, python & ruby. Controlp5 was used for a graphical user interface for interaction and used Peasycam for exploration in 3D space. A stand-alone software: KinectA was used to process the depth data and for skeleton tracking. This data was converted into osc messages and sent to Processing for further processing.

Relevant Links: