Investigating the Leap Motion Hand Model

I wanted to get a proper understanding of the gestures and inputs available to the browser from the Leap Motion Controller for an upcoming project, so sat down for an afternoon and played with the huge array of data available.

In a nutshell the hand model provides (among other things) the current hands position, orientation and shape. Position and orientation are pretty self explanatory, but the hands shape is cleverly described by using a virtual sphere that is placed and sized to fit into the curvature of the hand. So when the hand is closed the sphere is smallest and when open it is at its largest.

This sphere is what I am going to use to represent the hand model in the demo, in order to do this I will be experimenting with a few key parameters:

  • sphereRadius (for size data)
  • palmPosition (for positional data)
  • rotation (for orientation data)

Mainly raw data values are being used with very little post-processing, this should allow for a more direct link to the controller’s output so any future data issues can be spotted and overcome if necessary.

I have no doubt that post-processing will eventually be needed to give the best user-experience, hopefully this example will help demonstrate that fact and help pave the path to a more fluid overall experience.

The following demo’s show a virtual sphere created using three.js that reacts to the values from the given parameters. The values direct from the controller do not have a huge range so have been expanded to work more effectively on screen.

You can play with the full final demo here or keep reading for a more detailed breakdown.


The ‘sphereRadius’ parameter is pretty self explanatory and gives the current size of the sphere in the hand. It is a very useful value to know and for the most part is very stable.

The values do sometimes jump around a bit, for example if you try to make the sphere as large as possible in the demo by opening your hand out you will notice that some glitches start to creep in.

This is not such an issue at the smaller sizes and might still be improved in the core LeapJS library over time. Either way it will need to be smoothed out.


Initially I began by using the ‘sphereCenter’ values for positional data but, due to the glitches in the ‘sphereRadius’ data, the center of the sphere would jump around a lot so I opted to use the ‘palmPosition’ data instead.

It is a lot more stable, especially when the hand is out flat (i.e. when the ‘sphereRadius’ data experiences most of it’s glitches).

The positional data can still be a little glitchy at the extremes of the controller’s view but they are not too bad, and being as the final production model will have a greater field of view this should not pose any major issues.


The ‘rotation’ data allows the sphere to move on all three axis so any orientation is possible.

The data being fed through from the controller is based on the current position of the sphere, which is great, but when there has been a previous glitch the orientation can get a little lost, and the sphere can appear to rotate on the wrong axis.

This can be seen in the demo’s a few times where the Y axis gets about 90 degrees out so a ‘yaw’ rotation of the hand results in a ‘roll’ of the sphere. This is not a huge issue and can be solved with some checks on the front end – but is worth noting.

Bringing it all together

After bringing all the different parameters together the whole scene displays well in the 3D space.

The data will indeed need to be weighted and filtered due to the high speed it comes through at, and without additional processing the results can contain some anomalous frames or positional errors that can translate into quite noticeable glitches on the front end, especially if you have a large visual element that needs fine control.

You tend to notice when the sphere you are “holding” in the demo suddenly shoots over to the other side of the screen and back again!

But overall I am happy with the current demo as a basis to build on, with some post-processing and adaptive data filtering it should be possible to produce a very accurate virtual representation of the hand model. There is certainly enough data available!

The next step

The minor glitches from each parameter do seem to compound each other somewhat but all-in-all it works really well, especially considering that only raw data values are being used.

So next up has to be investigating some post-processing options to filter out any potential errors in the data. Once they can be minimized proper fine control can be achieved – I can’t wait!

Sneaky Peek…

Finally before I finish up I just wanted to give you a heads up that I am working on an extension for the Chrome browser that will enable Leap Motion control.

Scrolling, back/forward pages, navigating links and zooming/scaling are all on the cards at the moment so hopefully it will produce a pretty comprehensive way of navigating the web via the Leap Motion controller.

Here is a quick teaser of the scrolling in action, it was early days at this point but you get the idea: