Kinect gestures for computers take a step forward every day.
We have already seen complete integration with Windows 7 and Minority Report-esque 12-point motion tracking. Now that the SDK is out, open source developers like Kevin Connolly are opening new possibilities for multi-screen Kinect setups.
The KinectNUI uses a Kinect peripheral hooked up to a Windows 7 PC displaying across six monitors. Connolly shows off what his two-point multi-touch system can do including scrolling, drag and dropping, zooming windows to all the screens. The extra touches we spot are the windows moving in tandem with the moving user, and the application selecting though a pie or radial menu.
The KinectNUI build is obviously still a work in progress, with the system hiccuping and misreading gestures. But it should be interesting to see how these Kinect-PC hybrids develop, because they represent a preview of how we will interact with computer systems after the mouse and keyboard are retired permanently.