Session

Hand Gesture-Based Language, the era of computing interaction has arrived.

Computers are constantly changing the way we work, live, and interact with technology. One of the most innovative ways for computer interaction came thankfully with the Kinect introduction in 2010, revolutionizing and creating innovative ways to develop more interactive applications using body and hand tracking algorithms in areas like games and healthcare. Despite its development and manufacturing being discontinued, its technology is still alive in products like HoloLens and a recently released new version of the Kinect sensor to be primarily used with Azure (Kinect isn’t dead!). With that in mind and the disruption of AI in the latest years, developers nowadays have the responsibility to improve quality, speed, and functionality of their applications while also being more inclusive with the users, giving them more alternatives to interact with applications. The mouse and keyboard are not the only input devices we can use to interact with computers and applications. Gestures can, too! In this session, we’ll dip into the Project Gesture SDK and its hand gesture-based language, know the core features, and learn how you can develop and integrate customized hand gestures into apps using Visual Studio, XAML, and C# with a Kinect Sensor V2.

Roberto González

"Never argue with the data." - Sheen, Jimmy Neutron

Managua, Nicaragua

View Speaker Profile

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top