Roberto González

Roberto González

Accedo Services, Web Developer

Microsoft Student Partner, Bs in Computer Engineering, Web Developer and AI enthusiast

Current sessions

Hand Gesture-Based Language, the era of computing interaction has arrived.

Computers are constantly changing the way we work, live and interact with technology. One of the most innovative ways for computer interaction came thankfully with the Kinect introduction in 2010, revolutionizing and creating innovative ways to develop more interactive applications using body and hand tracking algorithms in areas like games and healthcare. Despite its development and its manufacturing were discontinued, it’s technology still alive in products like HoloLens and recently a new version of Kinect sensor to be primarily used with Azure (Kinect isn’t dead!). With that in mind and the AI disruption in the latest years, developers nowadays have the responsibility to improve quality, speed, the functionality of their applications and being more inclusive with the users, giving them more alternatives to interact with applications. The mouse and keyboard are not the only input devices we can use to interact with computers and applications. Gestures can too! In this session, we’ll dip into the Project Gesture SDK and its hand gesture-based language, know the core features and how you can develop and integrate customized hand gestures into apps using Visual Studio, XAML and C# with a Kinect Sensor V2.