Earth Imaging Journal: Remote Sensing, Satellite Images, Satellite Imagery
Breaking News
Atlas Dynamics to Launch Autonomous Professional Drone System with 55-Minute Flight Time at InterDrone 2017
Atlas Dynamics (http://www.atlasdynamics.eu), a leading provider of drone-based solutions...
Drone Delivery Canada Announces U.S. Market Listing
TORONTO - Drone Delivery Canada 'DDC or the Company'...
2017 Awards for Excellence in Public Safety GIS Recipients Recognized at the National Geospatial Preparedness Summit
WASHINGTON - The National Alliance for Public Safety GIS...
Satellite Images Erupting Russian Volcano
Shiveluch, one of the world's most active volcanoes, is...
Caliper Corporation: 2017 Sustained Growth
NEWTON, MASSACHUSETTS (USA) - Caliper Corporation, founded in 1983...

MIT researchers are fine tuning algorithms to consider arm positions and hand positions separately, which drastically cuts down on the computational complexity of their task.

The problem of interpreting hand signals has two distinct parts. The first is simply inferring the body pose of the signaler from a digital image: Are the hands up or down, the elbows in or out? The second is determining which specific gesture is depicted in a series of images. The MIT researchers are chiefly concerned with the second problem; they present their solution in the March issue of the journal ACM Transactions on Interactive Intelligent Systems. But to test their approach, they also had to address the first problem, which they did in work presented at last year’s IEEE International Conference on Automatic Face and Gesture Recognition.

Read the full story.

Comments are closed.