Earth Imaging Journal: Remote Sensing, Satellite Images, Satellite Imagery
Breaking News
Breakthrough Technology Introduced to Combat Growing Global Water Crisis
DUNEDIN, FLA. - To combat the global threat of...
Blue Marble Geographics releases version 8.1 of the GeoCalc Software Development Kit
Hallowell, Maine  — Blue Marble Geographics® (bluemarblegeo.com) is pleased to announce...
Fugro finishes first phase on Alcatel Submarine Networks’ transpacific Bifrost Cable System
Fugro has completed the first phase of its marine...
Paytronix Announces Integration with Google to enable ordering on Google Search and Maps
Newton, MA– Paytronix Systems, Inc., the most advanced digital guest experience platform, today announced...
USAF, Kratos Complete Milestone 1 of the Autonomous Attritable Aircraft Experimentation (AAAx) Campaign with Successful Flight Test Series
SAN DIEGO - Kratos Defense & Security Solutions, Inc. (NASDAQ:...

MIT researchers are fine tuning algorithms to consider arm positions and hand positions separately, which drastically cuts down on the computational complexity of their task.

The problem of interpreting hand signals has two distinct parts. The first is simply inferring the body pose of the signaler from a digital image: Are the hands up or down, the elbows in or out? The second is determining which specific gesture is depicted in a series of images. The MIT researchers are chiefly concerned with the second problem; they present their solution in the March issue of the journal ACM Transactions on Interactive Intelligent Systems. But to test their approach, they also had to address the first problem, which they did in work presented at last year’s IEEE International Conference on Automatic Face and Gesture Recognition.

Read the full story.

Comments are closed.