Earth Imaging Journal: Remote Sensing, Satellite Images, Satellite Imagery
Breaking News
Global Aerial Imaging Market – Expected to Reach $3.2 Billion by 2023 – Research and Markets
DUBLIN -The "Global Aerial Imaging Market Analysis (2017-2023)" report...
FLIR Announces FLIR DM166 Thermal Imaging TRMS Multimeter with IGM
WILSONVILLE, Ore. – FLIR announces the FLIR DM166 thermal...
OGC Seeks Public Comment on CDB Multi-Spectral Imagery Extension
The Open Geospatial Consortium (OGC) is seeking public comment...
Save Time and Improve Productivity with the digiVIT Advanced Digital Signal Conditioner from Kaman
Middletown, CT – The Measuring Division of Kaman Precision...
Euronews and Copernicus Present New Programmes that Make Climate Change and Atmosphere Data More Applicable in Daily Lives
Lyon/Reading  - The Copernicus Atmosphere Monitoring Service and the...

MIT researchers are fine tuning algorithms to consider arm positions and hand positions separately, which drastically cuts down on the computational complexity of their task.

The problem of interpreting hand signals has two distinct parts. The first is simply inferring the body pose of the signaler from a digital image: Are the hands up or down, the elbows in or out? The second is determining which specific gesture is depicted in a series of images. The MIT researchers are chiefly concerned with the second problem; they present their solution in the March issue of the journal ACM Transactions on Interactive Intelligent Systems. But to test their approach, they also had to address the first problem, which they did in work presented at last year’s IEEE International Conference on Automatic Face and Gesture Recognition.

Read the full story.

Comments are closed.