Earth Imaging Journal: Remote Sensing, Satellite Images, Satellite Imagery
Breaking News
AgEagle Announces First Quarter 2022 Results
Supply Chain Challenges Impact First Quarter Results; Expects Strong...
Help from Space sees Local Governments keep the Lights On
Veritas Imagery Services and NOKTOsat are collaborating to provide...
Phase One Announces Next-Generation Aerial Solutions Enhanced with Near Infrared Capabilities Ideal for Agriculture, Environment, Land Management
COPENHAGEN, 18 May 2022 – Phase One, a leading...
UP42 and Airbus Launch Copernicus Masters Challenge for Sustainable Urban Planning
Calling on all developers and researchers to leverage remote...
Colourisation and immersive walkthroughs among major GeoSLAM updates
GeoSLAM has announced the official launch of its ZEB...

MIT researchers are fine tuning algorithms to consider arm positions and hand positions separately, which drastically cuts down on the computational complexity of their task.

The problem of interpreting hand signals has two distinct parts. The first is simply inferring the body pose of the signaler from a digital image: Are the hands up or down, the elbows in or out? The second is determining which specific gesture is depicted in a series of images. The MIT researchers are chiefly concerned with the second problem; they present their solution in the March issue of the journal ACM Transactions on Interactive Intelligent Systems. But to test their approach, they also had to address the first problem, which they did in work presented at last year’s IEEE International Conference on Automatic Face and Gesture Recognition.

Read the full story.

Comments are closed.