首页|Description and recognition of complex spatial configurations of object pairs with Force Banner 2D features

Description and recognition of complex spatial configurations of object pairs with Force Banner 2D features

扫码查看
A major challenge in scene understanding is the handling of spatial relations between objects or object parts. Several descriptors dedicated to this task already exist, such as the force histogram which is a typical example of relative position descriptor. By computing the interaction between two objects for a given force in all the directions, it gives a good overview of the configuration, and it has useful properties that can make it invariant to the 2D viewpoint. Considering that using complementary forces (negative for repulsion, positive for attraction) should improve the description of complex spatial configurations, we propose to extend the force histogram to a panel of forces so as to make it a more complete descriptor. This gives a 2D descriptor that we called "(discrete) Force Banner " and which can be used as input of a classical Convolutional Neural Network (CNN), benefiting from their powerful performances, and reduced into more compact spatial features to use them in another system. As an illustration of its ability to describe spatial configurations, we used it to solve a classification problem aiming to discriminate simple spatial relations, but with variable configuration complexities. Experimental results obtained on datasets of synthetic and natural images with various shapes highlight the interest of this approach, in particular for complex spatial configurations.(c) 2021 Elsevier Ltd. All rights reserved.

Image analysisSpatial relationsRelative positionFeatures and descriptorsForce histogramScene understandingImage analysisSpatial relationsRelative positionFeatures and descriptorsForce histogramScene understandingRELATIVE POSITIONDEFINITIONHISTOGRAMS

Delearde, Robin、Kurtz, Camille、Wendling, Laurent

展开 >

Univ Paris

2022

Pattern Recognition

Pattern Recognition

EISCI
ISSN:0031-3203
年,卷(期):2022.123
  • 34