CVPR 2013: Symmetry Detection from Real World Images -- A Competition
Symmetry is a pervasive phenomenon present in all forms and scales in natural and man-made environments. It is not surprising, therefore, that humans, animals and insects have evolved an innate ability to perceive and take advantage of symmetry. What IS surprising is that perception and recognition of symmetry have yet to be fully explored in machine intelligence, in particular computer vision. Despite an understanding of how the concept of repeated patterns is generalized by the mathematics of group theory, and despite attempts over four decades to design algorithms that seek symmetry from digital data, there are very few effective computational tools for automated symmetry analysis available today. The goal of this competition is to benchmark state of the art symmetry detection algorithms (previously published and new algorithms) on real images.
0. Keynote Speaker:
2. Organizing Committee:
3. Advisory Committee:
4. Results and Paper Submissions:
We are releasing the training datasets containing images and hand-labeled ground truth, representative of the types of symmetry that will be found in the test set. After the release of the testing dataset (mid April), competitors are required to submit results (with a same format as our ground truth) for evaluation, along with an associated paper describing the algorithm(s) used. We encourage new symmetry detection algorithm or evaluating published algorithms on our benchmark dataset. A special issue for the competition results in the Journal of Machine Vision and Applications is planned.
5. Important Dates:
6. Important Links:
-- Yanxi Liu and Hagit Hel-Or and Craig S. Kaplan and Luc Van Gool, Computational Symmetry in Computer Vision and Computer Graphics, Foundations and Trends in Computer Graphics and Vision, Vol.5, Num.1-2, Pages 199, 2010. [pdf]
7. Data Download:
We invite all participants to submit a 4 page workshop paper using the CVPR12 template to describe their algorithms as well as their detection results on the testing dataset in the same format of the groundtruth label we release for the training data.
W also need a normalized confidence score (between 0 and 1) to be associated with each detected potential symmetry patterns. We recommend participants to include detection results (>=3 per image) even with low confidence scores so that we can get a more complete precision/recall curve for evaluation.