Overview

The main objective of the 2016 IEEE Scene Background Modeling Contest (SBMC 2016) is to challenge research teams from around the world to test their scene background modeling algorithms on our 8 video categories dataset, namely:

  • Basic (16 basic videos),
  • Intermittent Motion (16 videos subject to "ghosting" artifacts),
  • Clutter (11 videos containing foreground moving objects occluding a large portion of the background),
  • Jitter (9 videos shot by a moving camera),
  • Illumination Changes (6 videos with strong illumination variations),
  • Background Motion (6 videos containing background motion),
  • Very Long (5 videos containing thousands of frames),
  • Very Short (10 videos of only few frames),

The videos have been selected to cover a wide range of scene background modeling challenges and are representative of typical visual data captured today in surveillance, smart environment, and video database scenarios. This dataset aims to provide a rigorous academic benchmarking facility for testing and validating existing and new algorithms for scene background modeling.

The best performing algorithms submitted to the contest will be invited for oral presentation. Papers will be published in the 2016 ICPR Contest Proceedings. All submissions that meet minimum standards will be reported in the dataset on-line and in an overview-paper associated with the contest. The contest will also include an invited talk.

Key Dates (NEW DATES)

  • 16 May (Mon): Registration to the competition is opened.
  • 16 May (Mon): Publication of the dataset.
  • 18 July (Thur) 5 August (Fri): Deadline to submit contest results.
  • 19 July (Mon) 5 August (Fri): Deadline to submit contest paper.
  • 5 August (Fri) 12 August (Fri): Notification of acceptance.
  • 12 Sep (Mon): Deadline to submit camera ready contest paper.

Rules for participation

  • Researchers from both the academia and the industry are welcome to submit results.
  • Results must be reported for each video of each category.
  • Only one set of tuning parameters should be used for all videos.
  • Numerical scores can be computed using Matlab or Python programs available under UTILITIES. Both programs take the output produced by an algorithm and the available ground-truth color image, and compute performance metrics described on the RESULTS page.
  • In order for a method to be ranked on this website, upload your results via the UPLOAD page.
  • Methods published in the past can be submitted as long as extensive evaluation over all 8 video categories is performed.
  • Paper length: max 6 pages.
  • Latex/word templates: 3 files available at: HERE.
  • Please submit your paper HERE.

Program

14:00 - 14:20 Opening remarks and description of the challenge
14:20 - 14:40 "Evaluation of the Background Modeling Method Auto-Adaptive Parallel Neural Network Architecture in the SBMnet Dataset", M. Chacon-Murguia, G. Ramirez-Alonso and J. Ramirez-Quintana
14:40 - 15:00 "Rejection based Multipath Reconstruction for Background Estimation in SBMnet 2016 dataset", D. Ortego, J. C. Sanmiguel and J. M. Martínez
15:00 - 15:30 Coffee Break
15:30 - 15:50 "CNN-based Initial Background Estimation", I. Halfaoui, F. Bouzaraa and O. Urfalioglu
15:50 - 16:10 "Background Initialization based on Bidirectional Analysis and Consensus Voting", T. Minematsu, A. Shimada and R-I Taniguchi
16:10 - 16:30 "Scene Background Estimation Based on Temporal Median Filter with Gaussian Filtering", W. Liu, Y. Cai, M. Zhang, H. Li and H. Gu
16:30 - 16:40 Break
16:40 - 17:00 "Motion-Aware Graph Regularized RPCA for Background Modeling of Complex Scene", S. Javed, A. Mahmood, T. Bouwmans and S.-K. Jung
17:00 - 17:20 "Extracting a Background Image by a Multi-modal Scene Background Model", L. Maddalena and A. Petrosino
17:20 - 17:40 "LaBGen-P: A Pixel-Level Stationary Background Generation Method Based on LaBGen", B. Laugraud, S. Piérard, M. Van Droogenbroeck
17:40 - 18:00 Conclusion and future works

Contest Organizers

Dataset contributors

SBMC 2016 Program Committee

  • Thierry Bouwmans, Universitè La Rochelle (France)

  • Maurizio Giordano, National Research Council (Italy)

  • Pierre-Marc Jodoin, University of Sherbrooke (Canada)

  • Zhiming Luo, University of Sherbrooke (Canada)

  • Lucia Maddalena, National Research Council (Italy)

  • Alfredo Petrosino, University of Naples "Parthenope" (Italy)

  • Sébastien Piérard, University of Liège (Belgium)

  • Yi Wang, University of Sherbrooke (Canada)

Acknowledgment

  • Yi Wang, Ph.D student, Université de Sherbrooke, Canada
    Webmaster, software developer

  • Martin Cousineau, Université de Sherbrooke, Canada
    Webmaster, software developer

Results (September 12, 2016)



Picture of the winner! From left to right, Lucia Maddalena, Benjamin Laugraud (winner!), Pierre-Marc Jodoin


Results, all categories combined.

Click on method name for more details.

Method Average ranking Average ranking across categories Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 2.00 4.75 6.7090 0.0631 0.0265 0.9266 28.6396 29.4668
LaBGen-P [7] 2.83 5.38 7.0738 0.0706 0.0319 0.9278 28.4660 29.3196
Photomontage [3] 3.33 6.50 7.1950 0.0686 0.0257 0.9189 28.0113 28.8719
SC-SOBS-C4 [9] 4.33 6.00 7.5183 0.0711 0.0242 0.9160 27.6533 28.5601
MAGRPCA [10] 5.83 6.88 8.3132 0.0994 0.0567 0.9401 28.4556 29.3152
Temporal median filter [2] 7.17 5.50 8.2761 0.0984 0.0546 0.9130 27.5364 28.4434
BE-AAPSA [14] 7.17 7.88 7.9086 0.0873 0.0447 0.9127 27.0714 27.9811
Bidirectional Analysis [13] 7.67 6.63 8.3449 0.0756 0.0181 0.9085 26.1722 27.1637
Bidirectional Analysis and Consensus Voting [12] 8.67 7.75 8.5816 0.0724 0.0257 0.9078 26.1018 27.1000
TMFG [11] 10.00 6.25 7.4020 0.1051 0.0566 0.9043 27.1347 28.0530
FC-FlowNet [5] 10.17 9.00 9.1131 0.1128 0.0599 0.9162 26.9559 27.8767
RSL2011 [4] 11.17 10.25 9.0443 0.1008 0.0497 0.8891 25.8051 26.7986
AAPSA [1] 12.17 10.88 9.2044 0.1057 0.0523 0.9000 25.3947 26.3021
RMR [8] 12.50 10.00 9.5363 0.1176 0.0582 0.8790 26.5217 27.4549

Results, for the basic category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 4.33 3.9012 0.0154 0.0045 0.9742 32.9009 33.5733
LaBGen-P [7] 4.00 3.9712 0.0156 0.0041 0.9749 33.2445 33.8797
Photomontage [3] 6.83 4.4856 0.0226 0.0039 0.9719 32.3208 32.9621
SC-SOBS-C4 [9] 6.33 4.3598 0.0200 0.0033 0.9728 32.1766 32.8665
MAGRPCA [10] 12.50 8.9576 0.1124 0.0882 0.9669 29.6430 30.4461
Temporal median filter [2] 2.33 3.8269 0.0139 0.0034 0.9804 33.7085 34.3342
BE-AAPSA [14] 11.83 5.6842 0.0472 0.0273 0.9626 30.1101 30.9698
Bidirectional Analysis [13] 4.67 4.1075 0.0161 0.0017 0.9736 32.3092 33.0339
Bidirectional Analysis and Consensus Voting [12] 6.00 4.1421 0.0159 0.0019 0.9723 31.9277 32.6950
TMFG [11] 1.50 3.8063 0.0131 0.0031 0.9803 33.7483 34.3541
FC-FlowNet [5] 10.17 5.5856 0.0327 0.0142 0.9636 30.4121 31.2285
RSL2011 [4] 9.17 4.5546 0.0269 0.0081 0.9660 31.6767 32.4654
AAPSA [1] 12.50 5.6500 0.0449 0.0222 0.9575 29.4528 30.1252
RMR [8] 12.83 5.8867 0.0480 0.0167 0.9400 29.6069 30.4044

Results, for the intermittent motion category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 4.67 4.4333 0.0293 0.0164 0.9597 29.4757 30.3373
LaBGen-P [7] 2.00 4.1278 0.0225 0.0115 0.9712 31.9974 32.7260
Photomontage [3] 11.83 7.1460 0.0639 0.0427 0.9138 24.8941 25.8682
SC-SOBS-C4 [9] 8.33 6.2583 0.0487 0.0238 0.9255 25.9249 26.9569
MAGRPCA [10] 8.83 8.3106 0.1391 0.1018 0.9710 29.7923 30.6718
Temporal median filter [2] 11.83 6.8003 0.0615 0.0421 0.9150 24.7016 25.7573
BE-AAPSA [14] 8.50 6.6997 0.0547 0.0369 0.9345 26.5072 27.5251
Bidirectional Analysis [13] 4.83 5.0569 0.0245 0.0082 0.9533 28.6439 29.5925
Bidirectional Analysis and Consensus Voting [12] 2.67 4.5966 0.0198 0.0079 0.9657 30.2170 31.1074
TMFG [11] 10.67 6.7602 0.0609 0.0409 0.9167 24.8249 25.8626
FC-FlowNet [5] 8.67 6.7811 0.0599 0.0347 0.9312 27.0272 27.9086
RSL2011 [4] 6.67 5.1116 0.0399 0.0244 0.9532 28.0795 29.0599
AAPSA [1] 13.50 8.2573 0.0768 0.0522 0.9009 24.2814 25.2087
RMR [8] 2.00 4.3606 0.0213 0.0091 0.9730 31.1372 31.9285

Results, for the clutter category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 6.83 8.0579 0.1035 0.0740 0.8834 26.7690 27.7986
LaBGen-P [7] 4.83 7.8947 0.0986 0.0678 0.8967 28.1140 29.1305
Photomontage [3] 2.83 6.8195 0.0543 0.0294 0.8892 28.5554 29.4882
SC-SOBS-C4 [9] 4.00 7.0590 0.0644 0.0304 0.8939 28.0077 29.0737
MAGRPCA [10] 4.83 8.1589 0.0647 0.0294 0.9446 26.6872 27.5988
Temporal median filter [2] 9.83 12.5316 0.1590 0.1108 0.8185 26.1441 27.1507
BE-AAPSA [14] 11.50 12.3049 0.1775 0.1205 0.8526 23.7151 24.8666
Bidirectional Analysis [13] 3.17 6.6565 0.0497 0.0177 0.9243 26.4376 27.5267
Bidirectional Analysis and Consensus Voting [12] 5.00 7.2284 0.0546 0.0243 0.9206 25.9467 27.0419
TMFG [11] 10.00 11.7469 0.1397 0.0952 0.8144 25.9395 27.0378
FC-FlowNet [5] 10.83 12.5556 0.1719 0.1185 0.8762 25.3201 26.4707
RSL2011 [4] 4.83 7.3013 0.0701 0.0375 0.9087 27.9304 28.9763
AAPSA [1] 13.50 15.7186 0.2577 0.1863 0.8240 22.8853 23.9875
RMR [8] 13.00 15.3119 0.1819 0.1202 0.7306 23.3608 24.5341

Results, for the jitter category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 5.33 9.7096 0.1108 0.0420 0.8487 25.3282 26.3608
LaBGen-P [7] 4.33 9.6487 0.1100 0.0410 0.8504 25.2447 26.2834
Photomontage [3] 9.17 10.1272 0.1210 0.0441 0.8390 24.3478 25.4186
SC-SOBS-C4 [9] 7.17 10.0232 0.1186 0.0420 0.8403 24.5562 25.6570
MAGRPCA [10] 8.00 10.9525 0.1131 0.0419 0.8503 24.4999 25.6199
Temporal median filter [2] 1.50 9.0892 0.1063 0.0404 0.8556 25.5526 26.6236
BE-AAPSA [14] 10.00 10.1994 0.1246 0.0584 0.8373 24.5489 25.6301
Bidirectional Analysis [13] 7.00 10.1835 0.1090 0.0341 0.8385 23.6701 24.8364
Bidirectional Analysis and Consensus Voting [12] 7.50 10.0040 0.1098 0.0365 0.8369 23.5048 24.6902
TMFG [11] 2.67 9.2013 0.1069 0.0408 0.8543 25.4488 26.5210
FC-FlowNet [5] 7.67 10.2805 0.1138 0.0446 0.8489 25.0240 26.0705
RSL2011 [4] 12.50 10.5876 0.1237 0.0493 0.8059 22.6947 23.8316
AAPSA [1] 8.17 10.2185 0.1202 0.0382 0.8545 23.2861 24.2624
RMR [8] 14.00 11.5991 0.1468 0.0624 0.7806 22.1418 23.2998

Results, for the illumination changes category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 2.67 6.1922 0.0440 0.0206 0.9725 29.7108 30.5510
LaBGen-P [7] 5.83 7.4945 0.0611 0.0376 0.9630 25.2155 26.3522
Photomontage [3] 1.50 5.2668 0.0329 0.0155 0.9743 30.2102 31.0393
SC-SOBS-C4 [9] 8.33 10.3591 0.1005 0.0574 0.9075 26.2190 27.0837
MAGRPCA [10] 5.00 7.7987 0.1158 0.0804 0.9760 31.9554 32.7325
Temporal median filter [2] 10.17 12.2205 0.2322 0.1783 0.9400 24.3156 25.3760
BE-AAPSA [14] 5.83 7.0447 0.0694 0.0451 0.9613 27.4897 28.2828
Bidirectional Analysis [13] 12.00 16.8302 0.1833 0.0417 0.8252 19.3961 20.6819
Bidirectional Analysis and Consensus Voting [12] 11.17 16.1236 0.1192 0.0668 0.8653 21.4330 22.5940
TMFG [11] 13.50 22.0886 0.3025 0.2119 0.8612 20.8670 22.0259
FC-FlowNet [5] 10.83 13.3662 0.2584 0.1906 0.9452 24.2047 25.2306
RSL2011 [4] 9.50 9.1963 0.0996 0.0669 0.9349 23.9579 25.1714
AAPSA [1] 3.67 6.7259 0.0562 0.0250 0.9728 28.5080 29.2912
RMR [8] 5.00 7.1869 0.0656 0.0403 0.9485 28.7961 29.8049

Results, for the background motion category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 7.50 10.4996 0.1356 0.0334 0.8465 25.4982 26.1616
LaBGen-P [7] 7.67 10.5858 0.1350 0.0336 0.8442 25.7879 26.5748
Photomontage [3] 12.67 12.0930 0.1589 0.0410 0.8244 23.5420 24.5253
SC-SOBS-C4 [9] 8.67 10.7280 0.1481 0.0302 0.8486 24.5806 25.5603
MAGRPCA [10] 4.50 10.0742 0.1318 0.0299 0.8748 25.8756 26.7832
Temporal median filter [2] 4.17 9.6479 0.1262 0.0300 0.8566 25.9277 26.7931
BE-AAPSA [14] 2.17 9.3755 0.1266 0.0259 0.8766 26.0041 26.9062
Bidirectional Analysis [13] 7.83 10.7772 0.1350 0.0273 0.8574 24.0802 25.0203
Bidirectional Analysis and Consensus Voting [12] 6.17 10.5260 0.1316 0.0269 0.8553 25.2106 26.1305
TMFG [11] 1.33 9.0773 0.1185 0.0228 0.8706 26.4052 27.2770
FC-FlowNet [5] 5.33 10.0539 0.1440 0.0362 0.8612 26.0302 26.9014
RSL2011 [4] 13.50 13.2090 0.1604 0.0531 0.8040 23.8834 24.7017
AAPSA [1] 10.67 11.1404 0.1488 0.0381 0.8422 24.4876 25.4679
RMR [8] 12.83 12.2932 0.1682 0.0464 0.8151 23.9116 24.7040

Results, for the very long category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 4.67 5.5689 0.0327 0.0075 0.9807 29.2590 30.1356
LaBGen-P [7] 10.00 7.3028 0.0803 0.0395 0.9760 28.2974 29.1409
Photomontage [3] 7.33 6.6446 0.0629 0.0259 0.9838 29.2081 30.0166
SC-SOBS-C4 [9] 4.50 6.0638 0.0355 0.0021 0.9837 29.2615 30.1014
MAGRPCA [10] 2.17 3.9164 0.0184 0.0040 0.9861 31.3101 32.0265
Temporal median filter [2] 5.83 6.9588 0.0557 0.0199 0.9843 29.3160 30.2107
BE-AAPSA [14] 1.33 3.8745 0.0148 0.0010 0.9844 32.5501 33.1853
Bidirectional Analysis [13] 7.33 6.7502 0.0395 0.0012 0.9675 27.8260 28.7710
Bidirectional Analysis and Consensus Voting [12] 9.00 7.0049 0.0429 0.0045 0.9558 27.6839 28.6442
TMFG [11] 7.50 7.3530 0.0671 0.0274 0.9851 29.2432 30.1213
FC-FlowNet [5] 9.33 7.7727 0.0673 0.0189 0.9690 28.6840 29.5294
RSL2011 [4] 13.67 13.2990 0.1980 0.1170 0.8544 23.9750 25.0400
AAPSA [1] 9.00 6.6297 0.0547 0.0129 0.9614 27.4929 28.3462
RMR [8] 13.33 13.2459 0.2547 0.1571 0.9115 25.1309 26.0738

Results, for the very short category.

Click on method name for more details.

Method Average ranking Average     AGE  Average     pEPs  Average   pCEPs  Average   MS-SSIM  Average   PSNR     Average     CQM 
LaBGen [6] 5.33 5.3093 0.0332 0.0136 0.9474 30.1754 30.8162
LaBGen-P [7] 6.50 5.5653 0.0417 0.0199 0.9461 29.8264 30.4694
Photomontage [3] 1.50 4.9770 0.0327 0.0030 0.9548 31.0117 31.6568
SC-SOBS-C4 [9] 3.17 5.2953 0.0330 0.0044 0.9556 30.4997 31.1813
MAGRPCA [10] 10.33 8.3363 0.0999 0.0778 0.9511 27.8815 28.6430
Temporal median filter [2] 2.50 5.1336 0.0324 0.0120 0.9537 30.6255 31.3012
BE-AAPSA [14] 10.83 8.0857 0.0832 0.0429 0.8921 25.6458 26.5128
Bidirectional Analysis [13] 8.17 6.3972 0.0480 0.0126 0.9279 27.0145 27.8472
Bidirectional Analysis and Consensus Voting [12] 11.83 9.0271 0.0850 0.0366 0.8902 22.8906 23.8972
TMFG [11] 2.83 5.1825 0.0321 0.0106 0.9520 30.6010 31.2242
FC-FlowNet [5] 8.00 6.5094 0.0541 0.0211 0.9345 28.9447 29.6743
RSL2011 [4] 12.50 9.0950 0.0877 0.0410 0.8859 24.2435 25.1426
AAPSA [1] 13.33 9.2952 0.0860 0.0438 0.8870 22.7636 23.7275
RMR [8] 8.17 6.4059 0.0541 0.0137 0.9329 28.0882 28.8900