Checking for the submission file ... Clearing the temporary folder ... Unzipping the submission file to the temporary folder ... Calling: 7z x -o/var/www/html/scanrefer/tmp/extracted_results /var/www/html/scanrefer/server/uploads/2_submission_143_upload_2-8845d66a6ce24c0950bc2aa8b089ae93.zip Running evaluations ... Captioning Evaluation ----------------------Evaluation @ 0 IoU----------------------- [BLEU-1] Precision: 0.3759, Recall: 0.7761, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.3089, Recall: 0.6378, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.2395, Recall: 0.4944, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1725, Recall: 0.3561, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2998, Recall: 0.6189, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.3112, Recall: 0.6424, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.1471, Recall: 0.3038, Max: 1.0000, Min: 0.1274 ----------------------Evaluation @ 0.25 IoU----------------------- [BLEU-1] Precision: 0.3100, Recall: 0.6400, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.2578, Recall: 0.5321, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.2033, Recall: 0.4197, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1518, Recall: 0.3134, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2786, Recall: 0.5751, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.2588, Recall: 0.5343, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.1248, Recall: 0.2576, Max: 1.0000, Min: 0.1274 ----------------------Evaluation @ 0.5 IoU----------------------- [BLEU-1] Precision: 0.2806, Recall: 0.5793, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.2337, Recall: 0.4824, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.1847, Recall: 0.3813, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1380, Recall: 0.2849, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2565, Recall: 0.5294, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.2347, Recall: 0.4845, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.1134, Recall: 0.2340, Max: 1.0000, Min: 0.1274 ---------- iou_thresh: 0.250000 ---------- eval cabinet Average Precision: 0.637899 eval bed Average Precision: 0.869994 eval chair Average Precision: 0.627260 eval sofa Average Precision: 0.557498 eval table Average Precision: 0.581820 eval door Average Precision: 0.492248 eval window Average Precision: 0.600972 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.467614 eval counter Average Precision: 0.660624 eval desk Average Precision: 0.772809 eval curtain Average Precision: 0.612857 eval refrigerator Average Precision: 0.005578 eval shower curtain Average Precision: 0.928571 eval toilet Average Precision: 0.226163 eval sink Average Precision: 0.713849 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.298132 eval mAP: 0.593022 eval cabinet Recall: 0.823529 eval bed Recall: 1.000000 eval chair Recall: 0.952880 eval sofa Recall: 0.807692 eval table Recall: 0.792683 eval door Recall: 0.926471 eval window Recall: 0.744681 eval bookshelf Recall: 0.846154 eval picture Recall: 0.727273 eval counter Recall: 1.000000 eval desk Recall: 0.861111 eval curtain Recall: 0.700000 eval refrigerator Recall: 0.866667 eval shower curtain Recall: 1.000000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.734146 eval AR: 0.856145 ---------- iou_thresh: 0.500000 ---------- eval cabinet Average Precision: 0.517060 eval bed Average Precision: 0.811413 eval chair Average Precision: 0.580341 eval sofa Average Precision: 0.493700 eval table Average Precision: 0.522213 eval door Average Precision: 0.415938 eval window Average Precision: 0.361756 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.467614 eval counter Average Precision: 0.484848 eval desk Average Precision: 0.570378 eval curtain Average Precision: 0.367143 eval refrigerator Average Precision: 0.003300 eval shower curtain Average Precision: 0.416667 eval toilet Average Precision: 0.226163 eval sink Average Precision: 0.623904 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.257903 eval mAP: 0.485603 eval cabinet Recall: 0.705882 eval bed Recall: 0.900000 eval chair Recall: 0.895288 eval sofa Recall: 0.730769 eval table Recall: 0.719512 eval door Recall: 0.823529 eval window Recall: 0.553191 eval bookshelf Recall: 0.846154 eval picture Recall: 0.727273 eval counter Recall: 0.818182 eval desk Recall: 0.722222 eval curtain Recall: 0.500000 eval refrigerator Recall: 0.666667 eval shower curtain Recall: 0.500000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.668293 eval AR: 0.744683 | METEOR: 0.1500 | METEOR: 0.3000 | METEOR: 0.4500 | METEOR: 0.6000 | METEOR: 0.7500 | | IoU: 0.1000 | 0.2253 | 0.2233 | 0.2057 | 0.1173 | 0.0184 | | IoU: 0.2000 | 0.2129 | 0.2109 | 0.1947 | 0.1117 | 0.0180 | | IoU: 0.3000 | 0.1997 | 0.1977 | 0.1821 | 0.1055 | 0.0177 | | IoU: 0.4000 | 0.1912 | 0.1893 | 0.1743 | 0.1004 | 0.0168 | | IoU: 0.5000 | 0.1775 | 0.1758 | 0.1623 | 0.0940 | 0.0154 | InsertResults connected to database #localization-results = 3 submitting_to_test_set updated submitted to db Evaluations done, finishing up ... Done.