Checking for the submission file ... Clearing the temporary folder ... Unzipping the submission file to the temporary folder ... Calling: 7z x -o/var/www/html/scanrefer/tmp/extracted_results /var/www/html/scanrefer/server/uploads/2_submission_142_upload_1-5aa9f1f95a0710ba0118a4b7d9358463.zip Running evaluations ... Captioning Evaluation ----------------------Evaluation @ 0 IoU----------------------- [BLEU-1] Precision: 0.1600, Recall: 0.7836, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.1323, Recall: 0.6483, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.1035, Recall: 0.5072, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0762, Recall: 0.3735, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.1397, Recall: 0.6842, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.1330, Recall: 0.6516, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0636, Recall: 0.3116, Max: 1.0000, Min: 0.1295 ----------------------Evaluation @ 0.25 IoU----------------------- [BLEU-1] Precision: 0.1510, Recall: 0.7395, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.1254, Recall: 0.6141, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.0986, Recall: 0.4828, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0733, Recall: 0.3592, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.1352, Recall: 0.6624, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.1257, Recall: 0.6159, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0603, Recall: 0.2954, Max: 1.0000, Min: 0.1295 ----------------------Evaluation @ 0.5 IoU----------------------- [BLEU-1] Precision: 0.1359, Recall: 0.6657, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.1130, Recall: 0.5534, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.0890, Recall: 0.4360, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0664, Recall: 0.3254, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.1236, Recall: 0.6055, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.1134, Recall: 0.5557, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0546, Recall: 0.2675, Max: 1.0000, Min: 0.1295 ---------- iou_thresh: 0.250000 ---------- eval cabinet Average Precision: 0.668315 eval bed Average Precision: 0.909384 eval chair Average Precision: 0.642603 eval sofa Average Precision: 0.656509 eval table Average Precision: 0.636833 eval door Average Precision: 0.502103 eval window Average Precision: 0.656194 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.489605 eval counter Average Precision: 0.660624 eval desk Average Precision: 0.810507 eval curtain Average Precision: 0.638664 eval refrigerator Average Precision: 0.002363 eval shower curtain Average Precision: 0.928571 eval toilet Average Precision: 0.225525 eval sink Average Precision: 0.713849 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.327747 eval mAP: 0.616106 eval cabinet Recall: 0.931373 eval bed Recall: 1.000000 eval chair Recall: 0.984293 eval sofa Recall: 1.000000 eval table Recall: 0.939024 eval door Recall: 0.955882 eval window Recall: 0.893617 eval bookshelf Recall: 0.846154 eval picture Recall: 0.818182 eval counter Recall: 1.000000 eval desk Recall: 0.944444 eval curtain Recall: 0.800000 eval refrigerator Recall: 0.866667 eval shower curtain Recall: 1.000000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.919512 eval AR: 0.918138 ---------- iou_thresh: 0.500000 ---------- eval cabinet Average Precision: 0.552645 eval bed Average Precision: 0.849413 eval chair Average Precision: 0.596056 eval sofa Average Precision: 0.584431 eval table Average Precision: 0.568197 eval door Average Precision: 0.431520 eval window Average Precision: 0.395440 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.483718 eval counter Average Precision: 0.484848 eval desk Average Precision: 0.601325 eval curtain Average Precision: 0.386498 eval refrigerator Average Precision: 0.001399 eval shower curtain Average Precision: 0.416667 eval toilet Average Precision: 0.225525 eval sink Average Precision: 0.655688 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.283074 eval mAP: 0.507609 eval cabinet Recall: 0.813725 eval bed Recall: 0.950000 eval chair Recall: 0.937173 eval sofa Recall: 0.923077 eval table Recall: 0.865854 eval door Recall: 0.882353 eval window Recall: 0.680851 eval bookshelf Recall: 0.846154 eval picture Recall: 0.787879 eval counter Recall: 0.818182 eval desk Recall: 0.805556 eval curtain Recall: 0.600000 eval refrigerator Recall: 0.666667 eval shower curtain Recall: 0.500000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.831707 eval AR: 0.807584 | METEOR: 0.1500 | METEOR: 0.3000 | METEOR: 0.4500 | METEOR: 0.6000 | METEOR: 0.7500 | | IoU: 0.1000 | 0.2432 | 0.2411 | 0.2228 | 0.1264 | 0.0198 | | IoU: 0.2000 | 0.2303 | 0.2283 | 0.2108 | 0.1204 | 0.0193 | | IoU: 0.3000 | 0.2156 | 0.2136 | 0.1968 | 0.1135 | 0.0188 | | IoU: 0.4000 | 0.2060 | 0.2040 | 0.1879 | 0.1078 | 0.0178 | | IoU: 0.5000 | 0.1915 | 0.1897 | 0.1752 | 0.1010 | 0.0163 | InsertResults connected to database #localization-results = 3 submitting_to_test_set updated submitted to db Evaluations done, finishing up ... Done.