Checking for the submission file ... Clearing the temporary folder ... Unzipping the submission file to the temporary folder ... Calling: 7z x -o/var/www/html/scanrefer/tmp/extracted_results /var/www/html/scanrefer/server/uploads/2_submission_141_upload_1-d4d048ff7eba72b226fb990ddc2d10c5.zip Running evaluations ... Captioning Evaluation ----------------------Evaluation @ 0 IoU----------------------- [BLEU-1] Precision: 0.0891, Recall: 0.7840, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.0738, Recall: 0.6487, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.0577, Recall: 0.5075, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0425, Recall: 0.3736, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.0781, Recall: 0.6872, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.0741, Recall: 0.6519, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0355, Recall: 0.3119, Max: 1.0000, Min: 0.1295 ----------------------Evaluation @ 0.25 IoU----------------------- [BLEU-1] Precision: 0.0849, Recall: 0.7467, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.0704, Recall: 0.6195, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.0553, Recall: 0.4864, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0411, Recall: 0.3610, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.0761, Recall: 0.6692, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.0707, Recall: 0.6219, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0339, Recall: 0.2984, Max: 1.0000, Min: 0.1295 ----------------------Evaluation @ 0.5 IoU----------------------- [BLEU-1] Precision: 0.0767, Recall: 0.6745, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.0637, Recall: 0.5603, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.0501, Recall: 0.4410, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.0373, Recall: 0.3284, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.0700, Recall: 0.6155, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.0640, Recall: 0.5633, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0308, Recall: 0.2712, Max: 1.0000, Min: 0.1295 ---------- iou_thresh: 0.250000 ---------- eval cabinet Average Precision: 0.668315 eval bed Average Precision: 0.909384 eval chair Average Precision: 0.642603 eval sofa Average Precision: 0.656509 eval table Average Precision: 0.636833 eval door Average Precision: 0.502103 eval window Average Precision: 0.656194 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.489605 eval counter Average Precision: 0.660624 eval desk Average Precision: 0.810507 eval curtain Average Precision: 0.638664 eval refrigerator Average Precision: 0.002363 eval shower curtain Average Precision: 0.928571 eval toilet Average Precision: 0.225525 eval sink Average Precision: 0.713849 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.327747 eval mAP: 0.616106 eval cabinet Recall: 0.931373 eval bed Recall: 1.000000 eval chair Recall: 0.984293 eval sofa Recall: 1.000000 eval table Recall: 0.939024 eval door Recall: 0.955882 eval window Recall: 0.893617 eval bookshelf Recall: 0.846154 eval picture Recall: 0.818182 eval counter Recall: 1.000000 eval desk Recall: 0.944444 eval curtain Recall: 0.800000 eval refrigerator Recall: 0.866667 eval shower curtain Recall: 1.000000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.919512 eval AR: 0.918138 ---------- iou_thresh: 0.500000 ---------- eval cabinet Average Precision: 0.552645 eval bed Average Precision: 0.849413 eval chair Average Precision: 0.596056 eval sofa Average Precision: 0.584431 eval table Average Precision: 0.568197 eval door Average Precision: 0.431520 eval window Average Precision: 0.395440 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.483718 eval counter Average Precision: 0.484848 eval desk Average Precision: 0.601325 eval curtain Average Precision: 0.386498 eval refrigerator Average Precision: 0.001399 eval shower curtain Average Precision: 0.416667 eval toilet Average Precision: 0.225525 eval sink Average Precision: 0.655688 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.283074 eval mAP: 0.507609 eval cabinet Recall: 0.813725 eval bed Recall: 0.950000 eval chair Recall: 0.937173 eval sofa Recall: 0.923077 eval table Recall: 0.865854 eval door Recall: 0.882353 eval window Recall: 0.680851 eval bookshelf Recall: 0.846154 eval picture Recall: 0.787879 eval counter Recall: 0.818182 eval desk Recall: 0.805556 eval curtain Recall: 0.600000 eval refrigerator Recall: 0.666667 eval shower curtain Recall: 0.500000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.831707 eval AR: 0.807584 | METEOR: 0.1500 | METEOR: 0.3000 | METEOR: 0.4500 | METEOR: 0.6000 | METEOR: 0.7500 | | IoU: 0.1000 | 0.2447 | 0.2426 | 0.2243 | 0.1277 | 0.0200 | | IoU: 0.2000 | 0.2319 | 0.2298 | 0.2124 | 0.1217 | 0.0194 | | IoU: 0.3000 | 0.2174 | 0.2154 | 0.1986 | 0.1148 | 0.0189 | | IoU: 0.4000 | 0.2074 | 0.2054 | 0.1896 | 0.1091 | 0.0180 | | IoU: 0.5000 | 0.1927 | 0.1909 | 0.1766 | 0.1021 | 0.0165 | InsertResults connected to database #localization-results = 3 submitting_to_test_set updated submitted to db Evaluations done, finishing up ... Done.