Checking for the submission file ... Clearing the temporary folder ... Unzipping the submission file to the temporary folder ... Calling: 7z x -o/var/www/html/scanrefer/tmp/extracted_results /var/www/html/scanrefer/server/uploads/2_submission_143_upload_1-8845d66a6ce24c0950bc2aa8b089ae93.zip Running evaluations ... Captioning Evaluation ----------------------Evaluation @ 0 IoU----------------------- [BLEU-1] Precision: 0.2931, Recall: 0.7799, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.2421, Recall: 0.6441, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.1885, Recall: 0.5016, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1375, Recall: 0.3660, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2454, Recall: 0.6529, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.2434, Recall: 0.6477, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.1154, Recall: 0.3071, Max: 1.0000, Min: 0.1171 ----------------------Evaluation @ 0.25 IoU----------------------- [BLEU-1] Precision: 0.2595, Recall: 0.6905, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.2157, Recall: 0.5739, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.1699, Recall: 0.4520, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1267, Recall: 0.3371, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2329, Recall: 0.6197, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.2165, Recall: 0.5760, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.1038, Recall: 0.2763, Max: 1.0000, Min: 0.1171 ----------------------Evaluation @ 0.5 IoU----------------------- [BLEU-1] Precision: 0.2356, Recall: 0.6270, Max: 1.0000, Min: 0.2355 [BLEU-2] Precision: 0.1961, Recall: 0.5217, Max: 1.0000, Min: 0.1839 [BLEU-3] Precision: 0.1547, Recall: 0.4117, Max: 1.0000, Min: 0.0000 [BLEU-4] Precision: 0.1155, Recall: 0.3073, Max: 1.0000, Min: 0.0000 [CIDEr] Precision: 0.2152, Recall: 0.5727, Max: 4.2669, Min: 0.0000 [ROUGE-L] Precision: 0.1969, Recall: 0.5238, Max: 1.0000, Min: 0.3313 [METEOR] Precision: 0.0947, Recall: 0.2520, Max: 1.0000, Min: 0.1171 ---------- iou_thresh: 0.250000 ---------- eval cabinet Average Precision: 0.654543 eval bed Average Precision: 0.909384 eval chair Average Precision: 0.632521 eval sofa Average Precision: 0.581523 eval table Average Precision: 0.626704 eval door Average Precision: 0.500116 eval window Average Precision: 0.633006 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.476291 eval counter Average Precision: 0.660624 eval desk Average Precision: 0.795712 eval curtain Average Precision: 0.612857 eval refrigerator Average Precision: 0.004342 eval shower curtain Average Precision: 0.928571 eval toilet Average Precision: 0.225803 eval sink Average Precision: 0.713849 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.314480 eval mAP: 0.605047 eval cabinet Recall: 0.872549 eval bed Recall: 1.000000 eval chair Recall: 0.973822 eval sofa Recall: 0.923077 eval table Recall: 0.902439 eval door Recall: 0.955882 eval window Recall: 0.851064 eval bookshelf Recall: 0.846154 eval picture Recall: 0.757576 eval counter Recall: 1.000000 eval desk Recall: 0.916667 eval curtain Recall: 0.700000 eval refrigerator Recall: 0.866667 eval shower curtain Recall: 1.000000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.819512 eval AR: 0.889597 ---------- iou_thresh: 0.500000 ---------- eval cabinet Average Precision: 0.536274 eval bed Average Precision: 0.849413 eval chair Average Precision: 0.585607 eval sofa Average Precision: 0.515549 eval table Average Precision: 0.563222 eval door Average Precision: 0.426780 eval window Average Precision: 0.378049 eval bookshelf Average Precision: 0.620513 eval picture Average Precision: 0.476291 eval counter Average Precision: 0.484848 eval desk Average Precision: 0.588787 eval curtain Average Precision: 0.367143 eval refrigerator Average Precision: 0.002569 eval shower curtain Average Precision: 0.416667 eval toilet Average Precision: 0.225803 eval sink Average Precision: 0.655688 eval bathtub Average Precision: 1.000000 eval others Average Precision: 0.272951 eval mAP: 0.498120 eval cabinet Recall: 0.754902 eval bed Recall: 0.950000 eval chair Recall: 0.916230 eval sofa Recall: 0.846154 eval table Recall: 0.829268 eval door Recall: 0.867647 eval window Recall: 0.638298 eval bookshelf Recall: 0.846154 eval picture Recall: 0.757576 eval counter Recall: 0.818182 eval desk Recall: 0.777778 eval curtain Recall: 0.500000 eval refrigerator Recall: 0.666667 eval shower curtain Recall: 0.500000 eval toilet Recall: 0.714286 eval sink Recall: 0.913043 eval bathtub Recall: 1.000000 eval others Recall: 0.751220 eval AR: 0.780411 | METEOR: 0.1500 | METEOR: 0.3000 | METEOR: 0.4500 | METEOR: 0.6000 | METEOR: 0.7500 | | IoU: 0.1000 | 0.2351 | 0.2331 | 0.2149 | 0.1221 | 0.0191 | | IoU: 0.2000 | 0.2228 | 0.2208 | 0.2039 | 0.1165 | 0.0186 | | IoU: 0.3000 | 0.2087 | 0.2067 | 0.1906 | 0.1098 | 0.0182 | | IoU: 0.4000 | 0.1997 | 0.1978 | 0.1823 | 0.1046 | 0.0172 | | IoU: 0.5000 | 0.1857 | 0.1840 | 0.1700 | 0.0980 | 0.0158 | InsertResults connected to database #localization-results = 3 submitting_to_test_set updated submitted to db Evaluations done, finishing up ... Done.