3D Semantic Label with Limited Reconstructions Benchmark
The 3D semantic labeling task involves predicting a semantic labeling of a 3D scan mesh.
Evaluation and metricsOur evaluation ranks all methods according to the PASCAL VOC intersection-over-union metric (IoU). IoU = TP/(TP+FP+FN), where TP, FP, and FN are the numbers of true positive, false positive, and false negative pixels, respectively. Predicted labels are evaluated per-vertex over the respective 3D scan mesh; for 3D approaches that operate on other representations like grids or points, the predicted labels should be mapped onto the mesh vertices (e.g., one such example for grid to mesh vertices is provided in the evaluation helpers).
This table lists the benchmark results for the 3D semantic label with limited reconstructions scenario.
Method | Info | avg iou | bathtub | bed | bookshelf | cabinet | chair | counter | curtain | desk | door | floor | otherfurniture | picture | refrigerator | shower curtain | sink | sofa | table | toilet | wall | window |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
WS3D_LR_Sem | 0.685 1 | 0.871 1 | 0.769 1 | 0.779 1 | 0.647 1 | 0.806 1 | 0.453 1 | 0.802 1 | 0.577 1 | 0.588 1 | 0.945 1 | 0.460 1 | 0.223 1 | 0.539 1 | 0.793 1 | 0.732 1 | 0.766 1 | 0.614 1 | 0.904 1 | 0.823 1 | 0.604 1 | |
Kangcheng Liu: WS3D: Weakly Supervised 3D Scene Segmentation with Region-Level Boundary Awareness and Instance Discrimination. European Conference on Computer Vision (ECCV), 2022 | ||||||||||||||||||||||
DE-3DLearner LR | 0.263 4 | 0.000 2 | 0.547 2 | 0.235 8 | 0.184 5 | 0.566 3 | 0.165 5 | 0.249 6 | 0.196 4 | 0.309 2 | 0.938 2 | 0.070 8 | 0.186 3 | 0.069 7 | 0.000 2 | 0.000 8 | 0.368 6 | 0.356 2 | 0.000 3 | 0.698 4 | 0.118 4 | |
Ping-Chung Yu, Cheng Sun, Min Sun: Data Efficient 3D Learner via Knowledge Transferred from 2D Model. ECCV 2022 | ||||||||||||||||||||||
CSC_LR_SEM | 0.270 3 | 0.000 2 | 0.528 3 | 0.331 5 | 0.139 7 | 0.535 7 | 0.118 8 | 0.326 4 | 0.222 3 | 0.292 3 | 0.921 4 | 0.089 6 | 0.163 4 | 0.129 2 | 0.000 2 | 0.131 3 | 0.463 3 | 0.278 8 | 0.000 3 | 0.699 3 | 0.033 7 | |
NWSYY | 0.286 2 | 0.000 2 | 0.515 4 | 0.322 6 | 0.247 2 | 0.618 2 | 0.219 3 | 0.304 5 | 0.174 8 | 0.268 4 | 0.926 3 | 0.117 2 | 0.162 5 | 0.065 8 | 0.000 2 | 0.079 5 | 0.514 2 | 0.345 4 | 0.028 2 | 0.701 2 | 0.125 2 | |
Viewpoint_BN_LR_AIR | 0.256 6 | 0.000 2 | 0.479 5 | 0.377 2 | 0.204 4 | 0.551 5 | 0.205 4 | 0.219 7 | 0.235 2 | 0.224 7 | 0.903 7 | 0.092 4 | 0.088 6 | 0.122 3 | 0.000 2 | 0.003 7 | 0.354 7 | 0.354 3 | 0.000 3 | 0.676 7 | 0.034 6 | |
Scratch_LR_SEM | 0.251 8 | 0.000 2 | 0.457 6 | 0.238 7 | 0.205 3 | 0.528 8 | 0.123 7 | 0.419 2 | 0.195 5 | 0.246 6 | 0.905 6 | 0.086 7 | 0.048 8 | 0.103 4 | 0.000 2 | 0.132 2 | 0.331 8 | 0.308 6 | 0.000 3 | 0.675 8 | 0.015 8 | |
PointContrast_LR_SEM | 0.253 7 | 0.000 2 | 0.412 7 | 0.347 4 | 0.137 8 | 0.564 4 | 0.140 6 | 0.361 3 | 0.187 6 | 0.249 5 | 0.914 5 | 0.092 4 | 0.055 7 | 0.102 5 | 0.000 2 | 0.048 6 | 0.392 5 | 0.302 7 | 0.000 3 | 0.697 5 | 0.056 5 | |
CSG_3DSegNet | 0.258 5 | 0.000 2 | 0.335 8 | 0.368 3 | 0.169 6 | 0.549 6 | 0.229 2 | 0.158 8 | 0.182 7 | 0.208 8 | 0.898 8 | 0.105 3 | 0.190 2 | 0.093 6 | 0.000 2 | 0.093 4 | 0.448 4 | 0.342 5 | 0.000 3 | 0.679 6 | 0.119 3 | |