The 3D semantic labeling task involves predicting a semantic labeling of a 3D scan mesh.

Evaluation and metrics

Our evaluation ranks all methods according to the PASCAL VOC intersection-over-union metric (IoU). IoU = TP/(TP+FP+FN), where TP, FP, and FN are the numbers of true positive, false positive, and false negative pixels, respectively. Predicted labels are evaluated per-vertex over the respective 3D scan mesh; for 3D approaches that operate on other representations like grids or points, the predicted labels should be mapped onto the mesh vertices (e.g., one such example for grid to mesh vertices is provided in the evaluation helpers).



This table lists the benchmark results for the 3D semantic label with limited reconstructions scenario.




Method Infoavg ioubathtubbedbookshelfcabinetchaircountercurtaindeskdoorfloorotherfurniturepicturerefrigeratorshower curtainsinksofatabletoiletwallwindow
sort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysorted bysort by
WS3D_LR_Sem0.685 10.871 10.769 10.779 10.647 10.806 10.453 10.802 10.577 10.588 10.945 10.460 10.223 10.539 10.793 10.732 10.766 10.614 10.904 10.823 10.604 1
Kangcheng Liu: WS3D: Weakly Supervised 3D Scene Segmentation with Region-Level Boundary Awareness and Instance Discrimination. European Conference on Computer Vision (ECCV), 2022
NWSYY0.286 20.000 20.515 40.322 60.247 20.618 20.219 30.304 50.174 80.268 40.926 30.117 20.162 50.065 80.000 20.079 50.514 20.345 40.028 20.701 20.125 2
CSC_LR_SEM0.270 30.000 20.528 30.331 50.139 70.535 70.118 80.326 40.222 30.292 30.921 40.089 60.163 40.129 20.000 20.131 30.463 30.278 80.000 30.699 30.033 7
DE-3DLearner LR0.263 40.000 20.547 20.235 80.184 50.566 30.165 50.249 60.196 40.309 20.938 20.070 80.186 30.069 70.000 20.000 80.368 60.356 20.000 30.698 40.118 4
Ping-Chung Yu, Cheng Sun, Min Sun: Data Efficient 3D Learner via Knowledge Transferred from 2D Model. ECCV 2022
PointContrast_LR_SEM0.253 70.000 20.412 70.347 40.137 80.564 40.140 60.361 30.187 60.249 50.914 50.092 40.055 70.102 50.000 20.048 60.392 50.302 70.000 30.697 50.056 5
CSG_3DSegNet0.258 50.000 20.335 80.368 30.169 60.549 60.229 20.158 80.182 70.208 80.898 80.105 30.190 20.093 60.000 20.093 40.448 40.342 50.000 30.679 60.119 3
Viewpoint_BN_LR_AIR0.256 60.000 20.479 50.377 20.204 40.551 50.205 40.219 70.235 20.224 70.903 70.092 40.088 60.122 30.000 20.003 70.354 70.354 30.000 30.676 70.034 6
Scratch_LR_SEM0.251 80.000 20.457 60.238 70.205 30.528 80.123 70.419 20.195 50.246 60.905 60.086 70.048 80.103 40.000 20.132 20.331 80.308 60.000 30.675 80.015 8