The 3D semantic labeling task involves predicting a semantic labeling of a 3D scan mesh.

Evaluation and metrics

Our evaluation ranks all methods according to the PASCAL VOC intersection-over-union metric (IoU). IoU = TP/(TP+FP+FN), where TP, FP, and FN are the numbers of true positive, false positive, and false negative pixels, respectively. Predicted labels are evaluated per-vertex over the respective 3D scan mesh; for 3D approaches that operate on other representations like grids or points, the predicted labels should be mapped onto the mesh vertices (e.g., one such example for grid to mesh vertices is provided in the evaluation helpers).



This table lists the benchmark results for the 3D semantic label with limited reconstructions scenario.




Method Infoavg ioubathtubbedbookshelfcabinetchaircountercurtaindeskdoorfloorotherfurniturepicturerefrigeratorshower curtainsinksofatabletoiletwallwindow
sort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysorted bysort bysort bysort bysort bysort bysort bysort bysort bysort by
WS3D_LR_Sem0.682 10.864 20.766 40.785 10.645 10.805 70.443 20.793 10.600 10.588 10.944 70.457 10.222 10.537 30.773 20.724 10.755 20.622 10.907 20.821 30.597 3
Kangcheng Liu: WS3D: Weakly Supervised 3D Scene Segmentation with Region-Level Boundary Awareness and Instance Discrimination. European Conference on Computer Vision (ECCV), 2022
NWSYY0.670 20.745 40.784 20.780 30.611 30.829 50.463 10.692 20.585 30.542 30.948 20.448 20.219 20.554 20.828 10.702 20.758 10.573 30.844 70.825 20.674 1
CSG_3DSegNet0.594 80.732 80.755 50.697 70.548 50.790 80.363 70.432 80.576 40.448 80.933 80.414 30.168 50.486 70.621 60.574 60.727 40.539 60.807 80.779 80.498 6
DE-3DLearner LR0.663 30.851 30.770 30.760 40.615 20.830 40.439 30.670 30.546 50.587 20.955 10.406 40.177 40.627 10.758 30.606 40.740 30.549 40.888 30.840 10.652 2
Ping-Chung Yu, Cheng Sun, Min Sun: Data Efficient 3D Learner via Knowledge Transferred from 2D Model. ECCV 2022
PointContrast_LR_SEM0.603 60.740 60.700 80.700 60.546 60.843 20.419 40.592 60.462 70.513 50.946 40.374 50.104 80.530 50.687 40.571 80.694 70.519 70.850 40.781 70.484 8
Scratch_LR_SEM0.596 70.745 40.722 70.783 20.486 80.834 30.414 50.667 40.398 80.492 70.948 20.359 60.124 70.406 80.636 50.574 60.708 60.503 80.849 60.784 60.490 7
CSC_LR_SEM0.612 50.739 70.794 10.687 80.564 40.850 10.347 80.590 70.587 20.521 40.945 60.358 70.140 60.522 60.496 80.627 30.725 50.598 20.850 40.792 40.508 5
Viewpoint_BN_LR_AIR0.625 40.873 10.727 60.709 50.535 70.820 60.402 60.643 50.540 60.501 60.946 40.352 80.181 30.535 40.594 70.596 50.685 80.543 50.927 10.792 40.592 4