The 3D semantic labeling task involves predicting a semantic labeling of a 3D scan mesh.

Evaluation and metrics

Our evaluation ranks all methods according to the PASCAL VOC intersection-over-union metric (IoU). IoU = TP/(TP+FP+FN), where TP, FP, and FN are the numbers of true positive, false positive, and false negative pixels, respectively. Predicted labels are evaluated per-vertex over the respective 3D scan mesh; for 3D approaches that operate on other representations like grids or points, the predicted labels should be mapped onto the mesh vertices (e.g., one such example for grid to mesh vertices is provided in the evaluation helpers).



This table lists the benchmark results for the 3D semantic label with limited reconstructions scenario.




Method Infoavg ioubathtubbedbookshelfcabinetchaircountercurtaindeskdoorfloorotherfurniturepicturerefrigeratorshower curtainsinksofatabletoiletwallwindow
sort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysorted bysort bysort bysort bysort bysort bysort bysort bysort by
DE-3DLearner LR0.608 30.853 20.689 50.593 70.483 50.830 20.466 20.652 30.528 40.482 30.954 10.288 60.250 10.448 40.595 40.532 50.748 30.503 60.822 40.806 30.647 2
Ping-Chung Yu, Cheng Sun, Min Sun: Data Efficient 3D Learner via Knowledge Transferred from 2D Model. ECCV 2022
WS3D_LR_Sem0.682 10.863 10.765 20.782 10.648 10.803 70.438 30.793 10.607 10.589 10.944 30.455 10.223 20.536 20.768 10.726 10.758 20.623 10.906 10.821 20.596 3
Kangcheng Liu: WS3D: Weakly Supervised 3D Scene Segmentation with Region-Level Boundary Awareness and Instance Discrimination. European Conference on Computer Vision (ECCV), 2022
NWSYY0.678 20.779 40.782 10.774 20.637 20.827 40.491 10.736 20.597 20.561 20.947 20.438 20.206 30.610 10.758 20.667 20.773 10.594 30.880 20.824 10.673 1
CSC_LR_SEM0.575 40.671 80.740 30.727 30.445 60.847 10.380 70.602 50.512 50.447 50.942 40.291 50.184 40.353 80.468 80.508 60.745 40.602 20.855 30.765 50.420 8
CSG_3DSegNet0.570 50.717 60.730 40.697 40.521 30.823 50.377 80.419 80.531 30.452 40.935 80.316 30.147 50.359 70.551 70.551 40.692 70.513 50.797 60.764 60.508 4
Viewpoint_BN_LR_AIR0.566 60.780 30.659 80.677 50.484 40.799 80.419 50.636 40.480 60.432 70.940 50.238 80.124 60.396 50.609 30.432 80.735 50.527 40.787 70.752 80.423 7
PointContrast_LR_SEM0.555 70.711 70.668 60.622 60.425 70.830 20.433 40.552 60.273 80.440 60.938 60.287 70.096 70.470 30.576 50.612 30.687 80.438 80.781 80.785 40.474 5
Scratch_LR_SEM0.531 80.750 50.666 70.553 80.409 80.816 60.387 60.487 70.285 70.368 80.938 60.310 40.074 80.388 60.564 60.468 70.698 60.448 70.804 50.761 70.454 6