Submitted by Aljaz Bozic.

Submission data

Full nameDynamicFusion: Reconstruction and Tracking of Non-rigid Scenes in Real-Time
DescriptionWe present the first dense SLAM system capable of reconstructing non-rigidly deforming scenes in real-time, by fusing together RGBD scans captured from commodity sensors. Our DynamicFusion approach reconstructs scene geometry whilst simultaneously estimating a dense volumetric 6D motion field that warps the estimated geometry into a live frame. Like KinectFusion, our system produces increasingly denoised, detailed, and complete reconstructions as more measurements are fused, and displays the updated model in real time. Because we do not require a template or other prior scene model, the approach is applicable to a wide range of moving objects and scenes.
Publication titleDynamicFusion: Reconstruction and Tracking of Non-rigid Scenes in Real-Time
Publication authorsRichard Newcombe, Dieter Fox, Steve Seitz
Publication venueCVPR 2015
Publication URLhttps://grail.cs.washington.edu/projects/dynamicfusion/papers/DynamicFusion.pdf
Input Data TypesUses Color Input
Programming language(s)C++ with CUDA
HardwareGeForce 1080Ti
Websitehttps://grail.cs.washington.edu/projects/dynamicfusion/
Submission creation date17 Mar, 2020
Last edited20 Mar, 2020

Non-rigid Reconstruction

Method InfoGeometry error (cm)Deformation error (cm)
DynamicFusion1.0786.179