DetailGen3D: Generative 3D Geometry Enhancement via Data-Dependent Flow

Arxiv 2024

Abstract

Modern 3D generation methods can rapidly create shapes from sparse or single views, but their outputs often lack geometric detail due to computational constraints. We present DetailGen3D, a generative approach specifically designed to enhance these generated 3D shapes. Our key insight is to model the coarse-to-fine transformation directly through data-dependent flows in latent space, avoiding the computational overhead of large-scale 3D generative models. We introduce a token matching strategy that ensures accurate spatial correspondence during refinement, enabling local detail synthesis while preserving global structure. By carefully designing our training data to match the characteristics of synthesized coarse shapes, our method can effectively enhance shapes produced by various 3D generation and reconstruction approaches, from single-view to sparse multi-view inputs. Extensive experiments demonstrate that DetailGen3D achieves high-fidelity geometric detail synthesis while maintaining efficiency in training.

Method Overview

(1) Inference pipeline. We use FPS-VAE to extract tokens of the coarse geometry generated or reconstructed, then input the coarse token and DINO feature of the image prompt to DiT. After denoising, we decode the predicted token using FPS-VAE decoder to obtain refined geometry. The inference process takes only a few seconds. (2) For training data, we use reconstruction results reconstructed by LRM using multi-views rendered from fine geometry as coarse geometry. (3) We demonstrate the token matching process on the left. On the right, for the top one, we only use part query points, which are located in quadrant one, and for the bottom one, we use full query points, which demonstrate tokens represent the space around the corresponding query points.

Refine Results

Image prompt
 

Sample Image

Normal
 

Coarse mesh

Fine mesh

Sample Image
Sample Image
Sample Image
Sample Image
More Results

BibTeX

@misc{deng2024detailgen3dgenerative3dgeometry,
        title={DetailGen3D: Generative 3D Geometry Enhancement via Data-Dependent Flow}, 
        author={Ken Deng and Yuanchen Guo and Jingxiang Sun and Zixin Zou and Yangguang Li and Xin Cai and Yanpei Cao and Yebin Liu and Ding Liang},
        year={2024},
        eprint={2411.16820},
        archivePrefix={arXiv},
        primaryClass={cs.CV},
        url={https://arxiv.org/abs/2411.16820}, 
  }