THOM: Generating Physically Plausible Hand-Object Meshes From Text

1UNIST, 2University of Birmingham, 3POSTECH

THOM is a novel photorealistic text-to-3D HOI generation pipeline that generalizes to diverse objects, hands and interactions.

Abstract

The generation of 3D hand-object interactions (HOIs) from text is crucial for dexterous robotic grasping and VR/AR content generation, requiring both high visual fidelity and physical plausibility. Nevertheless, the ill-posed problem of mesh extraction from text-generated Gaussians, and physics-based optimization on the erroneous meshes pose challenges. To address these issues, we introduce THOM, a training-free framework that generates photorealistic, physically plausible 3D HOI meshes without the need for a template object mesh. THOM employs a two-stage pipeline, initially generating the hand and object Gaussians, followed by physics-based HOI optimization. Our new mesh extraction method and vertex-to-Gaussian mapping explicitly assign Gaussian elements to mesh vertices, allowing topology-aware regularization. Furthermore, we improve the physical plausibility of interactions by VLM-guided translation refinement and contact-aware optimization. Comprehensive experiments demonstrate that THOM consistently surpasses state-of-the-art methods in terms of text alignment, visual realism, and interaction plausibility.

Framework

Framework Figure

THOM framework adopts a two-stage pipeline for generating realistic hand-object interactions. Initially, object and hand meshes are independently generated with high visual realism. In the second stage, we jointly optimize their interaction parameters using physics-based regularization losses to ensure plausible contacts and minimal penetration.

Comparison with Text-To-3D Methods

Comparison Figure

Qualitative comparison of our method with Text-to-3D generation methods.

Comparison with Text-To-HOI Methods

HOI Comparison Figure

Qualitative comparison of our method with Text-to-HOI generation methods.

BibTeX

@article{
        Coming soon
}