Perceiving Systems, Computer Vision

TRACE: 5D Temporal Regression of Avatars With Dynamic Cameras in 3D Environments

2023

Conference Paper

ps


Although the estimation of 3D human pose and shape (HPS) is rapidly progressing, current methods still cannot reliably estimate moving humans in global coordinates, which is critical for many applications. This is particularly challenging when the camera is also moving, entangling human and camera motion. To address these issues, we adopt a novel 5D representation (space, time, and identity) that enables end-to-end reasoning about people in scenes. Our method, called TRACE, introduces several novel architectural components. Most importantly, it uses two new "maps" to reason about the 3D trajectory of people over time in camera, and world, coordinates. An additional memory unit enables persistent tracking of people even during long occlusions. TRACE is the first one-stage method to jointly recover and track 3D humans in global coordinates from dynamic cameras. By training it end-to-end, and using full image information, TRACE achieves state-of-the-art performance on tracking and HPS benchmarks. The code and dataset are released for research purposes.

Author(s): Sun, Yu and Bao, Qian and Liu, Wu and Mei, Tao and Black, Michael J.
Book Title: IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR)
Pages: 8856--8866
Year: 2023
Month: June

Department(s): Perceiving Systems
Bibtex Type: Conference Paper (inproceedings)

Event Name: CVPR 2023
Event Place: Vancouver

Links: pdf
supp
code
video
Video:

BibTex

@inproceedings{Sun:CVPR:2023,
  title = {{TRACE}: {5D} Temporal Regression of Avatars With Dynamic Cameras in {3D} Environments},
  author = {Sun, Yu and Bao, Qian and Liu, Wu and Mei, Tao and Black, Michael J.},
  booktitle = {IEEE/CVF Conf.~on Computer Vision and Pattern Recognition (CVPR)},
  pages = {8856--8866},
  month = jun,
  year = {2023},
  doi = {},
  month_numeric = {6}
}