2022-04-15 12:35:46 +08:00
2023-05-22 11:16:15 +08:00
2022-04-19 17:12:10 +08:00
2026-03-09 23:13:42 +08:00
2023-09-27 16:20:00 +08:00
2023-09-27 16:20:00 +08:00

logo
nmbgcl

OpenGait is a flexible and extensible gait analysis project provided by the Shiqi Yu Group and supported in part by WATRIX.AI. The corresponding paper has been accepted by CVPR2023 as a highlight paper. The extension paper has been accepted to TPAMI2025.

What's New

  • [Sep 2025] BiggerGait has been accepted to NeurIPS2025🎉 and is available at here. Here are checkpoints.
  • [Jun 2025] Scoliosis1K-Pose has been accepted to MICCAI2025🎉. Extends ScoNet by introducing pose annotations and clinical priors for interpretable scoliosis screening. Dataset is available on the project homepage.
  • [Jun 2025] LidarGait++ has been accepted to CVPR2025🎉 and open-source in configs/lidargaitv2.
  • [Jun 2025] The extension paper of OpenGait, further strengthened by the advancements of DeepGaitV2, SkeletonGait, and SkeletonGait++, has been accepted for publication in TPAMI🎉. We sincerely acknowledge the valuable contributions and continuous support from the OpenGait community.
  • [Feb 2025] The diffusion-based DenoisingGait has been accepted to CVPR2025🎉 Congratulations to Dongyang! This is his SECOND paper!
  • [Feb 2025] Chao successfully defended his Ph.D. thesis in Oct. 2024🎉🎉🎉 You can access the full text in Chao's Thesis in English or 樊超的学位论文(中文版).
  • [Dec 2024] The multimodal MultiGait++ has been accepted to AAAI2025🎉 Congratulations to Dongyang! This is his FIRST paper!
  • [Jun 2024] The first large-scale gait-based scoliosis screening benchmark ScoNet is accepted to MICCAI2024🎉 Congratulations to Zirui! This is his FIRST paper! The code is released here, and you can refer to project homepage for details.
  • [May 2024] The code of Large Vision Model based method BigGait is available at here. CCPG's checkpoints.
  • [Apr 2024] Our team's latest checkpoints for projects such as DeepGaitv2, SkeletonGait, SkeletonGait++, and SwinGait will be released on Hugging Face. Additionally, previously released checkpoints will also be gradually made available on it.
  • [Mar 2024] Chao gives a talk about 'Progress in Gait Recognition'. The video and slides are both available😊
  • [Mar 2024] The code of SkeletonGait++ is released here, and you can refer to readme for details.
  • [Mar 2024] BigGait has been accepted to CVPR2024🎉 Congratulations to Dingqiang! This is his FIRST paper!
  • [Jan 2024] The code of transfomer-based SwinGait is available at here.

Our Works

A Real Gait Recognition System: All-in-One-Gait

probe1-After

The workflow of All-in-One-Gait involves the processes of pedestrian tracking, segmentation and recognition. See here for details.

Highlighted features

Getting Started

Quick Start (uv)

# Install dependencies
uv sync

# Train
CUDA_VISIBLE_DEVICES=0,1 uv run python -m torch.distributed.launch --nproc_per_node=2 opengait/main.py --cfgs ./configs/baseline/baseline.yaml --phase train

# Test
CUDA_VISIBLE_DEVICES=0,1 uv run python -m torch.distributed.launch --nproc_per_node=2 opengait/main.py --cfgs ./configs/baseline/baseline.yaml --phase test

Note: The --nproc_per_node argument must exactly match the number of GPUs specified in CUDA_VISIBLE_DEVICES. For single-GPU evaluation, use CUDA_VISIBLE_DEVICES=0 and --nproc_per_node=1 with the DDP launcher.

Resume Tip: To survive interrupted training runs, set trainer_cfg.resume_every_iter to a non-zero value and optionally trainer_cfg.auto_resume_latest: true. OpenGait will keep output/.../checkpoints/latest.pt updated for crash recovery.

Please see 0.get_started.md. We also provide the following tutorials for your reference:

Model Zoo

You can find all the checkpoint files at Hugging Face Models!

The result list of appearance-based gait recognition is available here.

The result list of pose-based gait recognition is available here.

Authors:

Now OpenGait is mainly maintained by Dongyang Jin (金冬阳), 11911221@mail.sustech.edu.cn

Acknowledgement

Citation

@InProceedings{Fan_2023_CVPR,
    author    = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
    title     = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {9707-9716}
}

@ARTICLE{fan2025opengait,
  author={Fan, Chao and Hou, Saihui and Liang, Junhao and Shen, Chuanfu and Ma, Jingzhe and Jin, Dongyang and Huang, Yongzhen and Yu, Shiqi},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={OpenGait: A Comprehensive Benchmark Study for Gait Recognition Towards Better Practicality}, 
  year={2025},
  volume={},
  number={},
  pages={1-18},
  doi={10.1109/TPAMI.2025.3576283}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.

S
Description
No description provided
Readme 237 MiB