add gaitedge and reorgnize repo
This commit is contained in:
@@ -0,0 +1,68 @@
|
||||
# Get Started
|
||||
## Installation
|
||||
1. clone this repo.
|
||||
```
|
||||
git clone https://github.com/ShiqiYu/OpenGait.git
|
||||
```
|
||||
2. Install dependenices:
|
||||
- pytorch >= 1.6
|
||||
- torchvision
|
||||
- pyyaml
|
||||
- tensorboard
|
||||
- opencv-python
|
||||
- tqdm
|
||||
- py7zr
|
||||
|
||||
Install dependenices by [Anaconda](https://conda.io/projects/conda/en/latest/user-guide/install/index.html):
|
||||
```
|
||||
conda install tqdm pyyaml tensorboard opencv py7zr
|
||||
conda install pytorch==1.6.0 torchvision -c pytorch
|
||||
```
|
||||
Or, Install dependenices by pip:
|
||||
```
|
||||
pip install tqdm pyyaml tensorboard opencv-python py7zr
|
||||
pip install torch==1.6.0 torchvision==0.7.0
|
||||
```
|
||||
## Prepare dataset
|
||||
See [prepare dataset](docs/0.prepare_dataset.md).
|
||||
|
||||
## Get trained model
|
||||
- Option 1:
|
||||
```
|
||||
python misc/download_pretrained_model.py
|
||||
```
|
||||
- Option 2: Go to the [release page](https://github.com/ShiqiYu/OpenGait/releases/), then download the model file and uncompress it to [output](output).
|
||||
|
||||
## Train
|
||||
Train a model by
|
||||
```
|
||||
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 opengait/main.py --cfgs ./config/baseline/baseline.yaml --phase train
|
||||
```
|
||||
- `python -m torch.distributed.launch` [DDP](https://pytorch.org/tutorials/intermediate/ddp_tutorial.html) launch instruction.
|
||||
- `--nproc_per_node` The number of gpus to use, and it must equal the length of `CUDA_VISIBLE_DEVICES`.
|
||||
- `--cfgs` The path to config file.
|
||||
- `--phase` Specified as `train`.
|
||||
<!-- - `--iter` You can specify a number of iterations or use `restore_hint` in the config file and resume training from there. -->
|
||||
- `--log_to_file` If specified, the terminal log will be written on disk simultaneously.
|
||||
|
||||
You can run commands in [train.sh](train.sh) for training different models.
|
||||
|
||||
## Test
|
||||
Evaluate the trained model by
|
||||
```
|
||||
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 opengait/main.py --cfgs ./config/baseline/baseline.yaml --phase test
|
||||
```
|
||||
- `--phase` Specified as `test`.
|
||||
- `--iter` Specify a iteration checkpoint.
|
||||
|
||||
**Tip**: Other arguments are the same as train phase.
|
||||
|
||||
You can run commands in [test.sh](test.sh) for testing different models.
|
||||
|
||||
## Customize
|
||||
1. Read the [detailed config](docs/1.detailed_config.md) to figure out the usage of needed setting items;
|
||||
2. See [how to create your model](docs/2.how_to_create_your_model.md);
|
||||
3. There are some advanced usages, refer to [advanced usages](docs/3.advanced_usages.md), please.
|
||||
|
||||
## Warning
|
||||
- In `DDP` mode, zombie processes may be generated when the program terminates abnormally. You can use this command [sh misc/clean_process.sh](./misc/clean_process.sh) to clear them.
|
||||
@@ -0,0 +1,39 @@
|
||||
# Model Zoo
|
||||
|
||||
## [CASIA-B](http://www.cbsr.ia.ac.cn/english/Gait%20Databases.asp)
|
||||
| Model | NM | BG | CL | Configuration | Input Size | Inference Time | Model Size |
|
||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :--------: | :--------: | :-------------------------------------------------------------------------------------------: | :--------: | :------------: | :------------: |
|
||||
| Baseline | 96.3 | 92.2 | 77.6 | [baseline.yaml](../configs/baseline/baseline.yaml) | 64x44 | 12s | 3.78M |
|
||||
| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 95.8(95.0) | 90.0(87.2) | 75.4(70.4) | [gaitset.yaml](../configs/gaitset/gaitset.yaml) | 64x44 | 13s | 2.59M |
|
||||
| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 96.1(96.2) | 90.7(91.5) | 78.7(78.7) | [gaitpart.yaml](../configs/gaitpart/gaitpart.yaml) | 64x44 | 56s | 1.20M |
|
||||
| [GLN*(ECCV2020)](http://home.ustc.edu.cn/~saihui/papers/eccv2020_gln.pdf) | 96.4(95.6) | 93.1(92.0) | 81.0(77.2) | [gln_phase1.yaml](../configs/gln/gln_phase1.yaml), [gln_phase2.yaml](../configs/gln/gln_phase2.yaml) | 128x88 | 47s/46s | 8.54M / 14.70M |
|
||||
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 97.4(97.4) | 94.5(94.5) | 83.8(83.6) | [gaitgl.yaml](../configs/gaitgl/gaitgl.yaml) | 64x44 | 38s | 3.10M |
|
||||
|
||||
## [OUMVLP](http://www.am.sanken.osaka-u.ac.jp/BiometricDB/GaitMVLP.html)
|
||||
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :-------------: | :--------: |
|
||||
| Baseline | 86.7 | [baseline.yaml](../configs/baseline/baseline_OUMVLP.yaml) | 64x44 | 1m13s | 44.11M |
|
||||
| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 87.2(87.1) | [gaitset.yaml](../configs/gaitset/gaitset_OUMVLP.yaml) | 64x44 | 1m26s | 6.31M |
|
||||
| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 88.6(88.7) | [gaitpart.yaml](../configs/gaitpart/gaitpart_OUMVLP.yaml) | 64x44 | 8m04s | 3.78M |
|
||||
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 89.9(89.7) | [gaitgl.yaml](../configs/gaitgl/gaitgl_OUMVLP.yaml) | 64x44 | 5m23s | 95.62M |
|
||||
|
||||
|
||||
## [GREW](https://www.grew-benchmark.org)
|
||||
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :-------------: | :--------: |
|
||||
| Baseline | 48.5 | [baseline.yaml](../configs/baseline/baseline_GREW.yaml) | 64x44 | 2m23s | 84.12M |
|
||||
| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 48.4 | [gaitset.yaml](../configs/gaitset/gaitset_GREW.yaml) | 64x44 | - | - |
|
||||
| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 47.6 | [gaitpart.yaml](../configs/gaitpart/gaitpart_GREW.yaml) | 64x44 | - | - |
|
||||
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 41.5 | [gaitgl.yaml](../configs/gaitgl/gaitgl_GREW.yaml) | 64x44 | - | - |
|
||||
| [GaitGL(BNNeck)(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 51.7 | [gaitgl.yaml](../configs/gaitgl/gaitgl_GREW_BNNeck.yaml) | 64x44 | - | - |
|
||||
| [RealGait(Arxiv now)](https://arxiv.org/pdf/2201.04806.pdf)| (54.1) | - | - | - | - |
|
||||
|
||||
|
||||
------------------------------------------
|
||||
|
||||
The results in the parentheses are mentioned in the papers.
|
||||
|
||||
**Note**:
|
||||
- All results are Rank@1, excluding identical-view cases.
|
||||
- The shown result of GLN is implemented without compact block.
|
||||
- Only two RTX3090 are used for infering CASIA-B, and eight are used for infering OUMVLP.
|
||||
@@ -1,5 +1,5 @@
|
||||
# Prepare dataset
|
||||
Suppose you have downloaded the original dataset, we need to preprocess the data and save it as pickle file. Remember to set your path to the root of processed dataset in [config/*.yaml](config/).
|
||||
Suppose you have downloaded the original dataset, we need to preprocess the data and save it as pickle file. Remember to set your path to the root of processed dataset in [configs/*.yaml](configs/).
|
||||
|
||||
## Preprocess
|
||||
**CASIA-B**
|
||||
@@ -170,4 +170,4 @@ python datasets/pretreatment.py --input_path Path_of_GREW-rearranged --output_pa
|
||||
```
|
||||
|
||||
## Split dataset
|
||||
You can use the partition file in dataset folder directly, or you can create yours. Remember to set your path to the partition file in [config/*.yaml](config/).
|
||||
You can use the partition file in dataset folder directly, or you can create yours. Remember to set your path to the partition file in [configs/*.yaml](configs/).
|
||||
@@ -37,7 +37,7 @@
|
||||
* Model to be trained
|
||||
> * Args
|
||||
> * model : Model type, please refer to [Model Library](../opengait/modeling/models) for the supported values.
|
||||
> * **others** : Please refer to the [Training Configuration File of Corresponding Model](../config).
|
||||
> * **others** : Please refer to the [Training Configuration File of Corresponding Model](../configs).
|
||||
----
|
||||
### evaluator_cfg
|
||||
* Evaluator configuration
|
||||
@@ -78,7 +78,7 @@
|
||||
> * **others**: Please refer to `evaluator_cfg`.
|
||||
---
|
||||
**Note**:
|
||||
- All the config items will be merged into [default.yaml](../config/default.yaml), and the current config is preferable.
|
||||
- All the config items will be merged into [default.yaml](../configs/default.yaml), and the current config is preferable.
|
||||
- The output directory, which includes the log, checkpoint and summary files, is depended on the defined `dataset_name`, `model` and `save_name` settings, like `output/${dataset_name}/${model}/${save_name}`.
|
||||
# Example
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# Advanced Usages
|
||||
### Cross-Dataset Evalution
|
||||
> You can conduct cross-dataset evalution by just modifying several arguments in your [data_cfg](../config/baseline/baseline.yaml#L1).
|
||||
> You can conduct cross-dataset evalution by just modifying several arguments in your [data_cfg](../configs/baseline/baseline.yaml#L1).
|
||||
>
|
||||
> Take [baseline.yaml](../config/baseline/baseline.yaml) as an example:
|
||||
> Take [baseline.yaml](../configs/baseline/baseline.yaml) as an example:
|
||||
> ```yaml
|
||||
> data_cfg:
|
||||
> dataset_name: CASIA-B
|
||||
@@ -65,7 +65,7 @@
|
||||
>> ])
|
||||
>> return transform
|
||||
>> ```
|
||||
> * *Step2*: Reset the [`transform`](../config/baseline.yaml#L100) arguments in your config file:
|
||||
> * *Step2*: Reset the [`transform`](../configs/baseline.yaml#L100) arguments in your config file:
|
||||
>> ```yaml
|
||||
>> transform:
|
||||
>> - type: TransformDemo
|
||||
Reference in New Issue
Block a user