grew supported
This commit is contained in:
@@ -8,7 +8,7 @@ OpenGait is a flexible and extensible gait recognition project provided by the [
|
|||||||
|
|
||||||
|
|
||||||
## What's New
|
## What's New
|
||||||
- [Mar 2022] Dataset GREW is supported.
|
- [Mar 2022] Dataset [GREW](https://www.grew-benchmark.org) is supported.
|
||||||
- [Mar 2022] [HID](http://hid2022.iapr-tc4.org/) support is ready in [misc/HID](./misc/HID).
|
- [Mar 2022] [HID](http://hid2022.iapr-tc4.org/) support is ready in [misc/HID](./misc/HID).
|
||||||
|
|
||||||
## Highlighted features
|
## Highlighted features
|
||||||
@@ -19,7 +19,7 @@ OpenGait is a flexible and extensible gait recognition project provided by the [
|
|||||||
|
|
||||||
## Model Zoo
|
## Model Zoo
|
||||||
|
|
||||||
### CASIA-B
|
### [CASIA-B](http://www.cbsr.ia.ac.cn/english/Gait%20Databases.asp)
|
||||||
| Model | NM | BG | CL | Configuration | Input Size | Inference Time | Model Size |
|
| Model | NM | BG | CL | Configuration | Input Size | Inference Time | Model Size |
|
||||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :--------: | :--------: | :------------------------------------------------------------------------------------------- | :--------: | :------------: | :------------: |
|
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :--------: | :--------: | :------------------------------------------------------------------------------------------- | :--------: | :------------: | :------------: |
|
||||||
| Baseline | 96.3 | 92.2 | 77.6 | [baseline.yaml](config/baseline.yaml) | 64x44 | 12s | 3.78M |
|
| Baseline | 96.3 | 92.2 | 77.6 | [baseline.yaml](config/baseline.yaml) | 64x44 | 12s | 3.78M |
|
||||||
@@ -28,7 +28,7 @@ OpenGait is a flexible and extensible gait recognition project provided by the [
|
|||||||
| [GLN*(ECCV2020)](http://home.ustc.edu.cn/~saihui/papers/eccv2020_gln.pdf) | 96.4(95.6) | 93.1(92.0) | 81.0(77.2) | [gln_phase1.yaml](config/gln/gln_phase1.yaml), [gln_phase2.yaml](config/gln/gln_phase2.yaml) | 128x88 | 47s/46s | 8.54M / 14.70M |
|
| [GLN*(ECCV2020)](http://home.ustc.edu.cn/~saihui/papers/eccv2020_gln.pdf) | 96.4(95.6) | 93.1(92.0) | 81.0(77.2) | [gln_phase1.yaml](config/gln/gln_phase1.yaml), [gln_phase2.yaml](config/gln/gln_phase2.yaml) | 128x88 | 47s/46s | 8.54M / 14.70M |
|
||||||
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 97.4(97.4) | 94.5(94.5) | 83.8(83.6) | [gaitgl.yaml](config/gaitgl.yaml) | 64x44 | 38s | 3.10M |
|
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 97.4(97.4) | 94.5(94.5) | 83.8(83.6) | [gaitgl.yaml](config/gaitgl.yaml) | 64x44 | 38s | 3.10M |
|
||||||
|
|
||||||
### OUMVLP
|
### [OUMVLP](http://www.am.sanken.osaka-u.ac.jp/BiometricDB/GaitMVLP.html)
|
||||||
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
||||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :------------- | :--------: |
|
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :------------- | :--------: |
|
||||||
| Baseline | 86.7 | [baseline.yaml](config/baseline_OUMVLP.yaml) | 64x44 | 1m13s | 44.11M |
|
| Baseline | 86.7 | [baseline.yaml](config/baseline_OUMVLP.yaml) | 64x44 | 1m13s | 44.11M |
|
||||||
@@ -36,7 +36,7 @@ OpenGait is a flexible and extensible gait recognition project provided by the [
|
|||||||
| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 88.6(88.7) | [gaitpart.yaml](config/gaitpart_OUMVLP.yaml) | 64x44 | 8m04s | 3.78M |
|
| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 88.6(88.7) | [gaitpart.yaml](config/gaitpart_OUMVLP.yaml) | 64x44 | 8m04s | 3.78M |
|
||||||
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 89.9(89.7) | [gaitgl.yaml](config/gaitgl_OUMVLP.yaml) | 64x44 | 5m23s | 95.62M |
|
| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 89.9(89.7) | [gaitgl.yaml](config/gaitgl_OUMVLP.yaml) | 64x44 | 5m23s | 95.62M |
|
||||||
|
|
||||||
### GREW
|
### [GREW](https://www.grew-benchmark.org)
|
||||||
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
|
||||||
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :------------- | :--------: |
|
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :------------- | :--------: |
|
||||||
| Baseline | 43.3 | [baseline.yaml](config/baseline_GREW.yaml) | 64x44 | 2m23s | 84.12M |
|
| Baseline | 43.3 | [baseline.yaml](config/baseline_GREW.yaml) | 64x44 | 2m23s | 84.12M |
|
||||||
@@ -131,7 +131,7 @@ You can run commands in [test.sh](test.sh) for testing different models.
|
|||||||
## Acknowledgement
|
## Acknowledgement
|
||||||
- GLN: [Saihui Hou (侯赛辉)](http://home.ustc.edu.cn/~saihui/index_english.html)
|
- GLN: [Saihui Hou (侯赛辉)](http://home.ustc.edu.cn/~saihui/index_english.html)
|
||||||
- GaitGL: [Beibei Lin (林贝贝)](https://scholar.google.com/citations?user=KyvHam4AAAAJ&hl=en&oi=ao)
|
- GaitGL: [Beibei Lin (林贝贝)](https://scholar.google.com/citations?user=KyvHam4AAAAJ&hl=en&oi=ao)
|
||||||
|
- GREW: [GREW TEAM](https://www.grew-benchmark.org)
|
||||||
<!-- ## Citation
|
<!-- ## Citation
|
||||||
```
|
```
|
||||||
``` -->
|
``` -->
|
||||||
|
|||||||
@@ -133,7 +133,7 @@ Step2: [Unzip](https://github.com/GREW-Benchmark/GREW-Benchmark) the dataset, yo
|
|||||||
...
|
...
|
||||||
...
|
...
|
||||||
|
|
||||||
Step3 : To rearrange directory of OUMVLP dataset, turning to id-type-view structure, Run
|
Step3 : To rearrange directory of GREW dataset, turning to id-type-view structure, Run
|
||||||
```
|
```
|
||||||
python misc/rearrange_GREW.py --input_path Path_of_GREW-raw --output_path Path_of_GREW-rearranged
|
python misc/rearrange_GREW.py --input_path Path_of_GREW-raw --output_path Path_of_GREW-rearranged
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -0,0 +1,78 @@
|
|||||||
|
# GREW Tutorial
|
||||||
|
<!--  -->
|
||||||
|
This is for [GREW-Benchmark](https://github.com/GREW-Benchmark/GREW-Benchmark). We report our result of 48% using the baseline model. In order for participants to better start the first step, we provide a tutorial on how to use OpenGait for GREW.
|
||||||
|
|
||||||
|
## Preprocess the dataset
|
||||||
|
Download the raw dataset from the [official link](https://www.grew-benchmark.org/download.html). You will get three compressed files, i.e. `train.zip`, `test.zip` and `distractor.zip`.
|
||||||
|
|
||||||
|
Step 1: Unzip train and test:
|
||||||
|
```shell
|
||||||
|
unzip -P password train.zip (password is the obtained password)
|
||||||
|
tar -xzvf train.tgz
|
||||||
|
cd train
|
||||||
|
ls *.tgz | xargs -n1 tar xzvf
|
||||||
|
```
|
||||||
|
|
||||||
|
```shell
|
||||||
|
unzip -P password test.zip (password is the obtained password)
|
||||||
|
tar -xzvf test.tgz
|
||||||
|
cd test & cd gallery
|
||||||
|
ls *.tgz | xargs -n1 tar xzvf
|
||||||
|
cd .. & cd probe
|
||||||
|
ls *.tgz | xargs -n1 tar xzvf
|
||||||
|
```
|
||||||
|
|
||||||
|
After unpacking these compressed files, run this command:
|
||||||
|
|
||||||
|
Step2 : To rearrange directory of GREW dataset, turning to id-type-view structure, Run
|
||||||
|
```
|
||||||
|
python misc/rearrange_GREW.py --input_path Path_of_GREW-raw --output_path Path_of_GREW-rearranged
|
||||||
|
```
|
||||||
|
|
||||||
|
Step3: Transforming images to pickle file, run
|
||||||
|
```
|
||||||
|
python misc/pretreatment.py --input_path Path_of_GREW-rearranged --output_path Path_of_GREW-pkl
|
||||||
|
```
|
||||||
|
Then you will see the structure like:
|
||||||
|
|
||||||
|
- Processed
|
||||||
|
```
|
||||||
|
GREW-pkl
|
||||||
|
├── 00001train (subject in training set)
|
||||||
|
├── 00
|
||||||
|
├── 4XPn5Z28
|
||||||
|
├── 4XPn5Z28.pkl
|
||||||
|
├──5TXe8svE
|
||||||
|
├── 5TXe8svE.pkl
|
||||||
|
......
|
||||||
|
├── 00001 (subject in testing set)
|
||||||
|
├── 01
|
||||||
|
├── 79XJefi8
|
||||||
|
├── 79XJefi8.pkl
|
||||||
|
├── 02
|
||||||
|
├── t16VLaQf
|
||||||
|
├── t16VLaQf.pkl
|
||||||
|
├── probe
|
||||||
|
├── etaGVnWf
|
||||||
|
├── etaGVnWf.pkl
|
||||||
|
├── eT1EXpgZ
|
||||||
|
├── eT1EXpgZ.pkl
|
||||||
|
...
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
## Train the dataset
|
||||||
|
Modify the `dataset_root` in `./config/baseline_GREW.yaml`, and then run this command:
|
||||||
|
```shell
|
||||||
|
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/baseline_GREW.yaml --phase train
|
||||||
|
```
|
||||||
|
<!-- You can also download the [trained model](https://github.com/ShiqiYu/OpenGait/releases/download/v1.1/pretrained_hid_model.zip) and place it in `output` after unzipping. -->
|
||||||
|
|
||||||
|
## Get the submission file
|
||||||
|
```shell
|
||||||
|
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./misc/HID/baseline_hid.yaml --phase test
|
||||||
|
```
|
||||||
|
The result will be generated in your working directory, you must rename and compress it as the requirements before submitting.
|
||||||
|
|
||||||
|
## Evaluation locally
|
||||||
|
While the original grew treat both seq_01 and seq_02 as gallery, but there is no ground truth for probe. Therefore, it is nessesary to upload the submission file on grew competitation. We seperate test set to: seq_01 as gallery, seq_02 as probe. Then you can modify `eval_func` in the `./config/baseline_GREW.yaml` to `identification_real_scene`, you can obtain result localy like setting of OUMVLP.
|
||||||
Reference in New Issue
Block a user