Add DRF Scoliosis1K pipeline and optional wandb logging

This commit is contained in:
2026-03-07 17:49:19 +08:00
parent 654409ff50
commit 51eee70a4b
12 changed files with 1257 additions and 151 deletions
+52
View File
@@ -83,3 +83,55 @@ datasets/pretreatment_heatmap.py \
--save_root=<output_path> \
--dataset_name=OUMVLP
```
## DRF Preprocessing
For the DRF model, OpenGait expects a combined runtime dataset with:
* `0_heatmap.pkl`: the two-channel skeleton map sequence
* `1_pav.pkl`: the paper-style Postural Asymmetry Vector (PAV), repeated along the sequence axis so it matches OpenGait's multi-input loader contract
The PAV pass is implemented from the paper:
1. convert pose to COCO17 if needed
2. pad missing joints
3. pelvis-center and height normalize the sequence
4. compute vertical, midline, and angular deviations for the 8 symmetric joint pairs
5. apply IQR filtering per metric
6. average over time
7. min-max normalize across the dataset, or across `TRAIN_SET` when `--stats_partition` is provided
Run:
```bash
uv run python datasets/pretreatment_scoliosis_drf.py \
--pose_data_path=<path_to_pose_pkl> \
--output_path=<path_to_drf_pkl> \
--stats_partition=./datasets/Scoliosis1K/Scoliosis1K_1116.json
```
The output layout is:
```text
<path_to_drf_pkl>/
├── pav_stats.pkl
├── 00000/
│ ├── Positive/
│ │ ├── 000_180/
│ │ │ ├── 0_heatmap.pkl
│ │ │ └── 1_pav.pkl
```
Point `configs/drf/drf_scoliosis1k.yaml:data_cfg.dataset_root` to this output directory before training or testing.
## DRF Training and Testing
```bash
CUDA_VISIBLE_DEVICES=0,1,2,3 \
uv run python -m torch.distributed.launch --nproc_per_node=4 \
opengait/main.py --cfgs configs/drf/drf_scoliosis1k.yaml --phase train
CUDA_VISIBLE_DEVICES=0,1,2,3 \
uv run python -m torch.distributed.launch --nproc_per_node=4 \
opengait/main.py --cfgs configs/drf/drf_scoliosis1k.yaml --phase test
```