From de8a8c63ec1bf64b76b09da67496d24f3c6db9de Mon Sep 17 00:00:00 2001 From: noahshen98 <77523610+noahshen98@users.noreply.github.com> Date: Thu, 24 Mar 2022 15:46:03 +0800 Subject: [PATCH] Update README.md --- misc/GREW/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/misc/GREW/README.md b/misc/GREW/README.md index e314e5a..985781f 100644 --- a/misc/GREW/README.md +++ b/misc/GREW/README.md @@ -70,9 +70,9 @@ CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node ## Get the submission file ```shell -CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./misc/HID/baseline_hid.yaml --phase test +CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/baseline_GREW.yaml --phase test ``` The result will be generated in your working directory, you must rename and compress it as the requirements before submitting. ## Evaluation locally -While the original grew treat both seq_01 and seq_02 as gallery, but there is no ground truth for probe. Therefore, it is nessesary to upload the submission file on grew competitation. We seperate test set to: seq_01 as gallery, seq_02 as probe. Then you can modify `eval_func` in the `./config/baseline_GREW.yaml` to `identification_real_scene`, you can obtain result localy like setting of OUMVLP. \ No newline at end of file +While the original grew treat both seq_01 and seq_02 as gallery, but there is no ground truth for probe. Therefore, it is nessesary to upload the submission file on grew competitation. We seperate test set to: seq_01 as gallery, seq_02 as probe. Then you can modify `eval_func` in the `./config/baseline_GREW.yaml` to `identification_real_scene`, you can obtain result localy like setting of OUMVLP.