diff --git a/README.md b/README.md
index 56de424..d8fb2e0 100644
--- a/README.md
+++ b/README.md
@@ -1,40 +1,49 @@
-**Note:**
-This code is only used for **academic purposes**, people cannot use this code for anything that might be considered commercial use.
+
-# OpenGait
+
-OpenGait is a flexible and extensible gait recognition project provided by the [Shiqi Yu Group](https://faculty.sustech.edu.cn/yusq/) and supported in part by [WATRIX.AI](http://www.watrix.ai). Just the pre-beta version is released now, and more documentations as well as the reproduced methods will be offered as soon as possible.
+------------------------------------------
+
+OpenGait is a flexible and extensible gait recognition project provided by the [Shiqi Yu Group](https://faculty.sustech.edu.cn/yusq/) and supported in part by [WATRIX.AI](http://www.watrix.ai).
**Highlighted features:**
-- **Multiple Models Support**: We reproduced several SOTA methods, and reached the same or even better performance.
-- **DDP Support**: The officially recommended [`Distributed Data Parallel (DDP)`](https://pytorch.org/tutorials/intermediate/ddp_tutorial.html) mode is used during the training and testing phases.
+- **Multiple Models Support**: We reproduced several SOTA methods, and reached the same or even the better performance.
+- **DDP Support**: The officially recommended [`Distributed Data Parallel (DDP)`](https://pytorch.org/tutorials/intermediate/ddp_tutorial.html) mode is used during both the training and testing phases.
- **AMP Support**: The [`Auto Mixed Precision (AMP)`](https://pytorch.org/tutorials/recipes/recipes/amp_recipe.html?highlight=amp) option is available.
- **Nice log**: We use [`tensorboard`](https://pytorch.org/docs/stable/tensorboard.html) and `logging` to log everything, which looks pretty.
-# Model Zoo
+## Model Zoo
+### CASIA-B
| Model | NM | BG | CL | Configuration | Input Size | Inference Time | Model Size |
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :--------: | :--------: | :------------------------------------------------------------------------------------------- | :--------: | :------------: | :------------: |
| Baseline | 96.3 | 92.2 | 77.6 | [baseline.yaml](config/baseline.yaml) | 64x44 | 12s | 3.78M |
-| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 95.8(95.0) | 90.0(87.2) | 75.4(70.4) | [gaitset.yaml](config/gaitset.yaml) | 64x44 | 11s | 2.59M |
-| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 96.1(96.2) | 90.7(91.5) | 78.7(78.7) | [gaitpart.yaml](config/gaitpart.yaml) | 64x44 | 22s | 1.20M |
-| [GLN*(ECCV2020)](http://home.ustc.edu.cn/~saihui/papers/eccv2020_gln.pdf) | 96.4(95.6) | 93.1(92.0) | 81.0(77.2) | [gln_phase1.yaml](config/gln/gln_phase1.yaml), [gln_phase2.yaml](config/gln/gln_phase2.yaml) | 128x88 | 14s | 8.54M / 14.70M |
-| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 97.4(97.4) | 94.5(94.5) | 83.8(83.6) | [gaitgl.yaml](config/gaitgl.yaml) | 64x44 | 31s | 3.10M |
+| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 95.8(95.0) | 90.0(87.2) | 75.4(70.4) | [gaitset.yaml](config/gaitset.yaml) | 64x44 | 13s | 2.59M |
+| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 96.1(96.2) | 90.7(91.5) | 78.7(78.7) | [gaitpart.yaml](config/gaitpart.yaml) | 64x44 | 56s | 1.20M |
+| [GLN*(ECCV2020)](http://home.ustc.edu.cn/~saihui/papers/eccv2020_gln.pdf) | 96.4(95.6) | 93.1(92.0) | 81.0(77.2) | [gln_phase1.yaml](config/gln/gln_phase1.yaml), [gln_phase2.yaml](config/gln/gln_phase2.yaml) | 128x88 | 47s/46s | 8.54M / 14.70M |
+| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 97.4(97.4) | 94.5(94.5) | 83.8(83.6) | [gaitgl.yaml](config/gaitgl.yaml) | 64x44 | 38s | 3.10M |
+
+### OUMVLP
+| Model | Rank@1 | Configuration | Input Size | Inference Time | Model Size |
+| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :--------: | :------------------------------------------: | :--------: | :------------- | :--------: |
+| Baseline | 86.7 | [baseline.yaml](config/OUMVLP/baseline.yaml) | 64x44 | 1m13s | 44.11M |
+| [GaitSet(AAAI2019)](https://arxiv.org/pdf/1811.06186.pdf) | 87.2(87.1) | [gaitset.yaml](config/OUMVLP/gaitset.yaml) | 64x44 | 1m26s | 6.31M |
+| [GaitPart(CVPR2020)](http://home.ustc.edu.cn/~saihui/papers/cvpr2020_gaitpart.pdf) | 88.6(88.7) | [gaitpart.yaml](config/OUMVLP/gaitpart.yaml) | 64x44 | 8m04s | 3.78M |
+| [GaitGL(ICCV2021)](https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf) | 89.9(89.7) | [gaitgl.yaml](config/OUMVLP/gaitgl.yaml) | 64x44 | 5m23s | 95.62M |
+
The results in the parentheses are mentioned in the papers
-
**Note**:
-- All the models were tested on [CASIA-B](http://www.cbsr.ia.ac.cn/english/Gait%20Databases.asp) (Rank@1, excluding identical-view cases).
+- All results are Rank@1, excluding identical-view cases.
- The shown result of GLN is implemented without compact block.
-- Only 2 RTX6000 are used during the inference phase.
-- The results on [OUMVLP](http://www.am.sanken.osaka-u.ac.jp/BiometricDB/GaitMVLP.html) will be released soon.
-It's inference process just cost about 90 secs(Baseline & 8 RTX6000).
+- Only two RTX3090 are used for infering CASIA-B, and eight are used for infering OUMVLP.
-# Get Started
-## Installation
+
+## Get Started
+### Installation
1. clone this repo.
```
git clone https://github.com/ShiqiYu/OpenGait.git
@@ -57,56 +66,66 @@ It's inference process just cost about 90 secs(Baseline & 8 RTX6000).
pip install tqdm pyyaml tensorboard opencv-python
pip install torch==1.6.0 torchvision==0.7.0
```
-## Prepare dataset
-See [prepare dataset](doc/prepare_dataset.md).
+### Prepare dataset
+See [prepare dataset](docs/0.prepare_dataset.md).
-## Get trained model
+### Get trained model
- Option 1:
```
python misc/download_pretrained_model.py
```
-- Option 2: Go to the [release page](https://github.com/ShiqiYu/OpenGait/releases/), then download the model file and uncompress it to `output`.
+- Option 2: Go to the [release page](https://github.com/ShiqiYu/OpenGait/releases/), then download the model file and uncompress it to [output](output).
-## Train
+### Train
Train a model by
```
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase train
```
-- `python -m torch.distributed.launch` Our implementation uses DistributedDataParallel.
-- `--nproc_per_node` The number of gpu to use, it must equal the length of `CUDA_VISIBLE_DEVICES`.
-- `--cfgs` The path of config file.
+- `python -m torch.distributed.launch` [DDP](https://pytorch.org/tutorials/intermediate/ddp_tutorial.html) launch instruction.
+- `--nproc_per_node` The number of gpus to use, and it must equal the length of `CUDA_VISIBLE_DEVICES`.
+- `--cfgs` The path to config file.
- `--phase` Specified as `train`.
-- `--iter` You can specify a number of iterations or use `restore_hint` in the configuration file and resume training from there.
-- `--log_to_file` If specified, log will be written on disk simultaneously.
+
+- `--log_to_file` If specified, the terminal log will be written on disk simultaneously.
You can run commands in [train.sh](train.sh) for training different models.
-## Test
-Use trained model to evaluate by
+### Test
+Evaluate the trained model by
```
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase test
```
- `--phase` Specified as `test`.
-- `--iter` You can specify a number of iterations or or use `restore_hint` in the configuration file and restore model from there.
+- `--iter` Specify a iteration checkpoint.
**Tip**: Other arguments are the same as train phase.
You can run commands in [test.sh](test.sh) for testing different models.
+
## Customize
-1. First, you need to read the [config documentation](doc/detailed_config.md) to figure out the usage of every item.
-2. If you want create your own model, see [here](doc/how_to_create_your_model.md).
+1. Read the [detailed config](docs/1.detailed_config.md) to figure out the usage of needed setting items;
+2. See [how to create your model](docs/2.how_to_create_your_model.md);
+3. There are some advanced usages, refer to [advanced usages](docs/3.advanced_usages.md), please.
-# Warning
+## Warning
- Some models may not be compatible with `AMP`, you can disable it by setting `enable_float16` **False**.
-- In `DDP` mode, zombie processes may be generated when the program terminates abnormally. You can use this command `kill $(ps aux | grep main.py | grep -v grep | awk '{print $2}')` to clear them.
+- In `DDP` mode, zombie processes may be generated when the program terminates abnormally. You can use this command [sh misc/clean_process.sh](./misc/clean_process.sh) to clear them.
- We implemented the functionality about testing while training, but it slightly affected the results. None of our published models use this functionality. You can disable it by setting `with_test` **False**.
+- Recommended Pytorch version: 1.6-1.8
-# Authors:
+## Authors:
**Open Gait Team (OGT)**
-- [Chao Fan (樊超)](https://faculty.sustech.edu.cn/?p=128578&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date)
-- [Chuanfu Shen (沈川福)](https://faculty.sustech.edu.cn/?p=95396&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date)
-- [Junhao Liang (梁峻豪)](https://faculty.sustech.edu.cn/?p=95401&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date)
+- [Chao Fan (樊超)](https://faculty.sustech.edu.cn/?p=128578&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date), 12131100@mail.sustech.edu.cn
+- [Chuanfu Shen (沈川福)](https://faculty.sustech.edu.cn/?p=95396&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date), 11950016@mail.sustech.edu.cn
+- [Junhao Liang (梁峻豪)](https://faculty.sustech.edu.cn/?p=95401&tagid=yusq&cat=2&iscss=1&snapid=1&orderby=date), 12132342@mail.sustech.edu.cn
-# Acknowledgement
+## Acknowledgement
- GLN: [Saihui Hou (侯赛辉)](http://home.ustc.edu.cn/~saihui/index_english.html)
-- GaitGL: Beibei Lin (林贝贝)
+- GaitGL: [Beibei Lin (林贝贝)](https://scholar.google.com/citations?user=KyvHam4AAAAJ&hl=en&oi=ao)
+
+
+
+**Note:**
+This code is only used for **academic purposes**, people cannot use this code for anything that might be considered commercial use.
diff --git a/assets/bg.gif b/assets/bg.gif
new file mode 100644
index 0000000..32dc4be
Binary files /dev/null and b/assets/bg.gif differ
diff --git a/assets/cl.gif b/assets/cl.gif
new file mode 100644
index 0000000..cac5ffd
Binary files /dev/null and b/assets/cl.gif differ
diff --git a/assets/logo1.png b/assets/logo1.png
new file mode 100644
index 0000000..f1bb323
Binary files /dev/null and b/assets/logo1.png differ
diff --git a/assets/logo2.png b/assets/logo2.png
new file mode 100644
index 0000000..b0cdd42
Binary files /dev/null and b/assets/logo2.png differ
diff --git a/assets/nm.gif b/assets/nm.gif
new file mode 100644
index 0000000..35ef6aa
Binary files /dev/null and b/assets/nm.gif differ
diff --git a/assets/pipeline.png b/assets/pipeline.png
new file mode 100644
index 0000000..6e1b71f
Binary files /dev/null and b/assets/pipeline.png differ
diff --git a/config/baseline.yaml b/config/baseline.yaml
index de8345b..f8b6a50 100644
--- a/config/baseline.yaml
+++ b/config/baseline.yaml
@@ -1,6 +1,6 @@
data_cfg:
dataset_name: CASIA-B
- dataset_root: your_path
+ dataset_root: your_path
dataset_partition: ./misc/partitions/CASIA-B_include_005.json
num_workers: 1
remove_no_gallery: false # Remove probe if no gallery for it
@@ -18,9 +18,9 @@ evaluator_cfg:
sample_type: all_ordered # all indicates whole sequence used to test, while ordered means input sequence by its natural order; Other options: fixed_unordered
frames_all_limit: 720 # limit the number of sampled frames to prevent out of memory
metric: euc # cos
- # transform:
- # - type: BaseSilCuttingTransform
- # img_w: 128
+ transform:
+ - type: BaseSilCuttingTransform
+ img_w: 64
loss_cfg:
- loss_term_weights: 1.0
@@ -79,7 +79,7 @@ scheduler_cfg:
scheduler: MultiStepLR
trainer_cfg:
enable_float16: true # half_percesion float for memory reduction and speedup
- fix_BN: false
+ fix_BN: true
log_iter: 100
restore_ckpt_strict: true
restore_hint: 0
@@ -97,6 +97,6 @@ trainer_cfg:
frames_num_min: 25 # min frames number for unfixed traing
sample_type: fixed_unordered # fixed control input frames number, unordered for controlling order of input tensor; Other options: unfixed_ordered or all_ordered
type: TripletSampler
- # transform:
- # - type: BaseSilCuttingTransform
- # img_w: 128
+ transform:
+ - type: BaseSilCuttingTransform
+ img_w: 64
diff --git a/config/baseline_OUMVLP.yaml b/config/baseline_OUMVLP.yaml
new file mode 100644
index 0000000..c0883f2
--- /dev/null
+++ b/config/baseline_OUMVLP.yaml
@@ -0,0 +1,102 @@
+data_cfg:
+ dataset_name: OUMVLP
+ dataset_root: your_path
+ dataset_partition: ./misc/partitions/OUMVLP.json
+ num_workers: 1
+ remove_no_gallery: false # Remove probe if no gallery for it
+ test_dataset_name: OUMVLP
+
+evaluator_cfg:
+ enable_float16: true
+ restore_ckpt_strict: true
+ restore_hint: 150000
+ save_name: Baseline
+ eval_func: identification
+ sampler:
+ batch_shuffle: false
+ batch_size: 4
+ sample_type: all_ordered # all indicates whole sequence used to test, while ordered means input sequence by its natural order; Other options: fixed_unordered
+ frames_all_limit: 720 # limit the number of sampled frames to prevent out of memory
+ metric: euc # cos
+ # transform:
+ # - type: BaseSilCuttingTransform
+ # img_w: 128
+
+loss_cfg:
+ - loss_term_weights: 1.0
+ margin: 0.2
+ type: TripletLoss
+ log_prefix: triplet
+ - loss_term_weights: 0.1
+ scale: 16
+ type: CrossEntropyLoss
+ log_prefix: softmax
+ log_accuracy: true
+
+model_cfg:
+ model: Baseline
+ backbone_cfg:
+ in_channels: 1
+ layers_cfg: # Layers configuration for automatically model construction
+ - BC-32
+ - BC-32
+ - M
+ - BC-64
+ - BC-64
+ - M
+ - BC-128
+ - BC-128
+ - BC-256
+ - BC-256
+ type: Plain
+ SeparateFCs:
+ in_channels: 256
+ out_channels: 256
+ parts_num: 31
+ SeparateBNNecks:
+ class_num: 5153
+ in_channels: 256
+ parts_num: 31
+ bin_num:
+ - 16
+ - 8
+ - 4
+ - 2
+ - 1
+
+optimizer_cfg:
+ lr: 0.1
+ momentum: 0.9
+ solver: SGD
+ weight_decay: 0.0005
+
+scheduler_cfg:
+ gamma: 0.1
+ milestones: # Learning Rate Reduction at each milestones
+ - 50000
+ - 100000
+ scheduler: MultiStepLR
+trainer_cfg:
+ enable_float16: true # half_percesion float for memory reduction and speedup
+ fix_BN: false
+ log_iter: 100
+ with_test: true
+ restore_ckpt_strict: true
+ restore_hint: 0
+ save_iter: 10000
+ save_name: Baseline
+ sync_BN: true
+ total_iter: 150000
+ sampler:
+ batch_shuffle: true
+ batch_size:
+ - 32 # TripletSampler, batch_size[0] indicates Number of Identity
+ - 16 # batch_size[1] indicates Samples sequqnce for each Identity
+ frames_num_fixed: 30 # fixed frames number for training
+ frames_num_max: 50 # max frames number for unfixed training
+ frames_num_min: 25 # min frames number for unfixed traing
+ sample_type: fixed_unordered # fixed control input frames number, unordered for controlling order of input tensor; Other options: unfixed_ordered or all_ordered
+ type: TripletSampler
+ # transform:
+ # - type: BaseSilCuttingTransform
+ # img_w: 128
\ No newline at end of file
diff --git a/config/gaitgl.yaml b/config/gaitgl.yaml
index c6e3169..6ea78ea 100644
--- a/config/gaitgl.yaml
+++ b/config/gaitgl.yaml
@@ -49,7 +49,6 @@ scheduler_cfg:
trainer_cfg:
enable_distributed: true
enable_float16: false
- fix_BN: false
log_iter: 100
restore_ckpt_strict: true
restore_hint: 0
diff --git a/config/gaitgl_OUMVLP.yaml b/config/gaitgl_OUMVLP.yaml
new file mode 100644
index 0000000..ea3caca
--- /dev/null
+++ b/config/gaitgl_OUMVLP.yaml
@@ -0,0 +1,69 @@
+# Note : *** the batch_size should be equal to the gpus number at the test phase!!! ***
+data_cfg:
+ dataset_name: OUMVLP
+ dataset_root: your_path
+ dataset_partition: ./misc/partitions/OUMVLP.json
+ num_workers: 1
+ remove_no_gallery: false
+ test_dataset_name: OUMVLP
+
+evaluator_cfg:
+ enable_distributed: true
+ enable_float16: false
+ restore_ckpt_strict: true
+ restore_hint: 210000
+ save_name: GaitGL
+ sampler:
+ batch_size: 2
+ sample_type: all_ordered
+ type: InferenceSampler
+
+loss_cfg:
+ - loss_term_weights: 1.0
+ margin: 0.2
+ type: TripletLoss
+ log_prefix: triplet
+ - loss_term_weights: 1.0
+ scale: 1
+ type: CrossEntropyLoss
+ log_accuracy: true
+ label_smooth: true
+ log_prefix: softmax
+
+model_cfg:
+ model: GaitGL
+ channels: [32, 64, 128, 256]
+ class_num: 5153
+
+optimizer_cfg:
+ lr: 1.0e-4
+ solver: Adam
+ weight_decay: 0
+
+scheduler_cfg:
+ gamma: 0.1
+ milestones:
+ - 150000
+ - 200000
+ scheduler: MultiStepLR
+
+trainer_cfg:
+ enable_distributed: true
+ enable_float16: true
+ with_test: false
+ log_iter: 100
+ restore_ckpt_strict: true
+ restore_hint: 0
+ save_iter: 10000
+ save_name: GaitGL
+ sync_BN: true
+ total_iter: 210000
+ sampler:
+ batch_shuffle: true
+ batch_size:
+ - 32
+ - 8
+ frames_num_fixed: 30
+ frames_skip_num: 0
+ sample_type: fixed_ordered
+ type: TripletSampler
diff --git a/config/gaitpart.yaml b/config/gaitpart.yaml
index 02c2a78..4d700dc 100644
--- a/config/gaitpart.yaml
+++ b/config/gaitpart.yaml
@@ -58,7 +58,6 @@ scheduler_cfg:
trainer_cfg:
enable_float16: true
- fix_BN: false
log_iter: 100
restore_ckpt_strict: true
restore_hint: 0
diff --git a/config/gaitpart_OUMVLP.yaml b/config/gaitpart_OUMVLP.yaml
new file mode 100644
index 0000000..73512be
--- /dev/null
+++ b/config/gaitpart_OUMVLP.yaml
@@ -0,0 +1,82 @@
+data_cfg:
+ dataset_name: OUMVLP
+ dataset_root: your_path
+ dataset_partition: ./misc/partitions/OUMVLP.json
+ num_workers: 4
+ remove_no_gallery: false
+ test_dataset_name: OUMVLP
+
+evaluator_cfg:
+ enable_float16: false
+ restore_ckpt_strict: true
+ restore_hint: 250000
+ save_name: GaitPart
+ sampler:
+ batch_size: 4
+ sample_type: all_ordered
+ type: InferenceSampler
+ metric: euc # cos
+
+loss_cfg:
+ loss_term_weights: 1.0
+ margin: 0.2
+ type: TripletLoss
+ log_prefix: triplet
+
+model_cfg:
+ model: GaitPart
+ backbone_cfg:
+ in_channels: 1
+ layers_cfg:
+ - BC-32
+ - BC-32
+ - M
+ - BC-64
+ - BC-64
+ - M
+ - FC-128-3
+ - FC-128-3
+ - FC-256-3
+ - FC-256-3
+ type: Plain
+ SeparateFCs:
+ in_channels: 256
+ out_channels: 256
+ parts_num: 16
+ bin_num:
+ - 16
+
+optimizer_cfg:
+ lr: 0.0001
+ momentum: 0.9
+ solver: Adam
+ weight_decay: 0.0
+
+scheduler_cfg:
+ gamma: 0.1
+ milestones:
+ - 150000
+ scheduler: MultiStepLR
+
+trainer_cfg:
+ enable_float16: true
+ fix_BN: false
+ log_iter: 100
+ with_test: true
+ restore_ckpt_strict: true
+ restore_hint: 0
+ save_iter: 10000
+ save_name: GaitPart
+ sync_BN: false
+ total_iter: 250000
+ sampler:
+ batch_shuffle: false
+ batch_size:
+ - 32
+ - 16
+ frames_num_fixed: 30
+ frames_num_max: 50
+ frames_num_min: 25
+ frames_skip_num: 10
+ sample_type: fixed_ordered
+ type: TripletSampler
diff --git a/config/gaitset.yaml b/config/gaitset.yaml
index e34c770..b159d43 100644
--- a/config/gaitset.yaml
+++ b/config/gaitset.yaml
@@ -56,7 +56,6 @@ scheduler_cfg:
trainer_cfg:
enable_float16: true
- fix_BN: false
log_iter: 100
restore_ckpt_strict: true
restore_hint: 0
diff --git a/config/gaitset_OUMVLP.yaml b/config/gaitset_OUMVLP.yaml
new file mode 100644
index 0000000..8722eb8
--- /dev/null
+++ b/config/gaitset_OUMVLP.yaml
@@ -0,0 +1,76 @@
+data_cfg:
+ dataset_name: OUMVLP
+ dataset_root: your_path
+ dataset_partition: ./misc/partitions/OUMVLP.json
+ num_workers: 4
+ remove_no_gallery: false
+ test_dataset_name: OUMVLP
+
+evaluator_cfg:
+ enable_float16: true
+ restore_ckpt_strict: true
+ restore_hint: 250000
+ save_name: GaitSet
+ sampler:
+ batch_size: 4
+ sample_type: all_ordered
+ type: InferenceSampler
+ metric: euc # cos
+
+loss_cfg:
+ loss_term_weights: 1.0
+ margin: 0.2
+ type: TripletLoss
+ log_prefix: triplet
+
+model_cfg:
+ model: GaitSet
+ in_channels:
+ - 1
+ - 64
+ - 128
+ - 256
+ SeparateFCs:
+ in_channels: 256
+ out_channels: 256
+ parts_num: 62
+ bin_num:
+ - 16
+ - 8
+ - 4
+ - 2
+ - 1
+
+optimizer_cfg:
+ lr: 0.0001
+ momentum: 0.9
+ solver: Adam
+ weight_decay: 0
+
+scheduler_cfg:
+ gamma: 0.1
+ milestones:
+ - 150000
+ scheduler: MultiStepLR
+
+trainer_cfg:
+ enable_float16: true
+ fix_BN: false
+ with_test: true
+ log_iter: 100
+ restore_ckpt_strict: true
+ restore_hint: 0
+ save_iter: 10000
+ save_name: GaitSet
+ sync_BN: false
+ total_iter: 250000
+ sampler:
+ batch_shuffle: false
+ batch_size:
+ - 32
+ - 16
+ frames_num_fixed: 30
+ frames_num_max: 50
+ frames_num_min: 25
+ sample_type: fixed_unordered
+ type: TripletSampler
\ No newline at end of file
diff --git a/config/gln/gln_phase1.yaml b/config/gln/gln_phase1.yaml
index fe7359d..6acc2cb 100644
--- a/config/gln/gln_phase1.yaml
+++ b/config/gln/gln_phase1.yaml
@@ -76,7 +76,7 @@ scheduler_cfg:
trainer_cfg:
enable_distributed: true
enable_float16: true
- fix_BN: false
+ fix_layers: false
with_test: false
log_iter: 100
optimizer_reset: false
diff --git a/config/gln/gln_phase2.yaml b/config/gln/gln_phase2.yaml
index 56171aa..4766864 100644
--- a/config/gln/gln_phase2.yaml
+++ b/config/gln/gln_phase2.yaml
@@ -71,7 +71,7 @@ scheduler_cfg:
trainer_cfg:
enable_distributed: true
enable_float16: true
- fix_BN: false
+ fix_layers: false
log_iter: 100
optimizer_reset: true
scheduler_reset: true
diff --git a/doc/how_to_create_your_model.md b/doc/how_to_create_your_model.md
deleted file mode 100644
index de60051..0000000
--- a/doc/how_to_create_your_model.md
+++ /dev/null
@@ -1,4 +0,0 @@
-# How to Create Your Own Model
-This section of documentation will be **refined in the future**. For now, you can refer these files: [default config](../config/default.yaml), [baseline config](../config/baseline.yaml), [loss aggregator](../lib/modeling/loss_aggregator.py), [base_model](../lib/modeling/base_model.py), and [baseline model](../lib/modeling/models/baseline.py).
-
-Then, you can write your own model in `lib\modeling\models`, and use it in configuration file.
diff --git a/doc/prepare_dataset.md b/doc/prepare_dataset.md
deleted file mode 100644
index 04d1c5f..0000000
--- a/doc/prepare_dataset.md
+++ /dev/null
@@ -1,36 +0,0 @@
-# Prepare dataset
-Suppose you have downloaded the original dataset, we need to preprocess the data and save it as pickle file. Remember to set your path to the root of processed dataset in [config/*.yaml](config/).
-
-## Preprocess
-**CASIA-B**
-
-Download URL: http://www.cbsr.ia.ac.cn/GaitDatasetB-silh.zip
-
-- Original
- ```
- CASIA-B
- 001 (subject)
- bg-01 (type)
- 000 (view)
- 001-bg-01-000-001.png (frame)
- 001-bg-01-000-002.png (frame)
- ......
- ......
- ......
- ......
- ```
-- Run `python misc/pretreatment.py --input_path CASIA-B --output_path CASIA-B-pkl`
-- Processed
- ```
- CASIA-B-pkl
- 001 (subject)
- bg-01 (type)
- 000 (view)
- 000.pkl (contains all frames)
- ......
- ......
- ......
- ```
-
-## Split dataset
-You can use the partition file in [misc/partitions](misc/partitions/) directly, or you can create yours. Remember to set your path to the partition file in [config/*.yaml](config/).
\ No newline at end of file
diff --git a/docs/0.prepare_dataset.md b/docs/0.prepare_dataset.md
new file mode 100644
index 0000000..a50002a
--- /dev/null
+++ b/docs/0.prepare_dataset.md
@@ -0,0 +1,101 @@
+# Prepare dataset
+Suppose you have downloaded the original dataset, we need to preprocess the data and save it as pickle file. Remember to set your path to the root of processed dataset in [config/*.yaml](config/).
+
+## Preprocess
+**CASIA-B**
+
+Download URL: http://www.cbsr.ia.ac.cn/GaitDatasetB-silh.zip
+- Original
+ ```
+ CASIA-B
+ 001 (subject)
+ bg-01 (type)
+ 000 (view)
+ 001-bg-01-000-001.png (frame)
+ 001-bg-01-000-002.png (frame)
+ ......
+ ......
+ ......
+ ......
+ ```
+- Run `python misc/pretreatment.py --input_path CASIA-B --output_path CASIA-B-pkl`
+- Processed
+ ```
+ CASIA-B-pkl
+ 001 (subject)
+ bg-01 (type)
+ 000 (view)
+ 000.pkl (contains all frames)
+ ......
+ ......
+ ......
+ ```
+**OUMVLP**
+
+Step1: Download URL: http://www.am.sanken.osaka-u.ac.jp/BiometricDB/GaitMVLP.html
+
+Step2: Unzip the dataset, you will get a structure directory like:
+
+- Original
+ ```
+ OUMVLP-raw
+ Silhouette_000-00 (view-sequence)
+ 00001 (subject)
+ 0001.png (frame)
+ 0002.png (frame)
+ ......
+ 00002
+ 0001.png (frame)
+ 0002.png (frame)
+ ......
+ ......
+ Silhouette_000-01
+ 00001
+ 0001.png (frame)
+ 0002.png (frame)
+ ......
+ 00002
+ 0001.png (frame)
+ 0002.png (frame)
+ ......
+ ......
+ Silhouette_015-00
+ ......
+ Silhouette_015-01
+ ......
+ ......
+ ```
+Step3 : To rearrange directory of OUMVLP dataset, turning to id-type-view structure, Run
+```
+python misc/rearrange_OUMVLP.py --input_path OUMVLP-raw --output_path OUMVLP-rearrange
+```
+
+Step4: Transforming images to pickle file, run
+```
+python misc/pretreatment.py --input_path OUMVLP-rearrange --output_path OUMVLP-pkl
+```
+
+- Processed
+ ```
+ OUMVLP-pkl
+ 00001 (subject)
+ 00 (sequence)
+ 000 (view)
+ 000.pkl (contains all frames)
+ 015 (view)
+ 015.pkl (contains all frames)
+ ...
+ 01 (sequence)
+ 000 (view)
+ 000.pkl (contains all frames)
+ 015 (view)
+ 015.pkl (contains all frames)
+ ......
+ 00002 (subject)
+ ......
+ ......
+ ```
+
+
+## Split dataset
+You can use the partition file in [misc/partitions](misc/partitions/) directly, or you can create yours. Remember to set your path to the partition file in [config/*.yaml](config/).
\ No newline at end of file
diff --git a/doc/detailed_config.md b/docs/1.detailed_config.md
similarity index 71%
rename from doc/detailed_config.md
rename to docs/1.detailed_config.md
index 6a9e17c..43f53d3 100644
--- a/doc/detailed_config.md
+++ b/docs/1.detailed_config.md
@@ -4,7 +4,7 @@
* Data configuration
>
> * Args
-> * dataset_name: Dataset name. Only support `CASIA-B`.
+> * dataset_name: Only support `CASIA-B` and `OUMVLP` now.
> * dataset_root: The path of storing your dataset.
> * num_workers: The number of workers to collect data.
> * dataset_partition: The path of storing your dataset partition file. It splits the dataset to two parts, including train set and test set.
@@ -15,7 +15,7 @@
### loss_cfg
* Loss function
> * Args
-> * type: Loss function type, support `TripletLoss` and `CrossEntropyLoss`
+> * type: Loss function type, support `TripletLoss` and `CrossEntropyLoss`.
> * loss_term_weights: loss weight.
> * log_prefix: the prefix of loss log.
@@ -23,36 +23,36 @@
### optimizer_cfg
* Optimizer
> * Args
-> * solver: Optimizer type, example: `SGD`, `Adam`
-> * **others**: Please refer to `torch.optim`
+> * solver: Optimizer type, example: `SGD`, `Adam`.
+> * **others**: Please refer to `torch.optim`.
### scheduler_cfg
* Learning rate scheduler
> * Args
-> * scheduler : Learning rate scheduler, example: `MultiStepLR`
-> * **others** : Please refer to `torch.optim.lr_scheduler`
+> * scheduler : Learning rate scheduler, example: `MultiStepLR`.
+> * **others** : Please refer to `torch.optim.lr_scheduler`.
----
### model_cfg
* Model to be trained
> * Args
-> * model : Model type, please refer to [Model Library](../lib/modeling/models) for the supported values
-> * **others** : Please refer to [Training Configuration File of Corresponding Model](../config)
+> * model : Model type, please refer to [Model Library](../lib/modeling/models) for the supported values.
+> * **others** : Please refer to the [Training Configuration File of Corresponding Model](../config).
----
### evaluator_cfg
* Evaluator configuration
> * Args
-> * enable_float16: If `True`, enable auto mixed precision.
-> * restore_ckpt_strict: If `True`, check whether the checkpoint is the same as the model.
-> * restore_hint: `int` value indicates the iteration number of restored checkpoint; `str` value indicates the path of restored checkpoint.
+> * enable_float16: If `True`, enable the auto mixed precision mode.
+> * restore_ckpt_strict: If `True`, check whether the checkpoint is the same as the defined model.
+> * restore_hint: `int` value indicates the iteration number of restored checkpoint; `str` value indicates the path to restored checkpoint.
> * save_name: The name of the experiment.
> * eval_func: The function name of evaluation. For `CASIA-B`, choose `identification`.
> * sampler:
-> - type: The name of sampler. Choose `InferenceSampler`
+> - type: The name of sampler. Choose `InferenceSampler`.
> - sample_type: In general, we use `all_ordered` to input all frames by its natural order, which makes sure the tests are consistent.
-> - batch_size: In general, it should equal to the number of utilized GPU.
+> - batch_size: `int` values.
> - **others**: Please refer to [data.sampler](../lib/data/sampler.py) and [data.collate_fn](../lib/data/collate_fn.py)
-> * transform: support `BaseSilCuttingTransform`, `BaseSilTransform`. The difference between them is `BaseSilCuttingTransform` cut the pixels on both sides horizontally.
+> * transform: Support `BaseSilCuttingTransform`, `BaseSilTransform`. The difference between them is `BaseSilCuttingTransform` cut out the black pixels on both sides horizontally.
> * metric: `euc` or `cos`, generally, `euc` performs better.
----
@@ -60,25 +60,25 @@
* Trainer configuration
> * Args
> * fix_BN: If `True`, we fix the weight of all `BatchNorm` layers.
-> * log_iter: Every `log_iter` iterations, log the information.
-> * save_iter: Every `save_iter` iterations, save the model.
-> * with_test: If `True`, we test the model every `save_iter` iterations. A bit of performance impact.(*To Be Fixed*)
+> * log_iter: Log the information per `log_iter` iterations.
+> * save_iter: Save the checkpoint per `save_iter` iterations.
+> * with_test: If `True`, we test the model every `save_iter` iterations. A bit of performance impact.(*Disable in Default*)
> * optimizer_reset: If `True` and `restore_hint!=0`, reset the optimizer while restoring the model.
> * scheduler_reset: If `True` and `restore_hint!=0`, reset the scheduler while restoring the model.
-> * sync_BN: If `True`, applies Batch Normalization as described in the paper [Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift](https://arxiv.org/abs/1502.03167).
-> * total_iter: The total number of training iterations.
+> * sync_BN: If `True`, applies Batch Normalization synchronously.
+> * total_iter: The total training iterations, `int` values.
> * sampler:
-> - type: The name of sampler. Choose `TripletSampler`
+> - type: The name of sampler. Choose `TripletSampler`.
> - sample_type: `[all, fixed, unfixed]` indicates the number of frames used to test, while `[unordered, ordered]` means whether input sequence by its natural order. Example: `fixed_unordered` means selecting fixed number of frames randomly.
-> - batch_size: *[P,K]*\
-> **example**:
+> - batch_size: *[P,K]* where `P` denotes the subjects in training batch while the `K` represents the sequences every subject owns. **Example**:
> - 8
> - 16
-> - **others**: Please refer to [data.sampler](../lib/data/sampler.py) and [data.collate_fn](../lib/data/collate_fn.py)
-> * **others**: Please refer to `evaluator_cfg`
+> - **others**: Please refer to [data.sampler](../lib/data/sampler.py) and [data.collate_fn](../lib/data/collate_fn.py).
+> * **others**: Please refer to `evaluator_cfg`.
---
-**Note**: All configuatrarion items will merged into [default.yaml](../config/default.yaml), and the current configuration is preferable.
-
+**Note**:
+- All the config items will be merged into [default.yaml](../config/default.yaml), and the current config is preferable.
+- The output directory, which includes the log, checkpoint and summary files, is depended on the defined `dataset_name`, `model` and `save_name` settings, like `output/${dataset_name}/${model}/${save_name}`.
# Example
```yaml
@@ -102,7 +102,9 @@ evaluator_cfg:
sample_type: all_ordered # all indicates whole sequence used to test, while ordered means input sequence by its natural order; Other options: fixed_unordered
frames_all_limit: 720 # limit the number of sampled frames to prevent out of memory
metric: euc # cos
-
+ transform:
+ - type: BaseSilCuttingTransform
+ img_w: 64
loss_cfg:
- loss_term_weights: 1.0
@@ -128,6 +130,9 @@ model_cfg:
- M
- BC-256
- BC-256
+ # - M
+ # - BC-512
+ # - BC-512
type: Plain
SeparateFCs:
in_channels: 256
@@ -158,7 +163,7 @@ scheduler_cfg:
scheduler: MultiStepLR
trainer_cfg:
enable_float16: true # half_percesion float for memory reduction and speedup
- fix_BN: false
+ fix_layers: false
log_iter: 100
restore_ckpt_strict: true
restore_hint: 0
@@ -176,6 +181,8 @@ trainer_cfg:
frames_num_min: 25 # min frames number for unfixed traing
sample_type: fixed_unordered # fixed control input frames number, unordered for controlling order of input tensor; Other options: unfixed_ordered or all_ordered
type: TripletSampler
-
+ transform:
+ - type: BaseSilCuttingTransform
+ img_w: 64
```
diff --git a/docs/2.how_to_create_your_model.md b/docs/2.how_to_create_your_model.md
new file mode 100644
index 0000000..4826e4e
--- /dev/null
+++ b/docs/2.how_to_create_your_model.md
@@ -0,0 +1,86 @@
+# How to Create Your Own Model
+## Pipeline
+
+
+## A new model
+If you want to design a new model, you need to write a class inherited from `BaseModel`, e.g, NewModel in newmodel.py:
+```python
+from ..base_model import BaseModel
+
+class NewModel(BaseModel):
+ def __init__(self, cfgs, is_training):
+ super().__init__(cfgs, is_training)
+
+ def build_network(self, model_cfg):
+ self.encoder = ...
+
+ def forward(self, inputs):
+ ipts, labs, typs, viws, seqL = inputs
+ sils = ipts[0]
+ if len(sils.size()) == 4:
+ sils = sils.unsqueeze(2)
+ del ipts
+ n, s, c, h, w = sils.size()
+
+ embed_1, logits, embed = self.encoder(sils)
+
+ return {
+ 'training_feat': {
+ 'triplet': {'embeddings': embed_1, 'labels': labs},
+ 'softmax': {'logits': logits, 'labels': labs}
+ },
+ 'visual_summary': {
+ 'image/sils': sils.view(n*s, 1, h, w)
+ },
+ 'inference_feat': {
+ 'embeddings': embed
+ }
+ }
+
+```
+ In your model class, at least you need to implement `build_network()` and `forward()` functions. The first is used to build the netwroks, and it does not need `return value`. Another is used to calculate the features, the `return value` is fixed in dictionary format
+
+> `training_feat` is for the loss computing, and it must be a `dict` object.
+>
+> `visual_summary` is for visualization, and it must be a `dict` object.
+>
+> `inference_feat` is for the inference, and it must be a `dict` object.
+>
+> `triplet` and `softmax` are the prefixes (or names) of the loss function.
+>
+> `embeddings`, `logits` and `labels` are the input arguments of the loss function.
+
+More information should be seen in [base_model.py](../lib/modeling/base_model.py) and [loss_aggregator.py](../lib/modeling/loss_aggregator.py).
+
+After finishing the model file, you have two steps left to do:
+
+**Step 1**: Put your newmodel.py under `lib/modeling/models`.
+
+**Step 2**: Specify the model name in a yaml file:
+```yaml
+model_cfg:
+ model: NewModel
+ param1: ...
+ param2: ...
+ param3: ...
+```
+
+
+## A new loss
+If you want to write a new loss, you need to write a class inherited from `lib/modeling/losses`, like this
+```python
+from .base import BaseLoss
+
+class NewLoss(BaseLoss):
+ def __init__(self, *args, **kwargs):
+ super(NewLoss, self).__init__(*args, **kargs)
+
+ @gather_and_scale_wrapper
+ def forward(self, embeddings, labels):
+ pass
+```
+Remember to use `gather_and_scale_wrapper` to wrap your forward function if your loss is computed by pairs like `triplet`. By this, we gather all features to one GPU card and scale the loss by the number of GPUs.
+
+Then, put your loss in `lib/modeling/losses` so that you can use it in config file.
+
+Moreover, refer to [loss_aggregator.py](../lib/modeling/loss_aggregator.py) to explore how does your defined loss work in the model.
diff --git a/docs/3.advanced_usages.md b/docs/3.advanced_usages.md
new file mode 100644
index 0000000..b194437
--- /dev/null
+++ b/docs/3.advanced_usages.md
@@ -0,0 +1,88 @@
+# Advanced Usages
+### Cross-Dataset Evalution
+> You can conduct cross-dataset evalution by just modifying several arguments in your [data_cfg](../config/baseline.yaml#L1).
+>
+> Take [baseline.yaml](../config/baseline.yaml) as an example:
+> ```yaml
+> data_cfg:
+> dataset_name: CASIA-B
+> dataset_root: your_path
+> dataset_partition: ./misc/partitions/CASIA-B_include_005.json
+> num_workers: 1
+> remove_no_gallery: false # Remove probe if no gallery for it
+> test_dataset_name: CASIA-B
+> ```
+> Now, suppose we get the model trained on [CASIA-B](http://www.cbsr.ia.ac.cn/english/Gait%20Databases.asp), and then we want to test it on [OUMVLP](http://www.am.sanken.osaka-u.ac.jp/BiometricDB/GaitMVLP.html).
+>
+> We should alter the `dataset_root`, `dataset_partition` and `test_dataset_name`, just like:
+> ```yaml
+> data_cfg:
+> dataset_name: CASIA-B
+> dataset_root: your_OUMVLP_path
+> dataset_partition: ./misc/partitions/OUMVLP.json
+> num_workers: 1
+> remove_no_gallery: false # Remove probe if no gallery for it
+> test_dataset_name: OUMVLP
+> ```
+---
+>
+
+
+### Data Augmentation
+> In OpenGait, there is a basic transform class almost called by all the models, this is [BaseSilCuttingTransform](../lib/data/transform.py#L20), which is used to cut the input silhouettes.
+>
+> Accordingly, by referring to this implementation, you can easily customize the data agumentation in just two steps:
+> * *Step1*: Define the transform function or class in [transform.py](../lib/data/transform.py), and make sure it callable. The style of [torchvision.transforms](https://pytorch.org/vision/stable/_modules/torchvision/transforms/transforms.html) is recommanded, and following shows a demo;
+>> ```python
+>> import torchvision.transforms as T
+>> class demo1():
+>> def __init__(self, args):
+>> pass
+>>
+>> def __call__(self, seqs):
+>> '''
+>> seqs: with dimension of [sequence, height, width]
+>> '''
+>> pass
+>> return seqs
+>>
+>> class demo2():
+>> def __init__(self, args):
+>> pass
+>>
+>> def __call__(self, seqs):
+>> pass
+>> return seqs
+>>
+>> def TransformDemo(base_args, demo1_args, demo2_args):
+>> transform = T.Compose([
+>> BaseSilCuttingTransform(**base_args),
+>> demo1(args=demo1_args),
+>> demo2(args=demo2_args)
+>> ])
+>> return transform
+>> ```
+> * *Step2*: Reset the [`transform`](../config/baseline.yaml#L100) arguments in your config file:
+>> ```yaml
+>> transform:
+>> - type: TransformDemo
+>> base_args: {'img_w': 64}
+>> demo1_args: false
+>> demo2_args: false
+>> ```
+
+### Visualization
+> To learn how does the model work, sometimes, you need to visualize the intermediate result.
+>
+> For this purpose, we have defined a built-in instantiation of [`torch.utils.tensorboard.SummaryWriter`](https://pytorch.org/docs/stable/tensorboard.html), that is [`self.msg_mgr.writer`](../lib/utils/msg_manager.py#L24), to make sure you can log the middle information everywhere you want.
+>
+> Demo: if we want to visualize the output feature of [baseline's backbone](../lib/modeling/models/baseline.py#L27), we could just insert the following codes at [baseline.py#L28](../lib/modeling/models/baseline.py#L28):
+>> ```python
+>> summary_writer = self.msg_mgr.writer
+>> if torch.distributed.get_rank() == 0 and self.training and self.iteration % 100==0:
+>> summary_writer.add_video('outs', outs.mean(2).unsqueeze(2), self.iteration)
+>> ```
+> Note that this example requires the [`moviepy`](https://github.com/Zulko/moviepy) package, and hence you should run `pip install moviepy` first.
\ No newline at end of file
diff --git a/lib/data/dataset.py b/lib/data/dataset.py
index 59d9825..a9ceca8 100644
--- a/lib/data/dataset.py
+++ b/lib/data/dataset.py
@@ -52,14 +52,14 @@ class DataSet(tordata.Dataset):
def __getitem__(self, idx):
if not self.cache:
- data_lst = self.__loader__(self.seqs_info[idx][-1])
+ data_list = self.__loader__(self.seqs_info[idx][-1])
elif self.seqs_data[idx] is None:
- data_lst = self.__loader__(self.seqs_info[idx][-1])
- self.seqs_data[idx] = data_lst
+ data_list = self.__loader__(self.seqs_info[idx][-1])
+ self.seqs_data[idx] = data_list
else:
- data_lst = self.seqs_data[idx]
+ data_list = self.seqs_data[idx]
seq_info = self.seqs_info[idx]
- return data_lst, seq_info
+ return data_list, seq_info
def __load_all_data(self):
for idx in range(len(self)):
diff --git a/lib/modeling/backbones/plain.py b/lib/modeling/backbones/plain.py
index 926e9d0..e66da97 100644
--- a/lib/modeling/backbones/plain.py
+++ b/lib/modeling/backbones/plain.py
@@ -1,7 +1,28 @@
+"""The plain backbone.
+
+ The plain backbone only contains the BasicConv2d, FocalConv2d and MaxPool2d and LeakyReLU layers.
+"""
+
import torch.nn as nn
from ..modules import BasicConv2d, FocalConv2d
+
class Plain(nn.Module):
+ """
+ The Plain backbone class.
+
+ An implicit LeakyRelu appended to each layer except maxPooling.
+ The kernel size, stride and padding of the first convolution layer are 5, 1, 2, the ones of other layers are 3, 1, 1.
+
+ Typical usage:
+ - BC-64: Basic conv2d with output channel 64. The input channel is the output channel of previous layer.
+
+ - M: nn.MaxPool2d(kernel_size=2, stride=2)].
+
+ - FC-128-1: Focal conv2d with output channel 64 and halving 1(divided to 2^1=2 parts).
+
+ Use it in your configuration file.
+ """
def __init__(self, layers_cfg, in_channels=1):
super(Plain, self).__init__()
@@ -13,9 +34,11 @@ class Plain(nn.Module):
def forward(self, seqs):
out = self.feature(seqs)
return out
-
- # torchvision/models/vgg.py
+
def make_layers(self):
+ """
+ Reference: torchvision/models/vgg.py
+ """
def get_layer(cfg, in_c, kernel_size, stride, padding):
cfg = cfg.split('-')
typ = cfg[0]
@@ -27,7 +50,8 @@ class Plain(nn.Module):
return BasicConv2d(in_c, out_c, kernel_size=kernel_size, stride=stride, padding=padding)
return FocalConv2d(in_c, out_c, kernel_size=kernel_size, stride=stride, padding=padding, halving=int(cfg[2]))
- Layers = [get_layer(self.layers_cfg[0], self.in_channels, 5, 1, 2), nn.LeakyReLU(inplace=True)]
+ Layers = [get_layer(self.layers_cfg[0], self.in_channels,
+ 5, 1, 2), nn.LeakyReLU(inplace=True)]
in_c = int(self.layers_cfg[0].split('-')[1])
for cfg in self.layers_cfg[1:]:
if cfg == 'M':
@@ -37,6 +61,3 @@ class Plain(nn.Module):
Layers += [conv2d, nn.LeakyReLU(inplace=True)]
in_c = int(cfg.split('-')[1])
return nn.Sequential(*Layers)
-
-
-
diff --git a/lib/modeling/base_model.py b/lib/modeling/base_model.py
index 962457a..7b7a0df 100644
--- a/lib/modeling/base_model.py
+++ b/lib/modeling/base_model.py
@@ -1,3 +1,14 @@
+"""The base model definition.
+
+This module defines the abstract meta model class and base model class. In the base model,
+ we define the basic model functions, like get_loader, build_network, and run_train, etc.
+ The api of the base model is run_train and run_test, they are used in `lib/main.py`.
+
+Typical usage:
+
+BaseModel.run_train(model)
+BaseModel.run_test(model)
+"""
import torch
import numpy as np
import os.path as osp
@@ -13,7 +24,6 @@ from abc import abstractmethod
from . import backbones
from .loss_aggregator import LossAggregator
-from modeling.modules import fix_BN
from data.transform import get_transform
from data.collate_fn import CollateFn
from data.dataset import DataSet
@@ -28,80 +38,97 @@ __all__ = ['BaseModel']
class MetaModel(metaclass=ABCMeta):
+ """The necessary functions for the base model.
+ This class defines the necessary functions for the base model, in the base model, we have implemented them.
+ """
@abstractmethod
def get_loader(self, data_cfg):
- '''
- Build your data Loader here.
- Inputs: data_cfg, dict
- Return: Loader
- '''
+ """Based on the given data_cfg, we get the data loader."""
raise NotImplementedError
@abstractmethod
def build_network(self, model_cfg):
- '''
- Build your Model here.
- Inputs: model_cfg, dict
- Return: Network, nn.Module(s)
- '''
+ """Build your network here."""
raise NotImplementedError
@abstractmethod
def init_parameters(self):
+ """Initialize the parameters of your network."""
raise NotImplementedError
@abstractmethod
def get_optimizer(self, optimizer_cfg):
- '''
- Build your Optimizer here.
- Inputs: optimizer_cfg, dict
- Return: Optimizer, a optimizer object
- '''
+ """Based on the given optimizer_cfg, we get the optimizer."""
raise NotImplementedError
@abstractmethod
def get_scheduler(self, scheduler_cfg):
- '''
- Build your Scheduler.
- Inputs: scheduler_cfg, dict
- Optimizer, your optimizer
- Return: Scheduler, a scheduler object
- '''
+ """Based on the given scheduler_cfg, we get the scheduler."""
raise NotImplementedError
@abstractmethod
def save_ckpt(self, iteration):
+ """Save the checkpoint, including model parameter, optimizer and scheduler."""
raise NotImplementedError
@abstractmethod
def resume_ckpt(self, restore_hint):
+ """Resume the model from the checkpoint, including model parameter, optimizer and scheduler."""
raise NotImplementedError
@abstractmethod
def inputs_pretreament(self, inputs):
+ """Transform the input data based on transform setting."""
raise NotImplementedError
@abstractmethod
def train_step(self, loss_num) -> bool:
+ """Do one training step."""
raise NotImplementedError
@abstractmethod
def inference(self):
+ """Do inference (calculate features.)."""
raise NotImplementedError
@abstractmethod
def run_train(model):
+ """Run a whole train schedule."""
raise NotImplementedError
@abstractmethod
def run_test(model):
+ """Run a whole test schedule."""
raise NotImplementedError
class BaseModel(MetaModel, nn.Module):
+ """Base model.
+
+ This class inherites the MetaModel class, and implements the basic model functions, like get_loader, build_network, etc.
+
+ Attributes:
+ msg_mgr: the massage manager.
+ cfgs: the configs.
+ iteration: the current iteration of the model.
+ engine_cfg: the configs of the engine(train or test).
+ save_path: the path to save the checkpoints.
+
+ """
def __init__(self, cfgs, training):
+ """Initialize the base model.
+
+ Complete the model initialization, including the data loader, the network, the optimizer, the scheduler, the loss.
+
+ Args:
+ cfgs:
+ All of the configs.
+ training:
+ Whether the model is in training mode.
+ """
+
super(BaseModel, self).__init__()
self.msg_mgr = get_msg_mgr()
self.cfgs = cfgs
@@ -132,8 +159,6 @@ class BaseModel(MetaModel, nn.Module):
"cuda", self.device))
if training:
- if cfgs['trainer_cfg']['fix_BN']:
- fix_BN(self)
self.loss_aggregator = LossAggregator(cfgs['loss_cfg'])
self.optimizer = self.get_optimizer(self.cfgs['optimizer_cfg'])
self.scheduler = self.get_scheduler(cfgs['scheduler_cfg'])
@@ -142,7 +167,12 @@ class BaseModel(MetaModel, nn.Module):
if restore_hint != 0:
self.resume_ckpt(restore_hint)
+ if training:
+ if cfgs['trainer_cfg']['fix_BN']:
+ self.fix_BN()
+
def get_backbone(self, model_cfg):
+ """Get the backbone of the model."""
def _get_backbone(backbone_cfg):
if is_dict(backbone_cfg):
Backbone = get_attr_from([backbones], backbone_cfg['type'])
@@ -266,7 +296,20 @@ class BaseModel(MetaModel, nn.Module):
"Error type for -Restore_Hint-, supported: int or string.")
self._load_ckpt(save_name)
+ def fix_BN(self):
+ for module in self.modules():
+ classname = module.__class__.__name__
+ if classname.find('BatchNorm') != -1:
+ module.eval()
+
def inputs_pretreament(self, inputs):
+ """Conduct transforms on input data.
+
+ Args:
+ inputs: the input data.
+ Returns:
+ tuple: training data including inputs, labels, and some meta data.
+ """
seqs_batch, labs_batch, typs_batch, vies_batch, seqL_batch = inputs
trf_cfgs = self.engine_cfg['transform']
seq_trfs = get_transform(trf_cfgs)
@@ -293,9 +336,13 @@ class BaseModel(MetaModel, nn.Module):
return ipts, labs, typs, vies, seqL
def train_step(self, loss_sum) -> bool:
- '''
- Conduct loss_sum.backward(), self.optimizer.step() and self.scheduler.step().
- '''
+ """Conduct loss_sum.backward(), self.optimizer.step() and self.scheduler.step().
+
+ Args:
+ loss_sum:The loss of the current batch.
+ Returns:
+ bool: True if the training is finished, False otherwise.
+ """
self.optimizer.zero_grad()
if loss_sum <= 1e-9:
@@ -322,6 +369,13 @@ class BaseModel(MetaModel, nn.Module):
return True
def inference(self, rank):
+ """Inference all the test data.
+
+ Args:
+ rank: the rank of the current process.Transform
+ Returns:
+ Odict: contains the inference results.
+ """
total_size = len(self.test_loader)
if rank == 0:
pbar = tqdm(total=total_size, desc='Transforming')
@@ -355,9 +409,7 @@ class BaseModel(MetaModel, nn.Module):
@ staticmethod
def run_train(model):
- '''
- Accept the instance object(model) here, and then run the train loop handler.
- '''
+ """Accept the instance object(model) here, and then run the train loop."""
for inputs in model.train_loader:
ipts = model.inputs_pretreament(inputs)
with autocast(enabled=model.engine_cfg['enable_float16']):
@@ -390,6 +442,8 @@ class BaseModel(MetaModel, nn.Module):
@ staticmethod
def run_test(model):
+ """Accept the instance object(model) here, and then run the test loop."""
+
rank = torch.distributed.get_rank()
with torch.no_grad():
info_dict = model.inference(rank)
diff --git a/lib/modeling/loss_aggregator.py b/lib/modeling/loss_aggregator.py
index 0c67edc..7ccebb6 100644
--- a/lib/modeling/loss_aggregator.py
+++ b/lib/modeling/loss_aggregator.py
@@ -1,3 +1,5 @@
+"""The loss aggregator."""
+
import torch
from . import losses
from utils import is_dict, get_attr_from, get_valid_args, is_tensor, get_ddp_module
@@ -6,18 +8,48 @@ from utils import get_msg_mgr
class LossAggregator():
+ """The loss aggregator.
+
+ This class is used to aggregate the losses.
+ For example, if you have two losses, one is triplet loss, the other is cross entropy loss,
+ you can aggregate them as follows:
+ loss_num = tripley_loss + cross_entropy_loss
+
+ Attributes:
+ losses: A dict of losses.
+ """
+
def __init__(self, loss_cfg) -> None:
+ """
+ Initialize the loss aggregator.
+
+ Args:
+ loss_cfg: Config of losses. List for multiple losses.
+ """
self.losses = {loss_cfg['log_prefix']: self._build_loss_(loss_cfg)} if is_dict(loss_cfg) \
else {cfg['log_prefix']: self._build_loss_(cfg) for cfg in loss_cfg}
def _build_loss_(self, loss_cfg):
+ """Build the losses from loss_cfg.
+
+ Args:
+ loss_cfg: Config of loss.
+ """
Loss = get_attr_from([losses], loss_cfg['type'])
valid_loss_arg = get_valid_args(
- Loss, loss_cfg, ['type', 'pair_based_loss'])
- loss = get_ddp_module(Loss(**valid_loss_arg))
+ Loss, loss_cfg, ['type', 'gather_and_scale'])
+ loss = get_ddp_module(Loss(**valid_loss_arg).cuda())
return loss
def __call__(self, training_feats):
+ """Compute the sum of all losses.
+
+ The input is a dict of features. The key is the name of loss and the value is the feature and label. If the key not in
+ built losses and the value is torch.Tensor, then it is the computed loss to be added loss_sum.
+
+ Args:
+ training_feats: A dict of features. The same as the output["training_feat"] of the model.
+ """
loss_sum = .0
loss_info = Odict()
@@ -28,14 +60,12 @@ class LossAggregator():
for name, value in info.items():
loss_info['scalar/%s/%s' % (k, name)] = value
loss = loss.mean() * loss_func.loss_term_weights
- if loss_func.pair_based_loss:
- loss = loss * torch.distributed.get_world_size()
loss_sum += loss
else:
if isinstance(v, dict):
raise ValueError(
- "The key %s in -Trainng-Feat- should be stated as the log_prefix of a certain loss defined in your loss_cfg."
+ "The key %s in -Trainng-Feat- should be stated as the log_prefix of a certain loss defined in your loss_cfg."%v
)
elif is_tensor(v):
_ = v.mean()
diff --git a/lib/modeling/losses/base.py b/lib/modeling/losses/base.py
index 0578c60..fbdd7d6 100644
--- a/lib/modeling/losses/base.py
+++ b/lib/modeling/losses/base.py
@@ -1,13 +1,54 @@
+from ctypes import ArgumentError
import torch.nn as nn
+import torch
from utils import Odict
+import functools
+from utils import ddp_all_gather
-class BasicLoss(nn.Module):
- def __init__(self, loss_term_weights=1.0):
- super(BasicLoss, self).__init__()
- self.loss_term_weights = loss_term_weights
- self.pair_based_loss = True
- self.info = Odict()
-
+def gather_and_scale_wrapper(func):
+ """Internal wrapper: gather the input from multple cards to one card, and scale the loss by the number of cards.
+ """
+
+ @functools.wraps(func)
+ def inner(*args, **kwds):
+ try:
+
+ for k, v in kwds.items():
+ kwds[k] = ddp_all_gather(v)
+
+ loss, loss_info = func(*args, **kwds)
+ loss *= torch.distributed.get_world_size()
+ return loss, loss_info
+ except:
+ raise ArgumentError
+ return inner
+
+
+class BaseLoss(nn.Module):
+ """
+ Base class for all losses.
+
+ Your loss should also subclass this class.
+
+ Attribute:
+ loss_term_weights: the weight of the loss.
+ info: the loss info.
+ """
+ loss_term_weights = 1.0
+ info = Odict()
+
def forward(self, logits, labels):
- raise NotImplementedError
+ """
+ The default forward function.
+
+ This function should be overridden by the subclass.
+
+ Args:
+ logits: the logits of the model.
+ labels: the labels of the data.
+
+ Returns:
+ tuple of loss and info.
+ """
+ return .0, self.info
diff --git a/lib/modeling/losses/softmax.py b/lib/modeling/losses/softmax.py
index 99819bf..ddf2293 100644
--- a/lib/modeling/losses/softmax.py
+++ b/lib/modeling/losses/softmax.py
@@ -1,10 +1,10 @@
import torch
import torch.nn.functional as F
-from .base import BasicLoss
+from .base import BaseLoss
-class CrossEntropyLoss(BasicLoss):
+class CrossEntropyLoss(BaseLoss):
def __init__(self, scale=2**4, label_smooth=True, eps=0.1, loss_term_weights=1.0, log_accuracy=False):
super(CrossEntropyLoss, self).__init__()
self.scale = scale
@@ -13,7 +13,6 @@ class CrossEntropyLoss(BasicLoss):
self.log_accuracy = log_accuracy
self.loss_term_weights = loss_term_weights
- self.pair_based_loss = False
def forward(self, logits, labels):
"""
@@ -26,7 +25,7 @@ class CrossEntropyLoss(BasicLoss):
one_hot_labels = self.label2one_hot(
labels, c).unsqueeze(0).repeat(p, 1, 1) # [p, n, c]
loss = self.compute_loss(log_preds, one_hot_labels)
- self.info.update({'loss': loss})
+ self.info.update({'loss': loss.detach().clone()})
if self.log_accuracy:
pred = logits.argmax(dim=-1) # [p, n]
accu = (pred == labels.unsqueeze(0)).float().mean()
diff --git a/lib/modeling/losses/triplet.py b/lib/modeling/losses/triplet.py
index ef0b521..4b5dac2 100644
--- a/lib/modeling/losses/triplet.py
+++ b/lib/modeling/losses/triplet.py
@@ -1,22 +1,19 @@
import torch
import torch.nn.functional as F
-from .base import BasicLoss
-from utils import ddp_all_gather
+from .base import BaseLoss, gather_and_scale_wrapper
-class TripletLoss(BasicLoss):
+class TripletLoss(BaseLoss):
def __init__(self, margin, loss_term_weights=1.0):
super(TripletLoss, self).__init__()
self.margin = margin
self.loss_term_weights = loss_term_weights
- self.pair_based_loss = True
+ @gather_and_scale_wrapper
def forward(self, embeddings, labels):
# embeddings: [n, p, c], label: [n]
- embeddings = ddp_all_gather(embeddings)
- labels = ddp_all_gather(labels)
embeddings = embeddings.permute(
1, 0, 2).contiguous() # [n, p, c] -> [p, n, c]
embeddings = embeddings.float()
@@ -32,10 +29,10 @@ class TripletLoss(BasicLoss):
loss_avg, loss_num = self.AvgNonZeroReducer(loss)
self.info.update({
- 'loss': loss_avg,
- 'hard_loss': hard_loss,
- 'loss_num': loss_num,
- 'mean_dist': mean_dist})
+ 'loss': loss_avg.detach().clone(),
+ 'hard_loss': hard_loss.detach().clone(),
+ 'loss_num': loss_num.detach().clone(),
+ 'mean_dist': mean_dist.detach().clone()})
return loss_avg, self.info
diff --git a/lib/modeling/models/gaitgl.py b/lib/modeling/models/gaitgl.py
index 6a0bc05..7057d8b 100644
--- a/lib/modeling/models/gaitgl.py
+++ b/lib/modeling/models/gaitgl.py
@@ -63,8 +63,8 @@ class GeMHPP(nn.Module):
class GaitGL(BaseModel):
"""
- Title: Gait Recognition via Effective Global-Local Feature Representation and Local Temporal Aggregation
- ICCV2021: https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Gait_Recognition_via_Effective_Global-Local_Feature_Representation_and_Local_Temporal_ICCV_2021_paper.pdf
+ GaitGL: Gait Recognition via Effective Global-Local Feature Representation and Local Temporal Aggregation
+ Arxiv : https://arxiv.org/pdf/2011.01461.pdf
"""
def __init__(self, *args, **kargs):
@@ -73,31 +73,71 @@ class GaitGL(BaseModel):
def build_network(self, model_cfg):
in_c = model_cfg['channels']
class_num = model_cfg['class_num']
+ dataset_name = self.cfgs['data_cfg']['dataset_name']
- # For CASIA-B
- self.conv3d = nn.Sequential(
- BasicConv3d(1, in_c[0], kernel_size=(3, 3, 3),
- stride=(1, 1, 1), padding=(1, 1, 1)),
- nn.LeakyReLU(inplace=True)
- )
- self.LTA = nn.Sequential(
- BasicConv3d(in_c[0], in_c[0], kernel_size=(
- 3, 1, 1), stride=(3, 1, 1), padding=(0, 0, 0)),
- nn.LeakyReLU(inplace=True)
- )
+ if dataset_name == 'OUMVLP':
+ # For OUMVLP
+ self.conv3d = nn.Sequential(
+ BasicConv3d(1, in_c[0], kernel_size=(3, 3, 3),
+ stride=(1, 1, 1), padding=(1, 1, 1)),
+ nn.LeakyReLU(inplace=True),
+ BasicConv3d(in_c[0], in_c[0], kernel_size=(3, 3, 3),
+ stride=(1, 1, 1), padding=(1, 1, 1)),
+ nn.LeakyReLU(inplace=True),
+ )
+ self.LTA = nn.Sequential(
+ BasicConv3d(in_c[0], in_c[0], kernel_size=(
+ 3, 1, 1), stride=(3, 1, 1), padding=(0, 0, 0)),
+ nn.LeakyReLU(inplace=True)
+ )
- self.GLConvA0 = GLConv(in_c[0], in_c[1], halving=3, fm_sign=False, kernel_size=(
- 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
- self.MaxPool0 = nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))
+ self.GLConvA0 = nn.Sequential(
+ GLConv(in_c[0], in_c[1], halving=1, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ GLConv(in_c[1], in_c[1], halving=1, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ )
+ self.MaxPool0 = nn.MaxPool3d(
+ kernel_size=(1, 2, 2), stride=(1, 2, 2))
- self.GLConvA1 = GLConv(in_c[1], in_c[2], halving=3, fm_sign=False, kernel_size=(
- 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
- self.GLConvB2 = GLConv(in_c[2], in_c[2], halving=3, fm_sign=True, kernel_size=(
- 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
+ self.GLConvA1 = nn.Sequential(
+ GLConv(in_c[1], in_c[2], halving=1, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ GLConv(in_c[2], in_c[2], halving=1, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ )
+ self.GLConvB2 = nn.Sequential(
+ GLConv(in_c[2], in_c[3], halving=1, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ GLConv(in_c[3], in_c[3], halving=1, fm_sign=True, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1)),
+ )
+ else:
+ # For CASIA-B or other unstated datasets.
+ self.conv3d = nn.Sequential(
+ BasicConv3d(1, in_c[0], kernel_size=(3, 3, 3),
+ stride=(1, 1, 1), padding=(1, 1, 1)),
+ nn.LeakyReLU(inplace=True)
+ )
+ self.LTA = nn.Sequential(
+ BasicConv3d(in_c[0], in_c[0], kernel_size=(
+ 3, 1, 1), stride=(3, 1, 1), padding=(0, 0, 0)),
+ nn.LeakyReLU(inplace=True)
+ )
- self.Head0 = SeparateFCs(64, in_c[2], in_c[2])
- self.Bn = nn.BatchNorm1d(in_c[2])
- self.Head1 = SeparateFCs(64, in_c[2], class_num)
+ self.GLConvA0 = GLConv(in_c[0], in_c[1], halving=3, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
+ self.MaxPool0 = nn.MaxPool3d(
+ kernel_size=(1, 2, 2), stride=(1, 2, 2))
+
+ self.GLConvA1 = GLConv(in_c[1], in_c[2], halving=3, fm_sign=False, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
+ self.GLConvB2 = GLConv(in_c[2], in_c[2], halving=3, fm_sign=True, kernel_size=(
+ 3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
+
+ self.Head0 = SeparateFCs(64, in_c[-1], in_c[-1])
+ self.Bn = nn.BatchNorm1d(in_c[-1])
+ self.Head1 = SeparateFCs(64, in_c[-1], class_num)
self.TP = PackSequenceWrapper(torch.max)
self.HPP = GeMHPP()
@@ -105,7 +145,9 @@ class GaitGL(BaseModel):
def forward(self, inputs):
ipts, labs, _, _, seqL = inputs
seqL = None if not self.training else seqL
-
+ if not self.training and len(labs) != 1:
+ raise ValueError(
+ 'The input size of each GPU must be 1 in testing mode, but got {}!'.format(len(labs)))
sils = ipts[0].unsqueeze(1)
del ipts
n, _, s, h, w = sils.size()
diff --git a/lib/modeling/modules.py b/lib/modeling/modules.py
index e9e1487..34c5663 100644
--- a/lib/modeling/modules.py
+++ b/lib/modeling/modules.py
@@ -191,10 +191,3 @@ def RmBN2dAffine(model):
if isinstance(m, nn.BatchNorm2d):
m.weight.requires_grad = False
m.bias.requires_grad = False
-
-
-def fix_BN(model):
- for module in model.modules():
- classname = module.__class__.__name__
- if classname.find('BatchNorm2d') != -1:
- module.eval()
diff --git a/lib/utils/__init__.py b/lib/utils/__init__.py
index 211d424..8f72cd7 100644
--- a/lib/utils/__init__.py
+++ b/lib/utils/__init__.py
@@ -1,7 +1,7 @@
from .common import get_ddp_module, ddp_all_gather
from .common import Odict, Ntuple
from .common import get_valid_args
-from .common import is_list_or_tuple, is_str, is_list, is_dict, is_tensor, is_array, config_loader, init_seeds, handler, params_count
+from .common import is_list_or_tuple, is_bool, is_str, is_list, is_dict, is_tensor, is_array, config_loader, init_seeds, handler, params_count
from .common import ts2np, ts2var, np2var, list2var
from .common import mkdir, clones
from .common import MergeCfgsDict
diff --git a/lib/utils/common.py b/lib/utils/common.py
index f52fb15..72b75a7 100644
--- a/lib/utils/common.py
+++ b/lib/utils/common.py
@@ -74,6 +74,10 @@ def is_list_or_tuple(x):
return isinstance(x, (list, tuple))
+def is_bool(x):
+ return isinstance(x, bool)
+
+
def is_str(x):
return isinstance(x, str)
diff --git a/misc/clean_process.sh b/misc/clean_process.sh
new file mode 100644
index 0000000..c3df0ae
--- /dev/null
+++ b/misc/clean_process.sh
@@ -0,0 +1 @@
+kill $(ps aux | grep main.py | grep -v grep | awk '{print $2}')
diff --git a/misc/partitions/OUMVLP.json b/misc/partitions/OUMVLP.json
new file mode 100644
index 0000000..d701f22
--- /dev/null
+++ b/misc/partitions/OUMVLP.json
@@ -0,0 +1,10313 @@
+{
+ "TRAIN_SET": [
+ "00001",
+ "00003",
+ "00005",
+ "00007",
+ "00009",
+ "00011",
+ "00013",
+ "00015",
+ "00017",
+ "00019",
+ "00021",
+ "00023",
+ "00025",
+ "00027",
+ "00029",
+ "00031",
+ "00033",
+ "00035",
+ "00037",
+ "00039",
+ "00041",
+ "00043",
+ "00045",
+ "00047",
+ "00049",
+ "00051",
+ "00053",
+ "00055",
+ "00057",
+ "00059",
+ "00061",
+ "00063",
+ "00065",
+ "00067",
+ "00069",
+ "00071",
+ "00073",
+ "00075",
+ "00077",
+ "00079",
+ "00081",
+ "00083",
+ "00085",
+ "00087",
+ "00089",
+ "00091",
+ "00093",
+ "00095",
+ "00097",
+ "00099",
+ "00101",
+ "00103",
+ "00105",
+ "00107",
+ "00109",
+ "00111",
+ "00113",
+ "00115",
+ "00117",
+ "00119",
+ "00121",
+ "00123",
+ "00125",
+ "00127",
+ "00129",
+ "00131",
+ "00133",
+ "00135",
+ "00137",
+ "00139",
+ "00141",
+ "00143",
+ "00145",
+ "00147",
+ "00149",
+ "00151",
+ "00153",
+ "00155",
+ "00157",
+ "00159",
+ "00161",
+ "00163",
+ "00165",
+ "00167",
+ "00169",
+ "00171",
+ "00173",
+ "00175",
+ "00177",
+ "00179",
+ "00181",
+ "00183",
+ "00185",
+ "00187",
+ "00189",
+ "00191",
+ "00193",
+ "00195",
+ "00197",
+ "00199",
+ "00201",
+ "00203",
+ "00205",
+ "00207",
+ "00209",
+ "00211",
+ "00213",
+ "00215",
+ "00217",
+ "00219",
+ "00221",
+ "00223",
+ "00225",
+ "00227",
+ "00229",
+ "00231",
+ "00233",
+ "00235",
+ "00237",
+ "00239",
+ "00241",
+ "00243",
+ "00245",
+ "00247",
+ "00249",
+ "00251",
+ "00253",
+ "00255",
+ "00257",
+ "00259",
+ "00261",
+ "00263",
+ "00265",
+ "00267",
+ "00269",
+ "00271",
+ "00273",
+ "00275",
+ "00277",
+ "00279",
+ "00281",
+ "00283",
+ "00285",
+ "00287",
+ "00289",
+ "00291",
+ "00293",
+ "00295",
+ "00297",
+ "00299",
+ "00301",
+ "00303",
+ "00305",
+ "00307",
+ "00309",
+ "00311",
+ "00313",
+ "00315",
+ "00317",
+ "00319",
+ "00321",
+ "00323",
+ "00325",
+ "00327",
+ "00329",
+ "00331",
+ "00333",
+ "00335",
+ "00337",
+ "00339",
+ "00341",
+ "00343",
+ "00345",
+ "00347",
+ "00349",
+ "00351",
+ "00353",
+ "00355",
+ "00357",
+ "00359",
+ "00361",
+ "00363",
+ "00365",
+ "00367",
+ "00369",
+ "00371",
+ "00373",
+ "00375",
+ "00377",
+ "00379",
+ "00381",
+ "00383",
+ "00385",
+ "00387",
+ "00389",
+ "00391",
+ "00393",
+ "00395",
+ "00397",
+ "00399",
+ "00401",
+ "00403",
+ "00405",
+ "00407",
+ "00409",
+ "00411",
+ "00413",
+ "00415",
+ "00417",
+ "00419",
+ "00421",
+ "00423",
+ "00425",
+ "00427",
+ "00429",
+ "00431",
+ "00433",
+ "00435",
+ "00437",
+ "00439",
+ "00441",
+ "00443",
+ "00445",
+ "00447",
+ "00449",
+ "00451",
+ "00453",
+ "00455",
+ "00457",
+ "00459",
+ "00461",
+ "00463",
+ "00465",
+ "00467",
+ "00469",
+ "00471",
+ "00473",
+ "00475",
+ "00477",
+ "00479",
+ "00481",
+ "00483",
+ "00485",
+ "00487",
+ "00489",
+ "00491",
+ "00493",
+ "00495",
+ "00497",
+ "00499",
+ "00501",
+ "00503",
+ "00505",
+ "00507",
+ "00509",
+ "00511",
+ "00513",
+ "00515",
+ "00517",
+ "00519",
+ "00521",
+ "00523",
+ "00525",
+ "00527",
+ "00529",
+ "00531",
+ "00533",
+ "00535",
+ "00537",
+ "00539",
+ "00541",
+ "00543",
+ "00545",
+ "00547",
+ "00549",
+ "00551",
+ "00553",
+ "00555",
+ "00557",
+ "00559",
+ "00561",
+ "00563",
+ "00565",
+ "00567",
+ "00569",
+ "00571",
+ "00573",
+ "00575",
+ "00577",
+ "00579",
+ "00581",
+ "00583",
+ "00585",
+ "00587",
+ "00589",
+ "00591",
+ "00593",
+ "00595",
+ "00597",
+ "00599",
+ "00601",
+ "00603",
+ "00605",
+ "00607",
+ "00609",
+ "00611",
+ "00613",
+ "00615",
+ "00617",
+ "00619",
+ "00621",
+ "00623",
+ "00625",
+ "00627",
+ "00629",
+ "00631",
+ "00633",
+ "00635",
+ "00637",
+ "00639",
+ "00641",
+ "00643",
+ "00645",
+ "00647",
+ "00649",
+ "00651",
+ "00653",
+ "00655",
+ "00657",
+ "00659",
+ "00661",
+ "00663",
+ "00665",
+ "00667",
+ "00669",
+ "00671",
+ "00673",
+ "00675",
+ "00677",
+ "00679",
+ "00681",
+ "00683",
+ "00685",
+ "00687",
+ "00689",
+ "00691",
+ "00693",
+ "00695",
+ "00697",
+ "00699",
+ "00701",
+ "00703",
+ "00705",
+ "00707",
+ "00709",
+ "00711",
+ "00713",
+ "00715",
+ "00717",
+ "00719",
+ "00721",
+ "00723",
+ "00725",
+ "00727",
+ "00729",
+ "00731",
+ "00733",
+ "00735",
+ "00737",
+ "00739",
+ "00741",
+ "00743",
+ "00745",
+ "00747",
+ "00749",
+ "00751",
+ "00753",
+ "00755",
+ "00757",
+ "00759",
+ "00761",
+ "00763",
+ "00765",
+ "00767",
+ "00769",
+ "00771",
+ "00773",
+ "00775",
+ "00777",
+ "00779",
+ "00781",
+ "00783",
+ "00785",
+ "00787",
+ "00789",
+ "00791",
+ "00793",
+ "00795",
+ "00797",
+ "00799",
+ "00801",
+ "00803",
+ "00805",
+ "00807",
+ "00809",
+ "00811",
+ "00813",
+ "00815",
+ "00817",
+ "00819",
+ "00821",
+ "00823",
+ "00825",
+ "00827",
+ "00829",
+ "00831",
+ "00833",
+ "00835",
+ "00837",
+ "00839",
+ "00841",
+ "00843",
+ "00845",
+ "00847",
+ "00849",
+ "00851",
+ "00853",
+ "00855",
+ "00857",
+ "00859",
+ "00861",
+ "00863",
+ "00865",
+ "00867",
+ "00869",
+ "00871",
+ "00873",
+ "00875",
+ "00877",
+ "00879",
+ "00881",
+ "00883",
+ "00885",
+ "00887",
+ "00889",
+ "00891",
+ "00893",
+ "00895",
+ "00897",
+ "00899",
+ "00901",
+ "00903",
+ "00905",
+ "00907",
+ "00909",
+ "00911",
+ "00913",
+ "00915",
+ "00917",
+ "00919",
+ "00921",
+ "00923",
+ "00925",
+ "00927",
+ "00929",
+ "00931",
+ "00933",
+ "00935",
+ "00937",
+ "00939",
+ "00941",
+ "00943",
+ "00945",
+ "00947",
+ "00949",
+ "00951",
+ "00953",
+ "00955",
+ "00957",
+ "00959",
+ "00961",
+ "00963",
+ "00965",
+ "00967",
+ "00969",
+ "00971",
+ "00973",
+ "00975",
+ "00977",
+ "00979",
+ "00981",
+ "00983",
+ "00985",
+ "00987",
+ "00989",
+ "00991",
+ "00993",
+ "00995",
+ "00997",
+ "00999",
+ "01001",
+ "01003",
+ "01005",
+ "01007",
+ "01009",
+ "01011",
+ "01013",
+ "01015",
+ "01017",
+ "01019",
+ "01021",
+ "01023",
+ "01025",
+ "01027",
+ "01029",
+ "01031",
+ "01033",
+ "01035",
+ "01037",
+ "01039",
+ "01041",
+ "01043",
+ "01045",
+ "01047",
+ "01049",
+ "01051",
+ "01053",
+ "01055",
+ "01057",
+ "01059",
+ "01061",
+ "01063",
+ "01065",
+ "01067",
+ "01069",
+ "01071",
+ "01073",
+ "01075",
+ "01077",
+ "01079",
+ "01081",
+ "01083",
+ "01085",
+ "01087",
+ "01089",
+ "01091",
+ "01093",
+ "01095",
+ "01097",
+ "01099",
+ "01101",
+ "01103",
+ "01105",
+ "01107",
+ "01109",
+ "01111",
+ "01113",
+ "01115",
+ "01117",
+ "01119",
+ "01121",
+ "01123",
+ "01125",
+ "01127",
+ "01129",
+ "01131",
+ "01133",
+ "01135",
+ "01137",
+ "01139",
+ "01141",
+ "01143",
+ "01145",
+ "01147",
+ "01149",
+ "01151",
+ "01153",
+ "01155",
+ "01157",
+ "01159",
+ "01161",
+ "01163",
+ "01165",
+ "01167",
+ "01169",
+ "01171",
+ "01173",
+ "01175",
+ "01177",
+ "01179",
+ "01181",
+ "01183",
+ "01185",
+ "01187",
+ "01189",
+ "01191",
+ "01193",
+ "01195",
+ "01197",
+ "01199",
+ "01201",
+ "01203",
+ "01205",
+ "01207",
+ "01209",
+ "01211",
+ "01213",
+ "01215",
+ "01217",
+ "01219",
+ "01221",
+ "01223",
+ "01225",
+ "01227",
+ "01229",
+ "01231",
+ "01233",
+ "01235",
+ "01237",
+ "01239",
+ "01241",
+ "01243",
+ "01245",
+ "01247",
+ "01249",
+ "01251",
+ "01253",
+ "01255",
+ "01257",
+ "01259",
+ "01261",
+ "01263",
+ "01265",
+ "01267",
+ "01269",
+ "01271",
+ "01273",
+ "01275",
+ "01277",
+ "01279",
+ "01281",
+ "01283",
+ "01285",
+ "01287",
+ "01289",
+ "01291",
+ "01293",
+ "01295",
+ "01297",
+ "01299",
+ "01301",
+ "01303",
+ "01305",
+ "01307",
+ "01309",
+ "01311",
+ "01313",
+ "01315",
+ "01317",
+ "01319",
+ "01321",
+ "01323",
+ "01325",
+ "01327",
+ "01329",
+ "01331",
+ "01333",
+ "01335",
+ "01337",
+ "01339",
+ "01341",
+ "01343",
+ "01345",
+ "01347",
+ "01349",
+ "01351",
+ "01353",
+ "01355",
+ "01357",
+ "01359",
+ "01361",
+ "01363",
+ "01365",
+ "01367",
+ "01369",
+ "01371",
+ "01373",
+ "01375",
+ "01377",
+ "01379",
+ "01381",
+ "01383",
+ "01385",
+ "01387",
+ "01389",
+ "01391",
+ "01393",
+ "01395",
+ "01397",
+ "01399",
+ "01401",
+ "01403",
+ "01405",
+ "01407",
+ "01409",
+ "01411",
+ "01413",
+ "01415",
+ "01417",
+ "01419",
+ "01421",
+ "01423",
+ "01425",
+ "01427",
+ "01429",
+ "01431",
+ "01433",
+ "01435",
+ "01437",
+ "01439",
+ "01441",
+ "01443",
+ "01445",
+ "01447",
+ "01449",
+ "01451",
+ "01453",
+ "01455",
+ "01457",
+ "01459",
+ "01461",
+ "01463",
+ "01465",
+ "01467",
+ "01469",
+ "01471",
+ "01473",
+ "01475",
+ "01477",
+ "01479",
+ "01481",
+ "01483",
+ "01485",
+ "01487",
+ "01489",
+ "01491",
+ "01493",
+ "01495",
+ "01497",
+ "01499",
+ "01501",
+ "01503",
+ "01505",
+ "01507",
+ "01509",
+ "01511",
+ "01513",
+ "01515",
+ "01517",
+ "01519",
+ "01521",
+ "01523",
+ "01525",
+ "01527",
+ "01529",
+ "01531",
+ "01533",
+ "01535",
+ "01537",
+ "01539",
+ "01541",
+ "01543",
+ "01545",
+ "01547",
+ "01549",
+ "01551",
+ "01553",
+ "01555",
+ "01557",
+ "01559",
+ "01561",
+ "01563",
+ "01565",
+ "01567",
+ "01569",
+ "01571",
+ "01573",
+ "01575",
+ "01577",
+ "01579",
+ "01581",
+ "01583",
+ "01585",
+ "01587",
+ "01589",
+ "01591",
+ "01593",
+ "01595",
+ "01597",
+ "01599",
+ "01601",
+ "01603",
+ "01605",
+ "01607",
+ "01609",
+ "01611",
+ "01613",
+ "01615",
+ "01617",
+ "01619",
+ "01621",
+ "01623",
+ "01625",
+ "01627",
+ "01629",
+ "01631",
+ "01633",
+ "01635",
+ "01637",
+ "01639",
+ "01641",
+ "01643",
+ "01645",
+ "01647",
+ "01649",
+ "01651",
+ "01653",
+ "01655",
+ "01657",
+ "01659",
+ "01661",
+ "01663",
+ "01665",
+ "01667",
+ "01669",
+ "01671",
+ "01673",
+ "01675",
+ "01677",
+ "01679",
+ "01681",
+ "01683",
+ "01685",
+ "01687",
+ "01689",
+ "01691",
+ "01693",
+ "01695",
+ "01697",
+ "01699",
+ "01701",
+ "01703",
+ "01705",
+ "01707",
+ "01709",
+ "01711",
+ "01713",
+ "01715",
+ "01717",
+ "01719",
+ "01721",
+ "01723",
+ "01725",
+ "01727",
+ "01729",
+ "01731",
+ "01733",
+ "01735",
+ "01737",
+ "01739",
+ "01741",
+ "01743",
+ "01745",
+ "01747",
+ "01749",
+ "01751",
+ "01753",
+ "01755",
+ "01757",
+ "01759",
+ "01761",
+ "01763",
+ "01765",
+ "01767",
+ "01769",
+ "01771",
+ "01773",
+ "01775",
+ "01777",
+ "01779",
+ "01781",
+ "01783",
+ "01785",
+ "01787",
+ "01789",
+ "01791",
+ "01793",
+ "01795",
+ "01797",
+ "01799",
+ "01801",
+ "01803",
+ "01805",
+ "01807",
+ "01809",
+ "01811",
+ "01813",
+ "01815",
+ "01817",
+ "01819",
+ "01821",
+ "01823",
+ "01825",
+ "01827",
+ "01829",
+ "01831",
+ "01833",
+ "01835",
+ "01837",
+ "01839",
+ "01841",
+ "01843",
+ "01845",
+ "01847",
+ "01849",
+ "01851",
+ "01853",
+ "01855",
+ "01857",
+ "01859",
+ "01861",
+ "01863",
+ "01865",
+ "01867",
+ "01869",
+ "01871",
+ "01873",
+ "01875",
+ "01877",
+ "01879",
+ "01881",
+ "01883",
+ "01885",
+ "01887",
+ "01889",
+ "01891",
+ "01893",
+ "01895",
+ "01897",
+ "01899",
+ "01901",
+ "01903",
+ "01905",
+ "01907",
+ "01909",
+ "01911",
+ "01913",
+ "01915",
+ "01917",
+ "01919",
+ "01921",
+ "01923",
+ "01925",
+ "01927",
+ "01929",
+ "01931",
+ "01933",
+ "01935",
+ "01937",
+ "01939",
+ "01941",
+ "01943",
+ "01945",
+ "01947",
+ "01949",
+ "01951",
+ "01953",
+ "01955",
+ "01957",
+ "01959",
+ "01961",
+ "01963",
+ "01965",
+ "01967",
+ "01969",
+ "01971",
+ "01973",
+ "01975",
+ "01977",
+ "01979",
+ "01981",
+ "01983",
+ "01985",
+ "01987",
+ "01989",
+ "01991",
+ "01993",
+ "01995",
+ "01997",
+ "01999",
+ "02001",
+ "02003",
+ "02005",
+ "02007",
+ "02009",
+ "02011",
+ "02013",
+ "02015",
+ "02017",
+ "02019",
+ "02021",
+ "02023",
+ "02025",
+ "02027",
+ "02029",
+ "02031",
+ "02033",
+ "02035",
+ "02037",
+ "02039",
+ "02041",
+ "02043",
+ "02045",
+ "02047",
+ "02049",
+ "02051",
+ "02053",
+ "02055",
+ "02057",
+ "02059",
+ "02061",
+ "02063",
+ "02065",
+ "02067",
+ "02069",
+ "02071",
+ "02073",
+ "02075",
+ "02077",
+ "02079",
+ "02081",
+ "02083",
+ "02085",
+ "02087",
+ "02089",
+ "02091",
+ "02093",
+ "02095",
+ "02097",
+ "02099",
+ "02101",
+ "02103",
+ "02105",
+ "02107",
+ "02109",
+ "02111",
+ "02113",
+ "02115",
+ "02117",
+ "02119",
+ "02121",
+ "02123",
+ "02125",
+ "02127",
+ "02129",
+ "02131",
+ "02133",
+ "02135",
+ "02137",
+ "02139",
+ "02141",
+ "02143",
+ "02145",
+ "02147",
+ "02149",
+ "02151",
+ "02153",
+ "02155",
+ "02157",
+ "02159",
+ "02161",
+ "02163",
+ "02165",
+ "02167",
+ "02169",
+ "02171",
+ "02173",
+ "02175",
+ "02177",
+ "02179",
+ "02181",
+ "02183",
+ "02185",
+ "02187",
+ "02189",
+ "02191",
+ "02193",
+ "02195",
+ "02197",
+ "02199",
+ "02201",
+ "02203",
+ "02205",
+ "02207",
+ "02209",
+ "02211",
+ "02213",
+ "02215",
+ "02217",
+ "02219",
+ "02221",
+ "02223",
+ "02225",
+ "02227",
+ "02229",
+ "02231",
+ "02233",
+ "02235",
+ "02237",
+ "02239",
+ "02241",
+ "02243",
+ "02245",
+ "02247",
+ "02249",
+ "02251",
+ "02253",
+ "02255",
+ "02257",
+ "02259",
+ "02261",
+ "02263",
+ "02265",
+ "02267",
+ "02269",
+ "02271",
+ "02273",
+ "02275",
+ "02277",
+ "02279",
+ "02281",
+ "02283",
+ "02285",
+ "02287",
+ "02289",
+ "02291",
+ "02293",
+ "02295",
+ "02297",
+ "02299",
+ "02301",
+ "02303",
+ "02305",
+ "02307",
+ "02309",
+ "02311",
+ "02313",
+ "02315",
+ "02317",
+ "02319",
+ "02321",
+ "02323",
+ "02325",
+ "02327",
+ "02329",
+ "02331",
+ "02333",
+ "02335",
+ "02337",
+ "02339",
+ "02341",
+ "02343",
+ "02345",
+ "02347",
+ "02349",
+ "02351",
+ "02353",
+ "02355",
+ "02357",
+ "02359",
+ "02361",
+ "02363",
+ "02365",
+ "02367",
+ "02369",
+ "02371",
+ "02373",
+ "02375",
+ "02377",
+ "02379",
+ "02381",
+ "02383",
+ "02385",
+ "02387",
+ "02389",
+ "02391",
+ "02393",
+ "02395",
+ "02397",
+ "02399",
+ "02401",
+ "02403",
+ "02405",
+ "02407",
+ "02409",
+ "02411",
+ "02413",
+ "02415",
+ "02417",
+ "02419",
+ "02421",
+ "02423",
+ "02425",
+ "02427",
+ "02429",
+ "02431",
+ "02433",
+ "02435",
+ "02437",
+ "02439",
+ "02441",
+ "02443",
+ "02445",
+ "02447",
+ "02449",
+ "02451",
+ "02453",
+ "02455",
+ "02457",
+ "02459",
+ "02461",
+ "02463",
+ "02465",
+ "02467",
+ "02469",
+ "02471",
+ "02473",
+ "02475",
+ "02477",
+ "02479",
+ "02481",
+ "02483",
+ "02485",
+ "02487",
+ "02489",
+ "02491",
+ "02493",
+ "02495",
+ "02497",
+ "02499",
+ "02501",
+ "02503",
+ "02505",
+ "02507",
+ "02509",
+ "02511",
+ "02513",
+ "02515",
+ "02517",
+ "02519",
+ "02521",
+ "02523",
+ "02525",
+ "02527",
+ "02529",
+ "02531",
+ "02533",
+ "02535",
+ "02537",
+ "02539",
+ "02541",
+ "02543",
+ "02545",
+ "02547",
+ "02549",
+ "02551",
+ "02553",
+ "02555",
+ "02557",
+ "02559",
+ "02561",
+ "02563",
+ "02565",
+ "02567",
+ "02569",
+ "02571",
+ "02573",
+ "02575",
+ "02577",
+ "02579",
+ "02581",
+ "02583",
+ "02585",
+ "02587",
+ "02589",
+ "02591",
+ "02593",
+ "02595",
+ "02597",
+ "02599",
+ "02601",
+ "02603",
+ "02605",
+ "02607",
+ "02609",
+ "02611",
+ "02613",
+ "02615",
+ "02617",
+ "02619",
+ "02621",
+ "02623",
+ "02625",
+ "02627",
+ "02629",
+ "02631",
+ "02633",
+ "02635",
+ "02637",
+ "02639",
+ "02641",
+ "02643",
+ "02645",
+ "02647",
+ "02649",
+ "02651",
+ "02653",
+ "02655",
+ "02657",
+ "02659",
+ "02661",
+ "02663",
+ "02665",
+ "02667",
+ "02669",
+ "02671",
+ "02673",
+ "02675",
+ "02677",
+ "02679",
+ "02681",
+ "02683",
+ "02685",
+ "02687",
+ "02689",
+ "02691",
+ "02693",
+ "02695",
+ "02697",
+ "02699",
+ "02701",
+ "02703",
+ "02705",
+ "02707",
+ "02709",
+ "02711",
+ "02713",
+ "02715",
+ "02717",
+ "02719",
+ "02721",
+ "02723",
+ "02725",
+ "02727",
+ "02729",
+ "02731",
+ "02733",
+ "02735",
+ "02737",
+ "02739",
+ "02741",
+ "02743",
+ "02745",
+ "02747",
+ "02749",
+ "02751",
+ "02753",
+ "02755",
+ "02757",
+ "02759",
+ "02761",
+ "02763",
+ "02765",
+ "02767",
+ "02769",
+ "02771",
+ "02773",
+ "02775",
+ "02777",
+ "02779",
+ "02781",
+ "02783",
+ "02785",
+ "02787",
+ "02789",
+ "02791",
+ "02793",
+ "02795",
+ "02797",
+ "02799",
+ "02801",
+ "02803",
+ "02805",
+ "02807",
+ "02809",
+ "02811",
+ "02813",
+ "02815",
+ "02817",
+ "02819",
+ "02821",
+ "02823",
+ "02825",
+ "02827",
+ "02829",
+ "02831",
+ "02833",
+ "02835",
+ "02837",
+ "02839",
+ "02841",
+ "02843",
+ "02845",
+ "02847",
+ "02849",
+ "02851",
+ "02853",
+ "02855",
+ "02857",
+ "02859",
+ "02861",
+ "02863",
+ "02865",
+ "02867",
+ "02869",
+ "02871",
+ "02873",
+ "02875",
+ "02877",
+ "02879",
+ "02881",
+ "02883",
+ "02885",
+ "02887",
+ "02889",
+ "02891",
+ "02893",
+ "02895",
+ "02897",
+ "02899",
+ "02901",
+ "02903",
+ "02905",
+ "02907",
+ "02909",
+ "02911",
+ "02913",
+ "02915",
+ "02917",
+ "02919",
+ "02921",
+ "02923",
+ "02925",
+ "02927",
+ "02929",
+ "02931",
+ "02933",
+ "02935",
+ "02937",
+ "02939",
+ "02941",
+ "02943",
+ "02945",
+ "02947",
+ "02949",
+ "02951",
+ "02953",
+ "02955",
+ "02957",
+ "02959",
+ "02961",
+ "02963",
+ "02965",
+ "02967",
+ "02969",
+ "02971",
+ "02973",
+ "02975",
+ "02977",
+ "02979",
+ "02981",
+ "02983",
+ "02985",
+ "02987",
+ "02989",
+ "02991",
+ "02993",
+ "02995",
+ "02997",
+ "02999",
+ "03001",
+ "03003",
+ "03005",
+ "03007",
+ "03009",
+ "03011",
+ "03013",
+ "03015",
+ "03017",
+ "03019",
+ "03021",
+ "03023",
+ "03025",
+ "03027",
+ "03029",
+ "03031",
+ "03033",
+ "03035",
+ "03037",
+ "03039",
+ "03041",
+ "03043",
+ "03045",
+ "03047",
+ "03049",
+ "03051",
+ "03053",
+ "03055",
+ "03057",
+ "03059",
+ "03061",
+ "03063",
+ "03065",
+ "03067",
+ "03069",
+ "03071",
+ "03073",
+ "03075",
+ "03077",
+ "03079",
+ "03081",
+ "03083",
+ "03085",
+ "03087",
+ "03089",
+ "03091",
+ "03093",
+ "03095",
+ "03097",
+ "03099",
+ "03101",
+ "03103",
+ "03105",
+ "03107",
+ "03109",
+ "03111",
+ "03113",
+ "03115",
+ "03117",
+ "03119",
+ "03121",
+ "03123",
+ "03125",
+ "03127",
+ "03129",
+ "03131",
+ "03133",
+ "03135",
+ "03137",
+ "03139",
+ "03141",
+ "03143",
+ "03145",
+ "03147",
+ "03149",
+ "03151",
+ "03153",
+ "03155",
+ "03157",
+ "03159",
+ "03161",
+ "03163",
+ "03165",
+ "03167",
+ "03169",
+ "03171",
+ "03173",
+ "03175",
+ "03177",
+ "03179",
+ "03181",
+ "03183",
+ "03185",
+ "03187",
+ "03189",
+ "03191",
+ "03193",
+ "03195",
+ "03197",
+ "03199",
+ "03201",
+ "03203",
+ "03205",
+ "03207",
+ "03209",
+ "03211",
+ "03213",
+ "03215",
+ "03217",
+ "03219",
+ "03221",
+ "03223",
+ "03225",
+ "03227",
+ "03229",
+ "03231",
+ "03233",
+ "03235",
+ "03237",
+ "03239",
+ "03241",
+ "03243",
+ "03245",
+ "03247",
+ "03249",
+ "03251",
+ "03253",
+ "03255",
+ "03257",
+ "03259",
+ "03261",
+ "03263",
+ "03265",
+ "03267",
+ "03269",
+ "03271",
+ "03273",
+ "03275",
+ "03277",
+ "03279",
+ "03281",
+ "03283",
+ "03285",
+ "03287",
+ "03289",
+ "03291",
+ "03293",
+ "03295",
+ "03297",
+ "03299",
+ "03301",
+ "03303",
+ "03305",
+ "03307",
+ "03309",
+ "03311",
+ "03313",
+ "03315",
+ "03317",
+ "03319",
+ "03321",
+ "03323",
+ "03325",
+ "03327",
+ "03329",
+ "03331",
+ "03333",
+ "03335",
+ "03337",
+ "03339",
+ "03341",
+ "03343",
+ "03345",
+ "03347",
+ "03349",
+ "03351",
+ "03353",
+ "03355",
+ "03357",
+ "03359",
+ "03361",
+ "03363",
+ "03365",
+ "03367",
+ "03369",
+ "03371",
+ "03373",
+ "03375",
+ "03377",
+ "03379",
+ "03381",
+ "03383",
+ "03385",
+ "03387",
+ "03389",
+ "03391",
+ "03393",
+ "03395",
+ "03397",
+ "03399",
+ "03401",
+ "03403",
+ "03405",
+ "03407",
+ "03409",
+ "03411",
+ "03413",
+ "03415",
+ "03417",
+ "03419",
+ "03421",
+ "03423",
+ "03425",
+ "03427",
+ "03429",
+ "03431",
+ "03433",
+ "03435",
+ "03437",
+ "03439",
+ "03441",
+ "03443",
+ "03445",
+ "03447",
+ "03449",
+ "03451",
+ "03453",
+ "03455",
+ "03457",
+ "03459",
+ "03461",
+ "03463",
+ "03465",
+ "03467",
+ "03469",
+ "03471",
+ "03473",
+ "03475",
+ "03477",
+ "03479",
+ "03481",
+ "03483",
+ "03485",
+ "03487",
+ "03489",
+ "03491",
+ "03493",
+ "03495",
+ "03497",
+ "03499",
+ "03501",
+ "03503",
+ "03505",
+ "03507",
+ "03509",
+ "03511",
+ "03513",
+ "03515",
+ "03517",
+ "03519",
+ "03521",
+ "03523",
+ "03525",
+ "03527",
+ "03529",
+ "03531",
+ "03533",
+ "03535",
+ "03537",
+ "03539",
+ "03541",
+ "03543",
+ "03545",
+ "03547",
+ "03549",
+ "03551",
+ "03553",
+ "03555",
+ "03557",
+ "03559",
+ "03561",
+ "03563",
+ "03565",
+ "03567",
+ "03569",
+ "03571",
+ "03573",
+ "03575",
+ "03577",
+ "03579",
+ "03581",
+ "03583",
+ "03585",
+ "03587",
+ "03589",
+ "03591",
+ "03593",
+ "03595",
+ "03597",
+ "03599",
+ "03601",
+ "03603",
+ "03605",
+ "03607",
+ "03609",
+ "03611",
+ "03613",
+ "03615",
+ "03617",
+ "03619",
+ "03621",
+ "03623",
+ "03625",
+ "03627",
+ "03629",
+ "03631",
+ "03633",
+ "03635",
+ "03637",
+ "03639",
+ "03641",
+ "03643",
+ "03645",
+ "03647",
+ "03649",
+ "03651",
+ "03653",
+ "03655",
+ "03657",
+ "03659",
+ "03661",
+ "03663",
+ "03665",
+ "03667",
+ "03669",
+ "03671",
+ "03673",
+ "03675",
+ "03677",
+ "03679",
+ "03681",
+ "03683",
+ "03685",
+ "03687",
+ "03689",
+ "03691",
+ "03693",
+ "03695",
+ "03697",
+ "03699",
+ "03701",
+ "03703",
+ "03705",
+ "03707",
+ "03709",
+ "03711",
+ "03713",
+ "03715",
+ "03717",
+ "03719",
+ "03721",
+ "03723",
+ "03725",
+ "03727",
+ "03729",
+ "03731",
+ "03733",
+ "03735",
+ "03737",
+ "03739",
+ "03741",
+ "03743",
+ "03745",
+ "03747",
+ "03749",
+ "03751",
+ "03753",
+ "03755",
+ "03757",
+ "03759",
+ "03761",
+ "03763",
+ "03765",
+ "03767",
+ "03769",
+ "03771",
+ "03773",
+ "03775",
+ "03777",
+ "03779",
+ "03781",
+ "03783",
+ "03785",
+ "03787",
+ "03789",
+ "03791",
+ "03793",
+ "03795",
+ "03797",
+ "03799",
+ "03801",
+ "03803",
+ "03805",
+ "03807",
+ "03809",
+ "03811",
+ "03813",
+ "03815",
+ "03817",
+ "03819",
+ "03821",
+ "03823",
+ "03825",
+ "03827",
+ "03829",
+ "03831",
+ "03833",
+ "03835",
+ "03837",
+ "03839",
+ "03841",
+ "03843",
+ "03845",
+ "03847",
+ "03849",
+ "03851",
+ "03853",
+ "03855",
+ "03857",
+ "03859",
+ "03861",
+ "03863",
+ "03865",
+ "03867",
+ "03869",
+ "03871",
+ "03873",
+ "03875",
+ "03877",
+ "03879",
+ "03881",
+ "03883",
+ "03885",
+ "03887",
+ "03889",
+ "03891",
+ "03893",
+ "03895",
+ "03897",
+ "03899",
+ "03901",
+ "03903",
+ "03905",
+ "03907",
+ "03909",
+ "03911",
+ "03913",
+ "03915",
+ "03917",
+ "03919",
+ "03921",
+ "03923",
+ "03925",
+ "03927",
+ "03929",
+ "03931",
+ "03933",
+ "03935",
+ "03937",
+ "03939",
+ "03941",
+ "03943",
+ "03945",
+ "03947",
+ "03949",
+ "03951",
+ "03953",
+ "03955",
+ "03957",
+ "03959",
+ "03961",
+ "03963",
+ "03965",
+ "03967",
+ "03969",
+ "03971",
+ "03973",
+ "03975",
+ "03977",
+ "03979",
+ "03981",
+ "03983",
+ "03985",
+ "03987",
+ "03989",
+ "03991",
+ "03993",
+ "03995",
+ "03997",
+ "03999",
+ "04001",
+ "04003",
+ "04005",
+ "04007",
+ "04009",
+ "04011",
+ "04013",
+ "04015",
+ "04017",
+ "04019",
+ "04021",
+ "04023",
+ "04025",
+ "04027",
+ "04029",
+ "04031",
+ "04033",
+ "04035",
+ "04037",
+ "04039",
+ "04041",
+ "04043",
+ "04045",
+ "04047",
+ "04049",
+ "04051",
+ "04053",
+ "04055",
+ "04057",
+ "04059",
+ "04061",
+ "04063",
+ "04065",
+ "04067",
+ "04069",
+ "04071",
+ "04073",
+ "04075",
+ "04077",
+ "04079",
+ "04081",
+ "04083",
+ "04085",
+ "04087",
+ "04089",
+ "04091",
+ "04093",
+ "04095",
+ "04097",
+ "04099",
+ "04101",
+ "04103",
+ "04105",
+ "04107",
+ "04109",
+ "04111",
+ "04113",
+ "04115",
+ "04117",
+ "04119",
+ "04121",
+ "04123",
+ "04125",
+ "04127",
+ "04129",
+ "04131",
+ "04133",
+ "04135",
+ "04137",
+ "04139",
+ "04141",
+ "04143",
+ "04145",
+ "04147",
+ "04149",
+ "04151",
+ "04153",
+ "04155",
+ "04157",
+ "04159",
+ "04161",
+ "04163",
+ "04165",
+ "04167",
+ "04169",
+ "04171",
+ "04173",
+ "04175",
+ "04177",
+ "04179",
+ "04181",
+ "04183",
+ "04185",
+ "04187",
+ "04189",
+ "04191",
+ "04193",
+ "04195",
+ "04197",
+ "04199",
+ "04201",
+ "04203",
+ "04205",
+ "04207",
+ "04209",
+ "04211",
+ "04213",
+ "04215",
+ "04217",
+ "04219",
+ "04221",
+ "04223",
+ "04225",
+ "04227",
+ "04229",
+ "04231",
+ "04233",
+ "04235",
+ "04237",
+ "04239",
+ "04241",
+ "04243",
+ "04245",
+ "04247",
+ "04249",
+ "04251",
+ "04253",
+ "04255",
+ "04257",
+ "04259",
+ "04261",
+ "04263",
+ "04265",
+ "04267",
+ "04269",
+ "04271",
+ "04273",
+ "04275",
+ "04277",
+ "04279",
+ "04281",
+ "04283",
+ "04285",
+ "04287",
+ "04289",
+ "04291",
+ "04293",
+ "04295",
+ "04297",
+ "04299",
+ "04301",
+ "04303",
+ "04305",
+ "04307",
+ "04309",
+ "04311",
+ "04313",
+ "04315",
+ "04317",
+ "04319",
+ "04321",
+ "04323",
+ "04325",
+ "04327",
+ "04329",
+ "04331",
+ "04333",
+ "04335",
+ "04337",
+ "04339",
+ "04341",
+ "04343",
+ "04345",
+ "04347",
+ "04349",
+ "04351",
+ "04353",
+ "04355",
+ "04357",
+ "04359",
+ "04361",
+ "04363",
+ "04365",
+ "04367",
+ "04369",
+ "04371",
+ "04373",
+ "04375",
+ "04377",
+ "04379",
+ "04381",
+ "04383",
+ "04385",
+ "04387",
+ "04389",
+ "04391",
+ "04393",
+ "04395",
+ "04397",
+ "04399",
+ "04401",
+ "04403",
+ "04405",
+ "04407",
+ "04409",
+ "04411",
+ "04413",
+ "04415",
+ "04417",
+ "04419",
+ "04421",
+ "04423",
+ "04425",
+ "04427",
+ "04429",
+ "04431",
+ "04433",
+ "04435",
+ "04437",
+ "04439",
+ "04441",
+ "04443",
+ "04445",
+ "04447",
+ "04449",
+ "04451",
+ "04453",
+ "04455",
+ "04457",
+ "04459",
+ "04461",
+ "04463",
+ "04465",
+ "04467",
+ "04469",
+ "04471",
+ "04473",
+ "04475",
+ "04477",
+ "04479",
+ "04481",
+ "04483",
+ "04485",
+ "04487",
+ "04489",
+ "04491",
+ "04493",
+ "04495",
+ "04497",
+ "04499",
+ "04501",
+ "04503",
+ "04505",
+ "04507",
+ "04509",
+ "04511",
+ "04513",
+ "04515",
+ "04517",
+ "04519",
+ "04521",
+ "04523",
+ "04525",
+ "04527",
+ "04529",
+ "04531",
+ "04533",
+ "04535",
+ "04537",
+ "04539",
+ "04541",
+ "04543",
+ "04545",
+ "04547",
+ "04549",
+ "04551",
+ "04553",
+ "04555",
+ "04557",
+ "04559",
+ "04561",
+ "04563",
+ "04565",
+ "04567",
+ "04569",
+ "04571",
+ "04573",
+ "04575",
+ "04577",
+ "04579",
+ "04581",
+ "04583",
+ "04585",
+ "04587",
+ "04589",
+ "04591",
+ "04593",
+ "04595",
+ "04597",
+ "04599",
+ "04601",
+ "04603",
+ "04605",
+ "04607",
+ "04609",
+ "04611",
+ "04613",
+ "04615",
+ "04617",
+ "04619",
+ "04621",
+ "04623",
+ "04625",
+ "04627",
+ "04629",
+ "04631",
+ "04633",
+ "04635",
+ "04637",
+ "04639",
+ "04641",
+ "04643",
+ "04645",
+ "04647",
+ "04649",
+ "04651",
+ "04653",
+ "04655",
+ "04657",
+ "04659",
+ "04661",
+ "04663",
+ "04665",
+ "04667",
+ "04669",
+ "04671",
+ "04673",
+ "04675",
+ "04677",
+ "04679",
+ "04681",
+ "04683",
+ "04685",
+ "04687",
+ "04689",
+ "04691",
+ "04693",
+ "04695",
+ "04697",
+ "04699",
+ "04701",
+ "04703",
+ "04705",
+ "04707",
+ "04709",
+ "04711",
+ "04713",
+ "04715",
+ "04717",
+ "04719",
+ "04721",
+ "04723",
+ "04725",
+ "04727",
+ "04729",
+ "04731",
+ "04733",
+ "04735",
+ "04737",
+ "04739",
+ "04741",
+ "04743",
+ "04745",
+ "04747",
+ "04749",
+ "04751",
+ "04753",
+ "04755",
+ "04757",
+ "04759",
+ "04761",
+ "04763",
+ "04765",
+ "04767",
+ "04769",
+ "04771",
+ "04773",
+ "04775",
+ "04777",
+ "04779",
+ "04781",
+ "04783",
+ "04785",
+ "04787",
+ "04789",
+ "04791",
+ "04793",
+ "04795",
+ "04797",
+ "04799",
+ "04801",
+ "04803",
+ "04805",
+ "04807",
+ "04809",
+ "04811",
+ "04813",
+ "04815",
+ "04817",
+ "04819",
+ "04821",
+ "04823",
+ "04825",
+ "04827",
+ "04829",
+ "04831",
+ "04833",
+ "04835",
+ "04837",
+ "04839",
+ "04841",
+ "04843",
+ "04845",
+ "04847",
+ "04849",
+ "04851",
+ "04853",
+ "04855",
+ "04857",
+ "04859",
+ "04861",
+ "04863",
+ "04865",
+ "04867",
+ "04869",
+ "04871",
+ "04873",
+ "04875",
+ "04877",
+ "04879",
+ "04881",
+ "04883",
+ "04885",
+ "04887",
+ "04889",
+ "04891",
+ "04893",
+ "04895",
+ "04897",
+ "04899",
+ "04901",
+ "04903",
+ "04905",
+ "04907",
+ "04909",
+ "04911",
+ "04913",
+ "04915",
+ "04917",
+ "04919",
+ "04921",
+ "04923",
+ "04925",
+ "04927",
+ "04929",
+ "04931",
+ "04933",
+ "04935",
+ "04937",
+ "04939",
+ "04941",
+ "04943",
+ "04945",
+ "04947",
+ "04949",
+ "04951",
+ "04953",
+ "04955",
+ "04957",
+ "04959",
+ "04961",
+ "04963",
+ "04965",
+ "04967",
+ "04969",
+ "04971",
+ "04973",
+ "04975",
+ "04977",
+ "04979",
+ "04981",
+ "04983",
+ "04985",
+ "04987",
+ "04989",
+ "04991",
+ "04993",
+ "04995",
+ "04997",
+ "04999",
+ "05001",
+ "05003",
+ "05005",
+ "05007",
+ "05009",
+ "05011",
+ "05013",
+ "05015",
+ "05017",
+ "05019",
+ "05021",
+ "05023",
+ "05025",
+ "05027",
+ "05029",
+ "05031",
+ "05033",
+ "05035",
+ "05037",
+ "05039",
+ "05041",
+ "05043",
+ "05045",
+ "05047",
+ "05049",
+ "05051",
+ "05053",
+ "05055",
+ "05057",
+ "05059",
+ "05061",
+ "05063",
+ "05065",
+ "05067",
+ "05069",
+ "05071",
+ "05073",
+ "05075",
+ "05077",
+ "05079",
+ "05081",
+ "05083",
+ "05085",
+ "05087",
+ "05089",
+ "05091",
+ "05093",
+ "05095",
+ "05097",
+ "05099",
+ "05101",
+ "05103",
+ "05105",
+ "05107",
+ "05109",
+ "05111",
+ "05113",
+ "05115",
+ "05117",
+ "05119",
+ "05121",
+ "05123",
+ "05125",
+ "05127",
+ "05129",
+ "05131",
+ "05133",
+ "05135",
+ "05137",
+ "05139",
+ "05141",
+ "05143",
+ "05145",
+ "05147",
+ "05149",
+ "05151",
+ "05153",
+ "05155",
+ "05157",
+ "05159",
+ "05161",
+ "05163",
+ "05165",
+ "05167",
+ "05169",
+ "05171",
+ "05173",
+ "05175",
+ "05177",
+ "05179",
+ "05181",
+ "05183",
+ "05185",
+ "05187",
+ "05189",
+ "05191",
+ "05193",
+ "05195",
+ "05197",
+ "05199",
+ "05201",
+ "05203",
+ "05205",
+ "05207",
+ "05209",
+ "05211",
+ "05213",
+ "05215",
+ "05217",
+ "05219",
+ "05221",
+ "05223",
+ "05225",
+ "05227",
+ "05229",
+ "05231",
+ "05233",
+ "05235",
+ "05237",
+ "05239",
+ "05241",
+ "05243",
+ "05245",
+ "05247",
+ "05249",
+ "05251",
+ "05253",
+ "05255",
+ "05257",
+ "05259",
+ "05261",
+ "05263",
+ "05265",
+ "05267",
+ "05269",
+ "05271",
+ "05273",
+ "05275",
+ "05277",
+ "05279",
+ "05281",
+ "05283",
+ "05285",
+ "05287",
+ "05289",
+ "05291",
+ "05293",
+ "05295",
+ "05297",
+ "05299",
+ "05301",
+ "05303",
+ "05305",
+ "05307",
+ "05309",
+ "05311",
+ "05313",
+ "05315",
+ "05317",
+ "05319",
+ "05321",
+ "05323",
+ "05325",
+ "05327",
+ "05329",
+ "05331",
+ "05333",
+ "05335",
+ "05337",
+ "05339",
+ "05341",
+ "05343",
+ "05345",
+ "05347",
+ "05349",
+ "05351",
+ "05353",
+ "05355",
+ "05357",
+ "05359",
+ "05361",
+ "05363",
+ "05365",
+ "05367",
+ "05369",
+ "05371",
+ "05373",
+ "05375",
+ "05377",
+ "05379",
+ "05381",
+ "05383",
+ "05385",
+ "05387",
+ "05389",
+ "05391",
+ "05393",
+ "05395",
+ "05397",
+ "05399",
+ "05401",
+ "05403",
+ "05405",
+ "05407",
+ "05409",
+ "05411",
+ "05413",
+ "05415",
+ "05417",
+ "05419",
+ "05421",
+ "05423",
+ "05425",
+ "05427",
+ "05429",
+ "05431",
+ "05433",
+ "05435",
+ "05437",
+ "05439",
+ "05441",
+ "05443",
+ "05445",
+ "05447",
+ "05449",
+ "05451",
+ "05453",
+ "05455",
+ "05457",
+ "05459",
+ "05461",
+ "05463",
+ "05465",
+ "05467",
+ "05469",
+ "05471",
+ "05473",
+ "05475",
+ "05477",
+ "05479",
+ "05481",
+ "05483",
+ "05485",
+ "05487",
+ "05489",
+ "05491",
+ "05493",
+ "05495",
+ "05497",
+ "05499",
+ "05501",
+ "05503",
+ "05505",
+ "05507",
+ "05509",
+ "05511",
+ "05513",
+ "05515",
+ "05517",
+ "05519",
+ "05521",
+ "05523",
+ "05525",
+ "05527",
+ "05529",
+ "05531",
+ "05533",
+ "05535",
+ "05537",
+ "05539",
+ "05541",
+ "05543",
+ "05545",
+ "05547",
+ "05549",
+ "05551",
+ "05553",
+ "05555",
+ "05557",
+ "05559",
+ "05561",
+ "05563",
+ "05565",
+ "05567",
+ "05569",
+ "05571",
+ "05573",
+ "05575",
+ "05577",
+ "05579",
+ "05581",
+ "05583",
+ "05585",
+ "05587",
+ "05589",
+ "05591",
+ "05593",
+ "05595",
+ "05597",
+ "05599",
+ "05601",
+ "05603",
+ "05605",
+ "05607",
+ "05609",
+ "05611",
+ "05613",
+ "05615",
+ "05617",
+ "05619",
+ "05621",
+ "05623",
+ "05625",
+ "05627",
+ "05629",
+ "05631",
+ "05633",
+ "05635",
+ "05637",
+ "05639",
+ "05641",
+ "05643",
+ "05645",
+ "05647",
+ "05649",
+ "05651",
+ "05653",
+ "05655",
+ "05657",
+ "05659",
+ "05661",
+ "05663",
+ "05665",
+ "05667",
+ "05669",
+ "05671",
+ "05673",
+ "05675",
+ "05677",
+ "05679",
+ "05681",
+ "05683",
+ "05685",
+ "05687",
+ "05689",
+ "05691",
+ "05693",
+ "05695",
+ "05697",
+ "05699",
+ "05701",
+ "05703",
+ "05705",
+ "05707",
+ "05709",
+ "05711",
+ "05713",
+ "05715",
+ "05717",
+ "05719",
+ "05721",
+ "05723",
+ "05725",
+ "05727",
+ "05729",
+ "05731",
+ "05733",
+ "05735",
+ "05737",
+ "05739",
+ "05741",
+ "05743",
+ "05745",
+ "05747",
+ "05749",
+ "05751",
+ "05753",
+ "05755",
+ "05757",
+ "05759",
+ "05761",
+ "05763",
+ "05765",
+ "05767",
+ "05769",
+ "05771",
+ "05773",
+ "05775",
+ "05777",
+ "05779",
+ "05781",
+ "05783",
+ "05785",
+ "05787",
+ "05789",
+ "05791",
+ "05793",
+ "05795",
+ "05797",
+ "05799",
+ "05801",
+ "05803",
+ "05805",
+ "05807",
+ "05809",
+ "05811",
+ "05813",
+ "05815",
+ "05817",
+ "05819",
+ "05821",
+ "05823",
+ "05825",
+ "05827",
+ "05829",
+ "05831",
+ "05833",
+ "05835",
+ "05837",
+ "05839",
+ "05841",
+ "05843",
+ "05845",
+ "05847",
+ "05849",
+ "05851",
+ "05853",
+ "05855",
+ "05857",
+ "05859",
+ "05861",
+ "05863",
+ "05865",
+ "05867",
+ "05869",
+ "05871",
+ "05873",
+ "05875",
+ "05877",
+ "05879",
+ "05881",
+ "05883",
+ "05885",
+ "05887",
+ "05889",
+ "05891",
+ "05893",
+ "05895",
+ "05897",
+ "05899",
+ "05901",
+ "05903",
+ "05905",
+ "05907",
+ "05909",
+ "05911",
+ "05913",
+ "05915",
+ "05917",
+ "05919",
+ "05921",
+ "05923",
+ "05925",
+ "05927",
+ "05929",
+ "05931",
+ "05933",
+ "05935",
+ "05937",
+ "05939",
+ "05941",
+ "05943",
+ "05945",
+ "05947",
+ "05949",
+ "05951",
+ "05953",
+ "05955",
+ "05957",
+ "05959",
+ "05961",
+ "05963",
+ "05965",
+ "05967",
+ "05969",
+ "05971",
+ "05973",
+ "05975",
+ "05977",
+ "05979",
+ "05981",
+ "05983",
+ "05985",
+ "05987",
+ "05989",
+ "05991",
+ "05993",
+ "05995",
+ "05997",
+ "05999",
+ "06001",
+ "06003",
+ "06005",
+ "06007",
+ "06009",
+ "06011",
+ "06013",
+ "06015",
+ "06017",
+ "06019",
+ "06021",
+ "06023",
+ "06025",
+ "06027",
+ "06029",
+ "06031",
+ "06033",
+ "06035",
+ "06037",
+ "06039",
+ "06041",
+ "06043",
+ "06045",
+ "06047",
+ "06049",
+ "06051",
+ "06053",
+ "06055",
+ "06057",
+ "06059",
+ "06061",
+ "06063",
+ "06065",
+ "06067",
+ "06069",
+ "06071",
+ "06073",
+ "06075",
+ "06077",
+ "06079",
+ "06081",
+ "06083",
+ "06085",
+ "06087",
+ "06089",
+ "06091",
+ "06093",
+ "06095",
+ "06097",
+ "06099",
+ "06101",
+ "06103",
+ "06105",
+ "06107",
+ "06109",
+ "06111",
+ "06113",
+ "06115",
+ "06117",
+ "06119",
+ "06121",
+ "06123",
+ "06125",
+ "06127",
+ "06129",
+ "06131",
+ "06133",
+ "06135",
+ "06137",
+ "06139",
+ "06141",
+ "06143",
+ "06145",
+ "06147",
+ "06149",
+ "06151",
+ "06153",
+ "06155",
+ "06157",
+ "06159",
+ "06161",
+ "06163",
+ "06165",
+ "06167",
+ "06169",
+ "06171",
+ "06173",
+ "06175",
+ "06177",
+ "06179",
+ "06181",
+ "06183",
+ "06185",
+ "06187",
+ "06189",
+ "06191",
+ "06193",
+ "06195",
+ "06197",
+ "06199",
+ "06201",
+ "06203",
+ "06205",
+ "06207",
+ "06209",
+ "06211",
+ "06213",
+ "06215",
+ "06217",
+ "06219",
+ "06221",
+ "06223",
+ "06225",
+ "06227",
+ "06229",
+ "06231",
+ "06233",
+ "06235",
+ "06237",
+ "06239",
+ "06241",
+ "06243",
+ "06245",
+ "06247",
+ "06249",
+ "06251",
+ "06253",
+ "06255",
+ "06257",
+ "06259",
+ "06261",
+ "06263",
+ "06265",
+ "06267",
+ "06269",
+ "06271",
+ "06273",
+ "06275",
+ "06277",
+ "06279",
+ "06281",
+ "06283",
+ "06285",
+ "06287",
+ "06289",
+ "06291",
+ "06293",
+ "06295",
+ "06297",
+ "06299",
+ "06301",
+ "06303",
+ "06305",
+ "06307",
+ "06309",
+ "06311",
+ "06313",
+ "06315",
+ "06317",
+ "06319",
+ "06321",
+ "06323",
+ "06325",
+ "06327",
+ "06329",
+ "06331",
+ "06333",
+ "06335",
+ "06337",
+ "06339",
+ "06341",
+ "06343",
+ "06345",
+ "06347",
+ "06349",
+ "06351",
+ "06353",
+ "06355",
+ "06357",
+ "06359",
+ "06361",
+ "06363",
+ "06365",
+ "06367",
+ "06369",
+ "06371",
+ "06373",
+ "06375",
+ "06377",
+ "06379",
+ "06381",
+ "06383",
+ "06385",
+ "06387",
+ "06389",
+ "06391",
+ "06393",
+ "06395",
+ "06397",
+ "06399",
+ "06401",
+ "06403",
+ "06405",
+ "06407",
+ "06409",
+ "06411",
+ "06413",
+ "06415",
+ "06417",
+ "06419",
+ "06421",
+ "06423",
+ "06425",
+ "06427",
+ "06429",
+ "06431",
+ "06433",
+ "06435",
+ "06437",
+ "06439",
+ "06441",
+ "06443",
+ "06445",
+ "06447",
+ "06449",
+ "06451",
+ "06453",
+ "06455",
+ "06457",
+ "06459",
+ "06461",
+ "06463",
+ "06465",
+ "06467",
+ "06469",
+ "06471",
+ "06473",
+ "06475",
+ "06477",
+ "06479",
+ "06481",
+ "06483",
+ "06485",
+ "06487",
+ "06489",
+ "06491",
+ "06493",
+ "06495",
+ "06497",
+ "06499",
+ "06501",
+ "06503",
+ "06505",
+ "06507",
+ "06509",
+ "06511",
+ "06513",
+ "06515",
+ "06517",
+ "06519",
+ "06521",
+ "06523",
+ "06525",
+ "06527",
+ "06529",
+ "06531",
+ "06533",
+ "06535",
+ "06537",
+ "06539",
+ "06541",
+ "06543",
+ "06545",
+ "06547",
+ "06549",
+ "06551",
+ "06553",
+ "06555",
+ "06557",
+ "06559",
+ "06561",
+ "06563",
+ "06565",
+ "06567",
+ "06569",
+ "06571",
+ "06573",
+ "06575",
+ "06577",
+ "06579",
+ "06581",
+ "06583",
+ "06585",
+ "06587",
+ "06589",
+ "06591",
+ "06593",
+ "06595",
+ "06597",
+ "06599",
+ "06601",
+ "06603",
+ "06605",
+ "06607",
+ "06609",
+ "06611",
+ "06613",
+ "06615",
+ "06617",
+ "06619",
+ "06621",
+ "06623",
+ "06625",
+ "06627",
+ "06629",
+ "06631",
+ "06633",
+ "06635",
+ "06637",
+ "06639",
+ "06641",
+ "06643",
+ "06645",
+ "06647",
+ "06649",
+ "06651",
+ "06653",
+ "06655",
+ "06657",
+ "06659",
+ "06661",
+ "06663",
+ "06665",
+ "06667",
+ "06669",
+ "06671",
+ "06673",
+ "06675",
+ "06677",
+ "06679",
+ "06681",
+ "06683",
+ "06685",
+ "06687",
+ "06689",
+ "06691",
+ "06693",
+ "06695",
+ "06697",
+ "06699",
+ "06701",
+ "06703",
+ "06705",
+ "06707",
+ "06709",
+ "06711",
+ "06713",
+ "06715",
+ "06717",
+ "06719",
+ "06721",
+ "06723",
+ "06725",
+ "06727",
+ "06729",
+ "06731",
+ "06733",
+ "06735",
+ "06737",
+ "06739",
+ "06741",
+ "06743",
+ "06745",
+ "06747",
+ "06749",
+ "06751",
+ "06753",
+ "06755",
+ "06757",
+ "06759",
+ "06761",
+ "06763",
+ "06765",
+ "06767",
+ "06769",
+ "06771",
+ "06773",
+ "06775",
+ "06777",
+ "06779",
+ "06781",
+ "06783",
+ "06785",
+ "06787",
+ "06789",
+ "06791",
+ "06793",
+ "06795",
+ "06797",
+ "06799",
+ "06801",
+ "06803",
+ "06805",
+ "06807",
+ "06809",
+ "06811",
+ "06813",
+ "06815",
+ "06817",
+ "06819",
+ "06821",
+ "06823",
+ "06825",
+ "06827",
+ "06829",
+ "06831",
+ "06833",
+ "06835",
+ "06837",
+ "06839",
+ "06841",
+ "06843",
+ "06845",
+ "06847",
+ "06849",
+ "06851",
+ "06853",
+ "06855",
+ "06857",
+ "06859",
+ "06861",
+ "06863",
+ "06865",
+ "06867",
+ "06869",
+ "06871",
+ "06873",
+ "06875",
+ "06877",
+ "06879",
+ "06881",
+ "06883",
+ "06885",
+ "06887",
+ "06889",
+ "06891",
+ "06893",
+ "06895",
+ "06897",
+ "06899",
+ "06901",
+ "06903",
+ "06905",
+ "06907",
+ "06909",
+ "06911",
+ "06913",
+ "06915",
+ "06917",
+ "06919",
+ "06921",
+ "06923",
+ "06925",
+ "06927",
+ "06929",
+ "06931",
+ "06933",
+ "06935",
+ "06937",
+ "06939",
+ "06941",
+ "06943",
+ "06945",
+ "06947",
+ "06949",
+ "06951",
+ "06953",
+ "06955",
+ "06957",
+ "06959",
+ "06961",
+ "06963",
+ "06965",
+ "06967",
+ "06969",
+ "06971",
+ "06973",
+ "06975",
+ "06977",
+ "06979",
+ "06981",
+ "06983",
+ "06985",
+ "06987",
+ "06989",
+ "06991",
+ "06993",
+ "06995",
+ "06997",
+ "06999",
+ "07001",
+ "07003",
+ "07005",
+ "07007",
+ "07009",
+ "07011",
+ "07013",
+ "07015",
+ "07017",
+ "07019",
+ "07021",
+ "07023",
+ "07025",
+ "07027",
+ "07029",
+ "07031",
+ "07033",
+ "07035",
+ "07037",
+ "07039",
+ "07041",
+ "07043",
+ "07045",
+ "07047",
+ "07049",
+ "07051",
+ "07053",
+ "07055",
+ "07057",
+ "07059",
+ "07061",
+ "07063",
+ "07065",
+ "07067",
+ "07069",
+ "07071",
+ "07073",
+ "07075",
+ "07077",
+ "07079",
+ "07081",
+ "07083",
+ "07085",
+ "07087",
+ "07089",
+ "07091",
+ "07093",
+ "07095",
+ "07097",
+ "07099",
+ "07101",
+ "07103",
+ "07105",
+ "07107",
+ "07109",
+ "07111",
+ "07113",
+ "07115",
+ "07117",
+ "07119",
+ "07121",
+ "07123",
+ "07125",
+ "07127",
+ "07129",
+ "07131",
+ "07133",
+ "07135",
+ "07137",
+ "07139",
+ "07141",
+ "07143",
+ "07145",
+ "07147",
+ "07149",
+ "07151",
+ "07153",
+ "07155",
+ "07157",
+ "07159",
+ "07161",
+ "07163",
+ "07165",
+ "07167",
+ "07169",
+ "07171",
+ "07173",
+ "07175",
+ "07177",
+ "07179",
+ "07181",
+ "07183",
+ "07185",
+ "07187",
+ "07189",
+ "07191",
+ "07193",
+ "07195",
+ "07197",
+ "07199",
+ "07201",
+ "07203",
+ "07205",
+ "07207",
+ "07209",
+ "07211",
+ "07213",
+ "07215",
+ "07217",
+ "07219",
+ "07221",
+ "07223",
+ "07225",
+ "07227",
+ "07229",
+ "07231",
+ "07233",
+ "07235",
+ "07237",
+ "07239",
+ "07241",
+ "07243",
+ "07245",
+ "07247",
+ "07249",
+ "07251",
+ "07253",
+ "07255",
+ "07257",
+ "07259",
+ "07261",
+ "07263",
+ "07265",
+ "07267",
+ "07269",
+ "07271",
+ "07273",
+ "07275",
+ "07277",
+ "07279",
+ "07281",
+ "07283",
+ "07285",
+ "07287",
+ "07289",
+ "07291",
+ "07293",
+ "07295",
+ "07297",
+ "07299",
+ "07301",
+ "07303",
+ "07305",
+ "07307",
+ "07309",
+ "07311",
+ "07313",
+ "07315",
+ "07317",
+ "07319",
+ "07321",
+ "07323",
+ "07325",
+ "07327",
+ "07329",
+ "07331",
+ "07333",
+ "07335",
+ "07337",
+ "07339",
+ "07341",
+ "07343",
+ "07345",
+ "07347",
+ "07349",
+ "07351",
+ "07353",
+ "07355",
+ "07357",
+ "07359",
+ "07361",
+ "07363",
+ "07365",
+ "07367",
+ "07369",
+ "07371",
+ "07373",
+ "07375",
+ "07377",
+ "07379",
+ "07381",
+ "07383",
+ "07385",
+ "07387",
+ "07389",
+ "07391",
+ "07393",
+ "07395",
+ "07397",
+ "07399",
+ "07401",
+ "07403",
+ "07405",
+ "07407",
+ "07409",
+ "07411",
+ "07413",
+ "07415",
+ "07417",
+ "07419",
+ "07421",
+ "07423",
+ "07425",
+ "07427",
+ "07429",
+ "07431",
+ "07433",
+ "07435",
+ "07437",
+ "07439",
+ "07441",
+ "07443",
+ "07445",
+ "07447",
+ "07449",
+ "07451",
+ "07453",
+ "07455",
+ "07457",
+ "07459",
+ "07461",
+ "07463",
+ "07465",
+ "07467",
+ "07469",
+ "07471",
+ "07473",
+ "07475",
+ "07477",
+ "07479",
+ "07481",
+ "07483",
+ "07485",
+ "07487",
+ "07489",
+ "07491",
+ "07493",
+ "07495",
+ "07497",
+ "07499",
+ "07501",
+ "07503",
+ "07505",
+ "07507",
+ "07509",
+ "07511",
+ "07513",
+ "07515",
+ "07517",
+ "07519",
+ "07521",
+ "07523",
+ "07525",
+ "07527",
+ "07529",
+ "07531",
+ "07533",
+ "07535",
+ "07537",
+ "07539",
+ "07541",
+ "07543",
+ "07545",
+ "07547",
+ "07549",
+ "07551",
+ "07553",
+ "07555",
+ "07557",
+ "07559",
+ "07561",
+ "07563",
+ "07565",
+ "07567",
+ "07569",
+ "07571",
+ "07573",
+ "07575",
+ "07577",
+ "07579",
+ "07581",
+ "07583",
+ "07585",
+ "07587",
+ "07589",
+ "07591",
+ "07593",
+ "07595",
+ "07597",
+ "07599",
+ "07601",
+ "07603",
+ "07605",
+ "07607",
+ "07609",
+ "07611",
+ "07613",
+ "07615",
+ "07617",
+ "07619",
+ "07621",
+ "07623",
+ "07625",
+ "07627",
+ "07629",
+ "07631",
+ "07633",
+ "07635",
+ "07637",
+ "07639",
+ "07641",
+ "07643",
+ "07645",
+ "07647",
+ "07649",
+ "07651",
+ "07653",
+ "07655",
+ "07657",
+ "07659",
+ "07661",
+ "07663",
+ "07665",
+ "07667",
+ "07669",
+ "07671",
+ "07673",
+ "07675",
+ "07677",
+ "07679",
+ "07681",
+ "07683",
+ "07685",
+ "07687",
+ "07689",
+ "07691",
+ "07693",
+ "07695",
+ "07697",
+ "07699",
+ "07701",
+ "07703",
+ "07705",
+ "07707",
+ "07709",
+ "07711",
+ "07713",
+ "07715",
+ "07717",
+ "07719",
+ "07721",
+ "07723",
+ "07725",
+ "07727",
+ "07729",
+ "07731",
+ "07733",
+ "07735",
+ "07737",
+ "07739",
+ "07741",
+ "07743",
+ "07745",
+ "07747",
+ "07749",
+ "07751",
+ "07753",
+ "07755",
+ "07757",
+ "07759",
+ "07761",
+ "07763",
+ "07765",
+ "07767",
+ "07769",
+ "07771",
+ "07773",
+ "07775",
+ "07777",
+ "07779",
+ "07781",
+ "07783",
+ "07785",
+ "07787",
+ "07789",
+ "07791",
+ "07793",
+ "07795",
+ "07797",
+ "07799",
+ "07801",
+ "07803",
+ "07805",
+ "07807",
+ "07809",
+ "07811",
+ "07813",
+ "07815",
+ "07817",
+ "07819",
+ "07821",
+ "07823",
+ "07825",
+ "07827",
+ "07829",
+ "07831",
+ "07833",
+ "07835",
+ "07837",
+ "07839",
+ "07841",
+ "07843",
+ "07845",
+ "07847",
+ "07849",
+ "07851",
+ "07853",
+ "07855",
+ "07857",
+ "07859",
+ "07861",
+ "07863",
+ "07865",
+ "07867",
+ "07869",
+ "07871",
+ "07873",
+ "07875",
+ "07877",
+ "07879",
+ "07881",
+ "07883",
+ "07885",
+ "07887",
+ "07889",
+ "07891",
+ "07893",
+ "07895",
+ "07897",
+ "07899",
+ "07901",
+ "07903",
+ "07905",
+ "07907",
+ "07909",
+ "07911",
+ "07913",
+ "07915",
+ "07917",
+ "07919",
+ "07921",
+ "07923",
+ "07925",
+ "07927",
+ "07929",
+ "07931",
+ "07933",
+ "07935",
+ "07937",
+ "07939",
+ "07941",
+ "07943",
+ "07945",
+ "07947",
+ "07949",
+ "07951",
+ "07953",
+ "07955",
+ "07957",
+ "07959",
+ "07961",
+ "07963",
+ "07965",
+ "07967",
+ "07969",
+ "07971",
+ "07973",
+ "07975",
+ "07977",
+ "07979",
+ "07981",
+ "07983",
+ "07985",
+ "07987",
+ "07989",
+ "07991",
+ "07993",
+ "07995",
+ "07997",
+ "07999",
+ "08001",
+ "08003",
+ "08005",
+ "08007",
+ "08009",
+ "08011",
+ "08013",
+ "08015",
+ "08017",
+ "08019",
+ "08021",
+ "08023",
+ "08025",
+ "08027",
+ "08029",
+ "08031",
+ "08033",
+ "08035",
+ "08037",
+ "08039",
+ "08041",
+ "08043",
+ "08045",
+ "08047",
+ "08049",
+ "08051",
+ "08053",
+ "08055",
+ "08057",
+ "08059",
+ "08061",
+ "08063",
+ "08065",
+ "08067",
+ "08069",
+ "08071",
+ "08073",
+ "08075",
+ "08077",
+ "08079",
+ "08081",
+ "08083",
+ "08085",
+ "08087",
+ "08089",
+ "08091",
+ "08093",
+ "08095",
+ "08097",
+ "08099",
+ "08101",
+ "08103",
+ "08105",
+ "08107",
+ "08109",
+ "08111",
+ "08113",
+ "08115",
+ "08117",
+ "08119",
+ "08121",
+ "08123",
+ "08125",
+ "08127",
+ "08129",
+ "08131",
+ "08133",
+ "08135",
+ "08137",
+ "08139",
+ "08141",
+ "08143",
+ "08145",
+ "08147",
+ "08149",
+ "08151",
+ "08153",
+ "08155",
+ "08157",
+ "08159",
+ "08161",
+ "08163",
+ "08165",
+ "08167",
+ "08169",
+ "08171",
+ "08173",
+ "08175",
+ "08177",
+ "08179",
+ "08181",
+ "08183",
+ "08185",
+ "08187",
+ "08189",
+ "08191",
+ "08193",
+ "08195",
+ "08197",
+ "08199",
+ "08201",
+ "08203",
+ "08205",
+ "08207",
+ "08209",
+ "08211",
+ "08213",
+ "08215",
+ "08217",
+ "08219",
+ "08221",
+ "08223",
+ "08225",
+ "08227",
+ "08229",
+ "08231",
+ "08233",
+ "08235",
+ "08237",
+ "08239",
+ "08241",
+ "08243",
+ "08245",
+ "08247",
+ "08249",
+ "08251",
+ "08253",
+ "08255",
+ "08257",
+ "08259",
+ "08261",
+ "08263",
+ "08265",
+ "08267",
+ "08269",
+ "08271",
+ "08273",
+ "08275",
+ "08277",
+ "08279",
+ "08281",
+ "08283",
+ "08285",
+ "08287",
+ "08289",
+ "08291",
+ "08293",
+ "08295",
+ "08297",
+ "08299",
+ "08301",
+ "08303",
+ "08305",
+ "08307",
+ "08309",
+ "08311",
+ "08313",
+ "08315",
+ "08317",
+ "08319",
+ "08321",
+ "08323",
+ "08325",
+ "08327",
+ "08329",
+ "08331",
+ "08333",
+ "08335",
+ "08337",
+ "08339",
+ "08341",
+ "08343",
+ "08345",
+ "08347",
+ "08349",
+ "08351",
+ "08353",
+ "08355",
+ "08357",
+ "08359",
+ "08361",
+ "08363",
+ "08365",
+ "08367",
+ "08369",
+ "08371",
+ "08373",
+ "08375",
+ "08377",
+ "08379",
+ "08381",
+ "08383",
+ "08385",
+ "08387",
+ "08389",
+ "08391",
+ "08393",
+ "08395",
+ "08397",
+ "08399",
+ "08401",
+ "08403",
+ "08405",
+ "08407",
+ "08409",
+ "08411",
+ "08413",
+ "08415",
+ "08417",
+ "08419",
+ "08421",
+ "08423",
+ "08425",
+ "08427",
+ "08429",
+ "08431",
+ "08433",
+ "08435",
+ "08437",
+ "08439",
+ "08441",
+ "08443",
+ "08445",
+ "08447",
+ "08449",
+ "08451",
+ "08453",
+ "08455",
+ "08457",
+ "08459",
+ "08461",
+ "08463",
+ "08465",
+ "08467",
+ "08469",
+ "08471",
+ "08473",
+ "08475",
+ "08477",
+ "08479",
+ "08481",
+ "08483",
+ "08485",
+ "08487",
+ "08489",
+ "08491",
+ "08493",
+ "08495",
+ "08497",
+ "08499",
+ "08501",
+ "08503",
+ "08505",
+ "08507",
+ "08509",
+ "08511",
+ "08513",
+ "08515",
+ "08517",
+ "08519",
+ "08521",
+ "08523",
+ "08525",
+ "08527",
+ "08529",
+ "08531",
+ "08533",
+ "08535",
+ "08537",
+ "08539",
+ "08541",
+ "08543",
+ "08545",
+ "08547",
+ "08549",
+ "08551",
+ "08553",
+ "08555",
+ "08557",
+ "08559",
+ "08561",
+ "08563",
+ "08565",
+ "08567",
+ "08569",
+ "08571",
+ "08573",
+ "08575",
+ "08577",
+ "08579",
+ "08581",
+ "08583",
+ "08585",
+ "08587",
+ "08589",
+ "08591",
+ "08593",
+ "08595",
+ "08597",
+ "08599",
+ "08601",
+ "08603",
+ "08605",
+ "08607",
+ "08609",
+ "08611",
+ "08613",
+ "08615",
+ "08617",
+ "08619",
+ "08621",
+ "08623",
+ "08625",
+ "08627",
+ "08629",
+ "08631",
+ "08633",
+ "08635",
+ "08637",
+ "08639",
+ "08641",
+ "08643",
+ "08645",
+ "08647",
+ "08649",
+ "08651",
+ "08653",
+ "08655",
+ "08657",
+ "08659",
+ "08661",
+ "08663",
+ "08665",
+ "08667",
+ "08669",
+ "08671",
+ "08673",
+ "08675",
+ "08677",
+ "08679",
+ "08681",
+ "08683",
+ "08685",
+ "08687",
+ "08689",
+ "08691",
+ "08693",
+ "08695",
+ "08697",
+ "08699",
+ "08701",
+ "08703",
+ "08705",
+ "08707",
+ "08709",
+ "08711",
+ "08713",
+ "08715",
+ "08717",
+ "08719",
+ "08721",
+ "08723",
+ "08725",
+ "08727",
+ "08729",
+ "08731",
+ "08733",
+ "08735",
+ "08737",
+ "08739",
+ "08741",
+ "08743",
+ "08745",
+ "08747",
+ "08749",
+ "08751",
+ "08753",
+ "08755",
+ "08757",
+ "08759",
+ "08761",
+ "08763",
+ "08765",
+ "08767",
+ "08769",
+ "08771",
+ "08773",
+ "08775",
+ "08777",
+ "08779",
+ "08781",
+ "08783",
+ "08785",
+ "08787",
+ "08789",
+ "08791",
+ "08793",
+ "08795",
+ "08797",
+ "08799",
+ "08801",
+ "08803",
+ "08805",
+ "08807",
+ "08809",
+ "08811",
+ "08813",
+ "08815",
+ "08817",
+ "08819",
+ "08821",
+ "08823",
+ "08825",
+ "08827",
+ "08829",
+ "08831",
+ "08833",
+ "08835",
+ "08837",
+ "08839",
+ "08841",
+ "08843",
+ "08845",
+ "08847",
+ "08849",
+ "08851",
+ "08853",
+ "08855",
+ "08857",
+ "08859",
+ "08861",
+ "08863",
+ "08865",
+ "08867",
+ "08869",
+ "08871",
+ "08873",
+ "08875",
+ "08877",
+ "08879",
+ "08881",
+ "08883",
+ "08885",
+ "08887",
+ "08889",
+ "08891",
+ "08893",
+ "08895",
+ "08897",
+ "08899",
+ "08901",
+ "08903",
+ "08905",
+ "08907",
+ "08909",
+ "08911",
+ "08913",
+ "08915",
+ "08917",
+ "08919",
+ "08921",
+ "08923",
+ "08925",
+ "08927",
+ "08929",
+ "08931",
+ "08933",
+ "08935",
+ "08937",
+ "08939",
+ "08941",
+ "08943",
+ "08945",
+ "08947",
+ "08949",
+ "08951",
+ "08953",
+ "08955",
+ "08957",
+ "08959",
+ "08961",
+ "08963",
+ "08965",
+ "08967",
+ "08969",
+ "08971",
+ "08973",
+ "08975",
+ "08977",
+ "08979",
+ "08981",
+ "08983",
+ "08985",
+ "08987",
+ "08989",
+ "08991",
+ "08993",
+ "08995",
+ "08997",
+ "08999",
+ "09001",
+ "09003",
+ "09005",
+ "09007",
+ "09009",
+ "09011",
+ "09013",
+ "09015",
+ "09017",
+ "09019",
+ "09021",
+ "09023",
+ "09025",
+ "09027",
+ "09029",
+ "09031",
+ "09033",
+ "09035",
+ "09037",
+ "09039",
+ "09041",
+ "09043",
+ "09045",
+ "09047",
+ "09049",
+ "09051",
+ "09053",
+ "09055",
+ "09057",
+ "09059",
+ "09061",
+ "09063",
+ "09065",
+ "09067",
+ "09069",
+ "09071",
+ "09073",
+ "09075",
+ "09077",
+ "09079",
+ "09081",
+ "09083",
+ "09085",
+ "09087",
+ "09089",
+ "09091",
+ "09093",
+ "09095",
+ "09097",
+ "09099",
+ "09101",
+ "09103",
+ "09105",
+ "09107",
+ "09109",
+ "09111",
+ "09113",
+ "09115",
+ "09117",
+ "09119",
+ "09121",
+ "09123",
+ "09125",
+ "09127",
+ "09129",
+ "09131",
+ "09133",
+ "09135",
+ "09137",
+ "09139",
+ "09141",
+ "09143",
+ "09145",
+ "09147",
+ "09149",
+ "09151",
+ "09153",
+ "09155",
+ "09157",
+ "09159",
+ "09161",
+ "09163",
+ "09165",
+ "09167",
+ "09169",
+ "09171",
+ "09173",
+ "09175",
+ "09177",
+ "09179",
+ "09181",
+ "09183",
+ "09185",
+ "09187",
+ "09189",
+ "09191",
+ "09193",
+ "09195",
+ "09197",
+ "09199",
+ "09201",
+ "09203",
+ "09205",
+ "09207",
+ "09209",
+ "09211",
+ "09213",
+ "09215",
+ "09217",
+ "09219",
+ "09221",
+ "09223",
+ "09225",
+ "09227",
+ "09229",
+ "09231",
+ "09233",
+ "09235",
+ "09237",
+ "09239",
+ "09241",
+ "09243",
+ "09245",
+ "09247",
+ "09249",
+ "09251",
+ "09253",
+ "09255",
+ "09257",
+ "09259",
+ "09261",
+ "09263",
+ "09265",
+ "09267",
+ "09269",
+ "09271",
+ "09273",
+ "09275",
+ "09277",
+ "09279",
+ "09281",
+ "09283",
+ "09285",
+ "09287",
+ "09289",
+ "09291",
+ "09293",
+ "09295",
+ "09297",
+ "09299",
+ "09301",
+ "09303",
+ "09305",
+ "09307",
+ "09309",
+ "09311",
+ "09313",
+ "09315",
+ "09317",
+ "09319",
+ "09321",
+ "09323",
+ "09325",
+ "09327",
+ "09329",
+ "09331",
+ "09333",
+ "09335",
+ "09337",
+ "09339",
+ "09341",
+ "09343",
+ "09345",
+ "09347",
+ "09349",
+ "09351",
+ "09353",
+ "09355",
+ "09357",
+ "09359",
+ "09361",
+ "09363",
+ "09365",
+ "09367",
+ "09369",
+ "09371",
+ "09373",
+ "09375",
+ "09377",
+ "09379",
+ "09381",
+ "09383",
+ "09385",
+ "09387",
+ "09389",
+ "09391",
+ "09393",
+ "09395",
+ "09397",
+ "09399",
+ "09401",
+ "09403",
+ "09405",
+ "09407",
+ "09409",
+ "09411",
+ "09413",
+ "09415",
+ "09417",
+ "09419",
+ "09421",
+ "09423",
+ "09425",
+ "09427",
+ "09429",
+ "09431",
+ "09433",
+ "09435",
+ "09437",
+ "09439",
+ "09441",
+ "09443",
+ "09445",
+ "09447",
+ "09449",
+ "09451",
+ "09453",
+ "09455",
+ "09457",
+ "09459",
+ "09461",
+ "09463",
+ "09465",
+ "09467",
+ "09469",
+ "09471",
+ "09473",
+ "09475",
+ "09477",
+ "09479",
+ "09481",
+ "09483",
+ "09485",
+ "09487",
+ "09489",
+ "09491",
+ "09493",
+ "09495",
+ "09497",
+ "09499",
+ "09501",
+ "09503",
+ "09505",
+ "09507",
+ "09509",
+ "09511",
+ "09513",
+ "09515",
+ "09517",
+ "09519",
+ "09521",
+ "09523",
+ "09525",
+ "09527",
+ "09529",
+ "09531",
+ "09533",
+ "09535",
+ "09537",
+ "09539",
+ "09541",
+ "09543",
+ "09545",
+ "09547",
+ "09549",
+ "09551",
+ "09553",
+ "09555",
+ "09557",
+ "09559",
+ "09561",
+ "09563",
+ "09565",
+ "09567",
+ "09569",
+ "09571",
+ "09573",
+ "09575",
+ "09577",
+ "09579",
+ "09581",
+ "09583",
+ "09585",
+ "09587",
+ "09589",
+ "09591",
+ "09593",
+ "09595",
+ "09597",
+ "09599",
+ "09601",
+ "09603",
+ "09605",
+ "09607",
+ "09609",
+ "09611",
+ "09613",
+ "09615",
+ "09617",
+ "09619",
+ "09621",
+ "09623",
+ "09625",
+ "09627",
+ "09629",
+ "09631",
+ "09633",
+ "09635",
+ "09637",
+ "09639",
+ "09641",
+ "09643",
+ "09645",
+ "09647",
+ "09649",
+ "09651",
+ "09653",
+ "09655",
+ "09657",
+ "09659",
+ "09661",
+ "09663",
+ "09665",
+ "09667",
+ "09669",
+ "09671",
+ "09673",
+ "09675",
+ "09677",
+ "09679",
+ "09681",
+ "09683",
+ "09685",
+ "09687",
+ "09689",
+ "09691",
+ "09693",
+ "09695",
+ "09697",
+ "09699",
+ "09701",
+ "09703",
+ "09705",
+ "09707",
+ "09709",
+ "09711",
+ "09713",
+ "09715",
+ "09717",
+ "09719",
+ "09721",
+ "09723",
+ "09725",
+ "09727",
+ "09729",
+ "09731",
+ "09733",
+ "09735",
+ "09737",
+ "09739",
+ "09741",
+ "09743",
+ "09745",
+ "09747",
+ "09749",
+ "09751",
+ "09753",
+ "09755",
+ "09757",
+ "09759",
+ "09761",
+ "09763",
+ "09765",
+ "09767",
+ "09769",
+ "09771",
+ "09773",
+ "09775",
+ "09777",
+ "09779",
+ "09781",
+ "09783",
+ "09785",
+ "09787",
+ "09789",
+ "09791",
+ "09793",
+ "09795",
+ "09797",
+ "09799",
+ "09801",
+ "09803",
+ "09805",
+ "09807",
+ "09809",
+ "09811",
+ "09813",
+ "09815",
+ "09817",
+ "09819",
+ "09821",
+ "09823",
+ "09825",
+ "09827",
+ "09829",
+ "09831",
+ "09833",
+ "09835",
+ "09837",
+ "09839",
+ "09841",
+ "09843",
+ "09845",
+ "09847",
+ "09849",
+ "09851",
+ "09853",
+ "09855",
+ "09857",
+ "09859",
+ "09861",
+ "09863",
+ "09865",
+ "09867",
+ "09869",
+ "09871",
+ "09873",
+ "09875",
+ "09877",
+ "09879",
+ "09881",
+ "09883",
+ "09885",
+ "09887",
+ "09889",
+ "09891",
+ "09893",
+ "09895",
+ "09897",
+ "09899",
+ "09901",
+ "09903",
+ "09905",
+ "09907",
+ "09909",
+ "09911",
+ "09913",
+ "09915",
+ "09917",
+ "09919",
+ "09921",
+ "09923",
+ "09925",
+ "09927",
+ "09929",
+ "09931",
+ "09933",
+ "09935",
+ "09937",
+ "09939",
+ "09941",
+ "09943",
+ "09945",
+ "09947",
+ "09949",
+ "09951",
+ "09953",
+ "09955",
+ "09957",
+ "09959",
+ "09961",
+ "09963",
+ "09965",
+ "09967",
+ "09969",
+ "09971",
+ "09973",
+ "09975",
+ "09977",
+ "09979",
+ "09981",
+ "09983",
+ "09985",
+ "09987",
+ "09989",
+ "09991",
+ "09993",
+ "09995",
+ "09997",
+ "09999",
+ "10001",
+ "10003",
+ "10005",
+ "10007",
+ "10009",
+ "10011",
+ "10013",
+ "10015",
+ "10017",
+ "10019",
+ "10021",
+ "10023",
+ "10025",
+ "10027",
+ "10029",
+ "10031",
+ "10033",
+ "10035",
+ "10037",
+ "10039",
+ "10041",
+ "10043",
+ "10045",
+ "10047",
+ "10049",
+ "10051",
+ "10053",
+ "10055",
+ "10057",
+ "10059",
+ "10061",
+ "10063",
+ "10065",
+ "10067",
+ "10069",
+ "10071",
+ "10073",
+ "10075",
+ "10077",
+ "10079",
+ "10081",
+ "10083",
+ "10085",
+ "10087",
+ "10089",
+ "10091",
+ "10093",
+ "10095",
+ "10097",
+ "10099",
+ "10101",
+ "10103",
+ "10105",
+ "10107",
+ "10109",
+ "10111",
+ "10113",
+ "10115",
+ "10117",
+ "10119",
+ "10121",
+ "10123",
+ "10125",
+ "10127",
+ "10129",
+ "10131",
+ "10133",
+ "10135",
+ "10137",
+ "10139",
+ "10141",
+ "10143",
+ "10145",
+ "10147",
+ "10149",
+ "10151",
+ "10153",
+ "10155",
+ "10157",
+ "10159",
+ "10161",
+ "10163",
+ "10165",
+ "10167",
+ "10169",
+ "10171",
+ "10173",
+ "10175",
+ "10177",
+ "10179",
+ "10181",
+ "10183",
+ "10185",
+ "10187",
+ "10189",
+ "10191",
+ "10193",
+ "10195",
+ "10197",
+ "10199",
+ "10201",
+ "10203",
+ "10205",
+ "10207",
+ "10209",
+ "10211",
+ "10213",
+ "10215",
+ "10217",
+ "10219",
+ "10221",
+ "10223",
+ "10225",
+ "10227",
+ "10229",
+ "10231",
+ "10233",
+ "10235",
+ "10237",
+ "10239",
+ "10241",
+ "10243",
+ "10245",
+ "10247",
+ "10249",
+ "10251",
+ "10253",
+ "10255",
+ "10257",
+ "10259",
+ "10261",
+ "10263",
+ "10265",
+ "10267",
+ "10269",
+ "10271",
+ "10273",
+ "10275",
+ "10277",
+ "10279",
+ "10281",
+ "10283",
+ "10285",
+ "10287",
+ "10289",
+ "10291",
+ "10293",
+ "10295",
+ "10297",
+ "10299",
+ "10301",
+ "10303",
+ "10305"
+ ],
+ "TEST_SET": [
+ "00002",
+ "00004",
+ "00006",
+ "00008",
+ "00010",
+ "00012",
+ "00014",
+ "00016",
+ "00018",
+ "00020",
+ "00022",
+ "00024",
+ "00026",
+ "00028",
+ "00030",
+ "00032",
+ "00034",
+ "00036",
+ "00038",
+ "00040",
+ "00042",
+ "00044",
+ "00046",
+ "00048",
+ "00050",
+ "00052",
+ "00054",
+ "00056",
+ "00058",
+ "00060",
+ "00062",
+ "00064",
+ "00066",
+ "00068",
+ "00070",
+ "00072",
+ "00074",
+ "00076",
+ "00078",
+ "00080",
+ "00082",
+ "00084",
+ "00086",
+ "00088",
+ "00090",
+ "00092",
+ "00094",
+ "00096",
+ "00098",
+ "00100",
+ "00102",
+ "00104",
+ "00106",
+ "00108",
+ "00110",
+ "00112",
+ "00114",
+ "00116",
+ "00118",
+ "00120",
+ "00122",
+ "00124",
+ "00126",
+ "00128",
+ "00130",
+ "00132",
+ "00134",
+ "00136",
+ "00138",
+ "00140",
+ "00142",
+ "00144",
+ "00146",
+ "00148",
+ "00150",
+ "00152",
+ "00154",
+ "00156",
+ "00158",
+ "00160",
+ "00162",
+ "00164",
+ "00166",
+ "00168",
+ "00170",
+ "00172",
+ "00174",
+ "00176",
+ "00178",
+ "00180",
+ "00182",
+ "00184",
+ "00186",
+ "00188",
+ "00190",
+ "00192",
+ "00194",
+ "00196",
+ "00198",
+ "00200",
+ "00202",
+ "00204",
+ "00206",
+ "00208",
+ "00210",
+ "00212",
+ "00214",
+ "00216",
+ "00218",
+ "00220",
+ "00222",
+ "00224",
+ "00226",
+ "00228",
+ "00230",
+ "00232",
+ "00234",
+ "00236",
+ "00238",
+ "00240",
+ "00242",
+ "00244",
+ "00246",
+ "00248",
+ "00250",
+ "00252",
+ "00254",
+ "00256",
+ "00258",
+ "00260",
+ "00262",
+ "00264",
+ "00266",
+ "00268",
+ "00270",
+ "00272",
+ "00274",
+ "00276",
+ "00278",
+ "00280",
+ "00282",
+ "00284",
+ "00286",
+ "00288",
+ "00290",
+ "00292",
+ "00294",
+ "00296",
+ "00298",
+ "00300",
+ "00302",
+ "00304",
+ "00306",
+ "00308",
+ "00310",
+ "00312",
+ "00314",
+ "00316",
+ "00318",
+ "00320",
+ "00322",
+ "00324",
+ "00326",
+ "00328",
+ "00330",
+ "00332",
+ "00334",
+ "00336",
+ "00338",
+ "00340",
+ "00342",
+ "00344",
+ "00346",
+ "00348",
+ "00350",
+ "00352",
+ "00354",
+ "00356",
+ "00358",
+ "00360",
+ "00362",
+ "00364",
+ "00366",
+ "00368",
+ "00370",
+ "00372",
+ "00374",
+ "00376",
+ "00378",
+ "00380",
+ "00382",
+ "00384",
+ "00386",
+ "00388",
+ "00390",
+ "00392",
+ "00394",
+ "00396",
+ "00398",
+ "00400",
+ "00402",
+ "00404",
+ "00406",
+ "00408",
+ "00410",
+ "00412",
+ "00414",
+ "00416",
+ "00418",
+ "00420",
+ "00422",
+ "00424",
+ "00426",
+ "00428",
+ "00430",
+ "00432",
+ "00434",
+ "00436",
+ "00438",
+ "00440",
+ "00442",
+ "00444",
+ "00446",
+ "00448",
+ "00450",
+ "00452",
+ "00454",
+ "00456",
+ "00458",
+ "00460",
+ "00462",
+ "00464",
+ "00466",
+ "00468",
+ "00470",
+ "00472",
+ "00474",
+ "00476",
+ "00478",
+ "00480",
+ "00482",
+ "00484",
+ "00486",
+ "00488",
+ "00490",
+ "00492",
+ "00494",
+ "00496",
+ "00498",
+ "00500",
+ "00502",
+ "00504",
+ "00506",
+ "00508",
+ "00510",
+ "00512",
+ "00514",
+ "00516",
+ "00518",
+ "00520",
+ "00522",
+ "00524",
+ "00526",
+ "00528",
+ "00530",
+ "00532",
+ "00534",
+ "00536",
+ "00538",
+ "00540",
+ "00542",
+ "00544",
+ "00546",
+ "00548",
+ "00550",
+ "00552",
+ "00554",
+ "00556",
+ "00558",
+ "00560",
+ "00562",
+ "00564",
+ "00566",
+ "00568",
+ "00570",
+ "00572",
+ "00574",
+ "00576",
+ "00578",
+ "00580",
+ "00582",
+ "00584",
+ "00586",
+ "00588",
+ "00590",
+ "00592",
+ "00594",
+ "00596",
+ "00598",
+ "00600",
+ "00602",
+ "00604",
+ "00606",
+ "00608",
+ "00610",
+ "00612",
+ "00614",
+ "00616",
+ "00618",
+ "00620",
+ "00622",
+ "00624",
+ "00626",
+ "00628",
+ "00630",
+ "00632",
+ "00634",
+ "00636",
+ "00638",
+ "00640",
+ "00642",
+ "00644",
+ "00646",
+ "00648",
+ "00650",
+ "00652",
+ "00654",
+ "00656",
+ "00658",
+ "00660",
+ "00662",
+ "00664",
+ "00666",
+ "00668",
+ "00670",
+ "00672",
+ "00674",
+ "00676",
+ "00678",
+ "00680",
+ "00682",
+ "00684",
+ "00686",
+ "00688",
+ "00690",
+ "00692",
+ "00694",
+ "00696",
+ "00698",
+ "00700",
+ "00702",
+ "00704",
+ "00706",
+ "00708",
+ "00710",
+ "00712",
+ "00714",
+ "00716",
+ "00718",
+ "00720",
+ "00722",
+ "00724",
+ "00726",
+ "00728",
+ "00730",
+ "00732",
+ "00734",
+ "00736",
+ "00738",
+ "00740",
+ "00742",
+ "00744",
+ "00746",
+ "00748",
+ "00750",
+ "00752",
+ "00754",
+ "00756",
+ "00758",
+ "00760",
+ "00762",
+ "00764",
+ "00766",
+ "00768",
+ "00770",
+ "00772",
+ "00774",
+ "00776",
+ "00778",
+ "00780",
+ "00782",
+ "00784",
+ "00786",
+ "00788",
+ "00790",
+ "00792",
+ "00794",
+ "00796",
+ "00798",
+ "00800",
+ "00802",
+ "00804",
+ "00806",
+ "00808",
+ "00810",
+ "00812",
+ "00814",
+ "00816",
+ "00818",
+ "00820",
+ "00822",
+ "00824",
+ "00826",
+ "00828",
+ "00830",
+ "00832",
+ "00834",
+ "00836",
+ "00838",
+ "00840",
+ "00842",
+ "00844",
+ "00846",
+ "00848",
+ "00850",
+ "00852",
+ "00854",
+ "00856",
+ "00858",
+ "00860",
+ "00862",
+ "00864",
+ "00866",
+ "00868",
+ "00870",
+ "00872",
+ "00874",
+ "00876",
+ "00878",
+ "00880",
+ "00882",
+ "00884",
+ "00886",
+ "00888",
+ "00890",
+ "00892",
+ "00894",
+ "00896",
+ "00898",
+ "00900",
+ "00902",
+ "00904",
+ "00906",
+ "00908",
+ "00910",
+ "00912",
+ "00914",
+ "00916",
+ "00918",
+ "00920",
+ "00922",
+ "00924",
+ "00926",
+ "00928",
+ "00930",
+ "00932",
+ "00934",
+ "00936",
+ "00938",
+ "00940",
+ "00942",
+ "00944",
+ "00946",
+ "00948",
+ "00950",
+ "00952",
+ "00954",
+ "00956",
+ "00958",
+ "00960",
+ "00962",
+ "00964",
+ "00966",
+ "00968",
+ "00970",
+ "00972",
+ "00974",
+ "00976",
+ "00978",
+ "00980",
+ "00982",
+ "00984",
+ "00986",
+ "00988",
+ "00990",
+ "00992",
+ "00994",
+ "00996",
+ "00998",
+ "01000",
+ "01002",
+ "01004",
+ "01006",
+ "01008",
+ "01010",
+ "01012",
+ "01014",
+ "01016",
+ "01018",
+ "01020",
+ "01022",
+ "01024",
+ "01026",
+ "01028",
+ "01030",
+ "01032",
+ "01034",
+ "01036",
+ "01038",
+ "01040",
+ "01042",
+ "01044",
+ "01046",
+ "01048",
+ "01050",
+ "01052",
+ "01054",
+ "01056",
+ "01058",
+ "01060",
+ "01062",
+ "01064",
+ "01066",
+ "01068",
+ "01070",
+ "01072",
+ "01074",
+ "01076",
+ "01078",
+ "01080",
+ "01082",
+ "01084",
+ "01086",
+ "01088",
+ "01090",
+ "01092",
+ "01094",
+ "01096",
+ "01098",
+ "01100",
+ "01102",
+ "01104",
+ "01106",
+ "01108",
+ "01110",
+ "01112",
+ "01114",
+ "01116",
+ "01118",
+ "01120",
+ "01122",
+ "01124",
+ "01126",
+ "01128",
+ "01130",
+ "01132",
+ "01134",
+ "01136",
+ "01138",
+ "01140",
+ "01142",
+ "01144",
+ "01146",
+ "01148",
+ "01150",
+ "01152",
+ "01154",
+ "01156",
+ "01158",
+ "01160",
+ "01162",
+ "01164",
+ "01166",
+ "01168",
+ "01170",
+ "01172",
+ "01174",
+ "01176",
+ "01178",
+ "01180",
+ "01182",
+ "01184",
+ "01186",
+ "01188",
+ "01190",
+ "01192",
+ "01194",
+ "01196",
+ "01198",
+ "01200",
+ "01202",
+ "01204",
+ "01206",
+ "01208",
+ "01210",
+ "01212",
+ "01214",
+ "01216",
+ "01218",
+ "01220",
+ "01222",
+ "01224",
+ "01226",
+ "01228",
+ "01230",
+ "01232",
+ "01234",
+ "01236",
+ "01238",
+ "01240",
+ "01242",
+ "01244",
+ "01246",
+ "01248",
+ "01250",
+ "01252",
+ "01254",
+ "01256",
+ "01258",
+ "01260",
+ "01262",
+ "01264",
+ "01266",
+ "01268",
+ "01270",
+ "01272",
+ "01274",
+ "01276",
+ "01278",
+ "01280",
+ "01282",
+ "01284",
+ "01286",
+ "01288",
+ "01290",
+ "01292",
+ "01294",
+ "01296",
+ "01298",
+ "01300",
+ "01302",
+ "01304",
+ "01306",
+ "01308",
+ "01310",
+ "01312",
+ "01314",
+ "01316",
+ "01318",
+ "01320",
+ "01322",
+ "01324",
+ "01326",
+ "01328",
+ "01330",
+ "01332",
+ "01334",
+ "01336",
+ "01338",
+ "01340",
+ "01342",
+ "01344",
+ "01346",
+ "01348",
+ "01350",
+ "01352",
+ "01354",
+ "01356",
+ "01358",
+ "01360",
+ "01362",
+ "01364",
+ "01366",
+ "01368",
+ "01370",
+ "01372",
+ "01374",
+ "01376",
+ "01378",
+ "01380",
+ "01382",
+ "01384",
+ "01386",
+ "01388",
+ "01390",
+ "01392",
+ "01394",
+ "01396",
+ "01398",
+ "01400",
+ "01402",
+ "01404",
+ "01406",
+ "01408",
+ "01410",
+ "01412",
+ "01414",
+ "01416",
+ "01418",
+ "01420",
+ "01422",
+ "01424",
+ "01426",
+ "01428",
+ "01430",
+ "01432",
+ "01434",
+ "01436",
+ "01438",
+ "01440",
+ "01442",
+ "01444",
+ "01446",
+ "01448",
+ "01450",
+ "01452",
+ "01454",
+ "01456",
+ "01458",
+ "01460",
+ "01462",
+ "01464",
+ "01466",
+ "01468",
+ "01470",
+ "01472",
+ "01474",
+ "01476",
+ "01478",
+ "01480",
+ "01482",
+ "01484",
+ "01486",
+ "01488",
+ "01490",
+ "01492",
+ "01494",
+ "01496",
+ "01498",
+ "01500",
+ "01502",
+ "01504",
+ "01506",
+ "01508",
+ "01510",
+ "01512",
+ "01514",
+ "01516",
+ "01518",
+ "01520",
+ "01522",
+ "01524",
+ "01526",
+ "01528",
+ "01530",
+ "01532",
+ "01534",
+ "01536",
+ "01538",
+ "01540",
+ "01542",
+ "01544",
+ "01546",
+ "01548",
+ "01550",
+ "01552",
+ "01554",
+ "01556",
+ "01558",
+ "01560",
+ "01562",
+ "01564",
+ "01566",
+ "01568",
+ "01570",
+ "01572",
+ "01574",
+ "01576",
+ "01578",
+ "01580",
+ "01582",
+ "01584",
+ "01586",
+ "01588",
+ "01590",
+ "01592",
+ "01594",
+ "01596",
+ "01598",
+ "01600",
+ "01602",
+ "01604",
+ "01606",
+ "01608",
+ "01610",
+ "01612",
+ "01614",
+ "01616",
+ "01618",
+ "01620",
+ "01622",
+ "01624",
+ "01626",
+ "01628",
+ "01630",
+ "01632",
+ "01634",
+ "01636",
+ "01638",
+ "01640",
+ "01642",
+ "01644",
+ "01646",
+ "01648",
+ "01650",
+ "01652",
+ "01654",
+ "01656",
+ "01658",
+ "01660",
+ "01662",
+ "01664",
+ "01666",
+ "01668",
+ "01670",
+ "01672",
+ "01674",
+ "01676",
+ "01678",
+ "01680",
+ "01682",
+ "01684",
+ "01686",
+ "01688",
+ "01690",
+ "01692",
+ "01694",
+ "01696",
+ "01698",
+ "01700",
+ "01702",
+ "01704",
+ "01706",
+ "01708",
+ "01710",
+ "01712",
+ "01714",
+ "01716",
+ "01718",
+ "01720",
+ "01722",
+ "01724",
+ "01726",
+ "01728",
+ "01730",
+ "01732",
+ "01734",
+ "01736",
+ "01738",
+ "01740",
+ "01742",
+ "01744",
+ "01746",
+ "01748",
+ "01750",
+ "01752",
+ "01754",
+ "01756",
+ "01758",
+ "01760",
+ "01762",
+ "01764",
+ "01766",
+ "01768",
+ "01770",
+ "01772",
+ "01774",
+ "01776",
+ "01778",
+ "01780",
+ "01782",
+ "01784",
+ "01786",
+ "01788",
+ "01790",
+ "01792",
+ "01794",
+ "01796",
+ "01798",
+ "01800",
+ "01802",
+ "01804",
+ "01806",
+ "01808",
+ "01810",
+ "01812",
+ "01814",
+ "01816",
+ "01818",
+ "01820",
+ "01822",
+ "01824",
+ "01826",
+ "01828",
+ "01830",
+ "01832",
+ "01834",
+ "01836",
+ "01838",
+ "01840",
+ "01842",
+ "01844",
+ "01846",
+ "01848",
+ "01850",
+ "01852",
+ "01854",
+ "01856",
+ "01858",
+ "01860",
+ "01862",
+ "01864",
+ "01866",
+ "01868",
+ "01870",
+ "01872",
+ "01874",
+ "01876",
+ "01878",
+ "01880",
+ "01882",
+ "01884",
+ "01886",
+ "01888",
+ "01890",
+ "01892",
+ "01894",
+ "01896",
+ "01898",
+ "01900",
+ "01902",
+ "01904",
+ "01906",
+ "01908",
+ "01910",
+ "01912",
+ "01914",
+ "01916",
+ "01918",
+ "01920",
+ "01922",
+ "01924",
+ "01926",
+ "01928",
+ "01930",
+ "01932",
+ "01934",
+ "01936",
+ "01938",
+ "01940",
+ "01942",
+ "01944",
+ "01946",
+ "01948",
+ "01950",
+ "01952",
+ "01954",
+ "01956",
+ "01958",
+ "01960",
+ "01962",
+ "01964",
+ "01966",
+ "01968",
+ "01970",
+ "01972",
+ "01974",
+ "01976",
+ "01978",
+ "01980",
+ "01982",
+ "01984",
+ "01986",
+ "01988",
+ "01990",
+ "01992",
+ "01994",
+ "01996",
+ "01998",
+ "02000",
+ "02002",
+ "02004",
+ "02006",
+ "02008",
+ "02010",
+ "02012",
+ "02014",
+ "02016",
+ "02018",
+ "02020",
+ "02022",
+ "02024",
+ "02026",
+ "02028",
+ "02030",
+ "02032",
+ "02034",
+ "02036",
+ "02038",
+ "02040",
+ "02042",
+ "02044",
+ "02046",
+ "02048",
+ "02050",
+ "02052",
+ "02054",
+ "02056",
+ "02058",
+ "02060",
+ "02062",
+ "02064",
+ "02066",
+ "02068",
+ "02070",
+ "02072",
+ "02074",
+ "02076",
+ "02078",
+ "02080",
+ "02082",
+ "02084",
+ "02086",
+ "02088",
+ "02090",
+ "02092",
+ "02094",
+ "02096",
+ "02098",
+ "02100",
+ "02102",
+ "02104",
+ "02106",
+ "02108",
+ "02110",
+ "02112",
+ "02114",
+ "02116",
+ "02118",
+ "02120",
+ "02122",
+ "02124",
+ "02126",
+ "02128",
+ "02130",
+ "02132",
+ "02134",
+ "02136",
+ "02138",
+ "02140",
+ "02142",
+ "02144",
+ "02146",
+ "02148",
+ "02150",
+ "02152",
+ "02154",
+ "02156",
+ "02158",
+ "02160",
+ "02162",
+ "02164",
+ "02166",
+ "02168",
+ "02170",
+ "02172",
+ "02174",
+ "02176",
+ "02178",
+ "02180",
+ "02182",
+ "02184",
+ "02186",
+ "02188",
+ "02190",
+ "02192",
+ "02194",
+ "02196",
+ "02198",
+ "02200",
+ "02202",
+ "02204",
+ "02206",
+ "02208",
+ "02210",
+ "02212",
+ "02214",
+ "02216",
+ "02218",
+ "02220",
+ "02222",
+ "02224",
+ "02226",
+ "02228",
+ "02230",
+ "02232",
+ "02234",
+ "02236",
+ "02238",
+ "02240",
+ "02242",
+ "02244",
+ "02246",
+ "02248",
+ "02250",
+ "02252",
+ "02254",
+ "02256",
+ "02258",
+ "02260",
+ "02262",
+ "02264",
+ "02266",
+ "02268",
+ "02270",
+ "02272",
+ "02274",
+ "02276",
+ "02278",
+ "02280",
+ "02282",
+ "02284",
+ "02286",
+ "02288",
+ "02290",
+ "02292",
+ "02294",
+ "02296",
+ "02298",
+ "02300",
+ "02302",
+ "02304",
+ "02306",
+ "02308",
+ "02310",
+ "02312",
+ "02314",
+ "02316",
+ "02318",
+ "02320",
+ "02322",
+ "02324",
+ "02326",
+ "02328",
+ "02330",
+ "02332",
+ "02334",
+ "02336",
+ "02338",
+ "02340",
+ "02342",
+ "02344",
+ "02346",
+ "02348",
+ "02350",
+ "02352",
+ "02354",
+ "02356",
+ "02358",
+ "02360",
+ "02362",
+ "02364",
+ "02366",
+ "02368",
+ "02370",
+ "02372",
+ "02374",
+ "02376",
+ "02378",
+ "02380",
+ "02382",
+ "02384",
+ "02386",
+ "02388",
+ "02390",
+ "02392",
+ "02394",
+ "02396",
+ "02398",
+ "02400",
+ "02402",
+ "02404",
+ "02406",
+ "02408",
+ "02410",
+ "02412",
+ "02414",
+ "02416",
+ "02418",
+ "02420",
+ "02422",
+ "02424",
+ "02426",
+ "02428",
+ "02430",
+ "02432",
+ "02434",
+ "02436",
+ "02438",
+ "02440",
+ "02442",
+ "02444",
+ "02446",
+ "02448",
+ "02450",
+ "02452",
+ "02454",
+ "02456",
+ "02458",
+ "02460",
+ "02462",
+ "02464",
+ "02466",
+ "02468",
+ "02470",
+ "02472",
+ "02474",
+ "02476",
+ "02478",
+ "02480",
+ "02482",
+ "02484",
+ "02486",
+ "02488",
+ "02490",
+ "02492",
+ "02494",
+ "02496",
+ "02498",
+ "02500",
+ "02502",
+ "02504",
+ "02506",
+ "02508",
+ "02510",
+ "02512",
+ "02514",
+ "02516",
+ "02518",
+ "02520",
+ "02522",
+ "02524",
+ "02526",
+ "02528",
+ "02530",
+ "02532",
+ "02534",
+ "02536",
+ "02538",
+ "02540",
+ "02542",
+ "02544",
+ "02546",
+ "02548",
+ "02550",
+ "02552",
+ "02554",
+ "02556",
+ "02558",
+ "02560",
+ "02562",
+ "02564",
+ "02566",
+ "02568",
+ "02570",
+ "02572",
+ "02574",
+ "02576",
+ "02578",
+ "02580",
+ "02582",
+ "02584",
+ "02586",
+ "02588",
+ "02590",
+ "02592",
+ "02594",
+ "02596",
+ "02598",
+ "02600",
+ "02602",
+ "02604",
+ "02606",
+ "02608",
+ "02610",
+ "02612",
+ "02614",
+ "02616",
+ "02618",
+ "02620",
+ "02622",
+ "02624",
+ "02626",
+ "02628",
+ "02630",
+ "02632",
+ "02634",
+ "02636",
+ "02638",
+ "02640",
+ "02642",
+ "02644",
+ "02646",
+ "02648",
+ "02650",
+ "02652",
+ "02654",
+ "02656",
+ "02658",
+ "02660",
+ "02662",
+ "02664",
+ "02666",
+ "02668",
+ "02670",
+ "02672",
+ "02674",
+ "02676",
+ "02678",
+ "02680",
+ "02682",
+ "02684",
+ "02686",
+ "02688",
+ "02690",
+ "02692",
+ "02694",
+ "02696",
+ "02698",
+ "02700",
+ "02702",
+ "02704",
+ "02706",
+ "02708",
+ "02710",
+ "02712",
+ "02714",
+ "02716",
+ "02718",
+ "02720",
+ "02722",
+ "02724",
+ "02726",
+ "02728",
+ "02730",
+ "02732",
+ "02734",
+ "02736",
+ "02738",
+ "02740",
+ "02742",
+ "02744",
+ "02746",
+ "02748",
+ "02750",
+ "02752",
+ "02754",
+ "02756",
+ "02758",
+ "02760",
+ "02762",
+ "02764",
+ "02766",
+ "02768",
+ "02770",
+ "02772",
+ "02774",
+ "02776",
+ "02778",
+ "02780",
+ "02782",
+ "02784",
+ "02786",
+ "02788",
+ "02790",
+ "02792",
+ "02794",
+ "02796",
+ "02798",
+ "02800",
+ "02802",
+ "02804",
+ "02806",
+ "02808",
+ "02810",
+ "02812",
+ "02814",
+ "02816",
+ "02818",
+ "02820",
+ "02822",
+ "02824",
+ "02826",
+ "02828",
+ "02830",
+ "02832",
+ "02834",
+ "02836",
+ "02838",
+ "02840",
+ "02842",
+ "02844",
+ "02846",
+ "02848",
+ "02850",
+ "02852",
+ "02854",
+ "02856",
+ "02858",
+ "02860",
+ "02862",
+ "02864",
+ "02866",
+ "02868",
+ "02870",
+ "02872",
+ "02874",
+ "02876",
+ "02878",
+ "02880",
+ "02882",
+ "02884",
+ "02886",
+ "02888",
+ "02890",
+ "02892",
+ "02894",
+ "02896",
+ "02898",
+ "02900",
+ "02902",
+ "02904",
+ "02906",
+ "02908",
+ "02910",
+ "02912",
+ "02914",
+ "02916",
+ "02918",
+ "02920",
+ "02922",
+ "02924",
+ "02926",
+ "02928",
+ "02930",
+ "02932",
+ "02934",
+ "02936",
+ "02938",
+ "02940",
+ "02942",
+ "02944",
+ "02946",
+ "02948",
+ "02950",
+ "02952",
+ "02954",
+ "02956",
+ "02958",
+ "02960",
+ "02962",
+ "02964",
+ "02966",
+ "02968",
+ "02970",
+ "02972",
+ "02974",
+ "02976",
+ "02978",
+ "02980",
+ "02982",
+ "02984",
+ "02986",
+ "02988",
+ "02990",
+ "02992",
+ "02994",
+ "02996",
+ "02998",
+ "03000",
+ "03002",
+ "03004",
+ "03006",
+ "03008",
+ "03010",
+ "03012",
+ "03014",
+ "03016",
+ "03018",
+ "03020",
+ "03022",
+ "03024",
+ "03026",
+ "03028",
+ "03030",
+ "03032",
+ "03034",
+ "03036",
+ "03038",
+ "03040",
+ "03042",
+ "03044",
+ "03046",
+ "03048",
+ "03050",
+ "03052",
+ "03054",
+ "03056",
+ "03058",
+ "03060",
+ "03062",
+ "03064",
+ "03066",
+ "03068",
+ "03070",
+ "03072",
+ "03074",
+ "03076",
+ "03078",
+ "03080",
+ "03082",
+ "03084",
+ "03086",
+ "03088",
+ "03090",
+ "03092",
+ "03094",
+ "03096",
+ "03098",
+ "03100",
+ "03102",
+ "03104",
+ "03106",
+ "03108",
+ "03110",
+ "03112",
+ "03114",
+ "03116",
+ "03118",
+ "03120",
+ "03122",
+ "03124",
+ "03126",
+ "03128",
+ "03130",
+ "03132",
+ "03134",
+ "03136",
+ "03138",
+ "03140",
+ "03142",
+ "03144",
+ "03146",
+ "03148",
+ "03150",
+ "03152",
+ "03154",
+ "03156",
+ "03158",
+ "03160",
+ "03162",
+ "03164",
+ "03166",
+ "03168",
+ "03170",
+ "03172",
+ "03174",
+ "03176",
+ "03178",
+ "03180",
+ "03182",
+ "03184",
+ "03186",
+ "03188",
+ "03190",
+ "03192",
+ "03194",
+ "03196",
+ "03198",
+ "03200",
+ "03202",
+ "03204",
+ "03206",
+ "03208",
+ "03210",
+ "03212",
+ "03214",
+ "03216",
+ "03218",
+ "03220",
+ "03222",
+ "03224",
+ "03226",
+ "03228",
+ "03230",
+ "03232",
+ "03234",
+ "03236",
+ "03238",
+ "03240",
+ "03242",
+ "03244",
+ "03246",
+ "03248",
+ "03250",
+ "03252",
+ "03254",
+ "03256",
+ "03258",
+ "03260",
+ "03262",
+ "03264",
+ "03266",
+ "03268",
+ "03270",
+ "03272",
+ "03274",
+ "03276",
+ "03278",
+ "03280",
+ "03282",
+ "03284",
+ "03286",
+ "03288",
+ "03290",
+ "03292",
+ "03294",
+ "03296",
+ "03298",
+ "03300",
+ "03302",
+ "03304",
+ "03306",
+ "03308",
+ "03310",
+ "03312",
+ "03314",
+ "03316",
+ "03318",
+ "03320",
+ "03322",
+ "03324",
+ "03326",
+ "03328",
+ "03330",
+ "03332",
+ "03334",
+ "03336",
+ "03338",
+ "03340",
+ "03342",
+ "03344",
+ "03346",
+ "03348",
+ "03350",
+ "03352",
+ "03354",
+ "03356",
+ "03358",
+ "03360",
+ "03362",
+ "03364",
+ "03366",
+ "03368",
+ "03370",
+ "03372",
+ "03374",
+ "03376",
+ "03378",
+ "03380",
+ "03382",
+ "03384",
+ "03386",
+ "03388",
+ "03390",
+ "03392",
+ "03394",
+ "03396",
+ "03398",
+ "03400",
+ "03402",
+ "03404",
+ "03406",
+ "03408",
+ "03410",
+ "03412",
+ "03414",
+ "03416",
+ "03418",
+ "03420",
+ "03422",
+ "03424",
+ "03426",
+ "03428",
+ "03430",
+ "03432",
+ "03434",
+ "03436",
+ "03438",
+ "03440",
+ "03442",
+ "03444",
+ "03446",
+ "03448",
+ "03450",
+ "03452",
+ "03454",
+ "03456",
+ "03458",
+ "03460",
+ "03462",
+ "03464",
+ "03466",
+ "03468",
+ "03470",
+ "03472",
+ "03474",
+ "03476",
+ "03478",
+ "03480",
+ "03482",
+ "03484",
+ "03486",
+ "03488",
+ "03490",
+ "03492",
+ "03494",
+ "03496",
+ "03498",
+ "03500",
+ "03502",
+ "03504",
+ "03506",
+ "03508",
+ "03510",
+ "03512",
+ "03514",
+ "03516",
+ "03518",
+ "03520",
+ "03522",
+ "03524",
+ "03526",
+ "03528",
+ "03530",
+ "03532",
+ "03534",
+ "03536",
+ "03538",
+ "03540",
+ "03542",
+ "03544",
+ "03546",
+ "03548",
+ "03550",
+ "03552",
+ "03554",
+ "03556",
+ "03558",
+ "03560",
+ "03562",
+ "03564",
+ "03566",
+ "03568",
+ "03570",
+ "03572",
+ "03574",
+ "03576",
+ "03578",
+ "03580",
+ "03582",
+ "03584",
+ "03586",
+ "03588",
+ "03590",
+ "03592",
+ "03594",
+ "03596",
+ "03598",
+ "03600",
+ "03602",
+ "03604",
+ "03606",
+ "03608",
+ "03610",
+ "03612",
+ "03614",
+ "03616",
+ "03618",
+ "03620",
+ "03622",
+ "03624",
+ "03626",
+ "03628",
+ "03630",
+ "03632",
+ "03634",
+ "03636",
+ "03638",
+ "03640",
+ "03642",
+ "03644",
+ "03646",
+ "03648",
+ "03650",
+ "03652",
+ "03654",
+ "03656",
+ "03658",
+ "03660",
+ "03662",
+ "03664",
+ "03666",
+ "03668",
+ "03670",
+ "03672",
+ "03674",
+ "03676",
+ "03678",
+ "03680",
+ "03682",
+ "03684",
+ "03686",
+ "03688",
+ "03690",
+ "03692",
+ "03694",
+ "03696",
+ "03698",
+ "03700",
+ "03702",
+ "03704",
+ "03706",
+ "03708",
+ "03710",
+ "03712",
+ "03714",
+ "03716",
+ "03718",
+ "03720",
+ "03722",
+ "03724",
+ "03726",
+ "03728",
+ "03730",
+ "03732",
+ "03734",
+ "03736",
+ "03738",
+ "03740",
+ "03742",
+ "03744",
+ "03746",
+ "03748",
+ "03750",
+ "03752",
+ "03754",
+ "03756",
+ "03758",
+ "03760",
+ "03762",
+ "03764",
+ "03766",
+ "03768",
+ "03770",
+ "03772",
+ "03774",
+ "03776",
+ "03778",
+ "03780",
+ "03782",
+ "03784",
+ "03786",
+ "03788",
+ "03790",
+ "03792",
+ "03794",
+ "03796",
+ "03798",
+ "03800",
+ "03802",
+ "03804",
+ "03806",
+ "03808",
+ "03810",
+ "03812",
+ "03814",
+ "03816",
+ "03818",
+ "03820",
+ "03822",
+ "03824",
+ "03826",
+ "03828",
+ "03830",
+ "03832",
+ "03834",
+ "03836",
+ "03838",
+ "03840",
+ "03842",
+ "03844",
+ "03846",
+ "03848",
+ "03850",
+ "03852",
+ "03854",
+ "03856",
+ "03858",
+ "03860",
+ "03862",
+ "03864",
+ "03866",
+ "03868",
+ "03870",
+ "03872",
+ "03874",
+ "03876",
+ "03878",
+ "03880",
+ "03882",
+ "03884",
+ "03886",
+ "03888",
+ "03890",
+ "03892",
+ "03894",
+ "03896",
+ "03898",
+ "03900",
+ "03902",
+ "03904",
+ "03906",
+ "03908",
+ "03910",
+ "03912",
+ "03914",
+ "03916",
+ "03918",
+ "03920",
+ "03922",
+ "03924",
+ "03926",
+ "03928",
+ "03930",
+ "03932",
+ "03934",
+ "03936",
+ "03938",
+ "03940",
+ "03942",
+ "03944",
+ "03946",
+ "03948",
+ "03950",
+ "03952",
+ "03954",
+ "03956",
+ "03958",
+ "03960",
+ "03962",
+ "03964",
+ "03966",
+ "03968",
+ "03970",
+ "03972",
+ "03974",
+ "03976",
+ "03978",
+ "03980",
+ "03982",
+ "03984",
+ "03986",
+ "03988",
+ "03990",
+ "03992",
+ "03994",
+ "03996",
+ "03998",
+ "04000",
+ "04002",
+ "04004",
+ "04006",
+ "04008",
+ "04010",
+ "04012",
+ "04014",
+ "04016",
+ "04018",
+ "04020",
+ "04022",
+ "04024",
+ "04026",
+ "04028",
+ "04030",
+ "04032",
+ "04034",
+ "04036",
+ "04038",
+ "04040",
+ "04042",
+ "04044",
+ "04046",
+ "04048",
+ "04050",
+ "04052",
+ "04054",
+ "04056",
+ "04058",
+ "04060",
+ "04062",
+ "04064",
+ "04066",
+ "04068",
+ "04070",
+ "04072",
+ "04074",
+ "04076",
+ "04078",
+ "04080",
+ "04082",
+ "04084",
+ "04086",
+ "04088",
+ "04090",
+ "04092",
+ "04094",
+ "04096",
+ "04098",
+ "04100",
+ "04102",
+ "04104",
+ "04106",
+ "04108",
+ "04110",
+ "04112",
+ "04114",
+ "04116",
+ "04118",
+ "04120",
+ "04122",
+ "04124",
+ "04126",
+ "04128",
+ "04130",
+ "04132",
+ "04134",
+ "04136",
+ "04138",
+ "04140",
+ "04142",
+ "04144",
+ "04146",
+ "04148",
+ "04150",
+ "04152",
+ "04154",
+ "04156",
+ "04158",
+ "04160",
+ "04162",
+ "04164",
+ "04166",
+ "04168",
+ "04170",
+ "04172",
+ "04174",
+ "04176",
+ "04178",
+ "04180",
+ "04182",
+ "04184",
+ "04186",
+ "04188",
+ "04190",
+ "04192",
+ "04194",
+ "04196",
+ "04198",
+ "04200",
+ "04202",
+ "04204",
+ "04206",
+ "04208",
+ "04210",
+ "04212",
+ "04214",
+ "04216",
+ "04218",
+ "04220",
+ "04222",
+ "04224",
+ "04226",
+ "04228",
+ "04230",
+ "04232",
+ "04234",
+ "04236",
+ "04238",
+ "04240",
+ "04242",
+ "04244",
+ "04246",
+ "04248",
+ "04250",
+ "04252",
+ "04254",
+ "04256",
+ "04258",
+ "04260",
+ "04262",
+ "04264",
+ "04266",
+ "04268",
+ "04270",
+ "04272",
+ "04274",
+ "04276",
+ "04278",
+ "04280",
+ "04282",
+ "04284",
+ "04286",
+ "04288",
+ "04290",
+ "04292",
+ "04294",
+ "04296",
+ "04298",
+ "04300",
+ "04302",
+ "04304",
+ "04306",
+ "04308",
+ "04310",
+ "04312",
+ "04314",
+ "04316",
+ "04318",
+ "04320",
+ "04322",
+ "04324",
+ "04326",
+ "04328",
+ "04330",
+ "04332",
+ "04334",
+ "04336",
+ "04338",
+ "04340",
+ "04342",
+ "04344",
+ "04346",
+ "04348",
+ "04350",
+ "04352",
+ "04354",
+ "04356",
+ "04358",
+ "04360",
+ "04362",
+ "04364",
+ "04366",
+ "04368",
+ "04370",
+ "04372",
+ "04374",
+ "04376",
+ "04378",
+ "04380",
+ "04382",
+ "04384",
+ "04386",
+ "04388",
+ "04390",
+ "04392",
+ "04394",
+ "04396",
+ "04398",
+ "04400",
+ "04402",
+ "04404",
+ "04406",
+ "04408",
+ "04410",
+ "04412",
+ "04414",
+ "04416",
+ "04418",
+ "04420",
+ "04422",
+ "04424",
+ "04426",
+ "04428",
+ "04430",
+ "04432",
+ "04434",
+ "04436",
+ "04438",
+ "04440",
+ "04442",
+ "04444",
+ "04446",
+ "04448",
+ "04450",
+ "04452",
+ "04454",
+ "04456",
+ "04458",
+ "04460",
+ "04462",
+ "04464",
+ "04466",
+ "04468",
+ "04470",
+ "04472",
+ "04474",
+ "04476",
+ "04478",
+ "04480",
+ "04482",
+ "04484",
+ "04486",
+ "04488",
+ "04490",
+ "04492",
+ "04494",
+ "04496",
+ "04498",
+ "04500",
+ "04502",
+ "04504",
+ "04506",
+ "04508",
+ "04510",
+ "04512",
+ "04514",
+ "04516",
+ "04518",
+ "04520",
+ "04522",
+ "04524",
+ "04526",
+ "04528",
+ "04530",
+ "04532",
+ "04534",
+ "04536",
+ "04538",
+ "04540",
+ "04542",
+ "04544",
+ "04546",
+ "04548",
+ "04550",
+ "04552",
+ "04554",
+ "04556",
+ "04558",
+ "04560",
+ "04562",
+ "04564",
+ "04566",
+ "04568",
+ "04570",
+ "04572",
+ "04574",
+ "04576",
+ "04578",
+ "04580",
+ "04582",
+ "04584",
+ "04586",
+ "04588",
+ "04590",
+ "04592",
+ "04594",
+ "04596",
+ "04598",
+ "04600",
+ "04602",
+ "04604",
+ "04606",
+ "04608",
+ "04610",
+ "04612",
+ "04614",
+ "04616",
+ "04618",
+ "04620",
+ "04622",
+ "04624",
+ "04626",
+ "04628",
+ "04630",
+ "04632",
+ "04634",
+ "04636",
+ "04638",
+ "04640",
+ "04642",
+ "04644",
+ "04646",
+ "04648",
+ "04650",
+ "04652",
+ "04654",
+ "04656",
+ "04658",
+ "04660",
+ "04662",
+ "04664",
+ "04666",
+ "04668",
+ "04670",
+ "04672",
+ "04674",
+ "04676",
+ "04678",
+ "04680",
+ "04682",
+ "04684",
+ "04686",
+ "04688",
+ "04690",
+ "04692",
+ "04694",
+ "04696",
+ "04698",
+ "04700",
+ "04702",
+ "04704",
+ "04706",
+ "04708",
+ "04710",
+ "04712",
+ "04714",
+ "04716",
+ "04718",
+ "04720",
+ "04722",
+ "04724",
+ "04726",
+ "04728",
+ "04730",
+ "04732",
+ "04734",
+ "04736",
+ "04738",
+ "04740",
+ "04742",
+ "04744",
+ "04746",
+ "04748",
+ "04750",
+ "04752",
+ "04754",
+ "04756",
+ "04758",
+ "04760",
+ "04762",
+ "04764",
+ "04766",
+ "04768",
+ "04770",
+ "04772",
+ "04774",
+ "04776",
+ "04778",
+ "04780",
+ "04782",
+ "04784",
+ "04786",
+ "04788",
+ "04790",
+ "04792",
+ "04794",
+ "04796",
+ "04798",
+ "04800",
+ "04802",
+ "04804",
+ "04806",
+ "04808",
+ "04810",
+ "04812",
+ "04814",
+ "04816",
+ "04818",
+ "04820",
+ "04822",
+ "04824",
+ "04826",
+ "04828",
+ "04830",
+ "04832",
+ "04834",
+ "04836",
+ "04838",
+ "04840",
+ "04842",
+ "04844",
+ "04846",
+ "04848",
+ "04850",
+ "04852",
+ "04854",
+ "04856",
+ "04858",
+ "04860",
+ "04862",
+ "04864",
+ "04866",
+ "04868",
+ "04870",
+ "04872",
+ "04874",
+ "04876",
+ "04878",
+ "04880",
+ "04882",
+ "04884",
+ "04886",
+ "04888",
+ "04890",
+ "04892",
+ "04894",
+ "04896",
+ "04898",
+ "04900",
+ "04902",
+ "04904",
+ "04906",
+ "04908",
+ "04910",
+ "04912",
+ "04914",
+ "04916",
+ "04918",
+ "04920",
+ "04922",
+ "04924",
+ "04926",
+ "04928",
+ "04930",
+ "04932",
+ "04934",
+ "04936",
+ "04938",
+ "04940",
+ "04942",
+ "04944",
+ "04946",
+ "04948",
+ "04950",
+ "04952",
+ "04954",
+ "04956",
+ "04958",
+ "04960",
+ "04962",
+ "04964",
+ "04966",
+ "04968",
+ "04970",
+ "04972",
+ "04974",
+ "04976",
+ "04978",
+ "04980",
+ "04982",
+ "04984",
+ "04986",
+ "04988",
+ "04990",
+ "04992",
+ "04994",
+ "04996",
+ "04998",
+ "05000",
+ "05002",
+ "05004",
+ "05006",
+ "05008",
+ "05010",
+ "05012",
+ "05014",
+ "05016",
+ "05018",
+ "05020",
+ "05022",
+ "05024",
+ "05026",
+ "05028",
+ "05030",
+ "05032",
+ "05034",
+ "05036",
+ "05038",
+ "05040",
+ "05042",
+ "05044",
+ "05046",
+ "05048",
+ "05050",
+ "05052",
+ "05054",
+ "05056",
+ "05058",
+ "05060",
+ "05062",
+ "05064",
+ "05066",
+ "05068",
+ "05070",
+ "05072",
+ "05074",
+ "05076",
+ "05078",
+ "05080",
+ "05082",
+ "05084",
+ "05086",
+ "05088",
+ "05090",
+ "05092",
+ "05094",
+ "05096",
+ "05098",
+ "05100",
+ "05102",
+ "05104",
+ "05106",
+ "05108",
+ "05110",
+ "05112",
+ "05114",
+ "05116",
+ "05118",
+ "05120",
+ "05122",
+ "05124",
+ "05126",
+ "05128",
+ "05130",
+ "05132",
+ "05134",
+ "05136",
+ "05138",
+ "05140",
+ "05142",
+ "05144",
+ "05146",
+ "05148",
+ "05150",
+ "05152",
+ "05154",
+ "05156",
+ "05158",
+ "05160",
+ "05162",
+ "05164",
+ "05166",
+ "05168",
+ "05170",
+ "05172",
+ "05174",
+ "05176",
+ "05178",
+ "05180",
+ "05182",
+ "05184",
+ "05186",
+ "05188",
+ "05190",
+ "05192",
+ "05194",
+ "05196",
+ "05198",
+ "05200",
+ "05202",
+ "05204",
+ "05206",
+ "05208",
+ "05210",
+ "05212",
+ "05214",
+ "05216",
+ "05218",
+ "05220",
+ "05222",
+ "05224",
+ "05226",
+ "05228",
+ "05230",
+ "05232",
+ "05234",
+ "05236",
+ "05238",
+ "05240",
+ "05242",
+ "05244",
+ "05246",
+ "05248",
+ "05250",
+ "05252",
+ "05254",
+ "05256",
+ "05258",
+ "05260",
+ "05262",
+ "05264",
+ "05266",
+ "05268",
+ "05270",
+ "05272",
+ "05274",
+ "05276",
+ "05278",
+ "05280",
+ "05282",
+ "05284",
+ "05286",
+ "05288",
+ "05290",
+ "05292",
+ "05294",
+ "05296",
+ "05298",
+ "05300",
+ "05302",
+ "05304",
+ "05306",
+ "05308",
+ "05310",
+ "05312",
+ "05314",
+ "05316",
+ "05318",
+ "05320",
+ "05322",
+ "05324",
+ "05326",
+ "05328",
+ "05330",
+ "05332",
+ "05334",
+ "05336",
+ "05338",
+ "05340",
+ "05342",
+ "05344",
+ "05346",
+ "05348",
+ "05350",
+ "05352",
+ "05354",
+ "05356",
+ "05358",
+ "05360",
+ "05362",
+ "05364",
+ "05366",
+ "05368",
+ "05370",
+ "05372",
+ "05374",
+ "05376",
+ "05378",
+ "05380",
+ "05382",
+ "05384",
+ "05386",
+ "05388",
+ "05390",
+ "05392",
+ "05394",
+ "05396",
+ "05398",
+ "05400",
+ "05402",
+ "05404",
+ "05406",
+ "05408",
+ "05410",
+ "05412",
+ "05414",
+ "05416",
+ "05418",
+ "05420",
+ "05422",
+ "05424",
+ "05426",
+ "05428",
+ "05430",
+ "05432",
+ "05434",
+ "05436",
+ "05438",
+ "05440",
+ "05442",
+ "05444",
+ "05446",
+ "05448",
+ "05450",
+ "05452",
+ "05454",
+ "05456",
+ "05458",
+ "05460",
+ "05462",
+ "05464",
+ "05466",
+ "05468",
+ "05470",
+ "05472",
+ "05474",
+ "05476",
+ "05478",
+ "05480",
+ "05482",
+ "05484",
+ "05486",
+ "05488",
+ "05490",
+ "05492",
+ "05494",
+ "05496",
+ "05498",
+ "05500",
+ "05502",
+ "05504",
+ "05506",
+ "05508",
+ "05510",
+ "05512",
+ "05514",
+ "05516",
+ "05518",
+ "05520",
+ "05522",
+ "05524",
+ "05526",
+ "05528",
+ "05530",
+ "05532",
+ "05534",
+ "05536",
+ "05538",
+ "05540",
+ "05542",
+ "05544",
+ "05546",
+ "05548",
+ "05550",
+ "05552",
+ "05554",
+ "05556",
+ "05558",
+ "05560",
+ "05562",
+ "05564",
+ "05566",
+ "05568",
+ "05570",
+ "05572",
+ "05574",
+ "05576",
+ "05578",
+ "05580",
+ "05582",
+ "05584",
+ "05586",
+ "05588",
+ "05590",
+ "05592",
+ "05594",
+ "05596",
+ "05598",
+ "05600",
+ "05602",
+ "05604",
+ "05606",
+ "05608",
+ "05610",
+ "05612",
+ "05614",
+ "05616",
+ "05618",
+ "05620",
+ "05622",
+ "05624",
+ "05626",
+ "05628",
+ "05630",
+ "05632",
+ "05634",
+ "05636",
+ "05638",
+ "05640",
+ "05642",
+ "05644",
+ "05646",
+ "05648",
+ "05650",
+ "05652",
+ "05654",
+ "05656",
+ "05658",
+ "05660",
+ "05662",
+ "05664",
+ "05666",
+ "05668",
+ "05670",
+ "05672",
+ "05674",
+ "05676",
+ "05678",
+ "05680",
+ "05682",
+ "05684",
+ "05686",
+ "05688",
+ "05690",
+ "05692",
+ "05694",
+ "05696",
+ "05698",
+ "05700",
+ "05702",
+ "05704",
+ "05706",
+ "05708",
+ "05710",
+ "05712",
+ "05714",
+ "05716",
+ "05718",
+ "05720",
+ "05722",
+ "05724",
+ "05726",
+ "05728",
+ "05730",
+ "05732",
+ "05734",
+ "05736",
+ "05738",
+ "05740",
+ "05742",
+ "05744",
+ "05746",
+ "05748",
+ "05750",
+ "05752",
+ "05754",
+ "05756",
+ "05758",
+ "05760",
+ "05762",
+ "05764",
+ "05766",
+ "05768",
+ "05770",
+ "05772",
+ "05774",
+ "05776",
+ "05778",
+ "05780",
+ "05782",
+ "05784",
+ "05786",
+ "05788",
+ "05790",
+ "05792",
+ "05794",
+ "05796",
+ "05798",
+ "05800",
+ "05802",
+ "05804",
+ "05806",
+ "05808",
+ "05810",
+ "05812",
+ "05814",
+ "05816",
+ "05818",
+ "05820",
+ "05822",
+ "05824",
+ "05826",
+ "05828",
+ "05830",
+ "05832",
+ "05834",
+ "05836",
+ "05838",
+ "05840",
+ "05842",
+ "05844",
+ "05846",
+ "05848",
+ "05850",
+ "05852",
+ "05854",
+ "05856",
+ "05858",
+ "05860",
+ "05862",
+ "05864",
+ "05866",
+ "05868",
+ "05870",
+ "05872",
+ "05874",
+ "05876",
+ "05878",
+ "05880",
+ "05882",
+ "05884",
+ "05886",
+ "05888",
+ "05890",
+ "05892",
+ "05894",
+ "05896",
+ "05898",
+ "05900",
+ "05902",
+ "05904",
+ "05906",
+ "05908",
+ "05910",
+ "05912",
+ "05914",
+ "05916",
+ "05918",
+ "05920",
+ "05922",
+ "05924",
+ "05926",
+ "05928",
+ "05930",
+ "05932",
+ "05934",
+ "05936",
+ "05938",
+ "05940",
+ "05942",
+ "05944",
+ "05946",
+ "05948",
+ "05950",
+ "05952",
+ "05954",
+ "05956",
+ "05958",
+ "05960",
+ "05962",
+ "05964",
+ "05966",
+ "05968",
+ "05970",
+ "05972",
+ "05974",
+ "05976",
+ "05978",
+ "05980",
+ "05982",
+ "05984",
+ "05986",
+ "05988",
+ "05990",
+ "05992",
+ "05994",
+ "05996",
+ "05998",
+ "06000",
+ "06002",
+ "06004",
+ "06006",
+ "06008",
+ "06010",
+ "06012",
+ "06014",
+ "06016",
+ "06018",
+ "06020",
+ "06022",
+ "06024",
+ "06026",
+ "06028",
+ "06030",
+ "06032",
+ "06034",
+ "06036",
+ "06038",
+ "06040",
+ "06042",
+ "06044",
+ "06046",
+ "06048",
+ "06050",
+ "06052",
+ "06054",
+ "06056",
+ "06058",
+ "06060",
+ "06062",
+ "06064",
+ "06066",
+ "06068",
+ "06070",
+ "06072",
+ "06074",
+ "06076",
+ "06078",
+ "06080",
+ "06082",
+ "06084",
+ "06086",
+ "06088",
+ "06090",
+ "06092",
+ "06094",
+ "06096",
+ "06098",
+ "06100",
+ "06102",
+ "06104",
+ "06106",
+ "06108",
+ "06110",
+ "06112",
+ "06114",
+ "06116",
+ "06118",
+ "06120",
+ "06122",
+ "06124",
+ "06126",
+ "06128",
+ "06130",
+ "06132",
+ "06134",
+ "06136",
+ "06138",
+ "06140",
+ "06142",
+ "06144",
+ "06146",
+ "06148",
+ "06150",
+ "06152",
+ "06154",
+ "06156",
+ "06158",
+ "06160",
+ "06162",
+ "06164",
+ "06166",
+ "06168",
+ "06170",
+ "06172",
+ "06174",
+ "06176",
+ "06178",
+ "06180",
+ "06182",
+ "06184",
+ "06186",
+ "06188",
+ "06190",
+ "06192",
+ "06194",
+ "06196",
+ "06198",
+ "06200",
+ "06202",
+ "06204",
+ "06206",
+ "06208",
+ "06210",
+ "06212",
+ "06214",
+ "06216",
+ "06218",
+ "06220",
+ "06222",
+ "06224",
+ "06226",
+ "06228",
+ "06230",
+ "06232",
+ "06234",
+ "06236",
+ "06238",
+ "06240",
+ "06242",
+ "06244",
+ "06246",
+ "06248",
+ "06250",
+ "06252",
+ "06254",
+ "06256",
+ "06258",
+ "06260",
+ "06262",
+ "06264",
+ "06266",
+ "06268",
+ "06270",
+ "06272",
+ "06274",
+ "06276",
+ "06278",
+ "06280",
+ "06282",
+ "06284",
+ "06286",
+ "06288",
+ "06290",
+ "06292",
+ "06294",
+ "06296",
+ "06298",
+ "06300",
+ "06302",
+ "06304",
+ "06306",
+ "06308",
+ "06310",
+ "06312",
+ "06314",
+ "06316",
+ "06318",
+ "06320",
+ "06322",
+ "06324",
+ "06326",
+ "06328",
+ "06330",
+ "06332",
+ "06334",
+ "06336",
+ "06338",
+ "06340",
+ "06342",
+ "06344",
+ "06346",
+ "06348",
+ "06350",
+ "06352",
+ "06354",
+ "06356",
+ "06358",
+ "06360",
+ "06362",
+ "06364",
+ "06366",
+ "06368",
+ "06370",
+ "06372",
+ "06374",
+ "06376",
+ "06378",
+ "06380",
+ "06382",
+ "06384",
+ "06386",
+ "06388",
+ "06390",
+ "06392",
+ "06394",
+ "06396",
+ "06398",
+ "06400",
+ "06402",
+ "06404",
+ "06406",
+ "06408",
+ "06410",
+ "06412",
+ "06414",
+ "06416",
+ "06418",
+ "06420",
+ "06422",
+ "06424",
+ "06426",
+ "06428",
+ "06430",
+ "06432",
+ "06434",
+ "06436",
+ "06438",
+ "06440",
+ "06442",
+ "06444",
+ "06446",
+ "06448",
+ "06450",
+ "06452",
+ "06454",
+ "06456",
+ "06458",
+ "06460",
+ "06462",
+ "06464",
+ "06466",
+ "06468",
+ "06470",
+ "06472",
+ "06474",
+ "06476",
+ "06478",
+ "06480",
+ "06482",
+ "06484",
+ "06486",
+ "06488",
+ "06490",
+ "06492",
+ "06494",
+ "06496",
+ "06498",
+ "06500",
+ "06502",
+ "06504",
+ "06506",
+ "06508",
+ "06510",
+ "06512",
+ "06514",
+ "06516",
+ "06518",
+ "06520",
+ "06522",
+ "06524",
+ "06526",
+ "06528",
+ "06530",
+ "06532",
+ "06534",
+ "06536",
+ "06538",
+ "06540",
+ "06542",
+ "06544",
+ "06546",
+ "06548",
+ "06550",
+ "06552",
+ "06554",
+ "06556",
+ "06558",
+ "06560",
+ "06562",
+ "06564",
+ "06566",
+ "06568",
+ "06570",
+ "06572",
+ "06574",
+ "06576",
+ "06578",
+ "06580",
+ "06582",
+ "06584",
+ "06586",
+ "06588",
+ "06590",
+ "06592",
+ "06594",
+ "06596",
+ "06598",
+ "06600",
+ "06602",
+ "06604",
+ "06606",
+ "06608",
+ "06610",
+ "06612",
+ "06614",
+ "06616",
+ "06618",
+ "06620",
+ "06622",
+ "06624",
+ "06626",
+ "06628",
+ "06630",
+ "06632",
+ "06634",
+ "06636",
+ "06638",
+ "06640",
+ "06642",
+ "06644",
+ "06646",
+ "06648",
+ "06650",
+ "06652",
+ "06654",
+ "06656",
+ "06658",
+ "06660",
+ "06662",
+ "06664",
+ "06666",
+ "06668",
+ "06670",
+ "06672",
+ "06674",
+ "06676",
+ "06678",
+ "06680",
+ "06682",
+ "06684",
+ "06686",
+ "06688",
+ "06690",
+ "06692",
+ "06694",
+ "06696",
+ "06698",
+ "06700",
+ "06702",
+ "06704",
+ "06706",
+ "06708",
+ "06710",
+ "06712",
+ "06714",
+ "06716",
+ "06718",
+ "06720",
+ "06722",
+ "06724",
+ "06726",
+ "06728",
+ "06730",
+ "06732",
+ "06734",
+ "06736",
+ "06738",
+ "06740",
+ "06742",
+ "06744",
+ "06746",
+ "06748",
+ "06750",
+ "06752",
+ "06754",
+ "06756",
+ "06758",
+ "06760",
+ "06762",
+ "06764",
+ "06766",
+ "06768",
+ "06770",
+ "06772",
+ "06774",
+ "06776",
+ "06778",
+ "06780",
+ "06782",
+ "06784",
+ "06786",
+ "06788",
+ "06790",
+ "06792",
+ "06794",
+ "06796",
+ "06798",
+ "06800",
+ "06802",
+ "06804",
+ "06806",
+ "06808",
+ "06810",
+ "06812",
+ "06814",
+ "06816",
+ "06818",
+ "06820",
+ "06822",
+ "06824",
+ "06826",
+ "06828",
+ "06830",
+ "06832",
+ "06834",
+ "06836",
+ "06838",
+ "06840",
+ "06842",
+ "06844",
+ "06846",
+ "06848",
+ "06850",
+ "06852",
+ "06854",
+ "06856",
+ "06858",
+ "06860",
+ "06862",
+ "06864",
+ "06866",
+ "06868",
+ "06870",
+ "06872",
+ "06874",
+ "06876",
+ "06878",
+ "06880",
+ "06882",
+ "06884",
+ "06886",
+ "06888",
+ "06890",
+ "06892",
+ "06894",
+ "06896",
+ "06898",
+ "06900",
+ "06902",
+ "06904",
+ "06906",
+ "06908",
+ "06910",
+ "06912",
+ "06914",
+ "06916",
+ "06918",
+ "06920",
+ "06922",
+ "06924",
+ "06926",
+ "06928",
+ "06930",
+ "06932",
+ "06934",
+ "06936",
+ "06938",
+ "06940",
+ "06942",
+ "06944",
+ "06946",
+ "06948",
+ "06950",
+ "06952",
+ "06954",
+ "06956",
+ "06958",
+ "06960",
+ "06962",
+ "06964",
+ "06966",
+ "06968",
+ "06970",
+ "06972",
+ "06974",
+ "06976",
+ "06978",
+ "06980",
+ "06982",
+ "06984",
+ "06986",
+ "06988",
+ "06990",
+ "06992",
+ "06994",
+ "06996",
+ "06998",
+ "07000",
+ "07002",
+ "07004",
+ "07006",
+ "07008",
+ "07010",
+ "07012",
+ "07014",
+ "07016",
+ "07018",
+ "07020",
+ "07022",
+ "07024",
+ "07026",
+ "07028",
+ "07030",
+ "07032",
+ "07034",
+ "07036",
+ "07038",
+ "07040",
+ "07042",
+ "07044",
+ "07046",
+ "07048",
+ "07050",
+ "07052",
+ "07054",
+ "07056",
+ "07058",
+ "07060",
+ "07062",
+ "07064",
+ "07066",
+ "07068",
+ "07070",
+ "07072",
+ "07074",
+ "07076",
+ "07078",
+ "07080",
+ "07082",
+ "07084",
+ "07086",
+ "07088",
+ "07090",
+ "07092",
+ "07094",
+ "07096",
+ "07098",
+ "07100",
+ "07102",
+ "07104",
+ "07106",
+ "07108",
+ "07110",
+ "07112",
+ "07114",
+ "07116",
+ "07118",
+ "07120",
+ "07122",
+ "07124",
+ "07126",
+ "07128",
+ "07130",
+ "07132",
+ "07134",
+ "07136",
+ "07138",
+ "07140",
+ "07142",
+ "07144",
+ "07146",
+ "07148",
+ "07150",
+ "07152",
+ "07154",
+ "07156",
+ "07158",
+ "07160",
+ "07162",
+ "07164",
+ "07166",
+ "07168",
+ "07170",
+ "07172",
+ "07174",
+ "07176",
+ "07178",
+ "07180",
+ "07182",
+ "07184",
+ "07186",
+ "07188",
+ "07190",
+ "07192",
+ "07194",
+ "07196",
+ "07198",
+ "07200",
+ "07202",
+ "07204",
+ "07206",
+ "07208",
+ "07210",
+ "07212",
+ "07214",
+ "07216",
+ "07218",
+ "07220",
+ "07222",
+ "07224",
+ "07226",
+ "07228",
+ "07230",
+ "07232",
+ "07234",
+ "07236",
+ "07238",
+ "07240",
+ "07242",
+ "07244",
+ "07246",
+ "07248",
+ "07250",
+ "07252",
+ "07254",
+ "07256",
+ "07258",
+ "07260",
+ "07262",
+ "07264",
+ "07266",
+ "07268",
+ "07270",
+ "07272",
+ "07274",
+ "07276",
+ "07278",
+ "07280",
+ "07282",
+ "07284",
+ "07286",
+ "07288",
+ "07290",
+ "07292",
+ "07294",
+ "07296",
+ "07298",
+ "07300",
+ "07302",
+ "07304",
+ "07306",
+ "07308",
+ "07310",
+ "07312",
+ "07314",
+ "07316",
+ "07318",
+ "07320",
+ "07322",
+ "07324",
+ "07326",
+ "07328",
+ "07330",
+ "07332",
+ "07334",
+ "07336",
+ "07338",
+ "07340",
+ "07342",
+ "07344",
+ "07346",
+ "07348",
+ "07350",
+ "07352",
+ "07354",
+ "07356",
+ "07358",
+ "07360",
+ "07362",
+ "07364",
+ "07366",
+ "07368",
+ "07370",
+ "07372",
+ "07374",
+ "07376",
+ "07378",
+ "07380",
+ "07382",
+ "07384",
+ "07386",
+ "07388",
+ "07390",
+ "07392",
+ "07394",
+ "07396",
+ "07398",
+ "07400",
+ "07402",
+ "07404",
+ "07406",
+ "07408",
+ "07410",
+ "07412",
+ "07414",
+ "07416",
+ "07418",
+ "07420",
+ "07422",
+ "07424",
+ "07426",
+ "07428",
+ "07430",
+ "07432",
+ "07434",
+ "07436",
+ "07438",
+ "07440",
+ "07442",
+ "07444",
+ "07446",
+ "07448",
+ "07450",
+ "07452",
+ "07454",
+ "07456",
+ "07458",
+ "07460",
+ "07462",
+ "07464",
+ "07466",
+ "07468",
+ "07470",
+ "07472",
+ "07474",
+ "07476",
+ "07478",
+ "07480",
+ "07482",
+ "07484",
+ "07486",
+ "07488",
+ "07490",
+ "07492",
+ "07494",
+ "07496",
+ "07498",
+ "07500",
+ "07502",
+ "07504",
+ "07506",
+ "07508",
+ "07510",
+ "07512",
+ "07514",
+ "07516",
+ "07518",
+ "07520",
+ "07522",
+ "07524",
+ "07526",
+ "07528",
+ "07530",
+ "07532",
+ "07534",
+ "07536",
+ "07538",
+ "07540",
+ "07542",
+ "07544",
+ "07546",
+ "07548",
+ "07550",
+ "07552",
+ "07554",
+ "07556",
+ "07558",
+ "07560",
+ "07562",
+ "07564",
+ "07566",
+ "07568",
+ "07570",
+ "07572",
+ "07574",
+ "07576",
+ "07578",
+ "07580",
+ "07582",
+ "07584",
+ "07586",
+ "07588",
+ "07590",
+ "07592",
+ "07594",
+ "07596",
+ "07598",
+ "07600",
+ "07602",
+ "07604",
+ "07606",
+ "07608",
+ "07610",
+ "07612",
+ "07614",
+ "07616",
+ "07618",
+ "07620",
+ "07622",
+ "07624",
+ "07626",
+ "07628",
+ "07630",
+ "07632",
+ "07634",
+ "07636",
+ "07638",
+ "07640",
+ "07642",
+ "07644",
+ "07646",
+ "07648",
+ "07650",
+ "07652",
+ "07654",
+ "07656",
+ "07658",
+ "07660",
+ "07662",
+ "07664",
+ "07666",
+ "07668",
+ "07670",
+ "07672",
+ "07674",
+ "07676",
+ "07678",
+ "07680",
+ "07682",
+ "07684",
+ "07686",
+ "07688",
+ "07690",
+ "07692",
+ "07694",
+ "07696",
+ "07698",
+ "07700",
+ "07702",
+ "07704",
+ "07706",
+ "07708",
+ "07710",
+ "07712",
+ "07714",
+ "07716",
+ "07718",
+ "07720",
+ "07722",
+ "07724",
+ "07726",
+ "07728",
+ "07730",
+ "07732",
+ "07734",
+ "07736",
+ "07738",
+ "07740",
+ "07742",
+ "07744",
+ "07746",
+ "07748",
+ "07750",
+ "07752",
+ "07754",
+ "07756",
+ "07758",
+ "07760",
+ "07762",
+ "07764",
+ "07766",
+ "07768",
+ "07770",
+ "07772",
+ "07774",
+ "07776",
+ "07778",
+ "07780",
+ "07782",
+ "07784",
+ "07786",
+ "07788",
+ "07790",
+ "07792",
+ "07794",
+ "07796",
+ "07798",
+ "07800",
+ "07802",
+ "07804",
+ "07806",
+ "07808",
+ "07810",
+ "07812",
+ "07814",
+ "07816",
+ "07818",
+ "07820",
+ "07822",
+ "07824",
+ "07826",
+ "07828",
+ "07830",
+ "07832",
+ "07834",
+ "07836",
+ "07838",
+ "07840",
+ "07842",
+ "07844",
+ "07846",
+ "07848",
+ "07850",
+ "07852",
+ "07854",
+ "07856",
+ "07858",
+ "07860",
+ "07862",
+ "07864",
+ "07866",
+ "07868",
+ "07870",
+ "07872",
+ "07874",
+ "07876",
+ "07878",
+ "07880",
+ "07882",
+ "07884",
+ "07886",
+ "07888",
+ "07890",
+ "07892",
+ "07894",
+ "07896",
+ "07898",
+ "07900",
+ "07902",
+ "07904",
+ "07906",
+ "07908",
+ "07910",
+ "07912",
+ "07914",
+ "07916",
+ "07918",
+ "07920",
+ "07922",
+ "07924",
+ "07926",
+ "07928",
+ "07930",
+ "07932",
+ "07934",
+ "07936",
+ "07938",
+ "07940",
+ "07942",
+ "07944",
+ "07946",
+ "07948",
+ "07950",
+ "07952",
+ "07954",
+ "07956",
+ "07958",
+ "07960",
+ "07962",
+ "07964",
+ "07966",
+ "07968",
+ "07970",
+ "07972",
+ "07974",
+ "07976",
+ "07978",
+ "07980",
+ "07982",
+ "07984",
+ "07986",
+ "07988",
+ "07990",
+ "07992",
+ "07994",
+ "07996",
+ "07998",
+ "08000",
+ "08002",
+ "08004",
+ "08006",
+ "08008",
+ "08010",
+ "08012",
+ "08014",
+ "08016",
+ "08018",
+ "08020",
+ "08022",
+ "08024",
+ "08026",
+ "08028",
+ "08030",
+ "08032",
+ "08034",
+ "08036",
+ "08038",
+ "08040",
+ "08042",
+ "08044",
+ "08046",
+ "08048",
+ "08050",
+ "08052",
+ "08054",
+ "08056",
+ "08058",
+ "08060",
+ "08062",
+ "08064",
+ "08066",
+ "08068",
+ "08070",
+ "08072",
+ "08074",
+ "08076",
+ "08078",
+ "08080",
+ "08082",
+ "08084",
+ "08086",
+ "08088",
+ "08090",
+ "08092",
+ "08094",
+ "08096",
+ "08098",
+ "08100",
+ "08102",
+ "08104",
+ "08106",
+ "08108",
+ "08110",
+ "08112",
+ "08114",
+ "08116",
+ "08118",
+ "08120",
+ "08122",
+ "08124",
+ "08126",
+ "08128",
+ "08130",
+ "08132",
+ "08134",
+ "08136",
+ "08138",
+ "08140",
+ "08142",
+ "08144",
+ "08146",
+ "08148",
+ "08150",
+ "08152",
+ "08154",
+ "08156",
+ "08158",
+ "08160",
+ "08162",
+ "08164",
+ "08166",
+ "08168",
+ "08170",
+ "08172",
+ "08174",
+ "08176",
+ "08178",
+ "08180",
+ "08182",
+ "08184",
+ "08186",
+ "08188",
+ "08190",
+ "08192",
+ "08194",
+ "08196",
+ "08198",
+ "08200",
+ "08202",
+ "08204",
+ "08206",
+ "08208",
+ "08210",
+ "08212",
+ "08214",
+ "08216",
+ "08218",
+ "08220",
+ "08222",
+ "08224",
+ "08226",
+ "08228",
+ "08230",
+ "08232",
+ "08234",
+ "08236",
+ "08238",
+ "08240",
+ "08242",
+ "08244",
+ "08246",
+ "08248",
+ "08250",
+ "08252",
+ "08254",
+ "08256",
+ "08258",
+ "08260",
+ "08262",
+ "08264",
+ "08266",
+ "08268",
+ "08270",
+ "08272",
+ "08274",
+ "08276",
+ "08278",
+ "08280",
+ "08282",
+ "08284",
+ "08286",
+ "08288",
+ "08290",
+ "08292",
+ "08294",
+ "08296",
+ "08298",
+ "08300",
+ "08302",
+ "08304",
+ "08306",
+ "08308",
+ "08310",
+ "08312",
+ "08314",
+ "08316",
+ "08318",
+ "08320",
+ "08322",
+ "08324",
+ "08326",
+ "08328",
+ "08330",
+ "08332",
+ "08334",
+ "08336",
+ "08338",
+ "08340",
+ "08342",
+ "08344",
+ "08346",
+ "08348",
+ "08350",
+ "08352",
+ "08354",
+ "08356",
+ "08358",
+ "08360",
+ "08362",
+ "08364",
+ "08366",
+ "08368",
+ "08370",
+ "08372",
+ "08374",
+ "08376",
+ "08378",
+ "08380",
+ "08382",
+ "08384",
+ "08386",
+ "08388",
+ "08390",
+ "08392",
+ "08394",
+ "08396",
+ "08398",
+ "08400",
+ "08402",
+ "08404",
+ "08406",
+ "08408",
+ "08410",
+ "08412",
+ "08414",
+ "08416",
+ "08418",
+ "08420",
+ "08422",
+ "08424",
+ "08426",
+ "08428",
+ "08430",
+ "08432",
+ "08434",
+ "08436",
+ "08438",
+ "08440",
+ "08442",
+ "08444",
+ "08446",
+ "08448",
+ "08450",
+ "08452",
+ "08454",
+ "08456",
+ "08458",
+ "08460",
+ "08462",
+ "08464",
+ "08466",
+ "08468",
+ "08470",
+ "08472",
+ "08474",
+ "08476",
+ "08478",
+ "08480",
+ "08482",
+ "08484",
+ "08486",
+ "08488",
+ "08490",
+ "08492",
+ "08494",
+ "08496",
+ "08498",
+ "08500",
+ "08502",
+ "08504",
+ "08506",
+ "08508",
+ "08510",
+ "08512",
+ "08514",
+ "08516",
+ "08518",
+ "08520",
+ "08522",
+ "08524",
+ "08526",
+ "08528",
+ "08530",
+ "08532",
+ "08534",
+ "08536",
+ "08538",
+ "08540",
+ "08542",
+ "08544",
+ "08546",
+ "08548",
+ "08550",
+ "08552",
+ "08554",
+ "08556",
+ "08558",
+ "08560",
+ "08562",
+ "08564",
+ "08566",
+ "08568",
+ "08570",
+ "08572",
+ "08574",
+ "08576",
+ "08578",
+ "08580",
+ "08582",
+ "08584",
+ "08586",
+ "08588",
+ "08590",
+ "08592",
+ "08594",
+ "08596",
+ "08598",
+ "08600",
+ "08602",
+ "08604",
+ "08606",
+ "08608",
+ "08610",
+ "08612",
+ "08614",
+ "08616",
+ "08618",
+ "08620",
+ "08622",
+ "08624",
+ "08626",
+ "08628",
+ "08630",
+ "08632",
+ "08634",
+ "08636",
+ "08638",
+ "08640",
+ "08642",
+ "08644",
+ "08646",
+ "08648",
+ "08650",
+ "08652",
+ "08654",
+ "08656",
+ "08658",
+ "08660",
+ "08662",
+ "08664",
+ "08666",
+ "08668",
+ "08670",
+ "08672",
+ "08674",
+ "08676",
+ "08678",
+ "08680",
+ "08682",
+ "08684",
+ "08686",
+ "08688",
+ "08690",
+ "08692",
+ "08694",
+ "08696",
+ "08698",
+ "08700",
+ "08702",
+ "08704",
+ "08706",
+ "08708",
+ "08710",
+ "08712",
+ "08714",
+ "08716",
+ "08718",
+ "08720",
+ "08722",
+ "08724",
+ "08726",
+ "08728",
+ "08730",
+ "08732",
+ "08734",
+ "08736",
+ "08738",
+ "08740",
+ "08742",
+ "08744",
+ "08746",
+ "08748",
+ "08750",
+ "08752",
+ "08754",
+ "08756",
+ "08758",
+ "08760",
+ "08762",
+ "08764",
+ "08766",
+ "08768",
+ "08770",
+ "08772",
+ "08774",
+ "08776",
+ "08778",
+ "08780",
+ "08782",
+ "08784",
+ "08786",
+ "08788",
+ "08790",
+ "08792",
+ "08794",
+ "08796",
+ "08798",
+ "08800",
+ "08802",
+ "08804",
+ "08806",
+ "08808",
+ "08810",
+ "08812",
+ "08814",
+ "08816",
+ "08818",
+ "08820",
+ "08822",
+ "08824",
+ "08826",
+ "08828",
+ "08830",
+ "08832",
+ "08834",
+ "08836",
+ "08838",
+ "08840",
+ "08842",
+ "08844",
+ "08846",
+ "08848",
+ "08850",
+ "08852",
+ "08854",
+ "08856",
+ "08858",
+ "08860",
+ "08862",
+ "08864",
+ "08866",
+ "08868",
+ "08870",
+ "08872",
+ "08874",
+ "08876",
+ "08878",
+ "08880",
+ "08882",
+ "08884",
+ "08886",
+ "08888",
+ "08890",
+ "08892",
+ "08894",
+ "08896",
+ "08898",
+ "08900",
+ "08902",
+ "08904",
+ "08906",
+ "08908",
+ "08910",
+ "08912",
+ "08914",
+ "08916",
+ "08918",
+ "08920",
+ "08922",
+ "08924",
+ "08926",
+ "08928",
+ "08930",
+ "08932",
+ "08934",
+ "08936",
+ "08938",
+ "08940",
+ "08942",
+ "08944",
+ "08946",
+ "08948",
+ "08950",
+ "08952",
+ "08954",
+ "08956",
+ "08958",
+ "08960",
+ "08962",
+ "08964",
+ "08966",
+ "08968",
+ "08970",
+ "08972",
+ "08974",
+ "08976",
+ "08978",
+ "08980",
+ "08982",
+ "08984",
+ "08986",
+ "08988",
+ "08990",
+ "08992",
+ "08994",
+ "08996",
+ "08998",
+ "09000",
+ "09002",
+ "09004",
+ "09006",
+ "09008",
+ "09010",
+ "09012",
+ "09014",
+ "09016",
+ "09018",
+ "09020",
+ "09022",
+ "09024",
+ "09026",
+ "09028",
+ "09030",
+ "09032",
+ "09034",
+ "09036",
+ "09038",
+ "09040",
+ "09042",
+ "09044",
+ "09046",
+ "09048",
+ "09050",
+ "09052",
+ "09054",
+ "09056",
+ "09058",
+ "09060",
+ "09062",
+ "09064",
+ "09066",
+ "09068",
+ "09070",
+ "09072",
+ "09074",
+ "09076",
+ "09078",
+ "09080",
+ "09082",
+ "09084",
+ "09086",
+ "09088",
+ "09090",
+ "09092",
+ "09094",
+ "09096",
+ "09098",
+ "09100",
+ "09102",
+ "09104",
+ "09106",
+ "09108",
+ "09110",
+ "09112",
+ "09114",
+ "09116",
+ "09118",
+ "09120",
+ "09122",
+ "09124",
+ "09126",
+ "09128",
+ "09130",
+ "09132",
+ "09134",
+ "09136",
+ "09138",
+ "09140",
+ "09142",
+ "09144",
+ "09146",
+ "09148",
+ "09150",
+ "09152",
+ "09154",
+ "09156",
+ "09158",
+ "09160",
+ "09162",
+ "09164",
+ "09166",
+ "09168",
+ "09170",
+ "09172",
+ "09174",
+ "09176",
+ "09178",
+ "09180",
+ "09182",
+ "09184",
+ "09186",
+ "09188",
+ "09190",
+ "09192",
+ "09194",
+ "09196",
+ "09198",
+ "09200",
+ "09202",
+ "09204",
+ "09206",
+ "09208",
+ "09210",
+ "09212",
+ "09214",
+ "09216",
+ "09218",
+ "09220",
+ "09222",
+ "09224",
+ "09226",
+ "09228",
+ "09230",
+ "09232",
+ "09234",
+ "09236",
+ "09238",
+ "09240",
+ "09242",
+ "09244",
+ "09246",
+ "09248",
+ "09250",
+ "09252",
+ "09254",
+ "09256",
+ "09258",
+ "09260",
+ "09262",
+ "09264",
+ "09266",
+ "09268",
+ "09270",
+ "09272",
+ "09274",
+ "09276",
+ "09278",
+ "09280",
+ "09282",
+ "09284",
+ "09286",
+ "09288",
+ "09290",
+ "09292",
+ "09294",
+ "09296",
+ "09298",
+ "09300",
+ "09302",
+ "09304",
+ "09306",
+ "09308",
+ "09310",
+ "09312",
+ "09314",
+ "09316",
+ "09318",
+ "09320",
+ "09322",
+ "09324",
+ "09326",
+ "09328",
+ "09330",
+ "09332",
+ "09334",
+ "09336",
+ "09338",
+ "09340",
+ "09342",
+ "09344",
+ "09346",
+ "09348",
+ "09350",
+ "09352",
+ "09354",
+ "09356",
+ "09358",
+ "09360",
+ "09362",
+ "09364",
+ "09366",
+ "09368",
+ "09370",
+ "09372",
+ "09374",
+ "09376",
+ "09378",
+ "09380",
+ "09382",
+ "09384",
+ "09386",
+ "09388",
+ "09390",
+ "09392",
+ "09394",
+ "09396",
+ "09398",
+ "09400",
+ "09402",
+ "09404",
+ "09406",
+ "09408",
+ "09410",
+ "09412",
+ "09414",
+ "09416",
+ "09418",
+ "09420",
+ "09422",
+ "09424",
+ "09426",
+ "09428",
+ "09430",
+ "09432",
+ "09434",
+ "09436",
+ "09438",
+ "09440",
+ "09442",
+ "09444",
+ "09446",
+ "09448",
+ "09450",
+ "09452",
+ "09454",
+ "09456",
+ "09458",
+ "09460",
+ "09462",
+ "09464",
+ "09466",
+ "09468",
+ "09470",
+ "09472",
+ "09474",
+ "09476",
+ "09478",
+ "09480",
+ "09482",
+ "09484",
+ "09486",
+ "09488",
+ "09490",
+ "09492",
+ "09494",
+ "09496",
+ "09498",
+ "09500",
+ "09502",
+ "09504",
+ "09506",
+ "09508",
+ "09510",
+ "09512",
+ "09514",
+ "09516",
+ "09518",
+ "09520",
+ "09522",
+ "09524",
+ "09526",
+ "09528",
+ "09530",
+ "09532",
+ "09534",
+ "09536",
+ "09538",
+ "09540",
+ "09542",
+ "09544",
+ "09546",
+ "09548",
+ "09550",
+ "09552",
+ "09554",
+ "09556",
+ "09558",
+ "09560",
+ "09562",
+ "09564",
+ "09566",
+ "09568",
+ "09570",
+ "09572",
+ "09574",
+ "09576",
+ "09578",
+ "09580",
+ "09582",
+ "09584",
+ "09586",
+ "09588",
+ "09590",
+ "09592",
+ "09594",
+ "09596",
+ "09598",
+ "09600",
+ "09602",
+ "09604",
+ "09606",
+ "09608",
+ "09610",
+ "09612",
+ "09614",
+ "09616",
+ "09618",
+ "09620",
+ "09622",
+ "09624",
+ "09626",
+ "09628",
+ "09630",
+ "09632",
+ "09634",
+ "09636",
+ "09638",
+ "09640",
+ "09642",
+ "09644",
+ "09646",
+ "09648",
+ "09650",
+ "09652",
+ "09654",
+ "09656",
+ "09658",
+ "09660",
+ "09662",
+ "09664",
+ "09666",
+ "09668",
+ "09670",
+ "09672",
+ "09674",
+ "09676",
+ "09678",
+ "09680",
+ "09682",
+ "09684",
+ "09686",
+ "09688",
+ "09690",
+ "09692",
+ "09694",
+ "09696",
+ "09698",
+ "09700",
+ "09702",
+ "09704",
+ "09706",
+ "09708",
+ "09710",
+ "09712",
+ "09714",
+ "09716",
+ "09718",
+ "09720",
+ "09722",
+ "09724",
+ "09726",
+ "09728",
+ "09730",
+ "09732",
+ "09734",
+ "09736",
+ "09738",
+ "09740",
+ "09742",
+ "09744",
+ "09746",
+ "09748",
+ "09750",
+ "09752",
+ "09754",
+ "09756",
+ "09758",
+ "09760",
+ "09762",
+ "09764",
+ "09766",
+ "09768",
+ "09770",
+ "09772",
+ "09774",
+ "09776",
+ "09778",
+ "09780",
+ "09782",
+ "09784",
+ "09786",
+ "09788",
+ "09790",
+ "09792",
+ "09794",
+ "09796",
+ "09798",
+ "09800",
+ "09802",
+ "09804",
+ "09806",
+ "09808",
+ "09810",
+ "09812",
+ "09814",
+ "09816",
+ "09818",
+ "09820",
+ "09822",
+ "09824",
+ "09826",
+ "09828",
+ "09830",
+ "09832",
+ "09834",
+ "09836",
+ "09838",
+ "09840",
+ "09842",
+ "09844",
+ "09846",
+ "09848",
+ "09850",
+ "09852",
+ "09854",
+ "09856",
+ "09858",
+ "09860",
+ "09862",
+ "09864",
+ "09866",
+ "09868",
+ "09870",
+ "09872",
+ "09874",
+ "09876",
+ "09878",
+ "09880",
+ "09882",
+ "09884",
+ "09886",
+ "09888",
+ "09890",
+ "09892",
+ "09894",
+ "09896",
+ "09898",
+ "09900",
+ "09902",
+ "09904",
+ "09906",
+ "09908",
+ "09910",
+ "09912",
+ "09914",
+ "09916",
+ "09918",
+ "09920",
+ "09922",
+ "09924",
+ "09926",
+ "09928",
+ "09930",
+ "09932",
+ "09934",
+ "09936",
+ "09938",
+ "09940",
+ "09942",
+ "09944",
+ "09946",
+ "09948",
+ "09950",
+ "09952",
+ "09954",
+ "09956",
+ "09958",
+ "09960",
+ "09962",
+ "09964",
+ "09966",
+ "09968",
+ "09970",
+ "09972",
+ "09974",
+ "09976",
+ "09978",
+ "09980",
+ "09982",
+ "09984",
+ "09986",
+ "09988",
+ "09990",
+ "09992",
+ "09994",
+ "09996",
+ "09998",
+ "10000",
+ "10002",
+ "10004",
+ "10006",
+ "10008",
+ "10010",
+ "10012",
+ "10014",
+ "10016",
+ "10018",
+ "10020",
+ "10022",
+ "10024",
+ "10026",
+ "10028",
+ "10030",
+ "10032",
+ "10034",
+ "10036",
+ "10038",
+ "10040",
+ "10042",
+ "10044",
+ "10046",
+ "10048",
+ "10050",
+ "10052",
+ "10054",
+ "10056",
+ "10058",
+ "10060",
+ "10062",
+ "10064",
+ "10066",
+ "10068",
+ "10070",
+ "10072",
+ "10074",
+ "10076",
+ "10078",
+ "10080",
+ "10082",
+ "10084",
+ "10086",
+ "10088",
+ "10090",
+ "10092",
+ "10094",
+ "10096",
+ "10098",
+ "10100",
+ "10102",
+ "10104",
+ "10106",
+ "10108",
+ "10110",
+ "10112",
+ "10114",
+ "10116",
+ "10118",
+ "10120",
+ "10122",
+ "10124",
+ "10126",
+ "10128",
+ "10130",
+ "10132",
+ "10134",
+ "10136",
+ "10138",
+ "10140",
+ "10142",
+ "10144",
+ "10146",
+ "10148",
+ "10150",
+ "10152",
+ "10154",
+ "10156",
+ "10158",
+ "10160",
+ "10162",
+ "10164",
+ "10166",
+ "10168",
+ "10170",
+ "10172",
+ "10174",
+ "10176",
+ "10178",
+ "10180",
+ "10182",
+ "10184",
+ "10186",
+ "10188",
+ "10190",
+ "10192",
+ "10194",
+ "10196",
+ "10198",
+ "10200",
+ "10202",
+ "10204",
+ "10206",
+ "10208",
+ "10210",
+ "10212",
+ "10214",
+ "10216",
+ "10218",
+ "10220",
+ "10222",
+ "10224",
+ "10226",
+ "10228",
+ "10230",
+ "10232",
+ "10234",
+ "10236",
+ "10238",
+ "10240",
+ "10242",
+ "10244",
+ "10246",
+ "10248",
+ "10250",
+ "10252",
+ "10254",
+ "10256",
+ "10258",
+ "10260",
+ "10262",
+ "10264",
+ "10266",
+ "10268",
+ "10270",
+ "10272",
+ "10274",
+ "10276",
+ "10278",
+ "10280",
+ "10282",
+ "10284",
+ "10286",
+ "10288",
+ "10290",
+ "10292",
+ "10294",
+ "10296",
+ "10298",
+ "10300",
+ "10302",
+ "10304",
+ "10306",
+ "10307"
+ ]
+}
\ No newline at end of file
diff --git a/misc/rearrange_OUMVLP.py b/misc/rearrange_OUMVLP.py
new file mode 100644
index 0000000..6e155ff
--- /dev/null
+++ b/misc/rearrange_OUMVLP.py
@@ -0,0 +1,59 @@
+import os
+import shutil
+from tqdm import tqdm
+import argparse
+
+
+parser = argparse.ArgumentParser(description='Test')
+parser.add_argument('--input_path', default='/home1/data/OUMVLP_raw', type=str,
+ help='Root path of raw dataset.')
+parser.add_argument('--output_path', default='/home1/data/OUMVLP_rearranged', type=str,
+ help='Root path for output.')
+
+
+opt = parser.parse_args()
+
+INPUT_PATH = opt.input_path
+OUTPUT_PATH = opt.output_path
+
+
+def mv_dir(src, dst):
+ shutil.copytree(src, dst)
+ print(src, dst)
+
+
+sils_name_list = os.listdir(INPUT_PATH)
+name_space = 'Silhouette_'
+views = sorted(list(
+ set([each.replace(name_space, '').split('-')[0] for each in sils_name_list])))
+seqs = sorted(list(
+ set([each.replace(name_space, '').split('-')[1] for each in sils_name_list])))
+ids = list()
+for each in sils_name_list:
+ ids.extend(os.listdir(os.path.join(INPUT_PATH, each)))
+
+
+progress = tqdm(total=len(set(ids)))
+
+
+results = list()
+pid = 0
+for _id in sorted(set(ids)):
+ progress.update(1)
+ for _view in views:
+ for _seq in seqs:
+ seq_info = [_id, _seq, _view]
+ name = name_space + _view + '-' + _seq + '/' + _id
+ src = os.path.join(INPUT_PATH, name)
+ dst = os.path.join(OUTPUT_PATH, *seq_info)
+ if os.path.exists(src):
+ try:
+ if os.path.exists(dst):
+ pass
+ else:
+ os.makedirs(dst)
+ for subfile in os.listdir(src):
+ os.symlink(os.path.join(src, subfile),
+ os.path.join(dst, subfile))
+ except OSError as err:
+ print(err)
diff --git a/test.sh b/test.sh
index cfb44de..20e7db1 100644
--- a/test.sh
+++ b/test.sh
@@ -1,5 +1,6 @@
+# # **************** For CASIA-B ****************
# # Baseline
-# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase test
+CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase test
# # GaitSet
# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/gaitset.yaml --phase test
@@ -8,10 +9,24 @@
# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/gaitpart.yaml --phase test
# GaitGL
-# CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --master_port 12345 --nproc_per_node=4 lib/main.py --cfgs ./config/gaitgl.yaml --iter 80000 --phase test
+# CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --master_port 12345 --nproc_per_node=4 lib/main.py --cfgs ./config/gaitgl.yaml --phase test
# # GLN
# # Phase 1
-CUDA_VISIBLE_DEVICES=3,4 python -m torch.distributed.launch --master_port 12345 --nproc_per_node=2 lib/main.py --cfgs ./config/gln/gln_phase1.yaml --phase test
+# CUDA_VISIBLE_DEVICES=3,4 python -m torch.distributed.launch --master_port 12345 --nproc_per_node=2 lib/main.py --cfgs ./config/gln/gln_phase1.yaml --phase test
# # Phase 2
# CUDA_VISIBLE_DEVICES=2,5 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/gln/gln_phase2.yaml --phase test
+
+
+# # **************** For OUMVLP ****************
+# # Baseline
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/baseline_OUMVLP.yaml --phase test
+
+# # GaitSet
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitset_OUMVLP.yaml --phase test
+
+# # GaitPart
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitpart_OUMVLP.yaml --phase test
+
+# GaitGL
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitgl_OUMVLP.yaml --phase test
diff --git a/train.sh b/train.sh
index 2724955..b2b1718 100644
--- a/train.sh
+++ b/train.sh
@@ -1,5 +1,6 @@
+# # **************** For CASIA-B ****************
# # Baseline
-# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase train
+CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/baseline.yaml --phase train
# # GaitSet
# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/gaitset.yaml --phase train
@@ -8,10 +9,24 @@
# CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 lib/main.py --cfgs ./config/gaitpart.yaml --phase train
# GaitGL
-#CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gaitgl.yaml --phase train
+# CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gaitgl.yaml --phase train
# # GLN
# # Phase 1
-CUDA_VISIBLE_DEVICES=2,5,6,7 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gln/gln_phase1.yaml --phase train
+# CUDA_VISIBLE_DEVICES=2,5,6,7 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gln/gln_phase1.yaml --phase train
# # Phase 2
-CUDA_VISIBLE_DEVICES=2,5,6,7 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gln/gln_phase2.yaml --phase train
+# CUDA_VISIBLE_DEVICES=2,5,6,7 python -m torch.distributed.launch --nproc_per_node=4 lib/main.py --cfgs ./config/gln/gln_phase2.yaml --phase train
+
+
+# # **************** For OUMVLP ****************
+# # Baseline
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/baseline_OUMVLP.yaml --phase train
+
+# # GaitSet
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitset_OUMVLP.yaml --phase train
+
+# # GaitPart
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitpart_OUMVLP.yaml --phase train
+
+# GaitGL
+# CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 lib/main.py --cfgs ./config/gaitgl_OUMVLP.yaml --phase train
\ No newline at end of file