update readme

This commit is contained in:
Iridoudou
2021-08-09 12:45:23 +08:00
parent 44d1610296
commit 60d5f303f6
8 changed files with 54 additions and 34 deletions

View File

@ -1,23 +1,26 @@
SMPL layer for PyTorch
pose2smpl
=======
[SMPL](http://smpl.is.tue.mpg.de) human body [\[1\]](#references) layer for [PyTorch](https://pytorch.org/) (tested with v0.4 and v1.x)
is a differentiable PyTorch layer that deterministically maps from pose and shape parameters to human body joints and vertices.
It can be integrated into any architecture as a differentiable layer to predict body meshes.
The code is adapted from the [manopth](https://github.com/hassony2/manopth) repository by [Yana Hasson](https://github.com/hassony2).
### Fitting SMPL Parameters by 3D-pose Key-points
The repository provides a tool to fit **SMPL parameters** from **3D-pose** datasets that contain key-points of human body.
The SMPL human body layer for Pytorch is from the [smplpytorch](https://github.com/gulvarol/smplpytorch) repository.
<p align="center">
<img src="assets/image.png" alt="smpl" width="300"/>
<img src="assets/fit.gif" width="350"/>
<img src="assets/gt.gif" width="350"/>
</p>
## Setup
### 1. The `smplpytorch` package
* **Run without installing:** You will need to install the dependencies listed in [environment.yml](environment.yml):
* `conda env update -f environment.yml` in an existing environment, or
* `conda env create -f environment.yml`, for a new `smplpytorch` environment
* **Install:** To import `SMPL_Layer` in another project with `from smplpytorch.pytorch.smpl_layer import SMPL_Layer` do one of the following.
* Option 1: This should automatically install the dependencies.
``` bash
git clone https://github.com/gulvarol/smplpytorch.git
@ -33,35 +36,40 @@ The code is adapted from the [manopth](https://github.com/hassony2/manopth) repo
* Download the models from the [SMPL website](http://smpl.is.tue.mpg.de/) by choosing "SMPL for Python users". Note that you need to comply with the [SMPL model license](http://smpl.is.tue.mpg.de/license_model).
* Extract and copy the `models` folder into the `smplpytorch/native/` folder (or set the `model_root` parameter accordingly).
## Demo
### 3. Download Dataset
Forward pass the randomly created pose and shape parameters from the SMPL layer and display the human body mesh and joints:
- Download the datasets you want to fit
`python demo.py`
currently supported datasets:
## Acknowledgements
The code **largely** builds on the [manopth](https://github.com/hassony2/manopth) repository from [Yana Hasson](https://github.com/hassony2), which implements the [MANO](http://mano.is.tue.mpg.de) hand model [\[2\]](#references) layer.
- [HumanAct12](https://ericguo5513.github.io/action-to-motion/)
- [UTD-MHAD](https://personal.utdallas.edu/~kehtar/UTD-MHAD.html)
- Set the **DATASET.PATH** in the corresponding configuration file to the location of dataset.
The code is a PyTorch port of the original [SMPL](http://smpl.is.tue.mpg.de) model from [chumpy](https://github.com/mattloper/chumpy). It builds on the work of [Loper](https://github.com/mattloper) et al. [\[1\]](#references).
## Fitting
The code [reuses](https://github.com/gulvarol/smpl/pytorch/rodrigues_layer.py) [part of the code](https://github.com/MandyMo/pytorch_HMR/blob/master/src/util.py) by [Zhang Xiong](https://github.com/MandyMo) to compute the rotation utilities.
### 1. Executing Code
If you find this code useful for your research, please cite the original [SMPL](http://smpl.is.tue.mpg.de) publication:
You can start the fitting procedure by the following code and the configuration file in *fit/configs* corresponding to the dataset_name will be loaded:
```
@article{SMPL:2015,
author = {Loper, Matthew and Mahmood, Naureen and Romero, Javier and Pons-Moll, Gerard and Black, Michael J.},
title = {{SMPL}: A Skinned Multi-Person Linear Model},
journal = {ACM Trans. Graphics (Proc. SIGGRAPH Asia)},
number = {6},
pages = {248:1--248:16},
volume = {34},
year = {2015}
}
python fit/tools/main.py --dataset_name [DATASET NAME] --dataset_path [DATASET PATH]
```
## References
### 2. Output
\[1\] Matthew Loper, Naureen Mahmood, Javier Romero, Gerard Pons-Moll, and Michael J. Black, "SMPL: A Skinned Multi-Person Linear Model," SIGGRAPH Asia, 2015.
- **Direction**: The output SMPL parameters will be stored in *fit/output*
\[2\] Javier Romero, Dimitrios Tzionas, and Michael J. Black, "Embodied Hands: Modeling and Capturing Hands and Bodies Together," SIGGRAPH Asia, 2017.
- **Format:** The output are *.pkl* files, and the data format is:
```
{
"label": [The label of action],
"pose_params": pose parameters of SMPL (shape = [frame_num, 72]),
"shape_params": pose parameters of SMPL (shape = [frame_num, 10]),
"Jtr": key-point coordinates of SMPL model (shape = [frame_num, 24, 3])
}
```