diff --git a/README.md b/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..f5abd15b2d8510a6bd670de200eac2244ac29bbd
--- /dev/null
+++ b/README.md
@@ -0,0 +1,93 @@
+# Comprehensive Vulnerability Evaluation of Face Recognition Systems to Template Inversion Attacks via 3D Face Reconstruction 
+
+This package is part of the signal-processing and machine learning toolbox [Bob](https://www.idiap.ch/software/bob).
+It contains the source code to reproduce the following paper:
+```BibTeX
+@inproceedings{iccv2023ti3d,
+  author    = {Hatef Otroshi Shahreza and S{\'e}bastien Marcel},
+  title     = {Template Inversion Attack against Face Recognition Systems using 3D Face Reconstruction},
+  booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
+  pages     = {19662--19672},
+  month     = {October},
+  year      = {2023}
+}
+```
+
+## Installation
+The installation instructions are based on [conda](https://conda.io/) and works on **Linux systems
+only**. Therefore, please [install conda](https://conda.io/docs/install/quick.html#linux-miniconda-install) before continuing.
+
+For installation, please download the source code of this paper and unpack it. Then, you can create a conda
+environment with the following command:
+
+```sh
+$ git clone https://gitlab.idiap.ch/bob/bob.paper.iccv2023_face_ti
+$ cd bob.paper.iccv2023_face_ti
+
+# create the environment
+$ conda create --name bob.paper.iccv2023_face_ti --file package-list.txt
+# or 
+# $ conda env create -f environment.yml
+
+# activate the environment
+$ conda activate bob.paper.iccv2023_face_ti  
+
+# install paper package
+$ pip install ./ --no-build-isolation  
+```
+We use [EG3D](https://github.com/NVlabs/eg3d) as a pretrained face generator network based on generative neural radiance fields (GNeRF). Therefore, you need to clone its git repository and download [available pretrained model](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/research/models/eg3d):
+```sh
+$ git clone https://github.com/NVlabs/eg3d.git
+```
+We use `ffhq512-128.pkl` [checkpoint](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/research/models/eg3d/files) in our experiments.
+
+## Downloading the datasets
+In our experiments, we use [FFHQ](https://github.com/NVlabs/ffhq-dataset) dataset for training our face reconstruction network.
+Also we used [MOBIO](https://www.idiap.ch/dataset/mobio) and [LFW](http://vis-www.cs.umass.edu/lfw/) datasets for evaluation.
+All of these datasets are publicly available. To download the datasets please refer to their websites:
+- [FFHQ](https://github.com/NVlabs/ffhq-dataset)
+- [MOBIO](https://www.idiap.ch/dataset/mobio)
+- [LFW](http://vis-www.cs.umass.edu/lfw/)
+
+
+## Configuring the directories of the datasets
+Now that you have downloaded the three databases. You need to set the paths to
+those in the configuration files. [Bob](https://www.idiap.ch/software/bob) supports a configuration file
+(`~/.bobrc`) in your home directory to specify where the
+databases are located. Please specify the paths for the database like below:
+```sh
+# Setup FFHQ directory
+$ bob config set  bob.db.ffhq.directory [YOUR_FFHQ_IMAGE_DIRECTORY]
+
+# Setup MOBIO directories
+$ bob config set  bob.db.mobio.directory [YOUR_MOBIO_IMAGE_DIRECTORY]
+$ bob config set  bob.db.mobio.annotation_directory [YOUR_MOBIO_ANNOTATION_DIRECTORY]
+
+# Setup LFW directories
+$ bob config set  bob.db.lfw.directory [YOUR_LFW_IMAGE_DIRECTORY]
+$ bob config set  bob.bio.face.lfw.annotation_directory [YOUR_LFW_ANNOTATION_DIRECTORY]
+```
+
+## Running the Experiments
+### Step 1: Training face reconstruction model
+You can train the face reconstruction model by running `train.py`. For example, for blackbox attack against `ElasticFace` using `ArcFace` in loss function, you can use the following commands:
+```sh
+python train.py --path_eg3d_repo <path_eg3d_repo>  --path_eg3d_checkpoint <path_eg3d_checkpoint>       \
+                --FR_system ElasticFace   --FR_loss  ArcFace  --path_ffhq_dataset <path_ffhq_dataset>  \
+```
+### Step 2: Evaluation
+After the model is  trained, you can use it to run evaluation.
+For evaluation, you can use `evaluation_pipeline` script and evaluate on an evaluation dataset (MOBIO/LFW). For example, for evaluation of a face reconstruction of ElasticFace to attack the same system on MOBIO dataset, you can use the following commands:
+```sh
+python evaluation_pipeline.py --path_eg3d_repo <path_eg3d_repo>  --path_eg3d_checkpoint <path_eg3d_checkpoint>    \
+                --FR_system ElasticFace  --FR_target  ElasticFace --attack GaFaR  --checkpoint <path_checkpoint>  \
+                --dataset MOBIO
+```
+After you ran the evaluation pipeline, you can use `eval_SAR_TMR.py` to caluclate the vulnaribility in terms of Sucess Attack Rate (SAR).  
+
+
+## Project Page
+You can find general information about this work, including general block diagarm of the proposed method and experiments in the [project page](https://www.idiap.ch/paper/gafar/).
+
+## Contact
+For questions or reporting issues to this software package, please contact the first author (hatef.otroshi@idiap.ch) or our development [mailing list](https://www.idiap.ch/software/bob/discuss).
\ No newline at end of file