This fork should allow us to run inference using the fashionpedia project. Because this code base is all weird, we need to be a little careful about how we use it.
Let's use Anaconda to prepare an appropriate Python environment:
conda create -n tpu python=3.10
conda activate tpu
Then install a couple of dependencies:
pip install tensorflow-gpu==2.11.0 Pillow==9.5.0 pyyaml opencv-python-headless tqdm pycocotools
Next, navigate to the detection
directory:
cd path/to/tpu/models/official/detection
In that directory, we have placed inference_fashion.py
.
That file does odd things to PYTHONPATH such that all the imports should work.
Download model weights:
curl https://storage.googleapis.com/cloud-tpu-checkpoints/detection/projects/fashionpedia/fashionpedia-spinenet-143.tar.gz --output fashionpedia-spinenet-143.tar.gz
tar -xf fashionpedia-spinenet-143.tar.gz
Finally, start the service:
python serve.py
Alternatively, perform inference “manually”:
python inference_fashion.py --model="attribute_mask_rcnn" --image_size="640" --checkpoint_path="fashionpedia-spinenet-143/model.ckpt" --label_map_file="projects/fashionpedia/dataset/fashionpedia_label_map.csv" --image_file_pattern="path/to/sized-images.tar" --output_html="out.html" --max_boxes_to_draw=8 --min_score_threshold=0.05 --config_file="projects/fashionpedia/configs/yaml/spinenet143_amrcnn.yaml" --output_file="output.npy"
This repository is a collection of reference models and tools used with Cloud TPUs.
The fastest way to get started training a model on a Cloud TPU is by following the tutorial. Click the button below to launch the tutorial using Google Cloud Shell.
Note: This repository is a public mirror, pull requests will not be accepted. Please file an issue if you have a feature or bug request.
To run models in the models
subdirectory, you may need to add the top-level
/models
folder to the Python path with the command:
export PYTHONPATH="$PYTHONPATH:/path/to/models"