TMP ssd-mobilenet

Running the ssd-mobilenet test on TensorFlow

Ensure the virtual environment is prepared as described in TMP MLPerf Inference

 

Activate the virtual environment first and then

cd ~ mkdir models cd models ck install package --tags=object-detection,dataset,coco,val wget http://download.tensorflow.org/models/mobilenet_v1_2018_08_02/mobilenet_v1_1.0_224.tgz tar -zxf mobilenet_v1_1.0_224.tgz wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28.tar.gz tar -zxf ssd_mobilenet_v1_coco_2018_01_28.tar.gz cp ssd_mobilenet_v1_coco_2018_01_28/frozen_inference_graph.pb ssd_mobilenet_v1_coco_2018_01_28.pb cd ~/src/inference/vision/classification_and_detection export DATA_DIR=~/CK-TOOLS/dataset-coco-2017-val export MODEL_DIR=${HOME}/models ./run_local.sh tf ssd-mobilenet

If the cherry-pick when setting up MLPerf Inference is not done then this fails with

tensorflow.python.framework.errors_impl.InvalidArgumentError: Input 0 of node ToFloat was passed float from image_tensor:0 incompatible with expected uint8.

See this known issue

Alternatively if you are on a machine in Linaro’s Cambridge colo then you can do this

Mount the NFS drive with the already downloaded data TMP Cambridge Colo data server

 

export DATA_DIR=/mnt/datasets/data/CK-TOOLS/dataset-coco-2017-val export MODEL_DIR=/mnt/datasets/data/models cd ~/src/inference/vision/classification_and_detection ./run_local.sh tf ssd-mobilenet