Tensorflow Lite
TensorFlow is a free and open-source AI/ML framework from Google used for training and inferencing of neural networks. Tensorflow Lite is designed as a smaller subset of Tensorflow aimed at deployment on mobile and edge devices.
Requirements
Clone
cpuinfo
from GitHub - pytorch/cpuinfo: CPU INFOrmation library (x86/x86-64/ARM/ARM64, Linux/Windows/Android/macOS/iOS)Clone
pthreadpool
from GitHub - Maratyszcza/pthreadpool: Portable (POSIX/Windows/Emscripten) thread pool for C/C++Clone
XNNPACK
from GitHub - everton1984/XNNPACK: High-efficiency floating-point neural network inference operators for mobile, server, and Web and use branchwoa_enablement
Clone
tensorflow
from GitHub - everton1984/tensorflow: An Open Source Machine Learning Framework for Everyone and use branchwoa_enablement2
Download LLVM for WoA at least
18.1.0
Download bazel for WoA version
6.5.0
Make sure to take a look into
XNNPACK
’s andtensorflow
’sworkspace
and update the addresses forcpuinfo
,pthreadpool
andXNNPACK
to your respective local clones
Steps
Make sure to set the environment variables
BAZEL_LLVM
,BAZEL_VC
andTF_PYTHON_VERSION
to the respective proper directories on your system as well as the Python version you are using.Go to the XNNPACK folder and open the
WORKSPACE
file. Fix the directory of the local repositories to match to your own clones of pthreadpool and cpuinfo. Do the same for Tensorflow undertensorflow\workspace2.bzl
Now run
bazel.exe build //tensorflow/lite:tensorflowlite --cpu=arm64_windows --compiler=clang-cl --copt="/clang:-march=armv8-a+dotprod+fp16+i8mm" --cxxopt="/clang:-march=armv8-a+dotprod+fp16+i8mm"
You will find
tensorflow.dll
inside thebazel-bin\tensorflow
directory