Introduction
TensorFlow* is a leading deep learning and machine learning framework, and as of May 2017, it now integrates optimizations for Intel® Xeon® processors and Intel® Xeon Phi™ processors. This is the first in a series of tutorials providing information for developers who want to build, install, and explore TensorFlow optimized on Intel architecture from sources available in the GitHub* repository.
Resources
The TensorFlow website is a key resource for learning about the framework, providing informative overviews, tutorials, and technical information on its various components. This is the first stop for developers interested in understanding the full extent of what TensorFlow has to offer in the area of deep learning.
The article TensorFlow Optimizations on Modern Intel® Architecture introduces the specific graph optimizations, performance experiments, and details for building and installing TensorFlow with CPU optimizations. This article is highly recommended for developers who want to understand the details of how to fully optimize TensorFlow for different topologies, and the performance improvements they can achieve in doing so.
Installation Overview
The installation steps presented in this document are distilled from information provided in the Installing TensorFlow from Sources guide on the TensorFlow website. The steps outlined below are provided to give a quick overview of the installation process; however, since third-party information is subject to change over time, it is recommended that you also review the information provided on the TensorFlow website.
The installation guidelines presented in this document focus on installing TensorFlow with CPU support only. The target operating system and Python* distribution are Ubuntu* 16.04 and Python 2.7, respectively.
Installing the Bazel* Build Tool
Bazel* is the publicly available build tool from Google*. If Bazel is already installed on your system you can skip this section. Otherwise, enter the following commands to add the Bazel distribution URI, perform the installation, and update Bazel on your system:
echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
sudo apt install curl
curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -
sudo apt-get update && sudo apt-get install bazel
sudo apt-get upgrade bazel
Installing Python* Dependencies
If the Python dependencies are already installed on your system you can skip this section. To install the required packages for Python 2.7, enter the following command:
sudo apt-get install python-numpy python-dev python-pip python-wheel
Building a TensorFlow* Pip Package for Installation
If the program Git* is not currently installed on your system, issue the following command:
sudo apt install git
Clone the GitHub repository by issuing the following command:
git clone https://github.com/tensorflow/tensorflow
The tensorflow directory created during cloning contains a script named configure that must be executed prior to creating the pip package and installing TensorFlow. This script allows you to identify the pathname, dependencies, and other build configuration options. For TensorFlow optimized on Intel architecture, this script also allows you to set up Intel® Math Kernel Library (Intel® MKL) related environment settings. Execute the following commands:
cd tensorflow
./configure
Important: Select ‘Y’ to build TensorFlow with Intel MKL support, and ‘Y’ to download MKL LIB from the web. Select the default settings for the other configuration parameters. When the script has completed running, issue the following command to build the pip package:
bazel build --config=mkl --copt="-DEIGEN_USE_VML" -c opt //tensorflow/tools/pip_package:build_pip_package
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
Installing TensorFlow—Native Pip Option
At this point in the process the newly created pip package will be located in tmp/tensorflow_pkg. The next step is to install TensorFlow, which can be done either as a native pip installation, or in an Anaconda* virtual environment as described in the next section. For a native pip installation simply enter the following command:
sudo pip install /tmp/tensorflow_pkg/tensorflow-1.2.0rc1-cp27-cp27mu-linux_x86_64.whl
(Note: The name of the wheel, as shown above in italics, may be different for your particular build.)
Once these steps have been completed be sure to validate the installation before proceeding to the next section. Note: When running the Python validation script provided in the link, be sure to change to a different directory, for example:
cd ..
Installing TensorFlow—Conda* Environment Option
Note: If you already have Anaconda installed on your system you can skip this step.
Download Anaconda from the download page and follow the directions to run the installer script. (For this tutorial, we used the 64-bit, x86, Python 2.7 version of Anaconda.) During the installation you need to agree to the license, choose the defaults, and choose 'yes' to add Anaconda to your path. Once the installation is complete, close the terminal and open a new one.
Next, we will create a conda environment and install TensorFlow from the newly created pip package located in tmp/tensorflow_pkg. Run the following commands to create a TensorFlow environment called "inteltf" and issue the following commands:
conda create -n inteltf
source activate inteltf
pip install /tmp/tensorflow_pkg/tensorflow-1.2.0rc1-cp27-cp27mu-linux_x86_64.whl
(Note: The name of the wheel, as shown above in italics, may be different for your particular build.)
source deactivate inteltf
Close the terminal and open a new one before proceeding.
Restart the inteltf environment and validate the TensorFlow installation by running the following Python code from the website:
source activate inteltf python>>> import tensorflow as tf>>> hello = tf.constant('Hello, TensorFlow!')>>> sess = tf.Session()>>> print(sess.run(hello))
The Python program should output “Hello, TensorFlow!” if the installation was successful.
Coming Up
The next article in the series describes how to install TensorFlow Serving*, a high-performance serving system for machine learning models designed for production environments.