Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Clone of Installing the Intel® Computer Vision SDK 2018 on Linux, Without FPGA

$
0
0

Installing the Intel® Computer Vision SDK 2018 on Linux, without FPGA

NOTE: These steps apply to Ubuntu*, CentOS*, and Yocto*. 

The Intel® Computer Vision SDK (Intel® CV SDK) is a comprehensive toolkit for quickly deploying applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the SDK extends CV workloads across Intel® hardware, maximizing performance. The Intel® CV SDK includes the Deep Learning Deployment Toolkit.

The version of the Intel® CV SDK that you downloaded:

  • Enables CNN-based deep learning inference on the edge
  • Supports heterogeneous execution across Intel CV accelerators: CPU, Intel® Integrated Graphics, Intel® Movidius™ Neural Compute Stick
  • Speeds time-to-market via an easy-to-use library of CV functions and pre-optimized kernels
  • Includes optimized calls for CV standards including OpenCV*, OpenCL™, and OpenVX*

The installation package is free and comes as an archive that contains the software and installation scripts.

These instructions describe:

  • What is included in the free download
  • System requirements
  • Software dependencies
  • Installing the Intel® CV SDK on Linux* OS
  • Next steps

What's Included

ComponentDescription
Deep Learning Model OptimizerModel import tool. Imports trained models and converts to IR format for use by Deep Learning Inference Engine. This is part of the Intel® Deep Learning Deployment Toolkit.
Deep Learning Inference EngineUnified API to integrate the inference with application logic. This is part of the Intel® Deep Learning Deployment Toolkit.
Drivers and runtimes for OpenCL™ version 2.1Enables OpenCL 1.2 on the GPU/CPU for Intel® processors
Intel® Media SDKOffers access to hardware accelerated video codecs and frame processing
OpenCV version 3.4.1OpenCV* community version compiled for Intel® hardware. Includes PVL libraries for computer vision
OpenVX* version 1.1Intel's implementation of OpenVX* 1.1 optimized for running on Intel® hardware (CPU, GPU, IPU).
Documents and tutorialshttps://software.intel.com/en-us/computer-vision-sdk/documentation/featured

Where to Download This Release

https://software.intel.com/en-us/computer-vision-sdk/choose-download/free-download-linux

System Requirements

This guide covers the Linux* version of the Intel® Computer Vision SDK that does not includes FPGA support. For the Intel Computer Vision SDK with FPGA support, see https://software.intel.com/en-us/articles/CVSDK-Install-FPGA.

Development and Target Platform

The development and target platforms have the same requirements, but you can select different components during the installation, based on your intended use.

Processors

  • 6th-8th Generation Intel® Core™
  • Intel® Xeon® v5 family, Xeon® v6 family
  • Intel® Pentium® processor N4200/5, N3350/5, N3450/5 with Intel® HD Graphics
  • Intel® Movidius™ Neural Compute Stick

Processor Notes:

  • Processor graphics are not included in all processors. See https://ark.intel.com/ for information about your processor.
  • A chipset that supports processor graphics is required for Intel® Xeon® processors.

Operating Systems:

  • Ubuntu* 16.04.3 long-term support (LTS), 64-bit
  • CentOS* 7.4, 64 bit
  • Yocto Project* Poky Jethro* v2.0.3, 64-bit (intended for target only)

Pre-Installation

Use these steps to prepare your development machine for the Intel® CV SDK software.

  1. Download the Intel® CV SDK. By default, the file is saved as l_intel_cv_sdk_p_2018.0.<version>.tgz
  2. Unpack the .tgz file:
    tar -xf l_intel_cv_sdk_p_2018.0.<version>.tgz
  3. The files are unpacked to a directory named l_intel_cv_sdk_p_2018.0.<version>
  4. Go to the l_intel_cv_sdk_p_2018.0.<version> directory:
    cd l_intel_cv_sdk_p_2018.0.<version>

External Software Dependencies

These dependencies are the packages required for Intel-optimized OpenCV 3.4, the Deep Learning Inference Engine, and the Deep Learning Model Optimizer tools. Before installing the Intel® CV SDK, install these dependencies running the script from the installation package directory l_intel_cv_sdk_p_2018.0.<version>:

./install_cv_sdk_dependencies.sh

Installation Steps

  1. Go to the intel_cv_sdk_2018.0.<version> directory and start the GUI-based installation wizard:
    ./install_GUI.sh

    Or the CLI installer:

    ./install.sh
  2. You see GUI installation wizard or command-line installation instructions. The steps below are the same or similar. The only difference is that the command-line installer is text-based.
  3. The Prerequisites screen tells you if you are missing any required or recommended components, and the effect the missing component has on installing or using the product.
  4. Click Next to begin the installation, or make final changes to your component selections and choose your installation directory.
  5. The Installation summary screen shows you the options that will be installed if you make no changes.
  6. Click Install if you are ready to start the installation, or if you want to change the selected components and/or specify the installation directory, click Customize. To proceed with the standard installation, click Install.
  7. Complete screen indicates the software is installed. Click Finish to close the wizard and open the Getting Started page in the browser.
  8. Go to the install directory. For example, for the default installation in sudo mode:
    cd /opt/intel/computer_vision_sdk_2018.0.<version>/

Set Environment Variables

Updates to several environment variables are required to compile and run Intel® CV SDK applications. You can permanently set environment variables in a way matching your system's conventions. A method to set these variables temporarily (lasting only as long as the shell) is provided. For a standard Intel CV SDK installation:​

source /opt/intel/computer_vision_sdk_2018.0.<version>/bin/setupvars.sh

Post-Installation

Set External Software Dependencies (Processor Graphics)

Installation automatically creates the install_dependencies directory under /opt/intel/computer_vision_sdk with additional scripts to enable components to utilize processor graphics on your system:

  • The install_4_14_kernel.sh script contains steps to move your kernel forward to 4.14.  This is the minimum kernel supported, and the configuration that is validated, but you can choose newer kernels as well.  Please note that these are mainline kernels, not the kernels officially validated with your OS.  To check if you need to run this script, check your kernel version with uname –r.
  • The install_NEO_OCL_driver.sh script installs the OpenCL™ NEO driver components needed to use the clDNN GPU plugin and write custom layers for GPU.  For full functionality from this driver you must be running a 4.14 or newer kernel.
  • The ./MediaStack/install_media.sh script installs the Intel® Media SDK. This SDK offers access to hardware accelerated video codecs and frame processing.  This version of Intel Media SDK also requires a 4.14 or newer kernel.  For more information, see https://github.com/Intel-Media-SDK/MediaSDK/releases.  

NOTE: After running the scripts, a reboot is required.

Run these scripts and then reboot:

cd /opt/intel/computer_vision_sdk/install_dependencies/
sudo –E su
./install_4_14_kernel.sh
./install_OCL_driver.sh
MediaStack/install_media.sh
reboot

The OpenCL components here can be made more useful by installing header files to allow compiling new code. These can be obtained from from https://github.com/KhronosGroup/OpenCL-Headers.git

To make libOpenCL easier to find you may also want to consider adding some symbolic links:

ln -s /usr/lib/x86_64-linux-gnu/libOpenCL.so.1 /usr/lib/x86_64-linux-gnu/libOpenCL.so
ln -s /usr/lib/x86_64-linux-gnu/libOpenCL.so.1 /opt/intel/opencl/libOpenCL.so

To run clDNN, libOpenCL.so.1 will need to be in the library search path (LD_LIBRARY_PATH). You can do this by the many standard ways of setting environment variables, or update the /opt/intel/computer_vision_sdk_2018.0.<version>/bin/setupvars.sh script to include the directory where libOpenCL.so.1 can be found.

USB Rules (Intel® Movidius™ Neural Compute Stick)

To perform inference on Intel® Movidius™ Neural Compute Stick, install USB rules by running the following commands:

cat <<EOF > 97-usbboot.rules
SUBSYSTEM=="usb", ATTRS{idProduct}=="2150", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
SUBSYSTEM=="usb", ATTRS{idProduct}=="f63b", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
EOF
sudo cp 97-usbboot.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules
sudo udevadm trigger
sudo ldconfig
rm 97-usbboot.rules

Next Steps

Learn About the Intel® CV SDK

Before using the Intel® CV SDK, read through the product overview to gain a better understanding of how the product works.

Compile the Extensions Library

Some topology-specific layers, like DetectionOutput used in the SSD*, are delivered in source code that assumes the extensions library is compiled and loaded. The extensions are required for pre-trained models inference. While you can build the library manually, the best way to compile the extensions library is to execute the demo scripts.

Run the Demonstration Applications

To verify the installation, run the demo apps in <INSTALL_DIR>/deployment_tools/demo. For demo app documentation, see README.txt in <INSTALL_DIR>/deployment_tools/demo.

The demo apps and their functions are:

  • demo_squeezenet_download_convert_run.sh. This demo Illustrates the basic steps used to convert a model and run it. This enables the Intel® Deep Learning Deployment Toolkit to perform a classification task with the SqueezeNet model. This demo:
    • Downloads a public SqueezeNet model.
    • Installs all prerequisites to run the Model Optimizer.
    • Converts the model to an Intermediate Representation.
    • Builds the Inference Engine Image Classification Sample from the <INSTALL_DIR>/deployment_tools/inference_engine/samples/classification_sample
    • Runs the sample using cars.png from the demo folder.
    • Shows the label and confidence for the top-10 categories.
  • demo_security_barrier_camera_sample.sh. This demo shows an inference pipeline using three of the pre-trained models included with the Intel CV SDK. The region found by one model becomes the input to the next. Vehicle regions found by object recognition in the first phase become the input to the vehicle attributes model, which locates the license plate. The region identified in this step becomes the input to a license plate character recognition model. This demo:
    • Builds the Inference Engine Security Barrier Camera Sample from the <INSTALL_DIR>/deployment_tools/inference_engine/samples/security_barrier_camera_sample.
    • Runs the sample using car_1.bmp from the demo folder.
    • Displays the resulting frame with detections rendered as bounding boxes and text.

For documentation on the demo apps, see the README.txt file in the <INSTALL_DIR>/deployment_tools/demo folder.

Other Important Information

  • See the <INSTALL_DIR>/deployment_tools/inference_engine/samples/ folder and the samples overview documentation to learn about the range of samples available for the Inference Engine.
  • Before using the Model Optimizer to start working with your trained model, make sure your Caffe*TensorFlow*, or MXNet* framework is prepared for any custom layers you have in place.
  • For developer guides, API references, tutorials, and other online documentation, see the Intel® CV SDK documentation.

Helpful Links

NOTE: Links open in a new window.

Intel® CV SDK Home Page: https://software.intel.com/en-us/computer-vision-sdk

Intel® CV SDK Documentation: https://software.intel.com/en-us/computer-vision-sdk/documentation/featured

Legal Information

You may not use or facilitate the use of this document in connection with any infringement or other legal analysis concerning Intel products described herein. You agree to grant Intel a non-exclusive, royalty-free license to any patent claim thereafter drafted which includes subject matter disclosed herein.

No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.

All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest Intel product specifications and roadmaps.

The products described may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.

Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Learn more at http://www.intel.com/ or from the OEM or retailer.

No computer system can be absolutely secure.

Intel, Arria, Core, Movidius, Pentium, Xeon, and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

OpenCL and the OpenCL logo are trademarks of Apple Inc. used by permission by Khronos

*Other names and brands may be claimed as the property of others.

Copyright © 2018, Intel Corporation. All rights reserved.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>