Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Clone of Installing the OpenVINO™ Toolkit for Windows*

$
0
0

NOTEThe OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK.
These steps apply to Windows* 10. For Linux* instructions, see the Linux installation guide

Introduction

The OpenVINO™ toolkit quickly deploys applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the Toolkit extends computer vision (CV) workloads across Intel® hardware, maximizing performance. The OpenVINO™ Toolkit includes the Intel® Deep Learning Deployment Toolkit.For more information, see the OpenVINO Overview information on the Web site.

The OpenVINO™ toolkit for Linux:

  • Enables CNN-based deep learning inference on the edge
  • Supports heterogeneous execution across a CPU, Intel® Integrated Graphics, and Intel® Movidius™ Neural Compute Stick
  • Speeds time-to-market via an easy-to-use library of computer vision functions and pre-optimized kernels
  • Includes optimized calls for computer vision standards including OpenCV*, OpenCL™, and OpenVX*

Included with Installation

ComponentDescription
Deep Learning Model OptimizerModel import tool. Imports trained models and converts to IR format for use by Deep Learning Inference Engine. This is part of the Intel® Deep Learning Deployment Toolkit.
Deep Learning Inference EngineUnified API to integrate the inference with application logic. This is part of the Intel® Deep Learning Deployment Toolkit.
Drivers and runtimes for OpenCL™ version 2.1Enables OpenCL 1.2 on the GPU/CPU for Intel® processors
Intel® Media SDKOffers access to hardware accelerated video codecs and frame processing
OpenCV version 3.4.1OpenCV community version compiled for Intel® hardware. Includes PVL libraries for computer vision.
OpenVX* version 1.1Intel's implementation of OpenVX* 1.1 optimized for running on Intel® hardware (CPU, GPU, IPU).
Sample ApplicationsA set of simple console applications demonstrating how to use Intel's Deep Learning Inference Engine in your applications. Additional information about building and running the samples can be found in the Inference Engine Developer Guide.

System Requirements

This guide includes only information related to Microsoft Windows* 10 64-bit. See the Linux installation guide for Linux information and instructions.

NOTE: Only the CPU and Intel® Integrated Graphics processor options are available. Linux is required for using the FPGA or Intel® Movidius™ Myriad™ 2 VPU options.

Development and Target Platforms

The development and target platforms have the same requirements, but you can select different components during the installation, based on your intended use.

Processor

  • 6th-8th Generation Intel® Core™
  • Intel® Xeon® v5 family, Intel® Xeon® v6 family

Processor Notes:

  • Processor graphics are not included in all processors. See https://ark.intel.com/ for information about your processor.
  • A chipset that supports processor graphics is required for Intel® Xeon® processors.

Operating System

Microsoft Windows* 10 64-bit

Installation

The steps in this guide assume you have already downloaded a copy of OpenVINO™ Toolkit for Linux*. If you do not have a copy of the package you can download the latest version here, then return to this guide to proceed with installation.

Install External Software Dependencies

Install Core Components

  1. Download the OpenVINO toolkit. By default, the file is saved to Downloads as w_openvino_toolkit_p_2018.1.<version>.exe
  2. Go to the Downloads folder.
  3. Double-click w_openvino_toolkit_p_2018.1.<version>.exe. A screen displays with options to choose your installation directory and components:
  4. Click Next.
  5. The next screen warns you about any missing components and the effect the missing component has on installing or using the OpenVINO toolkit:
  6. If you are missing a critical component, click Cancel, resolve the issue, and then restart the installation.
  7. When the installation completes, click Finish to close the wizard and open the Getting Started Guide in a browser.
  8. Make sure the installation directory is populated with sub-folders. The default installation location is C:\Intel\computer_vision_sdk_2018.1.<versions>. 

Next Steps

IMPORTANT: Before using the Model Optimizer to work with your trained model, make sure your Caffe*TensorFlow*, or MXNet* framework is prepared for any custom layers you have in place. The next information will put you on the way to doing this.

Learn About the OpenVINO™ Toolkit

Before using the OpenVINO™ toolkit, read through the product overview information on the Web site to gain a better understanding of how the product works.

Compile the Extensions Library

Some topology-specific layers, like DetectionOutput used in the SSD*, are delivered in source code that assumes the extensions library is compiled and loaded. The extensions are required for pre-trained models inference. While you can build the library manually, the best way to compile the extensions library is to execute the demo scripts.

Run the Demonstration Applications

To verify the installation, run the demo apps in <INSTALL_FOLDER>\deployment_tools\demo. For demo app documentation, see the README.txt in <INSTALL_FOLDER>\deployment_tools\demo.

The demo apps and their functions are:

  • demo_squeezenet_download_convert_run.bat. This demo Illustrates the basic steps used to convert a model and run it. This enables the Intel® Deep Learning Deployment Toolkit to perform a classification task with the SqueezeNet model. This demo:
    • Downloads a public SqueezeNet model.
    • Installs all prerequisites to run the Model Optimizer.
    • Converts the model to an Intermediate Representation.
    • Builds the Inference Engine Image Classification Sample from the <INSTALL_FOLDER>\deployment_tools\inference_engine\samples\classification_sample
    • Runs the sample using cars.png from the demo folder.
    • Shows the label and confidence for the top-10 categories.
  • demo_security_barrier_camera_sample.bat. This demo shows an inference pipeline using three of the pre-trained models included with the OpenVINO. The region found by one model becomes the input to the next. Vehicle regions found by object recognition in the first phase become the input to the vehicle attributes model, which locates the license plate. The region identified in this step becomes the input to a license plate character recognition model. This demo:
    • Builds the Inference Engine Security Barrier Camera Sample from the <INSTALL_FOLDER>\deployment_tools\inference_engine\samples\security_barrier_camera_sample.
    • Runs the sample using car_1.bmp from the demo folder.
    • Displays the resulting frame with detections rendered as bounding boxes and text.

 

Legal Information

You may not use or facilitate the use of this document in connection with any infringement or other legal analysis concerning Intel products described herein. You agree to grant Intel a non-exclusive, royalty-free license to any patent claim thereafter drafted which includes subject matter disclosed herein.

No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.

All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest Intel product specifications and roadmaps.

The products described may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.

Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Learn more at http://www.intel.com/ or from the OEM or retailer.

No computer system can be absolutely secure.

Intel, Arria, Core, Movidius, Xeon, OpenVINO, and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

OpenCL and the OpenCL logo are trademarks of Apple Inc. used by permission by Khronos

*Other names and brands may be claimed as the property of others.

Copyright © 2018, Intel Corporation. All rights reserved.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>