Introduction
This article will investigate using the Intel® Edison platform for small scale autonomous vehicle and remote surveillance.
Materials
● Hardware
Here we use the Devastator Tank* Mobile Robot Platform from DFRobot*, a mobile robotics platform that uses a Romeo* board powered by Intel® Edison.
Kit: two servomotors, UVC compliant webcam, Passive Infrared (PIR), Ultrasonic Range Finder
● Software
The client device software is written in Node.js* using the Intel® XDK, and leverages the mraa libraries for peripheral access (motors, lights, etc).
Objectives
● Construct an Odometer
To assist in navigation and distance tracking, we will construct an odometer using a conductive switch called a Reed switch, coupled with a small neodymium magnet.
● Push Images to the Edge Device
Using Secure Copy (SCP), we will push surveillance images from the robot to an edge device, and walk through how to utilize ImageMagick*, an open-source software application, to scan for an object’s presence.
Recommended Reading
Before proceeding, it might be helpful to reference one of the following articles which describe construction of the robot discussed in this article, addition of Wi-Fi enabled camera, and programming the tank using Intel® XDK and mraa libraries.
Building a Robotic Platform Using the Intel® Edison Module
https://software.intel.com/en-us/articles/overview-of-intel-edison-based-robotics-platform
Using the Intel® Edison module to control robots
https://software.intel.com/en-us/articles/using-the-intel-edison-module-to-control-robots
Programming Robotics using the Intel® XDK, Node.js, and mraa library
Software Design for Basic Autonomous System
Before we talk about hardware, let’s first go through the design approach for our autonomous robot. Using the Intel® XDK, Node.js*, and the mraa libraries we will construct an application that will abstract the lower level complexities involved with hardware communication. We will also review each of the robot sensors, and how each might be modeled in software. Sample code for each class and JavaScript* file can be found in the appendix at the end of this document.
Our goal is to design a robot that can independently navigate a course, offer obstacle avoidance, and provide remote surveillance by uploading images for offline analysis.
Using a collection of sensors, and knowledge of the course dimensions, we can build a software solution that will allow an autonomous navigation while collecting and providing constant feedback of location and surrounding objects. Image capture can be triggered programmatically when a destination has been reached and uploaded to a local gateway device for real-time or offline analysis.
We divide the application into separate JavaScript* files, each of which represents an object in our working solution. There are files for:
1) Robot
2) Light
3) Camera
4) Object detector
5) Distance detector
6) Navigation
7) CourseLeg
The entry point of the application is main.js, which is responsible for instantiation of a robot and a course, responding to heartbeat timers, and sensor events, as well as kicking off a course execution. A constants file provides a centralized configuration that can be referenced from any other module. The heart of the autonomous capability lies in the Navigation.js module, which is responsible for all directional motor control as well as course tracking and exposure of the Course and Robot status by way of properties and triggered events. The relationship of classes can be seen in the class diagram below.
PROGRAM EXECUTION FLOW
The application is designed to execute asynchronously, responding to both timer and event driven triggers that come from the robot. The main.js contains one main timer called heartbeat that is started when the application launches, the default callback code simply displays robot status. The navigation class also has one timer, however it executes much faster, and is responsible for tracking robot movement through the course as well as issuing navigation commands. This timer also is stopped when a course ends. Wheel rotation events are also handled in the Navigation class. The odometer class will fire an external event, ‘WheelRotationEvent’, on every edge trigger set by the odometer.
Summary of configurable timer objects:
Main.js
timerHeartBeat
Navigation.js
timerPolling
Callbacks:
Navigation.js
odometer.SensorEvent.on('WheelRevolutionEvent')
onEndOfLegEvent()
MAIN.JS
At the heart of the application, the main.js script facilitates:
- Setting up the Robot, Distance, Presence, and main Navigation object.
- Listening for heartbeat checks for overall health
- Executing response listeners for Object and Distance detection
- Communicating directly with the Navigation object for inquiry and control
While most of the logic for the application occurs within the Navigation class, main is still responsible for the highest level health status checks using a timer based polling mechanism. It also handles events that are triggered by distance detection and object presence objects. Finally, main.js is responsible for enabling/disabling the on-board camera and transmission of images to a remote server.
Constants.js
The constants file provides one place for all configuration data. Any piece of data that can be changed should be placed in this file. Throughout the code, the data held within these constants can then easily be referenced by first 1.) Adding a require statement to include the constants.js file, and 2.) Referencing the variable correctly.
Example
1.) var constants = require("./constants.js");
2.) var objectDetector = new ObjectDetector(constants.GPIO_PIR_SENSOR, false);
Robot.js
The design of the robot object is to encapsulate properties and operations relevant to the robot itself without consideration of any external sensors or functions. The constructor requests parameters for indication of the most important fields such as I2C* details and physical characteristics. The lights are added by way of composition, and can easily be toggled using an external interface. Finally, the servomotors are created as individual instances, with public methods exposed for directional control of the camera and distance finder modules.
The Lights object is very straight forward. It indicates which GPIO pin the LED is attached too, and also provides external interface for toggling the light on or off. The lights are part of the Robot object composition.
ObjectDetector.js
The ObjectDetector class will subscribe to changes in the sensor and broadcast out using an event emitter to any interested clients. It also exposes methods for toggling Listening as well as resetting a trigger and checking the current value. Finally, a SensorEvent is triggered each time an object is detect.
Listening to and reacting to the PIR events is accomplished by setting up a subscription and listener in the main.js file. Alternatively or in addition to, the value of the sensor can be checked on-demand using a polling mechanism. The code example below shows both approaches, both of which are houses in our main.js script.
(From main.js)
var ObjectDetector = require("./ObjectDetector.js"); var objectDetector = new ObjectDetector(constants.GPIO_PIR_SENSOR, false); //Start Listening objectDetector.StartListening(); //Check using Polling if (objectDetector.Listening){ objectDetector.CheckForObject(); } //Event Listener – Only called when status changes on the device objectDetector.SensorEvent.on('ObjectDetected', function _onPIRSensorTriggered() { //PIR sensor triggered if (constants.DEBUG) console.info("[OBJECT DETECTION]. Something detected."); });
DistanceDetector.js
The distance class is designed to create an instance of an interface to the Ultrasonic, as well as rebroadcast out events when an object is detected. To save power, the device is not always scanning, and can be called at anytime using the public methods:
GetDistance() - Returns distance in CM to closes object
IsObjectClose(thresholdCM) – Returns true/false if object is within distance set by threshold.
Odometer.js
The odometer class is responsible for triggering external events when an Interrupt Service Routine (ISR) is triggered on the configured GPIO pin.
Setting up the ISR to detect the continuity trigger is accomplished by simply calling the isr() method on the gpioPin object and creating a listener. Inside the handler, the public event is then triggered, which will be handled by the Navigation object’s listener method.
Here is an example taken from the Odometer constructor that shows setting up the ISR and firing the external event. The full source code can be found in the appendix at the end of the document.
this.gpioPin = new mraa.Gpio(pinNumber); this.gpioPin.dir(mraa.DIR_IN); this.gpioPin.isr(mraa.EDGE_BOTH, function _onPIRAlert() { if(self.Listening) self.SensorEvent.emit('WheelRevolutionEvent'); });
CourseLeg.js
The course currently being navigated is a composition of CourseLegs that is referenced in the Navigation object. Each individual CourseLeg maintains pertinent information for that leg such as how far to go in centimeters as well as revolutions required to travel that distance, and also which direction to turn at the end. The conversion from Distance to Revolutions is set in the constructor of the CourseLeg and is determined using the track length and sprocket diameter data stored on the robot.
Navigation.js
The navigation class is the largest in the design and is responsible for robot control and management of the CurrentCourse, which is represented by a collection of CourseLeg objects. The CurrentCourse[] collection is populated inside of the SetCourse() method, which is expected to be called prior to starting any course.
On initialization of the Navigation object, an odometer instance is created, which internally will trigger a public event: “WheelRevolutionEvent” each time the wheel is turned. A callback handler is created to capture these events, ensuring that the current wheel rotation is incremented when applicable using the minimum cycle time filter to ignore duplicate hits while the magnet is approaching or departing near the reed switch. The handler will also manage leg and course termination logic by calling a method or triggering a separate event.
Navigation Odometer revolution handler:
this.odometer.SensorEvent.on('WheelRevolutionEvent', function _OnRobotWheelRevolution(){ if (self.CurrentLeg < CurrentCourse.length) { if ((Date.now() - lastClickTime) > constants.ODO_CYCLE_TIME_MIN) { lastClickTime = Date.now(); var current = ++CurrentCourse[self.CurrentLeg].CurrentBotRevolutions; var required = CurrentCourse[self.CurrentLeg].RevolutionsRequired; console.info("CLICK[" + current +"] of [" + required + "]"); //Trigger the end of Leg Event if (current >= required) { self._OnEndOfLegEvent(); if (current>required) { console.info("Received a click event after required has been met."); } self.RobotStop(); self._OnEndOfLegEvent(); } } } else { if (constants.DEBUG) console.info("---- COURSE ENDED ----"); self.StopCourse(); } });
Once a course has started, a callback timer periodically checks on the navigational status of the robot to determine if it is time to turn, if it is currently turning, or ready to start the next leg.
The following methods are called to start and stop the polling navigation timer.
this._StartTimerNavigationPoll = function _StartTimerNavigationPoll() { timerPolling = setInterval(this._onNavigationCheck, constants.POLLINGNAV_MS); }; this._StopTimerNavigationPoll = function _StopTimerNavigationPoll() { console.info("[NAV]: STOPPING polling timer"); clearInterval(timerPolling); _navigationStopped = true; };
The complete source code for this class can be found in the Appendix at the end of this document.
Camera.js
A USB Video Class (UVC) camera is leveraged to take still images for upload to a remote server. The camera object exposes an easy interface to take pictures on demand. When requested to capture an image, a child process is created which will execute an external application, fswebcam, which will take a snapshot. The image will also be uploaded to a remote server using SCP.
To position the Camera, a call to Robot.Look(anglePan, angleTilt) must be called. The robot class holds instances of the Servo motors that allow it to point to a given X, Y location. Public methods are available to move the camera to any location using Look(angleX, angleY). For instance, to center the camera, a call is made to camera.LookCenter(), which in turn calls self.Look(90, 90).
Start by installing the fswebcam package:
opkg install fswebcam
Once installed and verified, add the camera device details to the constants.js file, and set the CAMERA_USE flag to true. Setting this flag is required for any images to be captured.
Image capture configurations:
FTP_REMOTE_IP : "192.168.1.166", FTP_REMOTE_BASE_FOLDER : "/home/root/inbound", FTP_REMOTE_USER : "username", FTP_REMOTE_PASSWORD : "passwd", CAMERA_USE : true
The source code for this class can be found in the appendix at the end of this document.
Robot Automation Components
Adding automation capabilities requires numerous sensors to monitor the surrounding environment. We will need the ability to track distance, detect object presence and distance, and capture and submit image data for real time analysis to a nearby Edge device. All of the components required and discussed in this article are provided in the kit with the exception of the odometer, which we will construct. This additional component requires a small (<10mm) button neodymium battery, a reed switch, and two resistors. We will walk through construction of the odometer in the section below.
The component pin mapping used in these examples is shown below.
Component: | Peripheral: | Intel® Edison Pin: | Translated Pin: | MRAA Pin: |
Passive Infrared Sensor | Digital Input | GPIO43 | D11 | 38 |
Tilt Servo | PWM | PWM1 | D5 | 14 |
Pan Servo | PWM | PWM0 | D3 | 20 |
Ultrasonic Sensor | UART TX UART RX | GP131 GP130 | D1/TX D0/RX | UART0 |
Left LED | Digital Output | GPIO48 | D7 | 33 |
Right LED | Digital Output | GPIO41 | D10 | 51 |
Buzzer | PWM | PWM2 | D6 | 0 |
Brushed DC Motors | I2C SCL I2C SDA | SDA1 SCL1 | I2C1_SDA I2C1_SDL | I2C0 |
ODOMETER | Digital Input | GP129 | D4 | 25 |
POWER
The numerous sensors combined with parallel motor movement, Wi-Fi transmission, and a USB powered camera requires additional power above the default provided 9 volts. Initial tests for the Devastator* platform show that the minimum provided 9 volts is not sufficient to consistently power the DC motors in parallel to the Wi-Fi enabled camera. In addition, unpredictable behavior was observed at a voltage below 9 volts, including servo alignment and motor controller directional issues. We begin by increasing the input Vcc voltage to around 14 volts by adding four more AA batteries. Having rechargeable batteries on hand during testing is very helpful for testing.
SENSORS
To help our robot navigate autonomously, we will need to employ a few sensors, which are outlined in the following table:
Sensor: | Description | In Kit |
Passive Infrared Sensor | Object detection | YES |
Ultrasonic Range Finder | Detect distance to an object | YES |
Odometer | Counts revolutions, tracks distance traveled | NO |
Firstly, to know where or how far to go, we have to know how far we have come. Distance traveled is measured using a component called an odometer, which we will construct. Secondly, knowing if something is within our field of motion is critical to avoid hitting the object. The passive infrared sensor will help with this by detecting objects that emit radiation (such as humans or small animals). Finally, the ability to detect how far away a physical object is will become a crucial part of autonomous navigation. Distance detection, using an ultrasonic range finder, allows our robot detect an in-path obstacle as well as determine how close we are to a wall as we travel alongside it, perhaps say in a maze.
Object Detection - Passive Infrared
The Passive Infrared module (PIR) is connected to digital GPIO pin D11, and toggle to a HIGH signal when an object is detected within range.
Object Distance – Ultrasonic Range Finder
Knowing the distance away from an object is as important a feature of autonomous navigation as the ability to know if something is there at all. Using radio waves, we employ a device called an ultrasonic range finder which will bounce radio waves off surrounding objects, and through comparison of return signal elapsed time, provide the distance away from the object.
For this example, we wire the range finder to communicate using UART, and as discussed in the software section, respond to events in the DistanceDetector class.
Pin connections from RIGHT TO LEFT: +5, GND, Rx, Tx
Odometer – Track distance travelled
There are multiple approaches to tracking distance travelled including GPS, magnetometers, IMUs, and optical verses conductive revolution counting. Each method has its pros and cons. For instance, GPS will only provide a level of accuracy down to three meters. Maneuvering a small robot requires a much smaller acceptable accuracy level. Similarly, using a magnetometer or an IMU to determine relative distance can also have inaccuracies due to disruptions in the magnetic field near or close the device; perhaps due to spinning of the DC motors.
A simple approach to tracking distance is to construct a revolution counter using a magnetic switch. The parts required for construction of this odometer can easily be found online:
PARTS LIST
Reed Switch X 1
200 Ω resistor X 2
10mm Neodymium battery X 1
Crafters glue, Electrical tape, and Jumper wires
A reed switch is a small component that houses two pieces of metal that are millimeters apart. When a magnet gets near the switch, the two pieces of metal attract to each other forming a circuit, allowing electricity to flow. For more information on reed switches, see the following wiki:
https://en.wikipedia.org/wiki/Reed_switch
ASSEMBLY
Onto the outer sprocket wall that faces closest to the chassis, attach a button neodymium battery. Position the battery onto a stable portion of the sprocket with super glue, and then place the sprocket back onto the tank.
Onto the outside chassis near where the sprocket turns, attach a reed switch in line with a 200Ω resistor to an open digital pin and ground. Attach the other end of the reed switch to an open power pin on the GPIO rail with an additional resistor. The prototype of the circuit can be seen below. In this example, you can see the matching colored wires at the bottom, ready for attachment to the rail: Green is Signal, Red is input voltage, and Black is Ground.
[Figure 1 Layout of Odometer prototype]
[Figure 2 Wired prototype]
[Figure 3 Odometer connected]
Secure with tape, and prior to attaching the odometer to the chassis, be sure to test one more time.
The completed installation should look like the image in Figure 4 below. You can see the magnet attached to the sprocket, just to the right of the reed switch that is affixed to the outside of the chassis with a small piece of tape. Pull the LED through the front of the robot and secure with some electrical tape.
[Figure 4 Reed Switch and Magnet installed]
[Figure 5 Front of robot with mounted LED indicator]
With the hardware completed, we can focus on the software implementation, which can be verified using an Arduino script. In this example, I used an Arduino 101 (branded Genuino outside the U.S.) attached to a breadboard. The migration of this sample code** to the Intel® Edison is discussed in the previous section detailing the Odometer class.
/*Sample Arduino code** */ int revolutionCount=0; int startCycle = 0; int CYCLE_TIME_MIN = 200; void setup() { Serial.begin(9600); attachInterrupt(digitalPinToInterrupt(2), isr, FALLING); } void isr(){ if (startCycle==0){ startCycle=millis(); Serial.print("[TURN START] "); Serial.println(++revolutionCount); } else if ( (millis() - startCycle)>CYCLE_TIME_MIN){ startCycle = 0; Serial.print("-----[TURN END] "); Serial.println(revolutionCount); } } void loop() { }
Attach a piece of tape to the sprocket, and count the turns for different distances and speed combinations. Compare the visual observations with what is reported at the end of each cycle as shown in Figure 6.
[Figure 6 Arduino* sample output revolution counting]
Figure 7 below shows the data collected using the magnetic reed switch odometer. Each rise of voltage in the signal, will trigger an increment of a revolution counter in software.
[Figure 7 Arduino* Plotter output]
To complete calibration of the odometer, we simply need to the diameter of our wheel sprocket in mm. Knowing the diameter, we can compute circumference which is 2πr. One circumference of sprocket turned is the amount of tread that is moved and the distance traveled. We can also capture tread length, and compute a ratio as an additional reference computation.
We find that the track length is 48cm, and the radius of the drive sprocket is 45mm. We enter these values into our constants.js configuration file, and can later run some tests to see how close our estimates are.
VISION
A basic requirement for surveillance is to capture an image snapshot. The kit we are using comes with a UVC compliant webcam, so our hardware is ready to go. If your camera is not working correctly, and you are unable to see a /dev/video* device, you might want to take a look at the following article, “Using the Intel Edison Module to Control Robots”.
Also included with the kit are two servomotors, one for Panning left and right along the X axis, and another for Tilting up and down along the Y axis. Before using the Servo motors, be sure all of the wires have enough flexibility to extend to the full range of motion, and that they do not protrude out onto the moving tracks. The servos should be connected to appropriate PWM GPIO Pins. The software for controlling the camera can be referenced in the section detailing Camera.js.
Autonomous navigation
For our robot to move by itself, it is important that it understands how to follow a static path. To begin, we must first establish the ability to accurately move in a known direction for a known distance, and also to take turns accurately. During setup of a course, the wheel circumference is referenced to set the number of revolutions per leg. During navigation, this course leg limit is referenced to determine if the end of leg has been reached, which can then trigger a call to StartTurn() or EndCourse().
The Navigation class manages the movement of the robot through a course by controlling the directional speed and movement based on references in a fixed course map. The CurrentLeg is tracked as a static variable on the class, and is incremented after each turn has completed and compared against the maximum course length to ensure no overrun occurs.
Navigation.js Pseudocode
FOR EACH Leg in Course
GO FORWARD(speed)
EVERY 500ms – TIMER POLL
IF NOT Turning AND CourseLeg.EndOfLeg
IF Is Time to Turn
TURN, start TurnTimerCallback
ELSE IF Stopped Turning
START NEXT LEG
ELSE IF EndOfCourse
Stop Polling Timer
onTurnTimerCompleted
StopRobot
StoppedTurning = true
Also, at any point during the journey, a camera image can be uploaded:
this.camera.TakePicture("StartTurn");
this.camera.TakePicture("EndTurn");
Going Further
The Navigation.js class handles management of where the robot needs to go by iterating the collection of CourseLegs, however it does not know the accuracy of the forward distance and turning times. It also has no understanding if it has drifted off course. Even with knowledge of these factors, different surfaces as well as current power availability add additional variables that must be considered. The Navigation and CourseLeg classes could easily be extended to accept adjustments such as motor speed offsets through configuration. Another nice touch would to include drift correction in response to known alignment issues, or using IMU outputs from an Intel® Curie™ module as an additional point of reference for directional control. Given any new terrain, you could then run a few tests and insert the observed offset back into our calculations for each leg of the journey.
Image processing at the Edge
An Intel® IOT Gateway can be used as our edge device, providing data storage for the surveillance images, and an execution environment for image analysis services. On the client side, the Camera.js class is responsible for capturing the images using a call to fswebcam, and transferring the images using SCP, so our server simply need to have SCP configured correctly. Most Linux* builds come with SCP installed, and it can easily be added by installing openssh-client package if it is not already installed.
Once the surveillance images are captured using fswebcam, they could be scanned for object detection. By taking a series of images from a fixed location, the images can be analyzed for any differences using an application like “imagemagick”, which provides a nice command line interface for comparing two image.
For more information, see Image Magick* at http://www.imagemagick.org/.
On Ubuntu:
sudo apt-get install imagemagick
Now create a script to parse a set of images in a folder, and provide tagging when differences are detected. By using the compare command, an Absolute error argument can be used to show when the pixels in two separate files are indeed different in anyway. By applying a fuzz percentage, the unwanted noise factor can be eliminated for a more accurate result. The –metric parameter will indicate to output a number, which represents the pixel difference. This is in contrast to outputting an image.
compare –metric AE –fuzz 5% image1 image2
Source Code
constants.js**
/* constants.js */ // Matt Chandler // Intel, Corp. // March, 2017 //NOTE that all GPIO PINS are MRAA Mappings to Edison pins // https://iotdk.intel.com/docs/master/mraa/edison.html // which are mapped using the Romeo schematic's GPIOs to edison pins. /* jslint node:true */ /* jshint unused:true */ module.exports = { SIMULATION : true, //Do not talk to motors, run sensors using callback timers DEBUG : true, //Show more debug messages POLLINGNAV_MS : 1000, //Navigation callback timer, should be lowest number=quickest POLLINGHEARTBEAT_MS : 2000, //overall status poll/health check I2CBusNumber : Number(0), I2CAddress : Number(4), I2CMotorDirectionLeft : 0xB1, I2CMotorDirectionRight : 0xB2, I2CMotorSpeedLeft : 0xC1, I2CMotorSpeedRight : 0xC2, MAX_SPEED : 0xF9, RIGHT : 0, LEFT : 180, FORWARD : 90, BACKWARD : 270, MOTOR_SPEED_TURN: 0x90, LIGHT_LEFT : 0, LIGHT_RIGHT : 1, TRACK_LENGTHCM : 48, SPROCKET_DIAMETERMM : 45, MOTOR_MSG_CLOCKWISE : 0x0, MOTOR_MSG_COUNTERCLOCKWISE : 0x1, MOTOR_SPEED_CRAWL : 0x10, MOTOR_SPEED_WALK : 0x20, MOTOR_SPEED_JOG : 0x30, MOTOR_SPEED_RUN : 0x80, GPIO_LIGHT_LEFT : 33, GPIO_LIGHT_RIGHT : 51, GPIO_BUZZER : 0, GPIO_DIST_UART : 0, GPIO_PIR_SENSOR : 38, GPIO_ODO_CONTACT_SIGNAL : 50, GPIO_ODO_CONTACT_MONITOR : 25, GPIO_PAN : 20, GPIO_TILT : 14, SERVO_PERIOD_MS_PAN : 10000, SERVO_PERIOD_MS_TILT : 10000, SERVO_PULSE_WIDTH_PAN_0 : 1000, SERVO_PULSE_WIDTH_PAN_90 : 1500, SERVO_PULSE_WIDTH_PAN_180 : 2000, ODO_CYCLE_TIME_MIN : 500, FTP_REMOTE_IP : "192.168.1.166", FTP_REMOTE_BASE_FOLDER : "/home/root/inbound", FTP_REMOTE_USER : "username", FTP_REMOTE_PASSWORD : "passwd", CAMERA_USE : true, LEG_MAXTIME_MS : 10000, MOTOR_RIGHT_OFFSET : 0, MOTOR_LEFT_OFFSET : 50, DRIFT_ANGLE_PER_METER : 5 };
main.js**
/* main.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false */ var m = require("mraa"); var Robot = require("./Robot.js"); var courseLeg = require("./CourseLeg.js"); var Navigation = require("./Navigation.js"); var constants = require("./constants.js"); var ObjectDetector = require("./ObjectDetector.js"); var DistanceDetector = require("./DistanceDetector.js"); _setupListeners(); _buildCourse(); //START PROGRAM var thisBot = new Robot(0, 4, 0x55, 0xaa, "SurveyOne", constants.TRACK_LENGTHCM, constants.SPROCKET_DIAMETERMM); var thisNavigationControl = new Navigation(thisBot); var objectDetector = new ObjectDetector(constants.GPIO_PIR_SENSOR, false); _enableSensors(); _initialize(thisBot); //-------------------------------------------- // LISTENERS/Callbacks function _onHeartBeat() { if (constants.DEBUG===true) { ShowStatus(); } } function ShowStatus(){ var statusString = "-----[HEARTBEAT]: " + thisBot.Name; if (thisBot.IsCurrentlyMoving) statusString += " MOVING on LEG[" + thisNavigationControl.CurrentLeg + "]"; else statusString += " IDLE"; statusString += " [PIR]: "; if(objectDetector.Listening===true) statusString += "on"; else statusString += "off"; statusString += " , ObjectDetected=" + objectDetector.ObjectDetected; statusString += "-----"; } objectDetector.SensorEvent.on('ObjectDetected', function _onPIRSensorTriggered() { //PIR sensor triggered if (constants.DEBUG) console.info("[OBJECT DETECTION]. Something detected."); }); function _reset() { this.navigation.robotStop(); } function _setupListeners() { console.info("Setting up listeners. [Heartbeat]"); setInterval(_onHeartBeat, constants.POLLINGHEARTBEAT_MS); } function _buildCourse() { console.info("Building course"); } function _initialize(bot) { console.info("Initializing robot: "); console.info(thisBot.StatusString); thisBot.TurnLightsOff(); thisNavigationControl.StartWarmupSequence(); } function _enableSensors(){ objectDetector.StartListening(); if(constants.DEBUG) console.info("PIR Sensor enabled: "); } function _ToggleLights(lightState){ thisBot.ToggleLight(constants.LIGHT_LEFT, lightState); thisBot.ToggleLight(constants.LIGHT_RIGHT, lightState); }
Robot.js**
/* Robot.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); var Light = require("./Light.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; module.exports = Robot; function Robot(I2CBusNumber, I2CAddress, I2CMessageHeader1, I2CMessageHeader2, name, tracklengthCM, sprocketDiameterMM, panPinNumber, tiltPinNumber, panPeriodus, tiltPeriodus, pwPan0, pwPan90, pwPan180, pwTilt0, pwTilt90, pwTilt180, panDelayAfterPosition, tiltDelayAfterPosition) { this.Name = name; this.I2CBusNumber = I2CBusNumber; this.I2CAddress = I2CAddress; this.SprocketDiametermm = sprocketDiameterMM; this.TrackLengthcm = tracklengthCM; this.I2CMessageHeader1 = I2CMessageHeader1; this.I2CMessageHeader2 = I2CMessageHeader2; this.IsCurrentlyMoving = false; this.IsCurrentlyTurning = false; this.CurrentDirection = constants.FORWARD; this.CurrentSpeed = 0x00; // Setup the Servo motors this.pinNumberPan = panPinNumber; this.pinNumberTilt = tiltPinNumber; this.pwPan0 = pwPan0; this.pwPan90 = pwPan90; this.pwPan180 = pwPan180; this.pwTilt0 = pwTilt0; this.pwTilt90 = pwTilt90; this.pwTilt180 = pwTilt180; this.panDelayAfterPosition = panDelayAfterPosition; this.tiltDelayAfterPosition = tiltDelayAfterPosition; var tiltServo = new mraa.Pwm(parseInt(tiltPinNumber)); var panServo = new mraa.Pwm(parseInt(panPinNumber)); tiltServo.period_us(parseInt(tiltPeriodus)); //100Hz -> 10ms Period panServo.period_us(parseInt(panPeriodus)); //100Hz -> 10ms Period //Setup the lights this.LightLeft = new Light(constants.LIGHT_LEFT, false, constants.GPIO_LIGHT_LEFT); this.LightRight = new Light(constants.LIGHT_RIGHT, false, constants.GPIO_LIGHT_RIGHT); this.StatusString = "Name: " + name +", Bus:" + this.I2CBusNumber +", Address: " + this.I2CAddress +", Sprocket Diameter: " + this.SprocketDiametermm +", Messager Headers: [" + this.I2CMessageHeader1 + ":" + this.I2CMessageHeader2 +"], " + (this.IsCurrentlyMoving ? "Moving":"Not Moving") +", " + (this.IsCurrentlyTurning ? "Turning":"Stopped") +", SPEED = " + this.CurrentSpeed + ", Direction = " + this.CurrentDirection; this.ToggleLight = function ToggleLight(lightNumber, lightOn){ var txt = ""; if (lightNumber===constants.LIGHT_LEFT){ this.LightLeft.ToggleState(lightOn); txt += "LEFT "; } else if (lightNumber===constants.LIGHT_RIGHT){ txt += "RIGHT "; this.LightRight.ToggleState(lightOn); } txt += " light turned "; if(lightOn===true) txt += "on"; else txt += "off"; console.info(txt); }; this.TurnLightsOn = function TurnLightsOn(){ this.ToggleLight(constants.LIGHT_LEFT, true); this.ToggleLight(constants.LIGHT_RIGHT, true); }; this.TurnLightsOff = function TurnLightsOff(){ this.ToggleLight(constants.LIGHT_LEFT, false); this.ToggleLight(constants.LIGHT_RIGHT, false); }; this._togglePanServo = function _togglePanServo(state){ panServo.enable(state); }; this._toggleTiltServo = function _toggleTiltServo(state){ tiltServo.enable(state); }; this.mapValue = function _mapValue(val, origMin, origMax, destMin, destMax){ //http://stackoverflow.com/questions/345187/math-mapping-numbers var ratio = (destMax - destMin) / (origMax - origMin); return ratio * (val - origMin) + destMin; }; this.Pan = function Pan(angle){ var pwPan = this.mapValue(angle, 0, 180, this.pwPan0, this.pwPan180); this._togglePanServo(false); panServo.pulsewidth_us(pwPan); this._togglePanServo(true); setTimeout(this._togglePanServo(false), this.panDelayAfterPosition); }; this.Tilt = function Tilt(angle){ var pwTilt = this.mapValue(angle, 0,180, this.pwTilt0, this.pwTilt180); this._toggleTiltServo(false); tiltServo.pulsewidth_us(pwTilt); this._toggleTiltServo(true); setTimeout(this._togglePanServo(false), this.tiltDelayAfterPosition); }; this.Look = function Look(anglePan, angleTilt){ this.Pan(anglePan); this.Tilt(angleTilt); }; this.PanCenter = function PanCenter(){ this._togglePanServo(false); panServo.pulsewidth_us(this.pwPan90); this._togglePanServo(true); setTimeout(this._togglePanServo(false), this.panDelayAfterPosition); }; this.TiltCenter = function TiltCenter(){ this._toggleTiltServo(false); tiltServo.pulsewidth_us(this.pwTilt90); this._toggleTiltServo(true); setTimeout(this._toggleTiltServo(false), this.tiltDelayAfterPosition); }; this.LookCenter = function LookCenter(){ this.PanCenter(); this.TiltCenter(); }; } Robot.prototype.toString = function toString() { return this.StatusString; };
CourseLeg.js**
/* CourseLeg.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ function CourseLeg(leg, distFwcm, directionPost, blockAtEnd, bot, legSpeed) { this.LegNumber = leg; this.DistanceForwardcm = distFwcm; this.DirectionToTurn = directionPost; this.BlockAtEndOfStep = blockAtEnd; this.CurrentBotRevolutions = 0; //Number of revolutions counted for this leg this.RevolutionsRequired = 0.00; //Calculated value this.EndOfLegReached = false; this.Speed = legSpeed; //Distance travelled by one revolution of sprocket // = Circumference = 3.14 * Diameter = (3.14*44)mm = 138mm = 13.8cm this.RevolutionsRequired = distFwcm/(3.14 * (bot.SprocketDiametermm/10)); } exports.CourseLeg = CourseLeg;
Light.js**
/* Light.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; //Light Object module.exports = Light; function Light(locn, defaultState, pinNumber) { this.LightLocation = locn; this.LightState = defaultState; this.gpioPin = new mraa.Gpio(pinNumber); this.gpioPin.dir(mraa.DIR_OUT); this.ToggleState = function ToggleState(lightOn){ //var myDigitalPin5 = new mraa.Gpio(pinNumber); //setup digital read on Digital pin #5 (D5), mraa 33 // this.gpioPin.dir(mraa.DIR_OUT); //set the gpio direction to output this.gpioPin.write( (lightOn===true)? 1:0); //set the digital pin to high (1) this.LightState = lightOn; console.info("Toggled State to: " + (lightOn===true)? "ON":"OFF"); }; }
Odometer.js**
/* Odometer.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; var EventEmitter = require('events'); module.exports = Odometer; function Odometer(pinNumber) { var self = this; this.Listening = false; this.SensorEvent = new EventEmitter(); this.gpioPin = new mraa.Gpio(pinNumber); this.gpioPin.dir(mraa.DIR_IN); this.gpioPin.isr(mraa.EDGE_BOTH, function _onPIRAlert() { if(self.Listening) self.SensorEvent.emit('WheelRevolutionEvent'); }); this.StartListening = function StartListening() { this.Listening = true; }; this.StopListening = function StopListening() { this.Listening = false; }; this.IsContacted = function IsContacted(){ return (this.gpioPin.read()>0) ? true : false; }; this.GetPinValue = function GetPinValue(){ return this.gpioPin.read(); }; }
Navigation.js**
/*Navigation.js*/ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; var courseLeg = require("./CourseLeg.js"); var Robot = require("./Robot.js"); var Odometer = require("./Odometer.js"); var Camera = require("./Camera.js"); var CurrentCourse = [];//Array of CourseLegs var lastClickTime = Date.now(); //mraa I2C vars used for motor control var i2C_MSGBFR = new Buffer(5); i2C_MSGBFR[0] = 0x55; i2C_MSGBFR[1] = 0xaa; var _navigationRunning = false; var _navigationStopped = false; if (false)//constants.DEBUG console.log(mraaClass); // prints mraa object to XDK IoT debug output panel module.exports = Navigation; function Navigation(bot) { var self=this; var timerPolling; var thisBot = bot; var maxLegTimeMS = 5000; var timeNavigationStart = Date.now(); this.CurrentLeg = 0; this.odometer = new Odometer(constants.GPIO_ODO_CONTACT_MONITOR); this.odometer.StartListening(); this.camera = new Camera("800x600", true); //Navigation Status Tracking var turnCompleted = false; var turnStarted = false; this.SetCourse = function SetCourse(courselegs) { CurrentCourse = courselegs; console.info("Course set."); }; this.AddLeg = function AddLeg(courseleg) { console.info("Leg added: Leg:( " + courseleg.Leg + ", Distcm:(" + courseleg.DistanceForwardcm + "), TurnAtEnd:(" + courseleg.DirectionToTurn + ")"); CurrentCourse.push(courseleg); }; this.ShowCourse = function ShowCourse() { console.info("[COURSE]: " + CurrentCourse.length + " legs."); //iterate the current course for (var idx=0;idx<CurrentCourse.length;idx++) { var txt = "[COURSE LEG]: (" + idx + ")," +"Distcm:(" + CurrentCourse[idx].DistanceForwardcm + ")," +"Revolutions Required: (" + CurrentCourse[idx].RevolutionsRequired + ")," +"TurnAtEnd:(" + CurrentCourse[idx].DirectionToTurn + ")," +"BlockAtEnd:(" + CurrentCourse[idx].BlockAtEndOfStep + ")"; console.info(txt); } }; //------------------------------------------- // MOTOR CONTROL this.StartTurning = function StartTurning(angle, turnSpeed) { // Turning is always relative to the bot's direction, with standard cartesian coord layout : // 0 is right 90' // 90 is dead ahead // 180 is left 90' // 270 is turnaround/backwards /* Turning is always performed at a configurable rate, the same for each motor Testing the amount of time it takes to complete one 90' turn, a ratio can be configured which can be applied to any angle: 90 = 1 second 45 = .5 A = A/90 * (Time for 90) */ this.camera.TakePicture("StartTurn"); var timeToCompleteTurn = (angle/90) * constants.TURNTIME_90_MS; console.info("[NAV]: TURNING " + angle + "' . Should be done in about " + timeToCompleteTurn/1000 + "seconds"); turnCompleted = false; turnStarted = true; if (angle<90){ self.RobotRight(turnSpeed); } else if (angle<260){ self.RobotLeft(turnSpeed); } else if (angle<360){ self.RobotRight(turnSpeed); } //Set timer callback when turn has completed setTimeout(self._onTurnTimeoutCompleted, 4000); }; this.Sleep = function Sleep(ms){ var done=false; var startTime = Date.now(); while(!done){ if ( (Date.now() - startTime) > 2000) done=true; } }; this._onTurnTimeoutCompleted = function _onTurnTimeoutCompleted(){ self.StopTurning(); }; this.StopTurning = function StopTurning(){ console.info("[NAV]" + self.CurrentLeg + ": TURN COMPLETED"); turnCompleted = true; thisBot.IsCurrentlyTurning = false; self.RobotStop(); }; this.RobotGo = function RobotGo(direction, speed){ var rightMotorDirection; var leftMotorDirection; timeNavigationStart = Date.now(); if (speed > constants.MAX_SPEED) speed = constants.MAX_SPEED; if (speed > 0xFF) speed = 0xFF; thisBot.IsCurrentlyMoving = true; thisBot.CurrentDirection = direction; thisBot.CurrentSpeed = speed; switch(direction) { case constants.FORWARD: rightMotorDirection = constants.MOTOR_MSG_COUNTERCLOCKWISE; leftMotorDirection = constants.MOTOR_MSG_COUNTERCLOCKWISE; break; case constants.BACKWARD: rightMotorDirection = constants.MOTOR_MSG_CLOCKWISE; leftMotorDirection = constants.MOTOR_MSG_CLOCKWISE; break; case constants.LEFT: rightMotorDirection = constants.MOTOR_MSG_CLOCKWISE; leftMotorDirection = constants.MOTOR_MSG_COUNTERCLOCKWISE; thisBot.IsCurrentlyTurning = true; break; case constants.RIGHT: rightMotorDirection = constants.MOTOR_MSG_COUNTERCLOCKWISE; leftMotorDirection = constants.MOTOR_MSG_CLOCKWISE; thisBot.IsCurrentlyTurning = true; break; } //Set motor directions i2C_MSGBFR[2] = constants.I2CMotorDirectionLeft; i2C_MSGBFR[3] = leftMotorDirection; i2C_MSGBFR[4] = (i2C_MSGBFR[0] + i2C_MSGBFR[1] + i2C_MSGBFR[2] + i2C_MSGBFR[3]) & 0xFF; this.I2CBus.write(i2C_MSGBFR); i2C_MSGBFR[2] = constants.I2CMotorDirectionRight; i2C_MSGBFR[3] = rightMotorDirection; i2C_MSGBFR[4] = (i2C_MSGBFR[0] + i2C_MSGBFR[1] + i2C_MSGBFR[2] + i2C_MSGBFR[3]) & 0xFF; this.I2CBus.write(i2C_MSGBFR); //Set Motor Speeds i2C_MSGBFR[2] = constants.I2CMotorSpeedLeft; i2C_MSGBFR[3] = speed; i2C_MSGBFR[4] = (i2C_MSGBFR[0] + i2C_MSGBFR[1] + i2C_MSGBFR[2] + i2C_MSGBFR[3]) & 0xFF; this.I2CBus.write(i2C_MSGBFR); //Right Motor Speed i2C_MSGBFR[2] = constants.I2CMotorSpeedRight; i2C_MSGBFR[3] = speed; i2C_MSGBFR[4] = (i2C_MSGBFR[0] + i2C_MSGBFR[1] + i2C_MSGBFR[2] + i2C_MSGBFR[3]) & 0xFF; this.I2CBus.write(i2C_MSGBFR); }; this.RobotStop = function RobotStop() { this.RobotGo(constants.FORWARD, 0x00); thisBot.IsCurrentlyMoving = false; }; this.RobotForward = function RobotForward(speed) { console.info("Going Forward"); this.RobotGo(constants.FORWARD, speed); }; this.RobotBackward = function RobotBackward(speed) { this.RobotGo(constants.BACKWARD, speed); }; this.RobotLeft = function RobotLeft() { this.RobotGo(constants.LEFT, constants.MOTOR_SPEED_TURN); }; this.RobotRight = function RobotRight() { this.RobotGo(constants.RIGHT, constants.MOTOR_SPEED_TURN); }; this.StartCourse = function StartCourse() { console.info("[NAV]: STARTING COURSE with " + CurrentCourse.length + " legs in course."); _navigationStopped = false; _navigationRunning = true; thisBot.TurnLightsOn(); this._StartTimerNavigationPoll(); this.RobotForward(constants.MOTOR_SPEED_CRAWL); //Always start in a forward motion }; this.StopCourse = function StopCourse() { console.info("[NAV]: STOPPING COURSE. "); console.info("[" + thisBot.Name + "]: STOPPING."); this.RobotStop(); this._StopTimerNavigationPoll(); CurrentCourse = {}; _navigationRunning = false; thisBot.TurnLightsOff(); }; this.IsReadyToTurn = function IsReadyToTurn() { if (constants.DEBUG) console.info("[DEBUG]: Ready to TURN?? " + CurrentCourse[self.CurrentLeg].EndOfLegReached); return CurrentCourse[self.CurrentLeg].EndOfLegReached; }; // --------------------- // CALLBACKS/EventHandlers // --------------------- this._onNavigationCheck = function _onNavigationCheck() { if (_navigationRunning) { if ( (Date.now() - timeNavigationStart) > constants.LEG_MAXTIME_MS) { console.warn("EMERGENCY STOP - LEG HAS REACHED MAX CONFIGURED TIME."); _navigationRunning = false; } //NOT Turning, but Ready to Turn if (thisBot.IsCurrentlyTurning===false && turnStarted===false) { if (self.IsReadyToTurn()===true){ self.RobotStop(); self.StartTurning(CurrentCourse[self.CurrentLeg].DirectionToTurn, constants.MOTOR_SPEED_TURN); } } else if (thisBot.IsCurrentlyTurning && turnCompleted===false){ // TURNING but not complete if (turnCompleted===false) { console.info("[NAV] " + self.CurrentLeg + " : TURNING TURNING ..."); } } else if (turnCompleted===true){ if (self.CurrentLeg + 1 > CurrentCourse.length) { if (constants.DEBUG) console.info("----- END OF COURSE ------"); self.StopCourse(); } else { self.CurrentLeg++; if (constants.DEBUG) console.info("[NAV]: STARTING NEXT LEG : " + self.CurrentLeg); CurrentCourse[self.CurrentLeg].CurrentBotRevolutions = 0; self.RobotForward(constants.MOTOR_SPEED_CRAWL); } } else { if (constants.DEBUG) { var currentDirection = (currentDirection===constants.FORWARD) ? "FORWARD" : "UNKNOWN"; switch(thisBot.CurrentDirection) { case constants.FORWARD: currentDirection = "FORWARD"; break; case constants.BACKWARD: currentDirection = "BACKWARD"; break; case constants.LEFT: currentDirection = "LEFT"; break; case constants.RIGHT: currentDirection = "RIGHT"; break; default: break; } var revsToGo = CurrentCourse[self.CurrentLeg].RevolutionsRequired - CurrentCourse[self.CurrentLeg].CurrentBotRevolutions; console.info("[LEG]:" + self.CurrentLeg + ". " + currentDirection + ", " + revsToGo + " revolutions to go..."); } } } else if (!_navigationStopped) { console.warn("Navigation was not stopped by CourseEnd action. Stopping Course and Timer Navigation Poll."); self.StopCourse(); self._StopTimerNavigationPoll(); } else { if(constants.DEBUG) console.info("[NAV]: " + thisBot.Name + " Not Running, and Stopped.."); } }; this._OnEndOfLegEvent = function _OnEndOfLegEvent(){ if (constants.DEBUG) console.info("[NAV]: ----- END OF LEG " + self.CurrentLeg + " REACHED ---"); CurrentCourse[self.CurrentLeg].EndOfLegReached = true; }; this.odometer.SensorEvent.on('WheelRevolutionEvent', function _OnRobotWheelRevolution(){ if (self.CurrentLeg < CurrentCourse.length) { if ((Date.now() - lastClickTime) > constants.ODO_CYCLE_TIME_MIN) { lastClickTime = Date.now(); var current = ++CurrentCourse[self.CurrentLeg].CurrentBotRevolutions; var required = CurrentCourse[self.CurrentLeg].RevolutionsRequired; console.info("CLICK[" + current +"] of [" + required + "]"); //Trigger the end of Leg Event if (current >= required) { self._OnEndOfLegEvent(); if (current>required) { console.info("Received a click event after required has been met."); } self.RobotStop(); self._OnEndOfLegEvent(); } } } else { if (constants.DEBUG) console.info("---- COURSE ENDED ----"); self.StopCourse(); } }); // ----------- // TIMER STARTER/STOPPERS // ----------- this._StartTimerNavigationPoll = function _StartTimerNavigationPoll() { timerPolling = setInterval(this._onNavigationCheck, constants.POLLINGNAV_MS); }; this._StopTimerNavigationPoll = function _StopTimerNavigationPoll() { console.info("[NAV]: STOPPING polling timer"); clearInterval(timerPolling); _navigationStopped = true; _navigationStopped = true; }; this.StartWarmupSequence = function StartWarmupSequence() { var warmupCourse = []; warmupCourse.push(new courseLeg.CourseLeg(0, 100, constants.RIGHT, false, thisBot, constants.MOTOR_SPEED_CRAWL)); warmupCourse.push(new courseLeg.CourseLeg(1, 100, constants.RIGHT, false, thisBot, constants.MOTOR_SPEED_CRAWL)); warmupCourse.push(new courseLeg.CourseLeg(2, 100, constants.RIGHT, false, thisBot, constants.MOTOR_SPEED_CRAWL)); warmupCourse.push(new courseLeg.CourseLeg(3, 100, constants.RIGHT, false, thisBot, constants.MOTOR_SPEED_CRAWL)); console.info("Starting warmup sequence..."); this.camera.TakePicture("Start"); this.RobotStop(); this.SetCourse(warmupCourse); this.ShowCourse(); this.StartCourse(); }; _navigationRunning = false; this.I2CBus = new mraaClass.I2c(thisBot.I2CBusNumber); this.I2CBus.address(thisBot.I2CAddress); console.info("Navigation object created. No course defined yet."); }
ObjectDetector.js**
/* ObjectDetector.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; var EventEmitter = require('events'); module.exports = ObjectDetector; function ObjectDetector(pinNumber, startListening) { var self = this; this.ObjectDetected = false; this.Listening = startListening; this.SensorEvent = new EventEmitter(); this.gpioPin = new mraa.Gpio(pinNumber); this.gpioPin.dir(mraa.DIR_IN); this.gpioPin.isr(mraa.EDGE_RISING, function _onPIRAlert() { if (this.Listening){ self.SensorEvent.emit('ObjectDetected'); self.ObjectDetected = true; } }); this.ResetTrigger = function ResetTrigger() { this.ObjectDetected = false; }; this.StartListening = function StartListening() { this.Listening = true; }; this.StopListening = function StopListening() { this.Listening = false; }; this.CheckForObject = function CheckForObject(){ var pirValue = this.gpioPin.read(); this.ObjectDetected = (pirValue>0) ? true:false; console.info("checking PIR:" + pirValue); }; }
DistanceDetector.js**
/* DistanceDetector.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; module.exports = DistanceDetector; function DistanceDetector(uartNum, baudRate){ this.uart = new mraa.Uart(uartNum); //Default this.uart.setBaudRate(baudRate); this.uart.setMode(8,0,1); this.uart.setFlowcontrol(false, false); sleep(200); var command = new Buffer(4); command[0] = 0x22; command[1] = 0x00; command[2] = 0x00; command[3] = 0x22; this.GetDistance = function GetDistance(){ var rxBuf; var distanceCM; this.uart.write(command); sleep(200); rxBuf = this.uart.read(4); sleep(200); if (rxBuf[3] == (rxBuf[0]+rxBuf[1]+rxBuf[2])) { distanceCM = (rxBuf[1]<<8) | rxBuf[2]; return distanceCM; } }; this.IsObjectClose = function IsObjectClose(thresholdCM) { var rxBuf; var distanceCM = this.getDistance(); return (distanceCM < thresholdCM) ? true : false; }; } function sleep(ms){ var done=false; var startTime = Date.now(); while(!done){ if ( (Date.now() - startTime) > 2000) done=true; } }
Camera.js**
/* Camera.js */ // Matt Chandler // Intel, Corp. // March, 2017 /* jslint node:true */ /* jshint unused:false*/ var constants = require("./constants.js"); try { require.resolve("mraa"); } catch (e) { console.error("Critical: mraa node module is missing, try 'npm install -g mraa' to fix.", e); process.exit(-1); } var mraa = require("mraa") ; var cpLib = require('child_process'); module.exports = Camera; function Camera(resolution, greyscale){ this.CameraOn = false; this.Resolution = resolution; this.GreyScale = greyscale; var cp = null; this.TakePicture = function TakePicture(imagePrefix){ this.CameraOn = true; var currentDate = new Date(); var m = currentDate.getMonth(); m++; var month = (m<10) ? "0" + m : m; m = currentDate.getHours(); var hours = (m<10) ? "0" + m : m; var cmd = "fswebcam -r 800x600 --jpeg 100 -S 13 "; if (this.GreyScale===true) cmd+="--greyscale "; var fileName = "/home/root/" + imagePrefix + "_" + currentDate.getFullYear() + month + currentDate.getDate() + hours + currentDate.getMinutes() + currentDate.getSeconds() + ".jpg"; cmd += fileName; cmd += " --exec " + "\"sshpass -p '" + constants.FTP_REMOTE_PASSWORD + "' scp " + fileName + "" + constants.FTP_REMOTE_USER + "@" + constants.FTP_REMOTE_IP + ":" + constants.FTP_REMOTE_BASE_FOLDER + "\""; if (constants.DEBUG) console.info(cmd); cp = cpLib.exec(cmd); cp.on('close', function (code) { console.log('child process exited with code ' + code); }); }; }
Summary
In this article we have investigated some of the possibilities for using the Intel® Edison platform for autonomous navigation as well as remote surveillance. We outlined some example Node.js code that provides a starting framework for an autonomous application using the Romeo* Intel® Edison based robotics platform. Given a map, we also showed how we can navigate to a given location, and upload an image to a remote server. At the edge, we showed that image analysis can then easily be scripted for object detection.
About the Author
Matt Chandler is a software engineer at Intel working on scale enabling projects for Internet of Things.
References
DFRobot* Product Description
https://www.dfrobot.com/wiki/index.php/Romeo_for_Edison_Controller_SKU:_DFR0350
Notices
No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.
Intel disclaims all express and implied warranties, including without limitation, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement, as well as any warranty arising from course of performance, course of dealing, or usage in trade.
This document contains information on products, services and/or processes in development. All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest forecast, schedule, specifications and roadmaps.
The products and services described may contain defects or errors known as errata which may cause deviations from published specifications. Current characterized errata are available on request.
Copies of documents which have an order number and are referenced in this document may be obtained by calling 1-800-548-4725 or by visiting www.intel.com/design/literature.htm.
Intel, the Intel logo, and Intel RealSense are trademarks of Intel Corporation in the U.S. and/or other countries.
*Other names and brands may be claimed as the property of others
**This sample source code is released under the Intel Sample Source Code License Agreement.
© 2017 Intel Corporation.