Make your BeagleBone Track a Ball

Video Tutorial:

Supplies:

 

-A BeagleBone running Debian and a USB cable.

-A computer with Google Chrome or Mozilla Firefox. If you plan on viewing what your BeagleBone can see, you will need VNC or X11. You can also plug your BeagleBone into a monitor.

-A Linux supported USB webcam. I used a Logitech c270.

 

Connecting to your BeagleBone

 

You can use cloud9 or SSH with this tutorial. If you haven't installed the drivers for the BeagleBone, plug in your BeagleBone, which will come up as a USB device. Then open start.htm and install the drivers. Now you can open cloud9 by opening Chrome or Firefox and navigating to 192.168.7.2. On the bottom of cloud9, you will notice a terminal. Here is a video tutorial about how to get started with the BeagleBone Black.

 

If you prefer SSH, you can use PuTTY to connect to the BeagleBone on windows. In UNIX based systems like

OS X and Linux, you can connect by opening terminal and entering:

your-computer:~ ssh root@192.168.7.2

Download the Code:

Download the code with git:

bone# git clone https://github.com/AlekMabry/RobotHead.git

Now enter the directory with the code:

bone# cd RobotHead/Blob\ Detection

Configure the Program:

To track an object, the program looks for all pixels that have colors within a range of HSV color values.

 

Different programs have different scales of HSV, for example GIMP uses:

H = 0-360, S = 0-100 and V = 0-100

While openCV uses:

H: 0 - 180, S: 0 - 255, V: 0 - 255

So just remember openCV's scale when deciding a color range to track.

 

You can edit the color range in hsv.txt, which you can modify by running the command:

bone# nano hsv.txt

By changing the number in debug.txt, you can make the program give different outputs.

If the number is set to:

     -0 will output coordinates of the detected object.

     -1 will output coordinates and video.

     -2 will output coordinates, video, and the current HSV value settings.

You can edit the contents of debug.txt with:

bone# nano debug.txt

Compile and Execute the Tracking Code:

The tracking code is a C++ file. In order to run it, you need to first compile it:

bone# g++ `pkg-config --static --libs opencv` -O3 -o tracker tracker.cpp

Now it is time to execute it! If you are using ssh or cloud9, be sure to make sure that you have modified debug.txt to only output coordinates. If you are using a monitor, VNC, or X11 with SSH then you can use video output.

 

To execute the program enter:

bone# ./tracker

You may get some errors saying:

VIDIOC_QUERYMENU: Invalid argument

This error message is normal. Just wait a second and the program will start up.

Using the Servo Control Program:

If you like, you can make your robot's head follow the ball by piping the output of tracker.cpp to ServoControl.js.

 

Start by connecting your servos as shown below:

ServoControl.js works by using coordinates from tracker.cpp. It finds out how off-center the object is, then rotates the camera until the object is centered on screen. To push the coordinates outputted from the tracker.cpp program into ServoControl.js, we are going to use a pipe. A pipe is a 1 way flow of data from one program to another. To form this pipe and execute the program type:

bone# ./tracker | node ServoControl.js

You may have noticed ServoControl_Eyebrow.js in the GitHub repository you downloaded. This program includes one extra servo- which is used as an eyebrow. This eyebrow goes up when the object is detected. To use this code attach another servo to pin P8_45.

Copyright © 2017 Einsteinium Studios