Using the roboRIO with the BeagleBone

Supplies:

 

-A BeagleBone running Debian and a USB cable. Must have openCV, Python 2, and NumPy installed.

-A computer with Google Chrome or Mozilla Firefox. If you plan on viewing what your BeagleBone can see, you will need VNC or X11. Also be sure to have the newest FRC control software installed and a Java workspace with WPIlibsJ.

-A Linux supported USB webcam. I used a Logitech c270.

-A roboRIO with Java installed.

-A router (to plug your BeagleBone and roboRIO into)

-Power supply for the roboRIO.

 

Connecting the roboRIO to the BeagleBone

 

Connect both the roboRIO and BeagleBone to the router and plug the USB webcam into the BeagleBone before turning it on. In order to program and control the roboRIO you have to be connected to the router too, so join it's wifi network or tether in with an ethernet cable.

 

Connecting to your BeagleBone

 

You can use cloud9 or SSH with this tutorial. If you haven't installed the drivers for the BeagleBone, plug in your BeagleBone, which will come up as a USB device. Then open start.htm and install the drivers. Now you can open cloud9 by opening Chrome or Firefox and navigating to 192.168.7.2. On the bottom of cloud9, you will notice a terminal. Here is a video tutorial about how to get started with the BeagleBone Black.

 

I would recommend using SSH. I have a full tutorial for using SSH (with X) for all OS's here, but on UNIX based systems like OS X and Linux, you can connect by opening terminal and entering:

your-computer:~ ssh root@192.168.7.2

If you would like to be able to view openCV windows on your BeagleBone, SSH in with X:

your-computer:~ ssh -X root@192.168.7.2

If you need to connect to your BeagleBone over a network use:

bone# ssh root@beaglebone.local

UDP on the roboRIO (Java)

This UDP code works by sending a request the the BeagleBone (In this case a number corresponding to an openCV algorithm to run) and then receives the output from that openCV program. The BeagleBone will always have an IP of beaglebone.local, and the roboRIO will use roboRIO-####-FRC.local, where  the #### is your team number.

In the imports:

import java.io.*;

import java.net.*;

In the Robot class:

//Beaglebone Communications

DatagramSocket serverSocket;

byte[] receiveData;

byte[] sendData;

In robotInit():

//Setup the BeagleBone Communications

try {

    serverSocket = new DatagramSocket(3641); //Choose a port for your BeagleBone to send packets to!

} catch (SocketException e) {

    e.printStackTrace();

}

receiveData = new byte[256];

sendData = new byte[256];

How to send data:

//Send Data

InetAddress IPAddress = InetAddress.getByName("beaglebone.local");

int port = 3641;

String requestValue = "0";

sendData = requestValue.getBytes();

DatagramPacket sendPacket = new DatagramPacket(sendData, sendData.length, IPAddress, port);

try {

    serverSocket.send(sendPacket);

} catch (IOException e) {

    e.printStackTrace();

}

How to receive data:

//Receive Data

DatagramPacket receivePacket = new DatagramPacket(receiveData, receiveData.length);

try {

    serverSocket.receive(receivePacket);

} catch (IOException e) {

    e.printStackTrace();

}

String incoming = new String(receivePacket.getData());

String[] parts = sentence.split(" ");

String part1_raw = parts[0];

double part1_double = Double.parseDouble(part1_raw);

System.out.println("RECEIVED: " + part1_raw);

This code also splits the incoming packet based on spaces. This is useful for sending several different variable values at a time (just add spaces!). I also find that sometimes UDP packets will contain the text but with extra 'empty' characters of data. Some programs like pyCharm and intelliJ show these in the form of blocks, while others like the console on the driver station won't show them and just return errors in the code. Here is an example of what I mean:

So later in the code, if you were to do something like "if (incoming == "test")" the code above would never trigger this event because incoming would be "test◽◽◽◽◽◽◽◽" instead of "test." This can be solved by adding a space to the end of your message and separating the text from the 'empty' characters.

UDP on the BeagleBone (Python)

This code waits for a 'request' packet from the roboRIO, and then runs the openCV function associated to that request and returns the output over UDP.

To make and edit files you can use the nano editor:

bone# nano VisionCore.py

And to exit the nano editor you hold ctrl+x, and then press "y" to save the file.

VisionCore.py:

import numpy as np

import cv2

import sys

import time

import socket

 

import LineFollower  #This would be the file with the openCV function you would like to run

 

video_capture = cv2.VideoCapture(-1)

video_capture.set(3, 160)

video_capture.set(4, 120)

 

#OpenCV takes some time to turn on the camera when taking the first frame, so we do this before the UDP code starts so that your robot doesn't spend precious autonomous time starting the video feed

ret, frame = video_capture.read()

 

#Set this to 1 if you would like to see the video feed

showVideo = 0

 

UDP_IP = "0.0.0.0"

UDP_PORT = 3641

 

sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)

sock.bind((UDP_IP, UDP_PORT))

 

while(True):

    data, addr = sock.recvfrom(256)

    #data = data.split(" ") (use this if you expect to get multiple arguments at a time)

    print "Incoming: " + data

 

    if data == "0":

        ret, frame = video_capture.read()

        sendData = LineFollower.lineOffset(ret, frame)

    if data == "1":

        ret, frame = video_capture.read()

        #sendData = moreOpenCVCode.function(ret, frame)

    if data == "2":

        ret, frame = video_capture.read()

        #sendData = moreOpenCVCode.function(ret, frame)

 

    sock.sendto(str(sendData)+" ", ("roboRIO-3641-FRC.local", 3641))

 

    data = ""

 

    #Show Window

    if showVideo == 1:

        cv2.imshow("frame",frame)

    if cv2.waitKey(1) & 0xFF == ord('q'):

        break

And here is an example of an openCV function the program uses. If you would like to learn how the code for this file works you can read a tutorial here.

import numpy as np

import cv2

 

def lineOffset(ret, frame):

 

    # Crop the image

    crop_img = frame[60:120, 0:160]

 

    # Gaussian blur

    blur = cv2.GaussianBlur(crop_img,(5,5),0)

 

    #Convert to Hue, Saturation, and Value colorspace

    hsv = cv2.cvtColor(blur, cv2.COLOR_BGR2HSV)

 

    # define range of red color in HSV

    lower_red = np.array([0, 40, 0])

    upper_red = np.array([10, 255, 255])

 

    # Threshold the HSV image to get only red colors

    thresh = cv2.inRange(hsv, lower_red, upper_red)

 

    # Find the contours of the frame

    contours,hierarchy = cv2.findContours(thresh.copy(), 1, cv2.CHAIN_APPROX_NONE)

 

    # Find the biggest contour (if detected)

    if len(contours) > 0:

        c = max(contours, key=cv2.contourArea)

        M = cv2.moments(c)

 

        if M['m00'] != 0:

            cx = int(M['m10']/M['m00'])

            cy = int(M['m01']/M['m00'])

        else:

            cx = 0

            cy = 0

 

        cv2.line(crop_img, (cx, 0), (cx, 720), (255, 0, 0), 1)

        cv2.line(crop_img, (0, cy), (1280, cy), (255, 0, 0), 1)

 

        cv2.drawContours(crop_img, contours, -1, (0, 255, 0), 1)

 

        if cx >= 80:

            return cx-80

 

        if cx < 80:

            return -(80-cx)

    else:

        return 999

This line follower code simply finds the contour of the line on the ground and tells how far to the left or right it is. So for example, if the roboRIO wanted to know how much to turn it would send "0" over UDP and receive a number from -80 to +80 telling how much to turn.

To run the code simply run:

bone# python VisionCore.py

And to stop the code you can either hold ctrl+c or press "Q".

Happy coding! If you have any questions, you can contact me here.

Copyright © 2017 Einsteinium Studios