Skip to content

Running GRIP on a Raspberry Pi 2

Eric Ward edited this page Feb 2, 2016 · 20 revisions

DISCLAIMER

This is community led and not officially supported by the @WPIRoboticsProjects/GRIP core team at the moment.
If you have issues please direct them into this issue #366 or chat with us in the Gitter Chat. Mention @EdWard680 to get his attention and he may be able to help you out.
Please chat with one of us before opening a new issue with running on a PI.

Motivation

Running GRIP on a coprocessor on board the robot is a great way to fully utilize GRIP. Running intensive vision processing algorithms on the RoboRIO alongside the robot code runs the risk of starving both programs of CPU resources, resulting in latency on both devices. Running on the driver station is simple and fixes this issue, but it requires that you stream the video you wish to process by radio which can introduce latency and bandwidth concerns. By running on a coprocessor you get the best of both worlds, with the added bonus of being able to drive the robot using any computer that has a driver station on it, as opposed to a driver station and your GRIP install/config

The two choices which are likely to be most popular are the Jetson TK1 and the Raspberry Pi. One thing people should consider with the Jetson is that JavaCV, the image processing library used by GRIP, doesn't use the GPU for processing, so the Jetson's powerful GPU won't be taken advantage of.

Downloads

The current GRIP release (1.1.1) does not run on arm-hardfp (hard floating point. The RoboRIO uses soft floating point), however users multiplemonomials and ThadHouse have built GRIP its dependencies to use on hardfp arm devices.

Grip Core

Download core-1.1.1-all.jar

  • On Windows, move it to C:\Users\<Your Username>\AppData\Local\GRIP\app\
  • On OS X, move it to the GRIP.app folder OS_X_GRIP
  • On Linux, move it to /opt/GRIP/app

Make sure you replace the core-<version>-all.jar file and change the name to match. This is the file that GRIP deploys to your device.

core_install

External Dependencies

Download libntcore.so and libstdc++.so.6 to your downloads folder You are going to have to manually upload these to your Raspberry Pi

Setting up the Pi

The following steps have been tested on a Raspberry Pi 2 B+ running Raspbian Jessie Lite. Results for other models and OS's may vary

Connecting to the Pi

To start you must image your Raspberry Pi and determine its network address. The address used henceforth will be <pi-address>. This may possibly refer to the Pi's static IPv4 address (likely 10.<TEAM>.<NUMBER>.*) or its hostname (raspberrypi.local by default) depending on how it was set up. The username used henceforth will be the Raspberry Pi's default: pi. However, you can change this to any username on the pi you want. To be able to deploy to and use the Pi you must be able to open an SSH session from your computer. (Tutorial)

putty

Setting up your environment

  • Open up an ssh session with your pi.
  • Enter $ mkdir vision and then $ mkdir vision/grip. These are the directories you will be deploying GRIP to as well as storing related vision configuration files
  • Ensure java8 is installed sudo apt-get update && sudo apt-get install oracle-java8-jdk
  • Close the ssh session

environment

Uploading libntcore and libstdc++

The next step is to upload the shared libraries needed by GRIP to the Pi

  • Ensure you have scp (Linux/OS X) or pscp (Windows) installed on your computer
  • Open a Terminal/Command Prompt
  • Enter $ scp <path to downloads folder>/libntcore.so pi@<pi-address>:/home/pi/vision/grip/libntcore.so. If you're on Windows enter the same except with pscp instead of scp
  • Enter $ scp <path to downloads folder>/libstdc++.so.6 pi@<pi-address>:/home/pi/vision/grip/libstdc++.so.6. If you're on windows enter the same except with pscp instead of scp

Deploying GRIP

Now it's time to deploy your GRIP project

  • From the GRIP UI, click deploy or open the project settings
  • Fill out your raspberry pi's information. It doesn't really matter what is put in for the JRE directory because GRIP will not be able to be remotely run by the UI, as it needs to be run with a special command
  • From the deploy window, click deploy. Once it's finished deploying just hit stop as GRIP will fail to run properly due to library issues.

settings images

Running GRIP

Now to run GRIP on the Pi

  • Open an ssh session with the Pi
  • Enter $ env LD_LIBRARY_PATH=/home/pi/vision/grip:LD_LIBRARY_PATH java -jar /home/pi/vision/grip/grip.jar /home/pi/vision/grip/project.grip This should run your grip program

USB Camera Sources will fail and cause GRIP to crash currently, see below for a useful workaround. IP Camera Sources work fine

Using the Raspberry Pi camera or a USB camera

As mentioned, USB Camera Sources don't work with GRIP on the Pi at this time. One way to workaround for this is to host an mjpg stream of the USB camera on the Pi and then connect to localhost as an IP Camera source in GRIP. This has a few advantages

  • Being able to preview/test the GRIP pipeline in the GRIP UI with the exact same stream as it will grab when deployed
  • Being able to change camera sources and host them at the same address so GRIP's config doesn't have to be changed
  • Being able to view the stream in SmartDashboard or raw in a browser without going through GRIP

Installing mjpg-streamer

This fork of mjpg streamer does everything we need. It can stream USB input or Raspberry Pi camera input (or both) on any port with many different configurations. The Pi will need to be connected to the internet in order to install these programs.

  • Install git $ sudo apt-get update && sudo apt-get install git
  • Install cmake $ sudo apt-get update && sudo apt-get install cmake
  • Install libjpeg8-dev $ sudo apt-get update && sudo apt-get install libjpeg8-dev
  • Navigate to your vision directory $ cd vision
  • Clone mjpg-streamer $ git clone https://github.com/jacksonliam/mjpg-streamer.git
  • $ cd mjpg-streamer/mjpg-streamer-experimental
  • Build mpjg-streamer $ make clean all
  • Install mpjg-streamer $ sudo make install

mjpg_stream_install

Using mjpg-streamer with a USB Camera

mjpg-streamer comes with a lot of examples on how to use it. To use it to stream a USB Camera,

  • Plug in the camera
  • Verify which device it is by running ls /dev/ before and after plugging it in. It should be the device which appears after plugging it in, most likely /dev/video0, or possibly a higher number if it isn't the only camera device
  • Run mjpg-streamer mjpg_streamer -o "output_http.so -w ./www -p 1180" -i "input_uvc.so -d /dev/<cam-device> -f 15 -r 640x480 -y -n" This should start streaming your USB camera at http://<pi-address>:1180/?action=stream Which be able to be viewed by entering that URL into a browser. You should also be able to view the stream with the Simple Camera widget in Smart Dashboard, as well as an IP camera source in GRIP. Input settings (settings after -i) can be modified to stream in different frame rates and resolutions. Output settings (settings after -o) can be modified to stream it over different ports (see section 4.10 of the game manual for more information on which ports to use).

usb_cam

Using mjpg-streamer with a Raspberry Pi Camera

  • Hook up the camera
  • Run mjpg-streamer env LD_LIBRARY_PATH=/usr/local/lib:LD_LIBRARY_PATH mjpg_streamer -o "output_http.so -w ./www -p 1180" -i "input_raspicam.so -fps 15 -r 640x480". Note that the options which need to be specified for the raspberry pi input are slightly different than that of the USB camera input. To see all the options for a particular input or output, run (for example) mjpg-streamer -i "input_raspicam.so --help"

Writing Helper Scripts

With all these programs running with long, weird commands on the Pi, it'd be nice to have some scripts which can just be simply run in order to have the programs started up for us.

Starting GRIP

  • Open /home/pi/vision/start_grip.sh with your favorite text editor (which you will likely have to install). If you don't have a favorite, just run nano /home/pi/vision/start_grip.sh

  • Paste in the following

      # Start processing
      env LD_LIBRARY_PATH=/home/pi/vision/grip:LD_LIBRARY_PATH java -jar /home/pi/vision/grip/grip.jar /home/pi/code/grip/project.grip &
    
  • Save and Exit the text editor

  • Make the script executable chmod +x /home/pi/vision/start_grip.sh

  • The script can be run with /home/pi/vision/start_grip.sh, or remotely from a computer or RoboRIO using ssh with ssh pi@<pi-address> -c "/home/pi/vision/start_grip.sh

Starting mjpg-streamer

  • Open /home/pi/vision/start_mjpg_streamer.sh

  • Paste in the following

      # Start mjpg-streamer
      env LD_LIBRARY_PATH=/usr/local/lib:LD_LIBRARY_PATH mjpg_streamer -o "output_http.so -w ./www -p 1180" -i "input_uvc.so -f 15 -r 640x480 -y -n" &
    

Edit this file or make different versions of this file which configure mjpg-streamer to exactly what you would like

  • Save and Exit the text editor
  • Make the script executable chmod +x /home/pi/vision/start_mjpg_streamer.sh
  • The script can be run with /home/pi/vision/start_mjpg_streamer.sh, or remotely from a computer or RoboRIO using ssh with ssh pi@<pi-address> -c "/home/pi/vision/start_mjpg_streamer.sh

Starting both

  • Open /home/pi/vision/start_vision.sh

  • Paste in the following

      # Start mjpg stream
      /home/pi/vision/start_mjpg_streamer.sh
      pause 5s
     
      # Start GRIP after the stream has had time to start up
      /home/pi/vision/start_grip.sh
    
  • Make the script executable chmod +x /home/pi/vision/start_vision.sh

  • The script can be run with /home/pi/vision/start_vision.sh, or remotely from a computer or RoboRIO using ssh with ssh pi@<pi-address> -c "/home/pi/vision/start_vision.sh

Stopping

GRIP can be stopped with killall java, mjpg-streamer can be stopped with killall mjpg_streamer. These can be placed in similar script files as demonstrated above to make stopping vision processing remotely easier or more encapsulated.