Sunday, 27 January 2013

PI Vision 1.0

My ultimate goal here is making a robot, and the main reason I wanted a raspberry pi (or multiple ones) is that they are powerful / flexible enough to read and process a web cam feed. This first short blog is my experience trying to get the Pi running a web cam.

I tried various tutorials, recommending various programs that detect motion, or stream video over IP. However often you end up limited to a very low frame rate. I wanted to strip out any excess stuff and start with just one objective - plug the pi into the tv, a web cam into the pi, and the video feed rendering on screen. After some research it turns out ffmpeg is a good way to go for this.

What you'll need

For this test I used:

Installing FFMpeg

First step, boot up your raspberry pi, make sure its hooked up to the internet and get ffmpeg installed. Some of the internet seems to think there's issues with the available version and suggest compiling yourself, however installing it as normal has worked fine for me thus far:

pi@raspberrypi ~ $ sudo apt-get install ffmpeg

Hook Up The Web Cam

The crucial point here is that you must be plugged in via a powered usb hub, as the Pi can't power much more than a keyboard/mouse. I randomly chose the Trust 10 port one listed above, and it seems to work, however the camera is a little unreliable - it virtually never shows up on boot, and I have to unplug/plug it in a few times to get it detected. The first thing to do (after plugging things in) is to list the usb devices with the command 'lsusb', which should give you something like this:

pi@raspberrypi / $ lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 0424:9512 Standard Microsystems Corp.
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp.
Bus 001 Device 004: ID 05e3:0608 Genesys Logic, Inc. USB-2.0 4-Port HUB
Bus 001 Device 005: ID 1415:4000 Nam Tai E&E Products Ltd. or OmniVision Technologies, Inc.
Bus 001 Device 006: ID 05e3:0608 Genesys Logic, Inc. USB-2.0 4-Port HUB
Bus 001 Device 007: ID 05e3:0608 Genesys Logic, Inc. USB-2.0 4-Port HUB
Bus 001 Device 008: ID 1415:2000 Nam Tai E&E Products Ltd. or OmniVision Technologies, Inc. Sony Playstation Eye

As you can see, the PS Eye has shown up in the list. Now while that seems to always work for me, what doesn't always happen is the camera actually being picked up as a video input device. To check if this has worked, switch to the devices folder and list them:

pi@raspberrypi / $ cd /dev
pi@raspberrypi /dev $ ls

If you're not used to linux, the 'dev' folder is a special one, that contains a 'file' for each device on the system. If a camera is present, you should see 'video0'. If you're like me, the odds are it won't be there. Unplug the camera and plug it back in, then type 'ls' to see if it appears. So far this has worked for me every time, but I need to work out why its happening at some point, and whether it can be remedied.

Anyhoo, after a bit of plugging in, that 'ls' command should show you something like this:

Once that 'video0' appears you're all set!

Getting it running

Up until now I've been working over SSH via putty, as this means I can work via my pc using a nice screen etc, however it's time to get the web cam stream and for that we'll need to plug the Pi into a tv. Plug it in, reboot, and make sure the video0 device is present using the steps above. Now we can use the supplied ffmpeg player application to get the feed. Using a tweaked version of the example from this ffmpeg documentation, I come up with the command:

pi@raspberrypi /dev $ ffplay -f video4linux2 -framerate 15 -video_size 320x240 /dev/video0

In english this means:

  • Run the ffplay test application that comes with ffmpeg
  • Use the 'video4linux2' mode (video4linux2 is the video capture system that linux uses)
  • Request a frame rate of 15fps
  • Request a video size of 320x240
  • Use the device '/dev/video0'
Here's a video to show it working!

Conclusion / Future plans

After some experimenting I've found that you can't really push the above tests past 320x240. At 640x480 (the native ps eye resolution) you can't get past 2 or 3 fps, and on a higher resolution camera this'll just get worse. While it performs much better than some more complex examples, I'm looking for high performance, so the next stage will be to do this via code, using ffmpeg to pull out a stream and the pi gpu to render it. Hopefully then I can work out where the slow down is and whether it's feasible to use the gpu to speed things up. Maybe I'll have to skip ffmpeg altogether and go straight to video4linux, but I'd rather not!

On top of the decoding side, I also have the issue of the usb unplugging. If anyone has any thoughts on this I'd be interested. I've seen there's code out there to reset the usb device, which I'll try. Worst case though I'll use the gpio pins and a transistor to cut the power to the camera via code!


  1. Have you looked at either Opencv (c/c++) or Simplecv (python)? You may find these useful as they are designed to assist with robot vision processing.

  2. Thank for your tutorial, actually the capture is very slow in my system. I tried to lower the framerate and the resolution but it is not better.

  3. Can I used my raspberry pi camera?

  4. For the usb unplugging is an issue with the pi itself. You may want to check your POwersupply for the Pi, I am using the rev b and had the same issue until I realized it stopped with I used a 2.1A output microusb to power the pi and let the usb hub poweritself

  5. Thanks, works fine

  6. Thank you this is exactly what I needed to get started. I just got a Pi for a school project and was dreading getting the video working, as I thought I would have to pour over way too much technical stuff I do not yet understand.