After lots of work, the prototype has been successfully operated in all desired configurations.
This post is a wrap-up of all that went into completing this milestone.
Premise
I wanted to see if I could make a remote control movable real-time video streamer out of the Pi, a Webcam, and a Servo.
Bonus points added if it works over the Internet with no onerous setup.
Bonus points added if I don't have to program anything to do with video encode/decode/transcode.
The gist of how I wanted to do it:
- Get a Raspberry Pi
- Put a Webcam on top of a Servo
- Have the Pi operate the Webcam (USB)
- Have the Pi operate the Servo (GPIO)
- Run some Pi software which can read from the Webcam and stream via WebRTC
- Write some Pi software to operate the Servo
- Write some software to expose both the WebRTC and Servo control exposed on the internet
- Write some software to make use of the two interfaces and combine them into a single interface
Defined so broadly, it's hard to see how this wouldn't work. Challenges came from the details at each stage.
I will cover each point below in varying degrees of detail.
Get a Raspberry Pi
I went with a Raspberry Pi 2. This turned out to be a good choice since the software for WebRTC only works for Raspberry Pi 2.
Also got a WiFi dongle.
Put a Webcam on top of a Servo
Ha, just use some twist-ties and rubber bands.
Have the Pi operate the Webcam (USB)
The statement is vague, but the basic meaning is to let the Linux distro support the details of interaction with the USB Webcam.
USB isn't a prerequisite per-se, but it's common and what I had on hand.
I happened to be using a Logitech QuickCam Pro 9000, which worked well for this purpose.
I'm not super familiar with Linux Kernel support for video devices, but I've come to understand there is a class of video devices which adhere to the UVC standard, which stands for "USB Video Class."
Anyway, plug that into the Pi and you'll see that it's recognized straight away. Here is the /var/log/messages entry.
Dec 3 19:12:49 raspberrypi kernel: [348152.387472] usb 1-1.4: new high-speed USB device number 5 using dwc_otg
Dec 3 19:12:49 raspberrypi kernel: [348152.614881] usb 1-1.4: New USB device found, idVendor=046d, idProduct=0990
Dec 3 19:12:49 raspberrypi kernel: [348152.614907] usb 1-1.4: New USB device strings: Mfr=0, Product=0, SerialNumber=2
Dec 3 19:12:49 raspberrypi kernel: [348152.614925] usb 1-1.4: SerialNumber: 4BA96858
Dec 3 19:12:49 raspberrypi kernel: [348152.853207] media: Linux media interface: v0.10
Dec 3 19:12:49 raspberrypi kernel: [348152.881669] Linux video capture interface: v2.00
Dec 3 19:12:50 raspberrypi kernel: [348153.320477] usb 1-1.4: Warning! Unlikely big volume range (=3072), cval->res is probably wrong.
Dec 3 19:12:50 raspberrypi kernel: [348153.320512] usb 1-1.4: [5] FU [Mic Capture Volume] ch = 1, val = 4608/7680/1
Dec 3 19:12:50 raspberrypi kernel: [348153.321508] usbcore: registered new interface driver snd-usb-audio
Dec 3 19:12:50 raspberrypi kernel: [348153.327973] usbcore: registered new interface driver uvcvideo
Dec 3 19:12:50 raspberrypi kernel: [348153.328001] USB Video Class driver (1.1.1)
Have the Pi operate the Servo (GPIO)
There is so much written about how to make a servo move online it's not useful to re-write it here.
I did make a post about some issues I ran into early on since I'm an amateur regarding this kind of thing. Post here: (
link).
Basically, watch out for power and feedback issues.
Run some Pi software which can read from the Webcam and stream via WebRTC
This was actually one of the first things I researched when starting on this project. I posted some thoughts on it at the time here (
link).
All told, UV4L (Userspace Video 4 Linux) is what I went with, and some early successes in my work on this demonstrated it had a high likelihood of working in all desired configurations.
I cared a lot about this part since working with video is likely very difficult, and definitely something I don't know how to do. Nor do I want to do it.
Most of the majorly-difficult issues are solved if there is something can do this task.
Specifically, UV4L solves the difficulties in:
- Reading video from the webcam and doing anything whatsoever with it.
- Encoding the video stream to something appropriate for the end-consumer.
- NAT traversal.
- Transmitting video while adapting to congestion, latency, buffering, etc.
The first point is the specific implementation relating to UV4L, which does support the UVC driver for webcams, thankfully.
The last three points are attributes of whatever can 'do' WebRTC, which UV4L can.
Notably, the UV4L process supports Signaling via websockets and JSON-formatted message passing.
However, due to the purposefully-undefined Signaling requirement for setting up WebRTC, I was forced to sniff out the specific message structures defined by UV4L.
In short, I hid the details of UV4L behind a translating proxy I wrote such that I could replace UV4L for something else if I want to later. And other reasons related to the RDVP Server which I talk about later.
Also I wasn't 100% happy with the interface presented by UV4L anyway so bye bye to that also.
You can find lots of details about UV4L starting with the announcement about WebRTC support on their site (
link).
Note that their site says only Raspberry Pi 2 is supported for now. Perhaps this will change.
Write some Pi software to operate the Servo
I basically wrote a piece of software which accepts commands to move a servo. And then it does it.
The software has no idea there is a webcam strapped to the top of the servo.
I had a post about this here: (
link)
Write some software to expose both the WebRTC and Servo control exposed on the internet
From a prior post (
link) I noted the high-level architecture of the overall setup I was aiming for.
It mostly discussed the function of what I called the Rendez-vous Point (RDVP) Server.
The RDVP Server is a central place where different services and clients can register themselves for the purpose of having a connection set up between them.
This satisfies most of the "control from the internet" requirement. Since both the controller and controlee will "connect out" to a place on the internet, NAT issues are basically eliminated.
In the diagram below:
- The left-hand-side is the Raspberry Pi.
- The upper-right-hand-side is the RDVP Server, which needs to be anywhere accessible to both the Pi and the Controller. If you want internet control, it had better be on the internet.
- The lower-right-hand-side is the Client (controller). It's a Browser in this diagram, but could be anything really.
The "translating proxy" I discussed in the UV4L section is labeled as the WSBridge, which also acts as a male-to-male proxy.
Meaning I can ensure it reaches out to the RDVP server to make itself known. Once connected, it can relay messages back to UV4L for the purposes of setting up a WebRTC session.
Write some software to make use of the two interfaces and combine them into a single interface
As noted in a prior post (
link), I chose WebSockets as the mechanism for communication between server-side processes.
I did this with the forethought that I'd ultimately have a Browser involved in the action, and browsers speak WebSockets well, so why not be uniform.
A security dimension of WebSockets on Browsers is that at the time of writing, Browsers only want to open WebSockets to the host which served up the page you're on.
So, that means the RDVP Server needs to also serve up a web site which contains the code to connect back to the RDVP Server to control the Servo and UV4L.
Not so hard. In fact, the WebSockets library I used was Tornado for Python, which has lots of code already written for serving up webpages.
So, I wrote some simple HTML/Javascript to serve up from my RDVP server, changed the Server to serve it up, and pointed a Browser at it.
The moment of success
Pressing the 'connect' button leads to javascript connecting two WebSockets to the RDVP server.
Speaking RDVP language, they asked their messages to be relayed to the endpoints on the Pi servicing the UV4L and Servo controllers.
From there, the webpage sets up a WebRTC session with nearly no interaction from the user other than to accept that the camera is about to be used. Once the remote video stream is acquired, it is dropped into a video tag on the page and the rest is handled by the Browser.
Additionally, the buttons 0, 10, 20, ..., 100 become active. Click them, and a message is sent to the Servo controller indicating that it should move to that percent of its range-of-motion.
This is a screenshot from the laptop I was working at.
A few notes on network conditions relating to WebRTC and NATs
Key in getting video to stream properly is NAT traversal. So I wanted to be sure that wherever the Pi was, I'd be able to run the Browser somewhere else.
The NAT traversal is handled by WebRTC libs I never have to touch, but can configure. I instruct those libs to use the google STUN servers to identify both endpoints' locations for WebRTC handshaking.
TURN (TCP relaying of traffic) is never used. It is always peer-to-peer by constraint.
With that said, I note that the moment-of-success was:
- Run a RDVP Server on the Internet
- Run the Pi in my apt on the LAN WiFi
- Run my laptop tethered against my cell phone (so outside the network of both other systems)
- Created a peer-to-peer (non TURN) WebRTC connection between the Pi and my external laptop.
I wanted to try several configurations for where the Pi and Browser were in relation to one another's networks.
Always keeping the RDVP Server on the internet, I have subsequently placed the Pi and Browser:
- On the same LAN (in my apt)
- Apt LAN and Internet (Browser tethered to cell phone, maybe no NATing going on)
- Apt LAN to different LAN over the internet (so two computers in different LANs having to do NAT traversal)
Issues:
- Framerate is slow. I think it may be a combination of:
- Congested 2.4GHz WiFi.
- UV4L CPU utilization.
- No attempt yet made to configure the Webcam to operate differently.
- The servo jerks around a lot, I think due to the GPIO pins being software-based, and the PWM not hitting its timing targets. Seems much worse than normal, I think it's being affected by UV4L.
- I'd like to solve this with RPIO, but that's not supported for Raspberry Pi 2 (yet). Perhaps another lib will work for the time being.
- The webcam on top of the servo is not particularly stable and keeps tipping over!