So I wanted to use a Raspberry as another camera for my robot arm (as I described i'm building here). Since I have a Pi camera laying around, I thougth that the easiest way would be to use it and use the Pi as a relay/server that sends the feed to my computer over my local network. Besides, its a small enough camera and with a good enough resolution to be used a POV camera for the robot arm (at least I think so and I'm going to test it).
I will be using a Raspberry Pi Zero 2W. In case you have these small models as well, you might need to buy an adpater because the Zero model camera port is smaller than the usual Pi. I bought mine here.
To begin with, you can either just go directly with SSH or you can use the Pi GUI with VNC if you want, I have written the tutorial here.
In order to use the camera, you usually only need to (with the power off) stick it at the camera adpater and that's it. To be sure, you can do the general package repo update:
sudo apt-get update sudo apt-get upgrade
And then to test the camera with a 5 second feed, you can use:
libcamera-hello
In case you on SSH only and not GUI, you can save the image as well:
libcamera-jpeg -o test.jpg
Not the best quality. But its a pretty cheap camera (I have Camera v2.1), you can get some better quality raspberry cameras.
Ok! it should be good, now you can run a little server off you Pi that sends to your computer Ip. You first need to check your Pc IP with ipconfig/ifconfig depending if you are on windows or linux. and then you can configure the following script with this easy tool I've found to do the video serving, called ImageZMQ:
from picamera2 import Picamera2 import imagezmq import socket import time # Initialize the camera picam2 = Picamera2() picam2.preview_configuration.main.size = (1640, 922) # Wide FOV picam2.preview_configuration.main.format = "RGB888" # Otherwise, the feed will come with Red and Blue inversed! picam2.configure("preview") picam2.start() # Initialize ImageZMQ sender sender = imagezmq.ImageSender(connect_to='tcp://YOUR_PC_IP:5555') rpi_name = socket.gethostname() while True: frame = picam2.capture_array() sender.send_image(rpi_name, frame)
You should first install these system packages:
sudo apt install -y libcap-dev build-essential sudo apt install -y python3-picamera2 sudo apt install -y python3-libcamera
You usually can't install python pip packages globally because it might mess with the OS, so you should create a virtual environment. And you should do it after installing the system packages, and using this flag:
python3 -m venv --system-site-packages venv source venv/bin/activate
And then install these packages:
numpy opencv-python imagezmq
And then on your computer, where you will be "consuming" the camera feed, you can do this just to visualize the feed:
import cv2 import imagezmq # Initialize ImageZMQ receiver receiver = imagezmq.ImageHub(open_port='tcp://0.0.0.0:5555') while True: rpi_name, frame = receiver.recv_image() cv2.imshow('Feed from Raspberry Pi', frame) receiver.send_reply(b'OK') if cv2.waitKey(1) & 0xFF == ord('q'): break cv2.destroyAllWindows()
Et voilà!
So this latency is good enough for applications like OctoPrint, but it totally is not good enough to train a robot. It's too laggy. So I will probably look into doing this with a cable connection. And for this, there is this great tutorial from Raspberry itself.