From meff@3:770/3 to All on Tuesday, June 07, 2022 06:23:31
Not sure if anyone has played around with this. I have a Raspberry Pi
camera hooked up to a RasPi running Raspbian. I'm capturing video from
the camera by running the included "libcamera-vid" which lets you
specify output formats. I'm having libcamera-vid output through a pipe
and having ffmpeg read from the pipe before doing some things with the
output in real time.
In some tests I've run on a different machine, I found that ingesting
realtime camera data from ffmpeg (where I don't need to invoke
libcamera) then piping it to ffplay is lossier than streaming bytes
over loopback RTP/UDP and having ffplay ingest the UDP. I'm not sure
if this is because of some issue with the pipe or if the encode/decode
I was running on this machine was causing CPU contention resulting in
dropped frames.
In that spirit, I was wondering if there was performance I was losing
by using a pipe to take libcamera-vid's output and push it into
ffmpeg. The unrelated tests I ran on the other machine was streaming
at 1080p while the Pi is dealing with 640x480 video so the h264
encoder has to work a lot less hard, and the stream is not being
viewed on the Pi so nothing has to be written to a graphics buffer, so
I don't even know if there would be an issue on the Pi, but I'm
curious if I'm losing out on performance here.