Hello HN, I'd love some advice on live video processing for a hobby project I'm working on.<p>I'm currently working on placing a live camera on a beach. My current setup leaves a lot to be desired, but my main problem has been with video-processing-related stuff which I have had no contact with as a developer until now.<p>It consists of a camera-equipped microcontroller connected to WiFi that is constantly sending JPEGs to a server through HTTP POST requests. On the server side, these JPEGs get saved as files in a folder which in turn are converted to an RTMP live video feed - using ffmpeg - which gets sent to a streaming platform (Twitch or Youtube).<p>There are problems with my current approach mainly that ffmpeg "finishes" converting my JPEGs mid "transmission" (I use quotes here because all that means is that my microcontroller is still sending JPEGs to the server).<p>I searched extensively but I've had trouble formulating the queries given how specific my use-case seems to be, and have had little success in this regard. That leads me into believing that I'm commiting mistakes in my approach to this project - which is of no surprise, given my lack of experience in video, specially streaming.<p>So I'd like some pointers or suggestions from you - it could be pointed at any part of the process I'm currently doing, though what made me write this in the first place was the video streaming problems I'm facing. It could be general resources into encoders, decoders, transcoders, video in general... Or even just comment what you'd change to make streaming work.<p>Perhaps as a way of being more direct: how does can "generic" - as in normal practice - livestreaming be achieved without the help of something like OBS?<p>Thank you.
I've had a small amount of experience with this sort of thing, though not with the full blown streaming apps like OBS. For example I've written an Android App that sends video frames from a phone's camera to a remote client. I believe I set it up so that the camera side acts as the server and the client side just issues GET requests, but it's been a while since I looked at the code so I'm not entirely sure. It worked OK for what I wanted but I always hoped to improve it, which I is something I have yet to get around to.<p>As for your setup I'm wondering why you chose to use a microcontroller on the camera side instead of something more powerful like a RasberryPi. I would think the later would give you more options and I doubt the cost difference would be very significant to the overall project.
May I ask why you are using a collecting images this way? There are super cheap cameras that output rtsp streams. Are you dealing with a low-bw connection?<p>GStreamer is pretty neat for this sort of stuff. More flexible than FFMpeg for this sort of use case.<p>I also don't understand the "ffmpeg finishes .. mid-transmission" part .. are you getting chopped images in your video stream?