You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to create a live stream using data that we receive in real-time from a device. Specifically, the data received are AAC and H264 samples.
Unfortunately, our camera doesn't have a RTSP server built into it like many video cameras do. We would like to add support for live streaming, and so far we have only been able to come up with a solution where we manually containerize these AAC/H264 packets into fragmented .mp4 files (.m4s) and manually serve them up.
It was tedious and it took a very long time to make this work. Unfortunately we have only been able to get HLS to work with LL-HLS on specific OS' (iOS is stubborn and won't accept how we've formatted our LL-HLS stream)
Before we started doing this, I tried using FFmpeg with Oven Media Engine and I was never able to get anywhere--
Specifically, every time I tried to pipe input into an ffmpeg command, whether it was manually or through fluent-ffmpeg, it seemed as though the process would halt any output. I assumed this was because the input pipe needs to be closed before any processing can actually take place. I tested this assumption by taking a similar command where instead of piping the AAC/H264 data, it would extract the AAC/H264 from a .mp4 file (this mp4 file has the same exact data that was being used for testing with the above method)
So to follow up with that last statement: Do input pipes need to be closed before FFmpeg actually DOES anything? If not, what am I doing wrong or missing so I can pass this data in realtime to Oven Media Engine?
This was the ffmpeg portion of my code. (The PassThrough objects are what I wrote the packets from the camera into)
this.#audioPt=newPassThrough();this.#videoPt=newPassThrough();ffmpeg(this.#audioPt).fromFormat("aac").audioCodec("copy").noVideo().output(fullSrtUrl)// piped to Oven Media Engine multiplex audio channel with SRT.on('start',cmd=>{console.log(`[${cmd}]`);}).on('error',err=>{console.log(`error occurred`,err);}).on('end',()=>{console.log(`FFmpeg ended`);}).run();ffmpeg(this.#videoPt).fromFormat("h264").frames(5).videoCodec("copy").noAudio().output(fullSrtUrl)// piped to Oven Media Engine multiplex video channel with SRT.on('start',cmd=>{console.log(`[${cmd}]`);}).on('error',err=>{console.log(`error occurred`,err);}).on('end',()=>{console.log(`FFmpeg ended`);});
Any answer would be appreciated. I also tested this with the most recent version of fluent-ffmpeg, but I can't tell you what ffmpeg version I was on, as I am not on the same machine as of right now.
The text was updated successfully, but these errors were encountered:
I'm trying to create a live stream using data that we receive in real-time from a device. Specifically, the data received are AAC and H264 samples.
Unfortunately, our camera doesn't have a RTSP server built into it like many video cameras do. We would like to add support for live streaming, and so far we have only been able to come up with a solution where we manually containerize these AAC/H264 packets into fragmented
.mp4
files (.m4s
) and manually serve them up.It was tedious and it took a very long time to make this work. Unfortunately we have only been able to get HLS to work with LL-HLS on specific OS' (iOS is stubborn and won't accept how we've formatted our LL-HLS stream)
Before we started doing this, I tried using FFmpeg with Oven Media Engine and I was never able to get anywhere--
Specifically, every time I tried to pipe input into an ffmpeg command, whether it was manually or through
fluent-ffmpeg
, it seemed as though the process would halt any output. I assumed this was because the input pipe needs to be closed before any processing can actually take place. I tested this assumption by taking a similar command where instead of piping the AAC/H264 data, it would extract the AAC/H264 from a.mp4
file (this mp4 file has the same exact data that was being used for testing with the above method)So to follow up with that last statement: Do input pipes need to be closed before FFmpeg actually DOES anything? If not, what am I doing wrong or missing so I can pass this data in realtime to Oven Media Engine?
This was the ffmpeg portion of my code. (The
PassThrough
objects are what I wrote the packets from the camera into)Any answer would be appreciated. I also tested this with the most recent version of
fluent-ffmpeg
, but I can't tell you whatffmpeg
version I was on, as I am not on the same machine as of right now.The text was updated successfully, but these errors were encountered: