Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    NewUserHa
    @NewUserHa
    like, I want to ffmpeg -i udp_input0 -i udp_input1 -filter_complex hstack=inputs=2 output. how to make any missing udp_input displaying as static black image stream, when PYAV will block/stop if any stream missed just like what FFMPEG will do?
    34 replies
    Guilherme Richter
    @IamRichter
    Hey, I was trying to do something with opencv, but some people suggested me to do it using ffmpeg. Is there an easy way to get a single frame from a stream like you would with ffmpeg -i RTSP_URI -vframes 1 frame.jpg? I was looking at the most basic av example and without the for I just get a collection.
    Guilherme Richter
    @IamRichter
    Actually, what I wanted is to limit the number of frames per second in a RTSP stream, but this is not as easy as I wished (as far as looked up), so I was hoping to just get the "last frame" using ffmpeg, repeat it a couple of times per minute and call it a day.
    NewUserHa
    @NewUserHa
    @IamRichter I think the basic usage in PYAV doc already show you how to achieve that
    NewUserHa
    @NewUserHa
    what is the encode argument for x264 like profile fast/slow in Stream.encode(frame=None) of PYAV?
    NewUserHa
    @NewUserHa
    Samuel Lindgren
    @samiamlabs

    Hi all!

    I'm trying to use the hardware encoder h264_nvmpi on a Jetson Xavier to stream video with WebRTC (aiortc).

    The code that creates the encoder context looks like this:

    def create_encoder_context(
        codec_name: str, width: int, height: int, bitrate: int
    ) -> Tuple[av.CodecContext, bool]:
        codec = av.CodecContext.create(codec_name, "w")
        codec.width = width
        codec.height = height
        codec.bit_rate = bitrate
        codec.pix_fmt = "yuv420p"
        codec.framerate = fractions.Fraction(MAX_FRAME_RATE, 1)
        codec.time_base = fractions.Fraction(1, MAX_FRAME_RATE)
        codec.options = {
            "profile": "baseline",
            "level": "31",
            "tune": "zerolatency"  # does nothing using h264_omx,
        }
        codec.open()
        return codec, codec_name == "h264_nvmpi"
    
    self.codec, self.codec_buffering = create_encoder_context(
                        "h264_nvmpi", frame.width, frame.height, bitrate=self.target_bitrate
                    )

    The encoding itself seems to work fine but the video I can see in the browser is very delayed and it seems to get worse the more time passes.

    I get this output from the codec:

    INFO:aioice.ice:Connection(2) Check CandidatePair(('172.19.0.2', 53024) -> ('172.19.0.1', 56520)) State.IN_PROGRESS -> State.SUCCEEDED
    INFO:aioice.ice:Connection(2) ICE completed
    Connection state is connected
    INFO:aiortc.codecs.h264:=======> Creating h264 encoder!
    Opening in BLOCKING MODE
    Opening in BLOCKING MODE 
    NvMMLiteOpen : Block : BlockType = 4 
    ===== NVMEDIA: NVENC =====
    NvMMLiteBlockCreate : Block : BlockType = 4 
    875967048
    842091865
    H264: Profile = 66, Level = 31 
    NVMEDIA_ENC: bBlitMode is set to TRUE 
    Opening in BLOCKING MODE
    Opening in BLOCKING MODE 
    NvMMLiteOpen : Block : BlockType = 4 
    ===== NVMEDIA: NVENC =====
    NvMMLiteBlockCreate : Block : BlockType = 4 
    875967048
    842091865
    H264: Profile = 66, Level = 31 
    NVMEDIA_ENC: bBlitMode is set to TRUE

    I suspect I have some flag or option wrong but I'm pretty inexperienced with FFmpeg and don't really know where to start...

    If anyone has any tips or ideas they would be greatly appreciated :)

    Nitan Alexandru Marcel
    @nitanmarcel
    Hello.
    Oh, enter doesn't send a new line
    Anyway, I'm looking for a way to stream audio from an audio url from a link provided from YouTube-Dl library. I have to convert it to PCM 16 bit, 48k then send the data to another library for downloading. So far I've achieved this using an asyncio subprocess exec and I just found your library and I want to translate it to it. Are there any examples of doing this or something similar out there or can someone point me in the corect direction?
    Nitan Alexandru Marcel
    @nitanmarcel
    What I think I'm actually looking is to take the original bytes from the url using something like aiohttp then send the data though the library and recive the converted bytes
    Is this possible without revolving around subprocesses?
    NewUserHa
    @NewUserHa
    ffmpeg cli is good in your case
    quantotto
    @quantotto
    @nitanmarcel pyav allows re-encoding / transcoding the stream in memory (if io.BytesIO) is used for output buffer. It will not create any subprocesses and will use libav libraries. If you already use ffmpeg, you can definitely translate command line into a program that uses pyav. It allows supplying options by passing options dictionary to av.open method
    Nitan Alexandru Marcel
    @nitanmarcel
    @quantotto thanks, I'll take a look into it
    NewUserHa
    @NewUserHa
    container_out = av.open('udp://224.0.0.1:999?overrun_nonfatal=1', format='mpegts', mode='w')
    video_stream = container_out.add_stream('libx264', 30)
    video_stream.low_delay = True
    video_stream.width = 1280
    video_stream.height = 720
    video_stream.options = {'preset': 'veryfast', 'tune': 'film,zerolatency'}
    
    graph = av.filter.Graph()
    
    src0 = graph.add('testsrc', 's=1280x720:r=30')
    f01 = graph.add('drawtext', "text='%{n}@%{localtime}@%{pts}':y=190:fontsize=20")
    f02 = graph.add('format','pix_fmts=yuv420p')
    f03 = graph.add('scale', '640:360:flags=lanczos')
    
    src1 = graph.add('testsrc', 's=1280x720:r=30')
    f11 = graph.add('drawtext', "text='%{n}@%{localtime}@%{pts}':y=190:fontsize=20")
    f12 = graph.add('format','pix_fmts=yuv420p')
    f13 = graph.add('scale', '640:360:flags=lanczos')
    
    xstack = graph.add('xstack', 'inputs=2:layout=0_0|w0_0')
    
    src0.link_to(f01)
    f01.link_to(f02)
    f02.link_to(f03)
    f03.link_to(xstack, 0, 0)
    
    src1.link_to(f11)
    f11.link_to(f12)
    f12.link_to(f13)
    f13.link_to(xstack, 0, 1)
    
    xstack.link_to(graph.add('buffersink'))
    graph.configure()
    
    subprocess.Popen(
        """ffplay -f lavfi -i testsrc=r=30 -vf "drawtext=text='%{n}@%{localtime}@%{pts}':y=190:fontsize=20""", 
        shell=True)
    subprocess.Popen(
        """ffplay -fflags nobuffer udp://224.0.0.1:999?overrun_nonfatal=1""",
        shell=True)
    while 1:
        for packet in video_stream.encode(xstack.pull()):
            container_out.mux(packet)
    Why does streaming latency of this code keep increasing forever? any ideas?
    the latency = the %{localtime} in ffplay -i testsrc and ffplay udp://
    is it because that the PYTHON is slow?
    arsserpentarium
    @arsserpentarium
    Hello. Do anybody have example, how to record video with sound?
    quantotto
    @quantotto
    @NewUserHa it is probably less about Python, but more about encoding speed. If your encoding pipeline is slow indeed, most of the chances it is CPU bound. Check if you have one of the cores maxed out at 100% while running your code. I saw similar issues on weaker machines, like Raspberry Pi.
    NewUserHa
    @NewUserHa
    @quantotto But no, overall cpu usage ~40%, and all logical core usage <70%.
    @quantotto would you like to try that code?
    quantotto
    @quantotto
    @NewUserHa do I just replace 224.0.0.1 with localhost?
    quantotto
    @quantotto
    @NewUserHa I am not sure how graphs work exactly in AV, but it is probably some logic issue rather than performance. If you change rate to 2 from 30 everything becomes even slower. So, it seems that rate affects how often new frames are pushed and the localtime remains the same even longer than with rate 30. I'd check this rate logic.
    Zeyu Dong
    @dong-zeyu

    Hello everyone, I have problem trying to remux a raw h264 stream to mp4. That is what I'm currently dong.

    avin = av.open("test.h264", "r", "h264")
    avout = av.open("test.mp4", "w", "mp4")
    s = avout.add_stream(template=avin.streams[0], rate=30)
    for pkt in avin.demux():
        pkt.stream = s
        avout.mux(pkt)
    avin.close()
    avout.close()

    However, when I read the output file using ffprobe test.mp4, the stream info shows

    Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 720x480, 278198 kb/s, 10868.85 fps, 12800 tbr, 12800 tbn, 25600 tbc (default)

    The fps is too large to play.
    It seems that raw h264 stream dose not contain the information of the fps and pts data, but I know the actual fps is 30. How to tell the demuxer this information?

    Nitan Alexandru Marcel
    @nitanmarcel
    I've got a small problem with av.open. I'm passing to it an audio stream from fmstream.org and the container reaches eof early and the stream stops. Am I doing something wrong or?
    quantotto
    @quantotto
    @dong-zeyu yes, you have to calculate pts / dts yourself. Something like below worked for me:
    import av
    avin = av.open("test.264", "r", "h264")
    avout = av.open("test.mp4", "w", format="mp4")
    avout.add_stream(template=avin.streams[0])
    time_base = int(1 / avin.streams[0].time_base)
    rate = avin.streams[0].base_rate
    ts_inc = int(time_base / rate)
    ts = 0
    for pkt in avin.demux():
        pkt.pts = ts
        pkt.dts = ts
        avout.mux(pkt)
        ts += ts_inc
    avin.close()
    avout.close()
    @nitanmarcel could you share what exactly are you trying to open? which URL etc. Basically, code snippet that doesn't work
    Nitan Alexandru Marcel
    @nitanmarcel
    And about the snippet I removed it but I haven't set an out file instead I've tried to read it frame by frame
    The library I want to use pyav is called tgcalls and here's an example on how it should be used https://github.com/MarshalX/tgcalls/blob/main/examples/restream_using_raw_data.py
    NewUserHa
    @NewUserHa
    @quantotto maybe the pts problem may also solve by filter fps or encoder fps=
    @nitanmarcel the ffmpeg cli may work in that case
    Nitan Alexandru Marcel
    @nitanmarcel
    NewUserHa
    @NewUserHa
    @quantotto no need to replace 224.0.0.1 which is a multicast address and should work fine on most situations for udp
    if slow down the testsrc rate, then the while loop is also be slowed down. so it can also help solve this issue if it's really because the Python is slow.
    should consider other languages maybe I guess?
    Nitan Alexandru Marcel
    @nitanmarcel
    @NewUserHa i know as I'm already using the cli, I'm just looking for alternatives ^^
    NewUserHa
    @NewUserHa
    pyav has issues like the issue I encounter..
    but PYAV transferring audio shouldn't have any issue
    but if you want to read frame by frame, I guess you should caution overflow of ffmpeg libs(PYAV depends) internal's buffer
    Nitan Alexandru Marcel
    @nitanmarcel
    Oh, actually I haven't though of exploring projects that uses pyav, but most probably most of them are about converting files rather than reading from live streams
    NewUserHa
    @NewUserHa
    no. pyav has many streaming use cases as you can see on github issue tracker
    quantotto
    @quantotto
    Whatever works from ffmpeg cli will work through pyav as well. Just need to make sure correct options are supplied and, in some cases, things like pts / dts need to be set manually.
    NewUserHa
    @NewUserHa
    maybe pull from testsrc is using python's main thread??
    quantotto
    @quantotto
    @NewUserHa I don't think that this is the case of slow Python (thought I might be wrong). Most of the processing is happening in libav libraries (in C/C++). It is something related to logic of the graph.
    NewUserHa
    @NewUserHa
    the videoframe pulled from libav is in python's object form which is slow
    and change src rate from 30 to 2 solving the issue may also indicate the reason?
    quantotto
    @quantotto
    @NewUserHa did rate change solve the issue? I saw that localtime lagging even more, but it could be that I didn't run it properly. I didn't have too much time to invest in it
    NewUserHa
    @NewUserHa
    heard you said change src rate from 30 to 2 solving the issue..
    will try to change it to 60 fps later.
    quantotto
    @quantotto
    @NewUserHa what i said: "If you change rate to 2 from 30 everything becomes even slower."
    NewUserHa
    @NewUserHa
    the localtime difference is different with pts difference, also different with frame.
    but from the frame number, the video does delay, but not that much as localtime showed. and pts slower than other's changes is also weired