Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • ffmpeg dshow and ddagrab desync

    12 juillet, par Light Guardian

    so i have a code for screen capture

    ffmpeg.exe -hide_banner ^ 
    -f dshow -i audio="virtual-audio-capturer" ^ 
    -f dshow -i audio="@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{8049B44D-EB44-4706-83C2-7480DCD60587}" ^ 
    -filter_complex "[0:a][1:a]amix=inputs=2,volume=1.5[a];ddagrab=0:framerate=30[v];" ^ 
    -map "[v]" -map "[a]" ^ 
    -c:v h264_nvenc -rc vbr -cq 36 -qmin 29 ^ 
    -c:a aac ^ 
    "vid.mkv"
    

    first audio input - system sound second audio input - my mic

    The main problem is that the sound lags behind the video by about 0.5 seconds. Also, the first audio lags behind the second by about 0.2 seconds. In simple words - first I see that I started the video, then I hear that I pressed the spacebar, and only then do I hear the sound of the video itself.

    my ffmpeg version 7.1.1-full_build-www.gyan.dev

    log when recording

    Input #0, dshow, from 'audio=virtual-audio-capturer':   
    Duration: N/A, start: 7907.655000, bitrate: 1536 kb/s   
    Stream #0:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s 
    Input #1, dshow, from 'audio=@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{8049B44D-EB44-4706-83C2-7480DCD60587}':
    Duration: N/A, start: 7907.915000, bitrate: 1411 kb/s
    Stream #1:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s Stream mapping:
    Stream #0:0 (pcm_s16le) -> amix
    Stream #1:0 (pcm_s16le) -> amix
    ddagrab:default -> Stream #0:0 (h264_nvenc)
    volume:default -> Stream #0:1 (aac)
    

    To fix this, I tried: changing buffers. -async 1. -itsoffset -0.2 for directshow (this helped a little, but it didn't completely fix the issue, because lowering the value causes ffmpeg to start dropping frames). -fflags nobuffer. setpts=PTS+0.45\*S for filter_complex, etc.

    The problem might be that the sample rate of the audio inputs is different. 48000 Hz for the first and 44100 Hz for the second. But I'm not sure about this, as I tried to adjust them to the same value using -ar 48000 and -ar 44100, but it did not yield any results.

  • FFMPEG Python : Encountered scale(1920, 1080) with multiple outgoing edges with same upstream label None ; a `split` filter is probably required

    12 juillet, par Lysander Cox

    Here is the code in question:

    for comment in thread['comments']:
            commentClips += fragmentConcat(comment, filePrefix)
            
            staticClip = ffmpeg.input('assets/static.mp4')
            commentClips.append(staticClip
                                    .filter('setsar', 1, 1)
                                    .filter('scale', 1920, 1080)
                               )
            commentClips.append(staticClip.audio)
    

    This code generates the following error:

    ValueError: Encountered scale(1920, 1080) <6adb028f8ef5> with multiple outgoing edges with same upstream label None; a `split` filter is probably required
    

    I have tried using only the video part of the input for the first call (e.g. staticClip['v'].filter...), and I have tried using the split call as suggested (e.g. ffmpeg.input(...).split(). Nothing has worked. What is the issue, and how can I rememdy it? Thanks.

  • AVURLAsset tracks is empty, but video/audio is playable [closed]

    12 juillet, par Paltr

    I have video that lives here:

    http://195.16.112.71/adaptive/66aebabb-2632-44fc-abf1-df29bca6b941.video/66aebabb-2632-44fc-abf1-df29bca6b941.m3u8
    

    Ffmpeg says that this video has 5 tracks and it's correctly.

    But if I use AVURLAsset with that link it says me that there isn't any tracks:

    NSArray* const tracks = asset.tracks; // it's empty
    

    I modified Apple's StichedStreamPlayer sample to reproduce this problem, it lies here: https://yadi.sk/d/hV3jfbx1Z9sfC

    Simply click 'Load Movie', than the 'Play' button - movie plays perfectly, but if you check tracks variable in prepareToPlayAsset function you find it's empty. The question is: why it's empty if in reality the video has 5 tracks and how this video could be playing if no tracks exist, as AVURLAsset says?

    Thanks for your help in advance!

  • How can I use physical computer's GPU on VMware Workstation ?

    12 juillet, par ching

    I want to use the GPU to accelerate video transcoding time by using FFmpeg on the virtual machine, VMware Workstation.

    When I use the command lspci | grep VGA, the output is

    00:0f.0 VGA compatible controller: VMware SVGA II Adapter
    

    It seems that I only can use its own Graphic Card on VMware, right?

    How can I use my host computer's GPU, NVIDIA Quadro K2000, from a program running in a guest VM?

    Or is there another solution that can solve the problem mentioned in the title?

  • No accelerated colorspace conversion found from yuv420p to argb

    11 juillet, par Zac Chan

    I am a novice at ffmpeg and have recently taken over a code base built by a previous engineer. The FFmpeg code is on an app engine that will edit the videos when they are uploaded.

    This code generated a title animation that will later be used as an overlay.

    exports.generateTitleAnimation = function(metadata, destPath, options = {}) {
    const peeqLogoPath = "/app/assets/peeq-logo.png";
    const whiteBarMovPath = "/app/assets/whiteBar.mov";
    const titleFontPath = "/app/assets/Sofia-Pro-Black.otf";
    const dateStrFontPath = "/app/assets/Sofia-Pro-Medium.otf";
    const outputDuration = 5.52;
    const src01 = "color=c=white:s=1920x1080:duration=" + outputDuration;
    const src02 = "color=c=white@0.0:s=1920x1080:r=120:duration=" + outputDuration;
    
    var dateStrXOffset = "(92";
    var filterComplexStr = "[1]";
    
    if (metadata.title) {
        const title = metadata.title.toUpperCase();
        filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + title + "':x='floor(92*(min((t-1.75)^29,0)+max((t-3.75)^29,0)+1))':y=622+30+2:fontsize=70:fontcolor=black:ft_load_flags=render,";
    }
    if (metadata.subTitle) {
        const subTitle = metadata.subTitle.toUpperCase();
        filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + subTitle + "':x='floor(92*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,";
    
        dateStrXOffset += "+30*" + (subTitle.length + 1);
    }
    if (metadata.dateStr) {
        filterComplexStr += "drawtext=fontfile=" + dateStrFontPath + ":text='" + metadata.dateStr + "':x='floor(" + dateStrXOffset + ")*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,";
    }
    console.log("generateTitleAnimation generating")
    filterComplexStr += "split=10[t01][t02][t03][t04][t05][t06][t07][t08][t09][t10];[t02]setpts=PTS+0.0166/TB[d02];[t03]setpts=PTS+0.033/TB[d03];[t04]setpts=PTS+0.05/TB[d04];[t05]setpts=PTS+0.0666/TB[d05];[t06]setpts=PTS+0.083/TB[d06];[t07]setpts=PTS+0.1/TB[d07];[t08]setpts=PTS+0.1166/TB[d08];[t09]setpts=PTS+0.133/TB[d09];[t10]setpts=PTS+0.15/TB[d10];[d10][d09]blend=average,[d08]blend=darken,[d07]blend=average,[d06]blend=darken,[d05]blend=average,[d04]blend=darken,[d03]blend= average,[d02]blend=darken,[t01]blend=average,colorkey=white:0.2:0.0,perspective=y1=W*0.176327:y3=H+W*0.176327[text01];[2][3]overlay=x=(W-w)*0.5:y=(H-h)*0.5:enable='between(t,0,3.0)'[logo01];[logo01][text01]overlay[outv]";
    
    var args = ["-y", "-f", "lavfi", "-i", src01, "-f", "lavfi", "-i", src02, "-i", whiteBarMovPath, "-i", peeqLogoPath, "-filter_complex", filterComplexStr, "-vcodec", "qtrle", "-crf:v", "28", "-codec:a", "aac", "-ac", "2", "-ar", "44100", "-ab", "128k", "-map", "[outv]", destPath];
    
    //console.log("args", args);
    return childProcess.spawn('ffmpeg', args).then((ffResult) => {
        return destPath;
    }, (err) => {
        //console.error(new Error("generateTitleAnimation:" + err));
        console.error(err);
        return Promise.reject(err);
    });};
    

    destPath is a .mov file

    Up till a few days ago, the backend started throwing up this error

    stderr: 'ffmpeg version 3.4.2-1~16.04.york0.2 Copyright (c) 2000-2018
     the FFmpeg developers\n built with gcc 5.4.0 (Ubuntu 5.4.0-
    6ubuntu1~16.04.9) 20160609\n configuration: --prefix=/usr --extra-
    version=\'1~16.04.york0.2\' --toolchain=hardened --
    libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --
    enable-gpl --disable-stripping --enable-avresample --enable-avisynth --
    enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --
    enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --
    enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-
    libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-
    libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --
    enable-librubberband --enable-librsvg --enable-libshine --enable-
    libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-
    libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --
    enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 -
    -enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-
    openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --
    enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-
    libopencv --enable-libx264 --enable-shared\n libavutil 55. 78.100 / 55.
     78.100\n libavcodec 57.107.100 / 57.107.100\n libavformat 57. 83.100 /
     57. 83.100\n libavdevice 57. 10.100 / 57. 10.100\n libavfilter 
    6.107.100 / 6.107.100\n libavresample 3. 7. 0 / 3. 7. 0\n libswscale 4.
     8.100 / 4. 8.100\n libswresample 2. 9.100 / 2. 9.100\n libpostproc 54.
     7.100 / 54. 7.100\nInput #0, lavfi, from 
    \'color=c=white:s=1920x1080:duration=5.52\':\n Duration: N/A, start: 
    0.000000, bitrate: N/A\n Stream #0:0: Video: rawvideo (I420 / 
    0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn, 25
     tbc\nInput #1, lavfi, from 
    \'color=c=white@0.0:s=1920x1080:r=120:duration=5.52\':\n Duration: N/A,
     start: 0.000000, bitrate: N/A\n Stream #1:0: Video: rawvideo (I420 /
     0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 120 fps, 120 tbr,
     120 tbn, 120 tbc\nInput #2, mov,mp4,m4a,3gp,3g2,mj2, from 
    \'/app/assets/whiteBar.mov\':\n Metadata:\n major_brand : qt \n 
    minor_version : 537199360\n compatible_brands: qt \n creation_time : 
    2018-04-27T15:55:18.000000Z\n Duration: 00:00:05.52, start: 0.000000, 
    bitrate: 54847 kb/s\n Stream #2:0(eng): Video: qtrle (rle / 
    0x20656C72), bgra(progressive), 1920x1080, 53326 kb/s, SAR 1:1 DAR 16:9, 60 
    fps, 60 tbr, 60 tbn, 60 tbc (default)\n Metadata:\n creation_time : 
    2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data Handler\n
     encoder : Animation\n timecode : 00:00:00:00\n Stream #2:1(eng): Data:
     none (tmcd / 0x64636D74), 0 kb/s (default)\n Metadata:\n creation_time
     : 2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data
     Handler\n timecode : 00:00:00:00\nInput #3, png_pipe, from 
    \'/app/assets/peeq-logo.png\':\n Duration: N/A, bitrate: N/A\n Stream 
    #3:0: Video: png, rgba(pc), 452x207 [SAR 2834:2834 DAR 452:207], 25 
    tbr, 25 tbn, 25 tbc\nCodec AVOption crf (Select the quality for 
    constant quality mode) specified for output file #0 (/tmp/972967.mov) 
    has not been used for any stream. The most likely reason is either 
    wrong type (e.g. a video option with no video streams) or that it is a 
    private option of some encoder which was not actually used for any 
    stream.\nCodec AVOption b (set bitrate (in bits/s)) specified for 
    output file #0 (/tmp/972967.mov) has not been used for any stream. The 
    most likely reason is either wrong type (e.g. a video option with no 
    video streams) or that it is a private option of some encoder which was 
    not actually used for any stream.\nStream mapping:\n Stream #1:0 
    (rawvideo) -> drawtext\n Stream #2:0 (qtrle) -> overlay:main\n Stream 
    #3:0 (png) -> overlay:overlay\n overlay -> Stream #0:0 (qtrle)\nPress 
    [q] to stop, [?] for help\n[swscaler @ 0x56080b828180] No accelerated 
    colorspace conversion found from yuv420p to argb.\n[swscaler @ 
    0x56080b8b5f40] No accelerated colorspace conversion found from yuva420p to argb.\n',
    

    However, this error only occurs on the app engine. Running nom test on my Mac generates the title perfectly.