Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • flutter_ffmpeg is discontinued with no alternatives ?

    13 mars, par Rageh Azzazy

    Based on this article by Taner Sener https://tanersener.medium.com/saying-goodbye-to-ffmpegkit-33ae939767e1

    and after the discontinuation of flutter_ffmpeg and ffmpeg_kit_flutter packages.

    most packages for executing video editing commands were built depending on them and so won't work After April 1, 2025 as mentioned in the package documentation and Taner's article.

    example of packages depending on flutter_ffmpeg or ffmpeg_kit_flutter like

    • video_trimmer
    • zero_video_trimmer
    • flutter_video_trimmer
    • video_trim
    • bemeli_compress
    • video_trimmer_pro
    • ... others

    and editing video using video_editor or video_editor_2 or video_editor_pits has become a problem

    and I believe downloading the binaries and doing the whole thing locally is not just tedious but illegal as well.

    so I broke down exactly what I need to edit videos in my flutter app

    and I found those package but not yet tested them

    Trimming : flutter_native_video_trimmer

    Compression : video_compress or video_compress_plus

    Muting : video_compress handles this

    Encoding : Not sure, I just need a simple MP4 video files

    Cropping : I can't find a solution for video cropping

    I don't need to rotate, reverse or do any fancy stuff to the videos, just those five functions.

    so now only remaining solution for this approach is to find a simple video cropping function and an encoder that work on flutter IOS & Android

    so my question is not looking for external library recommendation but do you have any ideas how to crop a video in flutter natively ?

  • No accelerated colorspace conversion found from yuv420p to argb

    13 mars, par Zac Chan

    I am a novice at ffmpeg and have recently taken over a code base built by a previous engineer. The FFmpeg code is on an app engine that will edit the videos when they are uploaded.

    This code generated a title animation that will later be used as an overlay.

    exports.generateTitleAnimation = function(metadata, destPath, options = {}) {
    const peeqLogoPath = "/app/assets/peeq-logo.png";
    const whiteBarMovPath = "/app/assets/whiteBar.mov";
    const titleFontPath = "/app/assets/Sofia-Pro-Black.otf";
    const dateStrFontPath = "/app/assets/Sofia-Pro-Medium.otf";
    const outputDuration = 5.52;
    const src01 = "color=c=white:s=1920x1080:duration=" + outputDuration;
    const src02 = "color=c=white@0.0:s=1920x1080:r=120:duration=" + outputDuration;
    
    var dateStrXOffset = "(92";
    var filterComplexStr = "[1]";
    
    if (metadata.title) {
        const title = metadata.title.toUpperCase();
        filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + title + "':x='floor(92*(min((t-1.75)^29,0)+max((t-3.75)^29,0)+1))':y=622+30+2:fontsize=70:fontcolor=black:ft_load_flags=render,";
    }
    if (metadata.subTitle) {
        const subTitle = metadata.subTitle.toUpperCase();
        filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + subTitle + "':x='floor(92*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,";
    
        dateStrXOffset += "+30*" + (subTitle.length + 1);
    }
    if (metadata.dateStr) {
        filterComplexStr += "drawtext=fontfile=" + dateStrFontPath + ":text='" + metadata.dateStr + "':x='floor(" + dateStrXOffset + ")*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,";
    }
    console.log("generateTitleAnimation generating")
    filterComplexStr += "split=10[t01][t02][t03][t04][t05][t06][t07][t08][t09][t10];[t02]setpts=PTS+0.0166/TB[d02];[t03]setpts=PTS+0.033/TB[d03];[t04]setpts=PTS+0.05/TB[d04];[t05]setpts=PTS+0.0666/TB[d05];[t06]setpts=PTS+0.083/TB[d06];[t07]setpts=PTS+0.1/TB[d07];[t08]setpts=PTS+0.1166/TB[d08];[t09]setpts=PTS+0.133/TB[d09];[t10]setpts=PTS+0.15/TB[d10];[d10][d09]blend=average,[d08]blend=darken,[d07]blend=average,[d06]blend=darken,[d05]blend=average,[d04]blend=darken,[d03]blend= average,[d02]blend=darken,[t01]blend=average,colorkey=white:0.2:0.0,perspective=y1=W*0.176327:y3=H+W*0.176327[text01];[2][3]overlay=x=(W-w)*0.5:y=(H-h)*0.5:enable='between(t,0,3.0)'[logo01];[logo01][text01]overlay[outv]";
    
    var args = ["-y", "-f", "lavfi", "-i", src01, "-f", "lavfi", "-i", src02, "-i", whiteBarMovPath, "-i", peeqLogoPath, "-filter_complex", filterComplexStr, "-vcodec", "qtrle", "-crf:v", "28", "-codec:a", "aac", "-ac", "2", "-ar", "44100", "-ab", "128k", "-map", "[outv]", destPath];
    
    //console.log("args", args);
    return childProcess.spawn('ffmpeg', args).then((ffResult) => {
        return destPath;
    }, (err) => {
        //console.error(new Error("generateTitleAnimation:" + err));
        console.error(err);
        return Promise.reject(err);
    });};
    

    destPath is a .mov file

    Up till a few days ago, the backend started throwing up this error

    stderr: 'ffmpeg version 3.4.2-1~16.04.york0.2 Copyright (c) 2000-2018
     the FFmpeg developers\n built with gcc 5.4.0 (Ubuntu 5.4.0-
    6ubuntu1~16.04.9) 20160609\n configuration: --prefix=/usr --extra-
    version=\'1~16.04.york0.2\' --toolchain=hardened --
    libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --
    enable-gpl --disable-stripping --enable-avresample --enable-avisynth --
    enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --
    enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --
    enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-
    libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-
    libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --
    enable-librubberband --enable-librsvg --enable-libshine --enable-
    libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-
    libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --
    enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 -
    -enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-
    openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --
    enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-
    libopencv --enable-libx264 --enable-shared\n libavutil 55. 78.100 / 55.
     78.100\n libavcodec 57.107.100 / 57.107.100\n libavformat 57. 83.100 /
     57. 83.100\n libavdevice 57. 10.100 / 57. 10.100\n libavfilter 
    6.107.100 / 6.107.100\n libavresample 3. 7. 0 / 3. 7. 0\n libswscale 4.
     8.100 / 4. 8.100\n libswresample 2. 9.100 / 2. 9.100\n libpostproc 54.
     7.100 / 54. 7.100\nInput #0, lavfi, from 
    \'color=c=white:s=1920x1080:duration=5.52\':\n Duration: N/A, start: 
    0.000000, bitrate: N/A\n Stream #0:0: Video: rawvideo (I420 / 
    0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn, 25
     tbc\nInput #1, lavfi, from 
    \'color=c=white@0.0:s=1920x1080:r=120:duration=5.52\':\n Duration: N/A,
     start: 0.000000, bitrate: N/A\n Stream #1:0: Video: rawvideo (I420 /
     0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 120 fps, 120 tbr,
     120 tbn, 120 tbc\nInput #2, mov,mp4,m4a,3gp,3g2,mj2, from 
    \'/app/assets/whiteBar.mov\':\n Metadata:\n major_brand : qt \n 
    minor_version : 537199360\n compatible_brands: qt \n creation_time : 
    2018-04-27T15:55:18.000000Z\n Duration: 00:00:05.52, start: 0.000000, 
    bitrate: 54847 kb/s\n Stream #2:0(eng): Video: qtrle (rle / 
    0x20656C72), bgra(progressive), 1920x1080, 53326 kb/s, SAR 1:1 DAR 16:9, 60 
    fps, 60 tbr, 60 tbn, 60 tbc (default)\n Metadata:\n creation_time : 
    2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data Handler\n
     encoder : Animation\n timecode : 00:00:00:00\n Stream #2:1(eng): Data:
     none (tmcd / 0x64636D74), 0 kb/s (default)\n Metadata:\n creation_time
     : 2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data
     Handler\n timecode : 00:00:00:00\nInput #3, png_pipe, from 
    \'/app/assets/peeq-logo.png\':\n Duration: N/A, bitrate: N/A\n Stream 
    #3:0: Video: png, rgba(pc), 452x207 [SAR 2834:2834 DAR 452:207], 25 
    tbr, 25 tbn, 25 tbc\nCodec AVOption crf (Select the quality for 
    constant quality mode) specified for output file #0 (/tmp/972967.mov) 
    has not been used for any stream. The most likely reason is either 
    wrong type (e.g. a video option with no video streams) or that it is a 
    private option of some encoder which was not actually used for any 
    stream.\nCodec AVOption b (set bitrate (in bits/s)) specified for 
    output file #0 (/tmp/972967.mov) has not been used for any stream. The 
    most likely reason is either wrong type (e.g. a video option with no 
    video streams) or that it is a private option of some encoder which was 
    not actually used for any stream.\nStream mapping:\n Stream #1:0 
    (rawvideo) -> drawtext\n Stream #2:0 (qtrle) -> overlay:main\n Stream 
    #3:0 (png) -> overlay:overlay\n overlay -> Stream #0:0 (qtrle)\nPress 
    [q] to stop, [?] for help\n[swscaler @ 0x56080b828180] No accelerated 
    colorspace conversion found from yuv420p to argb.\n[swscaler @ 
    0x56080b8b5f40] No accelerated colorspace conversion found from yuva420p to argb.\n',
    

    However, this error only occurs on the app engine. Running nom test on my Mac generates the title perfectly.

  • Extracting audio from video using fluent-ffmpeg

    12 mars, par Idi Favour

    Im trying to extract the audio from a video, an error is occuring Error converting file: Error: ffmpeg exited with code 234: Error opening output file ./src/videos/output-audio.mp3. Error opening output files: Invalid argument

    I use this same directory for my video compression that runs before this one and it works.

    ffmpeg()
      .input(url)
      .audioChannels(0)
      .format("mp3")
      .output("./src/videos/output-audio.mp3")
      .on("error", (err) => console.error(`Error converting file: ${err}`))
      .on("end", async () => {
        console.log("audio transcripts");
       
        const stream = fs.createReadStream("./src/videos/output-audio.mp3");
        const transcription = await openai.audio.transcriptions.create({
          file: stream,
          model: "whisper-1",
          response_format: "verbose_json",
          timestamp_granularities: ["word"],
        });
        transcripts = transcription.text;
        console.log(transcription.text);
      })
      .run();
    
  • How to create a circular video and overlay it on an image background while preserving transparency for MP4 output ?

    12 mars, par Mykyta Manuilenko

    I have the following FFmpeg command that overlays a transparent webm video on a background image and preserves the transparency:

    ffmpeg -t 5 -framerate 25 -loop 1 -i "bg_image.png" -c:v libvpx-vp9 -i "transparent_video.webm" -y -filter_complex "[0:v]format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2[bg_image];[1:v]setpts=PTS-STARTPTS,setsar=1,pad=ceil(iw/2)*2:ceil(ih/2)*2[el_out_1];[bg_image][el_out_1]overlay=82:250,format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2[raw_video];[raw_video]format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2" -vcodec libx264 -f mp4 -t 5 -an -crf 23 -preset medium -copyts "output.mp4"
    

    It works fine - the transparency is preserved. But now I want to make it a circle but keep the transparency, so I tweaked the FFmpeg command a bit:

    ffmpeg -t 5 -framerate 25 -loop 1 -i "bg_image.png" -c:v libvpx-vp9 -i "transparent_video.webm" -y -filter_complex "[0:v]format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2[bg_image];[1:v]setpts=PTS-STARTPTS,crop=ceil(365.091854095459/2)*2:ceil(365.091854095459/2)*2:180.34759712219238:0,format=rgba,geq=lum='p(X,Y)':a='if(lt(sqrt(pow(X-182.5459270477295,2)+pow(Y-182.5459270477295,2)),182.5459270477295-0.5),255,if(gt(sqrt(pow(X-182.5459270477295,2)+pow(Y-182.5459270477295,2)),182.5459270477295+0.5),0,255*(182.5459270477295+0.5 - sqrt(pow(X-182.5459270477295,2)+pow(Y-182.5459270477295,2)))))',scale=ceil(549.2381964584453/2)*2:ceil(549.2381964584451/2)*2,setsar=1,pad=ceil(iw/2)*2:ceil(ih/2)*2[el_out_1];[bg_image][el_out_1]overlay=84:252,format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2[raw_video];[raw_video]format=yuv420p,pad=ceil(iw/2)*2:ceil(ih/2)*2" -vcodec libx264 -f mp4 -t 5 -an -crf 23 -preset medium -copyts "output.mp4"
    

    Input a transparent webm video became a circle, but the transparency is lost, instead of transparency I see a black background inside the circle.

    Is there any way to create a circular video and overlay it on an image background while preserving transparency?

    Note that I can't output it to webm or mov with ProRes 4444 which natively support alpha channel, the output should only be mp4.

  • How to split audio file into equal-length segments with ffmpeg ?

    11 mars, par GPWR

    I want to split an audio file into several equal-length segments using FFmpeg. I want to specify the general segment duration (no overlap), and I want FFmpeg to render as many segments as it takes to go over the whole audio file (in other words, the number of segments to be rendered is unspecified). Also, since I am not very experienced with FFmpeg (I only use it to make simple file conversions with few arguments), I would like a description of the code you should use to do this, rather than just a piece of code that I won't necessarily understand, if possible. Thank you in advance.

    P.S. Here's the context for why I'm trying to do this: I would like to sample a song into single-bar loops automatically, instead of having to chop them manually using a DAW. All I want to do is align the first beat of the song to the beat grid in my DAW, and then export that audio file and use it to generate one-bar loops in FFmpeg.

    In the future, I will try to do something like a batch command in which one can specify the tempo and key signature, and it will generate the loops using FFmpeg automatically (as long as the loop is aligned to the beat grid, as I've mentioned earlier). 😀