Recherche avancée

Médias (1)

Mot : - Tags -/musée

Autres articles (90)

  • Liste des distributions compatibles

    26 avril 2011, par

    Le tableau ci-dessous correspond à la liste des distributions Linux compatible avec le script d’installation automatique de MediaSPIP. Nom de la distributionNom de la versionNuméro de version Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    Si vous souhaitez nous aider à améliorer cette liste, vous pouvez nous fournir un accès à une machine dont la distribution n’est pas citée ci-dessus ou nous envoyer le (...)

  • Organiser par catégorie

    17 mai 2013, par

    Dans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
    Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
    Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

Sur d’autres sites (7749)

  • ffmpeg text relocations issues on

    26 janvier 2021, par Amazing Thing

    I have been working on a little app that use FFMPEG Android library in order to convert some videos files. However, the app is crashing on any devices >= 23. I read that this can be fixed either by downgrading your SDK to 22 or using --disable-asm (make it very slow). I want to test the second case but I could not find a good documentation how to implement it on Android. Here my cmd :

    



    String cmd="--disable-asm -i " + videoName + "-i watermark.jpg -filter_complex " + overlay + " -vcodec libx264 -crf 28 -preset ultrafast -c:a copy " +"repostvideo.mp4";


    



    Unfortunately this is not working. So my question how or where would I put --disable-asm in my cmd in order to make it work ?

    



    Thanks.

    



    Edit 1 : Logcat errors

    



    


    CANNOT LINK EXECUTABLE "/data/user/0/xxxx" : "/data/data/xxxx" has text relocations.

    


    



    Version :

    



    


    implementation 'com.writingminds:FFmpegAndroid:0.3.2'

    


    


  • ffmpeg image sequence from text file not rendering correctly

    4 juillet 2018, par Rich

    I’ve been working with ffmpeg and I have got it to work perfectly quite a few times, but as of recent I am running into an issue where all the images from my text file arent used.

    My text file looks like -

    ffconcat version 1.0
    file 'gallery-house-77-west-55th-street-01.jpg'
    duration 4.44
    file 'img03.jpg'
    duration 4.44
    file 'inside.png'
    duration 4.44
    file 'placeholder.png'

    (amount of files, file names and extensions will always vary)

    And I’ve tried several combinations but as of now my ffmpeg function looks like —

    ffmpeg -i audio.mp3 -safe 0 -f concat -i paths.txt -c:a copy -c:v libx264 \
    -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,fps=25,format=yuv420p" -vsync vfr -movflags +faststart -y output.mp4 2>&1');

    When trying to run it, the video seems to convert but has various weird behaviors - sometimes only the middle file(image) in the list is shown or the first 2 only are shown/rendered.

    The audio is playing correctly though.

    I’ve tried setting the framerate for the input and I also tried setting it for the output but it leads to more strange behavior.

    This is what is echoed when running my above ffmpeg command -

    libavutil      56. 15.100 / 56. 15.100
     libavcodec     58. 19.100 / 58. 19.100
     libavformat    58. 13.100 / 58. 13.100
     libavdevice    58.  4.100 / 58.  4.100
     libavfilter     7. 18.100 /  7. 18.100
     libswscale      5.  2.100 /  5.  2.100
     libswresample   3.  2.100 /  3.  2.100
     libpostproc    55.  2.100 / 55.  2.100
    [mp3 @ 0xb46780] Estimating duration from bitrate, this may be inaccurate
    Input #0, mp3, from 'audio.mp3':
     Metadata:
       encoder         : Lavf57.71.100
     Duration: 00:00:19.98, start: 0.000000, bitrate: 48 kb/s
       Stream #0:0: Audio: mp3, 22050 Hz, mono, fltp, 48 kb/s
    Input #1, concat, from 'paths.txt':
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #1:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 754x424 [SAR 72:72 DAR 377:212], 25 tbr, 25 tbn, 25 tbc
    Stream mapping:
     Stream #1:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
     Stream #0:0 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [swscaler @ 0xba25c0] deprecated pixel format used, make sure you did set range correctly
    [libx264 @ 0xb710c0] using SAR=3393/3392
    [libx264 @ 0xb710c0] using cpu capabilities: none!
    [libx264 @ 0xb710c0] profile High, level 3.1
    [libx264 @ 0xb710c0] 264 - core 155 r2901 7d0ff22 - H.264/MPEG-4 AVC codec - Copyleft 2003-2018 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=22 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'output.mp4':
     Metadata:
       encoder         : Lavf58.13.100
       Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 3393:3392 DAR 377:212], q=-1--1, 25 fps, 12800 tbn, 25 tbc
       Metadata:
         encoder         : Lavc58.19.100 libx264
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
       Stream #0:1: Audio: mp3 (mp4a / 0x6134706D), 22050 Hz, mono, fltp, 48 kb/s
    [swscaler @ 0xb93140] deprecated pixel format used, make sure you did set range correctly
    [swscaler @ 0xb93140] Warning: data is not aligned! This can lead to a speed loss
    [mjpeg @ 0xb6e200] mjpeg: unsupported coding type (cd)
    [mjpeg @ 0xb6e200] mjpeg: unsupported coding type (c8)
    [mjpeg @ 0xb6e200] Found EOI before any SOF, ignoring
    [mjpeg @ 0xb6e200] mjpeg: unsupported coding type (c7)
    Error while decoding stream #1:0: Invalid data found when processing input
    [mjpeg @ 0xb6e200] invalid id 94
    Error while decoding stream #1:0: Invalid data found when processing input
    [mp4 @ 0xb6f8c0] Starting second pass: moving the moov atom to the beginning of the file
    frame=    1 fps=0.0 q=28.0 Lsize=     179kB time=00:00:19.95 bitrate=  73.6kbits/s speed=44.1x    
    video:58kB audio:117kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.446521%
    [libx264 @ 0xb710c0] frame I:1     Avg QP:28.86  size: 58683
    [libx264 @ 0xb710c0] mb I  I16..4: 21.2% 59.8% 19.0%
    [libx264 @ 0xb710c0] 8x8 transform intra:59.8%
    [libx264 @ 0xb710c0] coded y,uvDC,uvAC intra: 50.5% 65.2% 31.6%
    [libx264 @ 0xb710c0] i16 v,h,dc,p: 10% 60%  3% 28%
    [libx264 @ 0xb710c0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 27% 26%  4%  4%  4%  6%  4%  7%
    [libx264 @ 0xb710c0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 33% 10%  5%  6%  5%  8%  5%  7%
    [libx264 @ 0xb710c0] i8c dc,h,v,p: 47% 34% 11%  8%
    [libx264 @ 0xb710c0] kb/s:11736.60

    Any ideas whats causing the image sequence to fail ?

    EDIT

    After some more debugging I think it may be related to the fact that all the files dont have the same extension. How can I make it so that it doesn’t have to be that way or is that already a default setting and the issue is else where ?

  • How to save a video on top of it a text widget changes every couple of seconds flutter ?

    23 septembre 2023, par abdallah mostafa

    I've been working on auto subtitle tool for videos but I did not How to save a final video.
should I record the video or get screenshots of all the frames and combine them together to be a video.

    


    I've used FFmpegKit but it's so hard to make the position of the text

    


      Future<void> saveSubtitle(&#xA;      {required double leftPosition,&#xA;      required double topPosition,&#xA;      required double opacityOfBackground,&#xA;      required String backgroundColor,&#xA;      required String subtitleColor}) async {&#xA;    emit(ExportSubtitleLoading());&#xA;&#xA;    String fontDirectoryPath =&#xA;        await _exportSubtitle.writeFontToFile(&#x27;assets/fonts/arial.ttf&#x27;);&#xA;    if (backgroundColor == &#x27;Transparent&#x27;) {&#xA;      opacityOfBackground = 0.0;&#xA;      backgroundColor = &#x27;black&#x27;;&#xA;    }&#xA;    String subtitleFilter = "";&#xA;    for (var subtitle in subtitles!.fotmatedSubtitle!) {&#xA;      double startTime = _exportSubtitle.timeToSeconds(subtitle.interval![0]);&#xA;      double endTime = _exportSubtitle.timeToSeconds(subtitle.interval![1]);&#xA;      String text = subtitle.displayText!.replaceComma;&#xA;      int fontSize = controller!.value.aspectRatio > 0.5625 ? 24 * 3 : 24;&#xA;      if (countWords(text) > 9 &amp;&amp; countWords(text) &lt;= 15) {&#xA;        // Add line breaks ("\n") to the text&#xA;        text = _exportSubtitle.addLineBreaks(&#xA;          text,&#xA;        );&#xA;      } else {&#xA;        text = _exportSubtitle.addLineBreaks(text, true);&#xA;      }&#xA;      final centeredNumber = text.split(&#x27;\n&#x27;);&#xA;      // centeredNumber[2].split(&#x27; &#x27;).logger;&#xA;      // return;&#xA;      for (var i = 0; i &lt; centeredNumber.length; i&#x2B;&#x2B;) {&#xA;        if (i == 0) {&#xA;          if (centeredNumber.length > 1 &amp;&amp;&#xA;              centeredNumber[i].split(&#x27; &#x27;).join().length >&#xA;                  centeredNumber[i &#x2B; 1].split(&#x27; &#x27;).join().length) {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition-30:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          } else {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition&#x2B;20:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          }&#xA;        } else if (i == 1) {&#xA;          subtitleFilter &#x2B;=&#xA;              " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition:y=$topPosition&#x2B;25:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;        } else {&#xA;          if (centeredNumber.length > 1 &amp;&amp;&#xA;              centeredNumber[i - 1].split(&#x27; &#x27;).join().length >&#xA;                  centeredNumber[i].split(&#x27; &#x27;).join().length) {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition&#x2B;text_w/16:y=$topPosition&#x2B;50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          } else {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition-text_w/16:y=$topPosition&#x2B;50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          }&#xA;        }&#xA;      }&#xA;&#xA;      // subtitleFilter &#x2B;=&#xA;      //     " drawtext=text=&#x27;$text&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;    }&#xA;&#xA;    final finalFilter = "\"$subtitleFilter\"";&#xA;    // final finalFilter =&#xA;    //     "\"$subtitleFilter split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\"";&#xA;    final dir = await getTemporaryDirectory();&#xA;    String outputPath = &#x27;${dir.path}/ex_vid.mp4&#x27;;&#xA;    final arguments = [&#xA;      &#x27;-y&#x27;,&#xA;      &#x27;-i&#x27;,&#xA;      inputFile,&#xA;      &#x27;-vf&#x27;,&#xA;      finalFilter,&#xA;      &#x27;-c:v&#x27;,&#xA;      &#x27;libx264&#x27;,&#xA;      &#x27;-c:a&#x27;,&#xA;      &#x27;copy&#x27;,&#xA;      outputPath&#xA;    ];&#xA;    arguments.join(&#x27; &#x27;).logger;&#xA;    // return;&#xA;    // String command =&#xA;    //     "-y -i $inputFile -vf \" drawtext=text=&#x27;You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.&#x27;:enable=&#x27;between(t,0,4.000)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text=&#x27;Well, who are the four best free\nwebsites to find a move?&#x27;:enable=&#x27;between(t,4.000,6.240)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2&#x2B;30:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\" -c:v libx264 -c:a copy $outputPath";&#xA;&#xA;    String command =&#xA;        "-y -i $inputFile -vf \" drawtext=text=&#x27;You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.&#x27;:enable=&#x27;between(t,0,4.000)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text=&#x27;Well, who are the four best free\nwebsites to find a move?&#x27;:enable=&#x27;between(t,4.000,6.240)&#x27;:x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5 \" -c:v libx264 -c:a copy $outputPath";&#xA;&#xA;    &#x27;=================&#x27;.logger;&#xA;    // FFmpegKitConfig.enableUTF8Charset();&#xA;    command.logger;&#xA;    await FFmpegKit.execute(arguments.join(&#x27; &#x27;)).then((session) async {&#xA;      final returnCode = await session.getReturnCode();&#xA;&#xA;      if (ReturnCode.isSuccess(returnCode)) {&#xA;        (&#x27;The Converstion is Success&#x27;).logger;&#xA;        final path = await _exportSubtitle.exportFile(File(outputPath));&#xA;        emit(ExportSubtitleSuccess(path));&#xA;      } else if (ReturnCode.isCancel(returnCode)) {&#xA;        // CANCEL&#xA;        (&#x27;The Converstion is Cancelled&#x27;).logger;&#xA;      } else {&#xA;        emit(ExportSubtitleerror());&#xA;        (&#x27;The Converstion Have an Error&#x27;).logger;&#xA;      }&#xA;    });&#xA;  }&#xA;</void>

    &#xA;

    This function, named saveSubtitle. It is responsible for applying subtitle to a video using FFmpeg. Here's a breakdown of what this function does :

    &#xA;

    It starts by emitting an event to indicate that the subtitle export process is loading.

    &#xA;

    It obtains the file path of a font (arial.ttf) from assets and stores it in fontDirectoryPath.

    &#xA;

    It checks if the background color for subtitles is set to "Transparent." If so, it sets the opacityOfBackground to 0.0 and changes the backgroundColor to black.

    &#xA;

    It initializes an empty subtitleFilter string, which will store FFmpeg filter commands for each subtitle.

    &#xA;

    It iterates through the subtitles and calculates the start and end time, text, and font size for each subtitle.

    &#xA;

    For each subtitle, it calculates the position (x and y coordinates) based on the leftPosition and topPosition. It also sets the font color, font file path, and background color with opacity for the subtitle.

    &#xA;

    It appends the FFmpeg drawtext filter command for each subtitle to the subtitleFilter string.

    &#xA;

    After processing all subtitles, it wraps the subtitleFilter string in double quotes and prepares to use it as an argument for the FFmpeg command.

    &#xA;

    It specifies the output path for the video with subtitles.

    &#xA;

    It constructs the FFmpeg command using various arguments, including the input video file, the subtitle filter, video and audio codecs, and the output path.

    &#xA;

    It executes the FFmpeg command using FFmpegKit and waits for the conversion process to complete.

    &#xA;

    Once the conversion is finished, it checks the return code to determine if it was successful. If successful, it emits a success event with the path to the exported video with subtitles. If canceled or if an error occurred, it emits corresponding events to indicate the status.

    &#xA;

    In summary, this function is used to add subtitles to a video by overlaying text on specific positions and with specified styles. It utilizes FFmpeg for video processing and emits events to notify the application about the export status.

    &#xA;