
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (90)
-
Liste des distributions compatibles
26 avril 2011, parLe tableau ci-dessous correspond à la liste des distributions Linux compatible avec le script d’installation automatique de MediaSPIP. Nom de la distributionNom de la versionNuméro de version Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
Si vous souhaitez nous aider à améliorer cette liste, vous pouvez nous fournir un accès à une machine dont la distribution n’est pas citée ci-dessus ou nous envoyer le (...) -
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...) -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (7749)
-
ffmpeg text relocations issues on
26 janvier 2021, par Amazing ThingI have been working on a little app that use FFMPEG Android library in order to convert some videos files. However, the app is crashing on any devices >= 23. I read that this can be fixed either by downgrading your SDK to 22 or using
--disable-asm
(make it very slow). I want to test the second case but I could not find a good documentation how to implement it on Android. Here my cmd :


String cmd="--disable-asm -i " + videoName + "-i watermark.jpg -filter_complex " + overlay + " -vcodec libx264 -crf 28 -preset ultrafast -c:a copy " +"repostvideo.mp4";




Unfortunately this is not working. So my question how or where would I put
--disable-asm
in my cmd in order to make it work ?


Thanks.



Edit 1 : Logcat errors





CANNOT LINK EXECUTABLE "/data/user/0/xxxx" : "/data/data/xxxx" has text relocations.





Version :





implementation 'com.writingminds:FFmpegAndroid:0.3.2'




-
ffmpeg image sequence from text file not rendering correctly
4 juillet 2018, par RichI’ve been working with ffmpeg and I have got it to work perfectly quite a few times, but as of recent I am running into an issue where all the images from my text file arent used.
My text file looks like -
ffconcat version 1.0
file 'gallery-house-77-west-55th-street-01.jpg'
duration 4.44
file 'img03.jpg'
duration 4.44
file 'inside.png'
duration 4.44
file 'placeholder.png'(amount of files, file names and extensions will always vary)
And I’ve tried several combinations but as of now my ffmpeg function looks like —
ffmpeg -i audio.mp3 -safe 0 -f concat -i paths.txt -c:a copy -c:v libx264 \
-vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,fps=25,format=yuv420p" -vsync vfr -movflags +faststart -y output.mp4 2>&1');When trying to run it, the video seems to convert but has various weird behaviors - sometimes only the middle file(image) in the list is shown or the first 2 only are shown/rendered.
The audio is playing correctly though.
I’ve tried setting the framerate for the input and I also tried setting it for the output but it leads to more strange behavior.
This is what is echoed when running my above ffmpeg command -
libavutil 56. 15.100 / 56. 15.100
libavcodec 58. 19.100 / 58. 19.100
libavformat 58. 13.100 / 58. 13.100
libavdevice 58. 4.100 / 58. 4.100
libavfilter 7. 18.100 / 7. 18.100
libswscale 5. 2.100 / 5. 2.100
libswresample 3. 2.100 / 3. 2.100
libpostproc 55. 2.100 / 55. 2.100
[mp3 @ 0xb46780] Estimating duration from bitrate, this may be inaccurate
Input #0, mp3, from 'audio.mp3':
Metadata:
encoder : Lavf57.71.100
Duration: 00:00:19.98, start: 0.000000, bitrate: 48 kb/s
Stream #0:0: Audio: mp3, 22050 Hz, mono, fltp, 48 kb/s
Input #1, concat, from 'paths.txt':
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #1:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 754x424 [SAR 72:72 DAR 377:212], 25 tbr, 25 tbn, 25 tbc
Stream mapping:
Stream #1:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Stream #0:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
[swscaler @ 0xba25c0] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0xb710c0] using SAR=3393/3392
[libx264 @ 0xb710c0] using cpu capabilities: none!
[libx264 @ 0xb710c0] profile High, level 3.1
[libx264 @ 0xb710c0] 264 - core 155 r2901 7d0ff22 - H.264/MPEG-4 AVC codec - Copyleft 2003-2018 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=22 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'output.mp4':
Metadata:
encoder : Lavf58.13.100
Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 3393:3392 DAR 377:212], q=-1--1, 25 fps, 12800 tbn, 25 tbc
Metadata:
encoder : Lavc58.19.100 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Stream #0:1: Audio: mp3 (mp4a / 0x6134706D), 22050 Hz, mono, fltp, 48 kb/s
[swscaler @ 0xb93140] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0xb93140] Warning: data is not aligned! This can lead to a speed loss
[mjpeg @ 0xb6e200] mjpeg: unsupported coding type (cd)
[mjpeg @ 0xb6e200] mjpeg: unsupported coding type (c8)
[mjpeg @ 0xb6e200] Found EOI before any SOF, ignoring
[mjpeg @ 0xb6e200] mjpeg: unsupported coding type (c7)
Error while decoding stream #1:0: Invalid data found when processing input
[mjpeg @ 0xb6e200] invalid id 94
Error while decoding stream #1:0: Invalid data found when processing input
[mp4 @ 0xb6f8c0] Starting second pass: moving the moov atom to the beginning of the file
frame= 1 fps=0.0 q=28.0 Lsize= 179kB time=00:00:19.95 bitrate= 73.6kbits/s speed=44.1x
video:58kB audio:117kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.446521%
[libx264 @ 0xb710c0] frame I:1 Avg QP:28.86 size: 58683
[libx264 @ 0xb710c0] mb I I16..4: 21.2% 59.8% 19.0%
[libx264 @ 0xb710c0] 8x8 transform intra:59.8%
[libx264 @ 0xb710c0] coded y,uvDC,uvAC intra: 50.5% 65.2% 31.6%
[libx264 @ 0xb710c0] i16 v,h,dc,p: 10% 60% 3% 28%
[libx264 @ 0xb710c0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 27% 26% 4% 4% 4% 6% 4% 7%
[libx264 @ 0xb710c0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 33% 10% 5% 6% 5% 8% 5% 7%
[libx264 @ 0xb710c0] i8c dc,h,v,p: 47% 34% 11% 8%
[libx264 @ 0xb710c0] kb/s:11736.60Any ideas whats causing the image sequence to fail ?
EDIT
After some more debugging I think it may be related to the fact that all the files dont have the same extension. How can I make it so that it doesn’t have to be that way or is that already a default setting and the issue is else where ?
-
How to save a video on top of it a text widget changes every couple of seconds flutter ?
23 septembre 2023, par abdallah mostafaI've been working on auto subtitle tool for videos but I did not How to save a final video.
should I record the video or get screenshots of all the frames and combine them together to be a video.


I've used FFmpegKit but it's so hard to make the position of the text


Future<void> saveSubtitle(
 {required double leftPosition,
 required double topPosition,
 required double opacityOfBackground,
 required String backgroundColor,
 required String subtitleColor}) async {
 emit(ExportSubtitleLoading());

 String fontDirectoryPath =
 await _exportSubtitle.writeFontToFile('assets/fonts/arial.ttf');
 if (backgroundColor == 'Transparent') {
 opacityOfBackground = 0.0;
 backgroundColor = 'black';
 }
 String subtitleFilter = "";
 for (var subtitle in subtitles!.fotmatedSubtitle!) {
 double startTime = _exportSubtitle.timeToSeconds(subtitle.interval![0]);
 double endTime = _exportSubtitle.timeToSeconds(subtitle.interval![1]);
 String text = subtitle.displayText!.replaceComma;
 int fontSize = controller!.value.aspectRatio > 0.5625 ? 24 * 3 : 24;
 if (countWords(text) > 9 && countWords(text) <= 15) {
 // Add line breaks ("\n") to the text
 text = _exportSubtitle.addLineBreaks(
 text,
 );
 } else {
 text = _exportSubtitle.addLineBreaks(text, true);
 }
 final centeredNumber = text.split('\n');
 // centeredNumber[2].split(' ').logger;
 // return;
 for (var i = 0; i < centeredNumber.length; i++) {
 if (i == 0) {
 if (centeredNumber.length > 1 &&
 centeredNumber[i].split(' ').join().length >
 centeredNumber[i + 1].split(' ').join().length) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition-30:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition+20:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }
 } else if (i == 1) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition:y=$topPosition+25:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 if (centeredNumber.length > 1 &&
 centeredNumber[i - 1].split(' ').join().length >
 centeredNumber[i].split(' ').join().length) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition+text_w/16:y=$topPosition+50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition-text_w/16:y=$topPosition+50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }
 }
 }

 // subtitleFilter +=
 // " drawtext=text='$text':enable='between(t,$startTime,$endTime)':x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }

 final finalFilter = "\"$subtitleFilter\"";
 // final finalFilter =
 // "\"$subtitleFilter split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\"";
 final dir = await getTemporaryDirectory();
 String outputPath = '${dir.path}/ex_vid.mp4';
 final arguments = [
 '-y',
 '-i',
 inputFile,
 '-vf',
 finalFilter,
 '-c:v',
 'libx264',
 '-c:a',
 'copy',
 outputPath
 ];
 arguments.join(' ').logger;
 // return;
 // String command =
 // "-y -i $inputFile -vf \" drawtext=text='You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.':enable='between(t,0,4.000)':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text='Well, who are the four best free\nwebsites to find a move?':enable='between(t,4.000,6.240)':x=(w-text_w)/2:y=(h-text_h)/2+30:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\" -c:v libx264 -c:a copy $outputPath";

 String command =
 "-y -i $inputFile -vf \" drawtext=text='You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.':enable='between(t,0,4.000)':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text='Well, who are the four best free\nwebsites to find a move?':enable='between(t,4.000,6.240)':x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5 \" -c:v libx264 -c:a copy $outputPath";

 '================='.logger;
 // FFmpegKitConfig.enableUTF8Charset();
 command.logger;
 await FFmpegKit.execute(arguments.join(' ')).then((session) async {
 final returnCode = await session.getReturnCode();

 if (ReturnCode.isSuccess(returnCode)) {
 ('The Converstion is Success').logger;
 final path = await _exportSubtitle.exportFile(File(outputPath));
 emit(ExportSubtitleSuccess(path));
 } else if (ReturnCode.isCancel(returnCode)) {
 // CANCEL
 ('The Converstion is Cancelled').logger;
 } else {
 emit(ExportSubtitleerror());
 ('The Converstion Have an Error').logger;
 }
 });
 }
</void>


This function, named saveSubtitle. It is responsible for applying subtitle to a video using FFmpeg. Here's a breakdown of what this function does :


It starts by emitting an event to indicate that the subtitle export process is loading.


It obtains the file path of a font (arial.ttf) from assets and stores it in fontDirectoryPath.


It checks if the background color for subtitles is set to "Transparent." If so, it sets the opacityOfBackground to 0.0 and changes the backgroundColor to black.


It initializes an empty subtitleFilter string, which will store FFmpeg filter commands for each subtitle.


It iterates through the subtitles and calculates the start and end time, text, and font size for each subtitle.


For each subtitle, it calculates the position (x and y coordinates) based on the leftPosition and topPosition. It also sets the font color, font file path, and background color with opacity for the subtitle.


It appends the FFmpeg drawtext filter command for each subtitle to the subtitleFilter string.


After processing all subtitles, it wraps the subtitleFilter string in double quotes and prepares to use it as an argument for the FFmpeg command.


It specifies the output path for the video with subtitles.


It constructs the FFmpeg command using various arguments, including the input video file, the subtitle filter, video and audio codecs, and the output path.


It executes the FFmpeg command using FFmpegKit and waits for the conversion process to complete.


Once the conversion is finished, it checks the return code to determine if it was successful. If successful, it emits a success event with the path to the exported video with subtitles. If canceled or if an error occurred, it emits corresponding events to indicate the status.


In summary, this function is used to add subtitles to a video by overlaying text on specific positions and with specified styles. It utilizes FFmpeg for video processing and emits events to notify the application about the export status.