
Recherche avancée
Médias (91)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
avec chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
sans chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
config chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (38)
-
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Encodage et transformation en formats lisibles sur Internet
10 avril 2011MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (4972)
-
Create a video preview image like youtube or whatsapp from a Video file (.mp4) using JAVA Code
27 novembre 2017, par Ravi DuaIssue 1. When I use FFmpeg Java api, program doesn’t run and print anything after grabber.start(). No preview generated.
Code Sample :
public static boolean generatePreviewImage(String filePath, String previewFileName ) throws IOException {
boolean isPreviewGenerated = false;
System.out.println("Request received to generate thumbnail for video.");
System.out.println("VideoFilePath : "+filePath);
System.out.println("ResultFileName : "+previewFileName);
try {
FFmpegFrameGrabber fGrabber = new FFmpegFrameGrabber(filePath);
System.out.println("FrameGrabber found "+fGrabber);
fGrabber.start();
System.out.println("Frame started..");
ImageIO.write(fGrabber.grab().getBufferedImage(), "jpg", new File(previewFileName));
System.out.println("Image written successfully as "+previewFileName);
isPreviewGenerated = true;
fGrabber.stop();
System.out.println("FrameGrabber stopped.. "+fGrabber);
} catch(Exception e){
System.out.println("Exception while creating video thumbnail : "+previewFileName+" - exception - "+e);
e.printStackTrace();
}
System.out.println("Image written successfully? "+previewFileName);
return isPreviewGenerated;
}Result :
Request received to generate thumbnail for video.
VideoFilePath : /root/appdir/VIDEO20171124143855.mp4
ResultFileName : /root/appdir/vdthumb_0.jpg
FrameGrabber found org.bytedeco.javacv.FFmpegFrameGrabber@3529360eNothing happens and gets printed after above statement..
Additional Information :
I installed FFmpeg on Linux VPS as well and able to generate preview using command line
root@vps19984[ /usr/appdir]#ffmpeg -i /root/appdir/.VIDEO20171123165555.mp4 -r 1 -f image2 image-%2d.png
(above command ffmpeg generates preview successfully on linux box but I want to generate it via Java program)
Issue 2. When I use JCodec api, program generates a black image but NOT an image from video file.
Code Sample :public static boolean generatePreviewImage(String filePath, String previewFileName ) throws IOException, JCodecException {
logger.info("Request received to generate thumbnail for video. VideoFilePath : "+filePath+", resultFileName "+previewFileName);
boolean isPreviewGenerated = false;
Picture framePic = FrameGrab.getNativeFrame(new File(filePath),20);
logger.info("Frame grabbed successfully..");
Transform transform = ColorUtil.getTransform(framePic.getColor(), ColorSpace.RGB);
Picture rgb = Picture.create(framePic.getWidth(), framePic.getHeight(), ColorSpace.RGB);
transform.transform(framePic, rgb);
logger.info("Frame transformed successfully to RGB..");
BufferedImage dst = new BufferedImage(rgb.getCroppedWidth(), rgb.getCroppedHeight(),
BufferedImage.TYPE_INT_RGB);
ImageIO.write(dst, "jpg", new File(previewFileName));
isPreviewGenerated = true;
logger.info("Is preview generated.."+isPreviewGenerated);
}Result :
Request received to generate thumbnail for video. VideoFilePath : /usr/appdir/VIDEO20171123165555.mp4, resultFileName /usr/appdir/vdthumb_0.jpg
Frame grabbed successfully..
Frame transformed successfully to RGB..
Is preview generated..trueIssue : A black jpg image of 5 KB gets generated by JCodec
-
offloading to ffmpeg via named pipes in c#/dotnet core
1er avril 2022, par bepI tried to break this down to the base elements so I hope this is clear. I want to take in a network stream, it may be a 1 way, it may be a protocol that requires 2 way communication, such as RTMP during handshake.


I want to pass that stream straight through to a spawned FFMPEG process. I then want to capture the output of FFMPEG, in this example I just want to pipe it out to a file. The file is not my end goal, but for simplicity if I can get that far I think I'll be ok.




I want the code to be as plain as possible and offload the core processing to FFMPEG. If I ask FFMPEG to output webrtc stream, a file, whatever, I just want to capture that. FFMPEG shouldn't be used directly, just indirectly via
IncomingConnectionHandler
.

Only other component is OBS, which I am using to create the RTMP stream coming in.


As things stand now, running this results in the following error, which I'm a little unclear on. I don't feel like I'm causing concurrent reads at any point.


System.InvalidOperationException: Concurrent reads are not allowed
 at Medallion.Shell.Throw`1.If(Boolean condition, String message)
 at Medallion.Shell.Streams.Pipe.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, TimeSpan timeout, CancellationToken cancellationToken)
 at Medallion.Shell.Streams.Pipe.PipeOutputStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
 at System.IO.Stream.ReadAsync(Memory`1 buffer, CancellationToken cancellationToken)
 at System.IO.StreamReader.ReadBufferAsync(CancellationToken cancellationToken)
 at System.IO.StreamReader.ReadLineAsyncInternal()
 at Medallion.Shell.Streams.MergedLinesEnumerable.GetEnumeratorInternal()+MoveNext()
 at System.String.Join(String separator, IEnumerable`1 values)
 at VideoIngest.IncomingRtmpConnectionHandler.OnConnectedAsync(ConnectionContext connection) in Program.cs:line 55
 at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Infrastructure.KestrelConnection`1.ExecuteAsync()



Code :


namespace VideoIngest
{
 public class IncomingRtmpConnectionHandler : ConnectionHandler
 {
 private readonly ILogger<incomingrtmpconnectionhandler> logger;

 public IncomingRtmpConnectionHandler(ILogger<incomingrtmpconnectionhandler> logger)
 {
 this.logger = logger;
 }

 public override async Task OnConnectedAsync(ConnectionContext connection)
 {
 logger?.LogInformation("connection started");

 var outputFileName = @"C:\Temp\bunny.mp4";

 var rtmpPassthroughPipeName = Guid.NewGuid().ToString();
 var cmdPath = @"C:\Opt\ffmpeg\bin\ffmpeg.exe";
 var cmdArgs = $"-i pipe:{rtmpPassthroughPipeName} -preset slow -c copy -f mp4 -y pipe:1";

 var cancellationToken = connection.ConnectionClosed;
 var rtmpStream = connection.Transport;

 using (var outputStream = new FileStream(outputFileName, FileMode.Create))
 using (var cmd = Command.Run(cmdPath, options: o => { o.StartInfo(i => i.Arguments = cmdArgs); o.CancellationToken(cancellationToken); }))
 {
 // create a pipe to pass the RTMP data straight to FFMPEG. This code should be dumb to proto etc being used
 var ffmpegPassthroughStream = new NamedPipeServerStream(rtmpPassthroughPipeName, PipeDirection.InOut, 10, PipeTransmissionMode.Byte, System.IO.Pipes.PipeOptions.Asynchronous);

 // take the network stream and pass data to/from ffmpeg process
 var fromFfmpegTask = ffmpegPassthroughStream.CopyToAsync(rtmpStream.Output.AsStream(), cancellationToken);
 var toFfmpegTask = rtmpStream.Input.AsStream().CopyToAsync(ffmpegPassthroughStream, cancellationToken);

 // take the ffmpeg process output (not stdout) into target file
 var outputTask = cmd.StandardOutput.PipeToAsync(outputStream);

 while (!outputTask.IsCompleted && !outputTask.IsCanceled)
 {
 var errs = cmd.GetOutputAndErrorLines();
 logger.LogInformation(string.Join(Environment.NewLine, errs));

 await Task.Delay(1000);
 }

 CommandResult result = result = cmd.Result;

 if (result != null && result.Success)
 {
 logger.LogInformation("Created file");
 }
 else
 {
 logger.LogError(result.StandardError);
 }
 }

 logger?.LogInformation("connection closed");
 }
 }

 public class Startup
 {
 public void ConfigureServices(IServiceCollection services) { }

 public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
 {
 app.Run(async (context) =>
 {
 var log = context.RequestServices.GetRequiredService>();
 await context.Response.WriteAsync("Hello World!");
 });
 }
 }

 public class Program
 {
 public static void Main(string[] args)
 {
 CreateHostBuilder(args).Build().Run();
 }

 public static IWebHostBuilder CreateHostBuilder(string[] args) =>
 WebHost
 .CreateDefaultBuilder(args)
 .ConfigureServices(services =>
 {
 services.AddLogging(options =>
 {
 options.AddDebug().AddConsole().SetMinimumLevel(LogLevel.Information);
 });
 })
 .UseKestrel(options =>
 {
 options.ListenAnyIP(15666, builder =>
 {
 builder.UseConnectionHandler<incomingrtmpconnectionhandler>();
 });

 options.ListenLocalhost(5000);

 // HTTPS 5001
 options.ListenLocalhost(5001, builder =>
 {
 builder.UseHttps();
 });
 })
 .UseStartup<startup>();
 }
 

}
</startup></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler>


Questions :


- 

- Is this a valid approach, do you see any fundamental issues ?
- Is the pipe naming correct, is the convention just
pipe:someName
? - Any ideas on what specifically may be causing the
Concurrent reads are not allowed
? - If #3 is solved, does the rest of this seem valid ?










-
FFMPEG - adding a nullsrc causes my script to report "1000 duplicate frames"
14 juin 2020, par rossmcmI'm trying to add coloured rectangle highlights to a video, appearing at different locations and times. The highlights are on a 6x6 grid of 320x180 rectangles.



Originally I didn't have the
nullsrc=size=1920x1080
thinking that it would start with an empty image, but it seems that that causes it to make assumptions about where the input is coming from. So I added thenullsrc=size=1920x1080
to start with a transparent 1920x1080 image but this command returns a warning that 1000 duplicate frames have been produced and it keeps going past the end of the input video with no signs of stopping.


ffmpeg -y \ 
 -i "Input.mp4" \ 
 -filter_complex \ 
 "nullsrc=size=1920x1080, drawbox=x=(3-1)*320:y=(3-1)*180:w=320:h=180:t=7:c=cyan, fade=in:st=10:d=1:alpha=1, fade=out:st=40:d=1:alpha=1[tmp1]; \
 nullsrc=size=1920x1080, drawbox=x=(4-1)*320:y=(4-1)*180:w=320:h=180:t=7:c=blue, fade=in:st=20:d=1:alpha=1, fade=out:st=50:d=1:alpha=1[tmp2]; \
 nullsrc=size=1920x1080, drawbox=x=(5-1)*320:y=(5-1)*180:w=320:h=180:t=7:c=green, fade=in:st=30:d=1:alpha=1, fade=out:st=60:d=1:alpha=1[tmp3]; \
 nullsrc=size=1920x1080, drawbox=x=(6-1)*320:y=(6-1)*180:w=320:h=180:t=7:c=yellow, fade=in:st=40:d=1:alpha=1, fade=out:st=70:d=1:alpha=1[tmp4]; \
 [tmp1][tmp2] overlay=0:0[ovr1]; \ 
 [tmp3][tmp4] overlay=0:0[ovr2]; \ 
 [ovr1][ovr2] overlay=0:0[boxes]; \ 
 [0:v][boxes] overlay=0:0" \ 
 "Output.mp4"




The input video is around 01:45 long. Log of run :



ffmpeg -y -loglevel verbose -i "Input.mp4" -filter_complex " nullsrc=size=1920x1080, drawbox=x=(3-1)*320:y=(3-1)*180:w=320:h=180:t=7:c=cyan, fade=in:st=10:d=1:alpha=1, fade=out:st=40:d=1:alpha=1[tmp1]; nullsrc=size=1920x1080, drawbox=x =(4-1)*320:y=(4-1)*180:w=320:h=180:t=7:c=blue, fade=in:st=20:d=1:alpha=1, fade=out:st=50:d=1:alpha=1[tmp2]; nullsrc=size=1920x1080, drawbox=x=(5-1)*320:y=(5-1)*180:w=320:h=180:t=7:c=green, fa
de=in:st=30:d=1:alpha=1, fade=out:st=60:d=1:alpha=1[tmp3]; nullsrc=size=1920x1080, drawbox=x=(6-1)*320:y=(6-1)*180:w=320:h=180:t=7:c=yellow, fade=in:st=40:d=1:alpha=1, fade=out:st=70:d=1:alpha=1[tmp4]; [tmp1][tmp2] overlay=0:0[ovr1]; [tmp3][tmp4] overlay=0:0[ovr2]; [ovr1][ovr2] overlay=0:0[boxes]; [0:v][boxes] overlay=0:0" "Output.mp4"
ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
 built with gcc 7.2.0 (GCC)
 configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvid
stab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
 libavutil 55. 78.100 / 55. 78.100
 libavcodec 57.107.100 / 57.107.100
 libavformat 57. 83.100 / 57. 83.100
 libavdevice 57. 10.100 / 57. 10.100
 libavfilter 6.107.100 / 6.107.100
 libswscale 4. 8.100 / 4. 8.100
 libswresample 2. 9.100 / 2. 9.100
 libpostproc 54. 7.100 / 54. 7.100
[h264 @ 000001df2b623ea0] Reinit context to 1920x1088, pix_fmt: yuv420p
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Input.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf58.29.100
 Duration: 00:01:48.67, start: 0.000000, bitrate: 1825 kb/s
 Stream #0:0(und): Video: h264 (High), 1 reference frame (avc1 / 0x31637661), yuv420p(left), 1920x1080 (1920x1088) [SAR 1:1 DAR 16:9], 1693 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
 Metadata:
 handler_name : VideoHandler
 Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 44100 Hz, stereo, s16p, 127 kb/s (default)
 Metadata:
 handler_name : SoundHandler
[Parsed_nullsrc_0 @ 000001df2b61af00] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_2 @ 000001df2b6a6860] type:in start_time:10.000000 duration:1.000000 alpha:1
[Parsed_fade_3 @ 000001df2b9ddec0] type:out start_time:40.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_4 @ 000001df2bc00560] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_6 @ 000001df29fe6780] type:in start_time:20.000000 duration:1.000000 alpha:1
[Parsed_fade_7 @ 000001df29fe6840] type:out start_time:50.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_8 @ 000001df2b642dc0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_10 @ 000001df2b6442a0] type:in start_time:30.000000 duration:1.000000 alpha:1
[Parsed_fade_11 @ 000001df2b6444e0] type:out start_time:60.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_12 @ 000001df2b62d000] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_14 @ 000001df2b62d580] type:in start_time:40.000000 duration:1.000000 alpha:1
[Parsed_fade_15 @ 000001df2b62ddc0] type:out start_time:70.000000 duration:1.000000 alpha:1
Stream mapping:
 Stream #0:0 (h264) -> overlay:main (graph 0)
 overlay (graph 0) -> Stream #0:0 (libx264)
 Stream #0:1 -> #0:1 (mp3 (native) -> aac (native))
Press [q] to stop, [?] for help
[h264 @ 000001df2b8e5040] Reinit context to 1920x1088, pix_fmt: yuv420p
[graph_1_in_0_1 @ 000001df2b62e100] tb:1/44100 samplefmt:s16p samplerate:44100 chlayout:0x3
[format_out_0_1 @ 000001df2b62e5e0] auto-inserting filter 'auto_resampler_0' between the filter 'Parsed_anull_0' and the filter 'format_out_0_1'
[auto_resampler_0 @ 000001df2b62dd00] ch:2 chl:stereo fmt:s16p r:44100Hz -> ch:2 chl:stereo fmt:fltp r:44100Hz
[Parsed_nullsrc_0 @ 000001df2b62e440] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_2 @ 000001df2b62df60] type:in start_time:10.000000 duration:1.000000 alpha:1
[Parsed_fade_3 @ 000001df2b62e6c0] type:out start_time:40.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_4 @ 000001df2b62d9c0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_6 @ 000001df2b62e520] type:in start_time:20.000000 duration:1.000000 alpha:1
[Parsed_fade_7 @ 000001df2b62d8e0] type:out start_time:50.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_8 @ 000001df2b62e1e0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_10 @ 000001df2b62e2a0] type:in start_time:30.000000 duration:1.000000 alpha:1
[Parsed_fade_11 @ 000001df2b62e380] type:out start_time:60.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_12 @ 000001df2b62dc20] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_14 @ 000001df2b816f60] type:in start_time:40.000000 duration:1.000000 alpha:1
[Parsed_fade_15 @ 000001df2b817ac0] type:out start_time:70.000000 duration:1.000000 alpha:1
[graph 0 input from stream 0:0 @ 000001df2b817ba0] w:1920 h:1080 pixfmt:yuv420p tb:1/15360 fr:30/1 sar:1/1 sws_param:flags=2
[Parsed_drawbox_1 @ 000001df2b62da80] x:640 y:360 w:320 h:180 color:0xA9A610FF
[Parsed_drawbox_5 @ 000001df2b62e780] x:960 y:540 w:320 h:180 color:0x29F06EFF
[Parsed_overlay_16 @ 000001df2b817860] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_16 @ 000001df2b817860] [framesync @ 000001df2b815348] Selected 1/25 time base
[Parsed_overlay_16 @ 000001df2b817860] [framesync @ 000001df2b815348] Sync level 2
[Parsed_drawbox_9 @ 000001df2b62db60] x:1280 y:720 w:320 h:180 color:0x515B51FF
[Parsed_drawbox_13 @ 000001df2b818be0] x:1600 y:900 w:320 h:180 color:0xD21092FF
[Parsed_overlay_17 @ 000001df2b816d00] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_17 @ 000001df2b816d00] [framesync @ 000001df2b815e48] Selected 1/25 time base
[Parsed_overlay_17 @ 000001df2b816d00] [framesync @ 000001df2b815e48] Sync level 2
[Parsed_overlay_18 @ 000001df2b8183c0] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_18 @ 000001df2b8183c0] [framesync @ 000001df2b815668] Selected 1/25 time base
[Parsed_overlay_18 @ 000001df2b8183c0] [framesync @ 000001df2b815668] Sync level 2
[Parsed_overlay_19 @ 000001df2b817d40] main w:1920 h:1080 fmt:yuv420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Selected 1/76800 time base
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Sync level 2
[libx264 @ 000001df2b62cd40] using SAR=1/1
[libx264 @ 000001df2b62cd40] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 000001df2b62cd40] profile High, level 4.0
[libx264 @ 000001df2b62cd40] 264 - core 152 r2851 ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weigh
tb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'Output.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf57.83.100
 Stream #0:0: Video: h264 (libx264), 1 reference frame (avc1 / 0x31637661), yuv420p(left), 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 30 fps, 15360 tbn, 30 tbc (default)
 Metadata:
 encoder : Lavc57.107.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
 Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, delay 1024, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 encoder : Lavc57.107.100 aac
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Sync level 1speed=0.78x
Past duration 0.800774 too large
*** 1 dup!5 fps= 24 q=29.0 size= 21760kB time=00:01:46.86 bitrate=1668.0kbits/s speed=0.78x
 Last message repeated 1 times
*** 1 dup!8 fps= 24 q=29.0 size= 21760kB time=00:01:47.30 bitrate=1661.3kbits/s dup=2 drop=0 speed=0.78x
 Last message repeated 2 times
*** 1 dup!3 fps= 24 q=29.0 size= 21760kB time=00:01:47.80 bitrate=1653.6kbits/s dup=5 drop=0 speed=0.781x
 Last message repeated 1 times

...

 Last message repeated 2 times
*** 1 dup!3 fps= 25 q=29.0 size= 27392kB time=00:05:05.80 bitrate= 733.8kbits/s dup=995 drop=0 speed=0.832x
 Last message repeated 1 times
*** 1 dup!6 fps= 25 q=29.0 size= 27392kB time=00:05:06.23 bitrate= 732.8kbits/s dup=997 drop=0 speed=0.832x
 Last message repeated 1 times
*** 1 dup!0 fps= 25 q=29.0 size= 27392kB time=00:05:06.70 bitrate= 731.6kbits/s dup=999 drop=0 speed=0.832x
 Last message repeated 1 times
More than 1000 frames duplicated
*** 1 dup!3 fps= 25 q=29.0 size= 27392kB time=00:05:07.13 bitrate= 730.6kbits/s dup=1001 drop=0 speed=0.832x
 Last message repeated 2 times
*** 1 dup!8 fps= 25 q=29.0 size= 27392kB time=00:05:07.63 bitrate= 729.4kbits/s dup=1004 drop=0 speed=0.832x
 Last message repeated 1 times
... 
*** 1 dup!9 fps= 25 q=29.0 size= 27904kB time=00:05:18.66 bitrate= 717.3kbits/s dup=1059 drop=0 speed=0.834x
 Last message repeated 1 times
*** 1 dup!3 fps= 25 q=29.0 size= 27904kB time=00:05:19.13 bitrate= 716.3kbits/s dup=1061 drop=0 speed=0.834x
frame= 9635 fps= 25 q=-1.0 Lsize= 28364kB time=00:05:21.06 bitrate= 723.7kbits/s dup=1062 drop=0 speed=0.837x
video:26539kB audio:1637kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.669889%
Input file #0 (Input.mp4):
 Input stream #0:0 (video): 3260 packets read (23006797 bytes); 3260 frames decoded;
 Input stream #0:1 (audio): 4025 packets read (1682285 bytes); 4025 frames decoded (4636800 samples);
 Total: 7285 packets (24689082 bytes) demuxed
Output file #0 (Output.mp4):
 Output stream #0:0 (video): 9635 frames encoded; 9635 packets muxed (27175566 bytes);
 Output stream #0:1 (audio): 4529 frames encoded (4636800 samples); 4530 packets muxed (1676214 bytes);
 Total: 14165 packets (28851780 bytes) muxed
[libx264 @ 000001df2b62cd40] frame I:39 Avg QP:16.49 size:213954
[libx264 @ 000001df2b62cd40] frame P:2446 Avg QP:17.77 size: 6277
[libx264 @ 000001df2b62cd40] frame B:7150 Avg QP:30.38 size: 486
[libx264 @ 000001df2b62cd40] consecutive B-frames: 0.8% 0.8% 0.1% 98.3%
[libx264 @ 000001df2b62cd40] mb I I16..4: 13.2% 43.0% 43.8%
[libx264 @ 000001df2b62cd40] mb P I16..4: 0.2% 0.2% 0.1% P16..4: 7.1% 3.3% 1.7% 0.0% 0.0% skip:87.5%
[libx264 @ 000001df2b62cd40] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 4.3% 0.1% 0.0% direct: 0.0% skip:95.5% L0:41.9% L1:56.1% BI: 2.0%
[libx264 @ 000001df2b62cd40] 8x8 transform intra:44.1% inter:65.8%
[libx264 @ 000001df2b62cd40] coded y,uvDC,uvAC intra: 71.3% 83.5% 52.5% inter: 1.1% 1.5% 0.0%
[libx264 @ 000001df2b62cd40] i16 v,h,dc,p: 31% 28% 4% 38%
[libx264 @ 000001df2b62cd40] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 20% 10% 6% 6% 7% 7% 8% 8%
[libx264 @ 000001df2b62cd40] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 33% 28% 7% 4% 6% 6% 6% 5% 5%
[libx264 @ 000001df2b62cd40] i8c dc,h,v,p: 34% 27% 28% 11%
[libx264 @ 000001df2b62cd40] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 000001df2b62cd40] ref P L0: 71.1% 14.9% 10.6% 3.4%
[libx264 @ 000001df2b62cd40] ref B L0: 93.3% 5.9% 0.8%
[libx264 @ 000001df2b62cd40] ref B L1: 97.8% 2.2%
[libx264 @ 000001df2b62cd40] kb/s:676.90
[aac @ 000001df2b6a5620] Qavg: 1165.766
Exiting normally, received signal 2.
Terminate batch job (Y/N)? y