
Recherche avancée
Autres articles (39)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)
Sur d’autres sites (6687)
-
Investigating Steam for Linux
1er mars 2013, par Multimedia Mike — Game HackingValve recently released the final, public version of their Steam client for Linux, and the Linux world rejoiced. At least, it probably did. The announcement was 2 weeks ago on Valentine’s Day and I had other things on my mind, so I missed any fanfare. When framed in this manner, the announcement timing becomes suspect– it’s as though Linux enthusiasts would have plenty of time that day or something.
Taming the Frontier
Speculation about a Linux Steam client had been kicking around for nearly as long as Steam has existed. However, sometime last year, the rumors became more substantive.I naturally wondered how to port something like Steam to Linux. I have some experience with trying to make a necessarily binary-only program that runs on Linux. I’m fairly well-versed in the assorted technical challenges that one might face when attempting such a feat. Because of this, whenever I hear rumors that a company might be entertaining the notion of porting a major piece of proprietary software to Linux, my instinctive reflex is, “What ?! Why, you fools ?! Save yourselves !”
At least, that’s how it used to be. The proposal of developing a proprietary binary for Linux has been rendered considerably less insane by a few developments, for example :
- The rise of Ubuntu Linux as a quasi de facto standard for desktop Linux computing
- The increasing homogeneity in personal desktop computing technology
What I would like to know is how the Steam client runs on Linux. Does it rely on any libraries being present on the system ? Or does it bring its own ? The latter is a trick that proprietary programs can use– transport all of the shared libraries that the main program binary depends upon, install them someplace out of the way on the filesystem, probably in /opt, and then make the main program a shell script which sets a preload path to rely on the known quantity libraries instead of the copies already on the system.
Downloading and Installing the Client
For this exercise, I installed x86_64 desktop Ubuntu 12.04 Linux on a l33t gaming rig that was totally top of the line about 5 years ago, and that someone didn’t want anymore and handed down to me recently. So it should be ideal for this project.At first, I was blown away– the Linux client is in a .deb package that is less than 2 MB large. I unpacked the steam.deb file and found a bunch of support libraries — mostly X11 and standard C/C++ runtimes. Just as I suspected. Still, I can’t believe how small the thing is. However, my amazement quickly abated when I actually ran Steam and saw this :
So it turns out steam.db is just the installer program which immediately proceeds to download an additional 160+ MB of data. So there’s actually a lot more information to possibly sift through.
Another component of the installation is to basically run a big ‘apt-get install’ command to make sure a bunch of required packages are installed :
After all these installation steps, the client was ready to run. However, whenever I tried to do so, I got this dialog which would cause Steam to close when the dialog was dismissed.
Not a huge deal ; later NVIDIA drivers are fairly straightforward to install on Ubuntu Linux. After a few minutes of downloading, installing and restarting X, Steam ran with minimal complaint (it still had some issue regarding the video drivers but didn’t seem to consider it a deal-breaker).
Using Steam on Linux
So here’s Steam running on Linux :
If you have experience with using Steam on Windows or Mac, you might observe that it looks exactly the same. I don’t have a very expansive library of games (I only started using Steam because purchasing a few computer components a few years ago entitled me to some free Steam downloads of some of the games on the list in the screenshot). I didn’t really expect any of the games to have Linux versions yet, but it turns out that the indie darling FTL : Faster Than Light has been ported to Linux. FTL was a much-heralded Kickstarter success story and sounded like something I wanted to support. I purchased this from Steam shortly after its release last year and was able to download the Linux version at no additional cost with a single click.
It runs natively on Linux (note the Ubuntu desktop window decorations) :
You might notice from the main Steam client that, despite purchasing FTL about a 1/2 year ago and starting it up at least a 1/2 dozen times, I haven’t really invested a whole lot of time into it. I only managed to get about 2 minutes further this time :
What can I say ? This game just bores me to tears. It’s frustrating because I know that this is one of the cool games that all real gamers are supposed to like, but I practically catch myself nodding off every time I try to run through the tutorial. It’s strange to think that I’ve invested far more time into games that offer considerably less stimulation. That’s probably because I had far more free time compared to gaming options during those times.
But that’s neither here nor there. We’ll file this under “games that aren’t for me.” I’m glad that people like FTL and a little indie underdog has met with such success. And I’m pleased that Steam on Linux works. It’s native and the games are also native, which is all quite laudable (there was speculation that everything would just be running on top of a Wine layer).
Deeper Analysis
So I set out wondering how Steam was able to create a proprietary program that would satisfy a large enough cross-section of Linux users (i.e., on different platforms and distros). Answer : well, they didn’t, per the stated requirements. The installation is only tuned to work on Ubuntu 12.04. However, it works on both 32- and 64-bit platforms, the only 2 desktop CPU platforms that matter these days (unless ARM somehow makes inroads on the desktop). The Steam client is quite clearly an x86_32 binary– look at the terminal screenshot above and observe that it’s downloading all :i386 support libraries.The file /usr/bin/steam isn’t a binary but a launcher shell script (something you’ll also see if you investigate /usr/bin/firefox on a Linux system). Here’s an interesting tidbit :
function detect_platform() # Maybe be smarter someday # Right now this is the only platform we have a bootstrap for, so hard-code it. echo ubuntu12_32
I wager that it’s possible to get Steam running on other distributions, it probably just takes a little more effort (assuming that Steam doesn’t put too much effort into thwarting such attempts).
As for the FTL game, it comes with binaries and libraries for both x86_32 and x86_64. So, good work to the dev team for creating and testing both versions. FTL also distributes versions of the libraries it expects to work with.
I suspect that the Steam client overall is largely a WWW rendering engine underneath the covers. That would help explain how Valve is able to achieve such a consistent look and feel, not only across OS platforms, but also through a web browser. When I browse the Steam store through Google Chrome, it looks and feels exactly like the native desktop client. When I first thought of how someone could port Steam to Linux, I immediately wondered about how they would do the UI.
A little Googling for “steam uses webkit” (just a hunch) confirms my hypothesis.
-
FFmpeg filtergraph memory leak
5 juillet 2017, par Leif AndersenI have an FFmpeg program that :
- Demuxes and decodes a video file.
- Passes it through a filtergraph
- encodes and muxes the new video.
The filtergraph itself is rather complex, and can be run directly from the command line as such :
ffmpeg -i demo.mp4 -filter_complex \
"[audio3]atrim=end=30:start=10[audio2];\
[video5]trim=end=30:start=10[video4];[audio2]anull[audio6];\
[video4]scale=width=1920:height=1080[video7];[audio6]anull[audio8];\
[video7]fps=fps=30[video9];[audio8]anull[audio10];\
[video9]format=pix_fmts=yuv420p[video11];\
[audio10]asetpts=expr=PTS-STARTPTS[audio12];\
[video11]setpts=expr=PTS-STARTPTS[video13];\
[audio15]concat=v=0:a=1:n=1[audio14];\
[video17]concat=v=1:a=0:n=1[video16];\
[audio12]afifo[audio15];[video13]fifo[video17];\
[audio14]afifo[audio18];[video16]fifo[video19];\
[audio18]anull[audio20];\
[video19]pad=width=1920:height=1080[video21];\
[audio20]anull[audio22];[video21]fps=fps=25[video23];\
[audio22]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo[fa];\
[video23]format=pix_fmts=yuv420p[fv];[0:a]afifo[audio3];\
[0:v]fifo[video5]" \
-map "[fv]" -map "[fa]" out.mp4I realize this is a massive filtergraph with a lot of no-op filters, it was autogenerated rather than being hand written. Here is a more cleaner version of the graph. (Its a graphviz file, you can run it in the command line or here.)
Anyway, when I run the program that uses this filtergraph my memory usage spikes. I end up using about 7 GB of RAM for a 30 second clip. However, when I run the program using the ffmpeg command above, it peaks out at about 600 MB of RAM. This causes me to believe that the problem is not the ungodly size of the filtergraph, but a problem with how my program is using it.
The program sets up the filtergraph (using
av_filter_parse_ptr
, giving the filtergraph string shown above), encoder, muxer, decoder, and demuxer, then spawns two threads, one that sends frames into the filtergraph, and one that receives them. The frame that sends them looks something like :void decode () {
while(... more_frames ...) {
AVFrame *frame = av_frame_alloc();
... fill next frame of stream ...
av_buffersrc_write_frame(ctx, frame);
av_frame_free(&frame);
}
}(I have elided the
av_send_packet/av_receive_frame
functions as they don’t seem to be leaking memory. I have also elided the process of flushing the buffersrc as that won’t happen until the end, and the memory spikes long before that.)And the encoder thread looks similar :
void encode() {
while(... nodes_in_graph ...) {
AVFrame *frame = av_frame_alloc();
av_buffersink_get_frame(ctx, frame);
... ensure frame actually was filled ...
... send frame to encoder ...
av_frame_free(&frame);
}
}As with the decoder, I have elided the
send_frame/receive_packet
combo as they don’t seem to be leaking memory. Additionally I have elided the details of ensuring that the frame actually was filled. The code loops until the frame eventually does get filled.Every frame I allocate I fairly quickly deallocate. I additionally handled all of the error cases that the ffmpeg can give (Elided in the example).
I have also tried having only one frame for the encoder and one for the decoder (and calling
av_frame_unref
in each iteration of the loop).Am I forgetting to free something, or am I just using the calls to libavfilter incorrectly such that it has to buffer all of the data ? I don’t think the leak is caused by the memory graph because running it from the command line doesn’t seem to cause the same memory explosion.
FWIW, the actual code is here, although its written in Racket. I do have a minimal example that also seems to duplicate this behavior (modified from the
doc/example/filtering_video.c
file from the ffmpeg code :#include
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavfilter></libavfilter>avfiltergraph.h>
#include <libavfilter></libavfilter>buffersink.h>
#include <libavfilter></libavfilter>buffersrc.h>
#include <libavutil></libavutil>opt.h>
const char *filter_descr = "trim=start=10:end=30,scale=78:24,transpose=cclock";
static AVFormatContext *fmt_ctx;
static AVCodecContext *dec_ctx;
AVFilterContext *buffersink_ctx;
AVFilterContext *buffersrc_ctx;
AVFilterGraph *filter_graph;
static int video_stream_index = -1;
static int64_t last_pts = AV_NOPTS_VALUE;
static int open_input_file(const char *filename)
{
int ret;
AVCodec *dec;
if ((ret = avformat_open_input(&fmt_ctx, filename, NULL, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open input file\n");
return ret;
}
if ((ret = avformat_find_stream_info(fmt_ctx, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find stream information\n");
return ret;
}
/* select the video stream */
ret = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &dec, 0);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find a video stream in the input file\n");
return ret;
}
video_stream_index = ret;
/* create decoding context */
dec_ctx = avcodec_alloc_context3(dec);
if (!dec_ctx)
return AVERROR(ENOMEM);
avcodec_parameters_to_context(dec_ctx, fmt_ctx->streams[video_stream_index]->codecpar);
av_opt_set_int(dec_ctx, "refcounted_frames", 1, 0);
/* init the video decoder */
if ((ret = avcodec_open2(dec_ctx, dec, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open video decoder\n");
return ret;
}
return 0;
}
static int init_filters(const char *filters_descr)
{
char args[512];
int ret = 0;
AVFilter *buffersrc = avfilter_get_by_name("buffer");
AVFilter *buffersink = avfilter_get_by_name("buffersink");
AVFilterInOut *outputs = avfilter_inout_alloc();
AVFilterInOut *inputs = avfilter_inout_alloc();
AVRational time_base = fmt_ctx->streams[video_stream_index]->time_base;
enum AVPixelFormat pix_fmts[] = { AV_PIX_FMT_GRAY8, AV_PIX_FMT_NONE };
filter_graph = avfilter_graph_alloc();
if (!outputs || !inputs || !filter_graph) {
ret = AVERROR(ENOMEM);
goto end;
}
/* buffer video source: the decoded frames from the decoder will be inserted here. */
snprintf(args, sizeof(args),
"video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
dec_ctx->width, dec_ctx->height, dec_ctx->pix_fmt,
time_base.num, time_base.den,
dec_ctx->sample_aspect_ratio.num, dec_ctx->sample_aspect_ratio.den);
ret = avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in",
args, NULL, filter_graph);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot create buffer source\n");
goto end;
}
/* buffer video sink: to terminate the filter chain. */
ret = avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out",
NULL, NULL, filter_graph);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot create buffer sink\n");
goto end;
}
ret = av_opt_set_int_list(buffersink_ctx, "pix_fmts", pix_fmts,
AV_PIX_FMT_NONE, AV_OPT_SEARCH_CHILDREN);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot set output pixel format\n");
goto end;
}
outputs->name = av_strdup("in");
outputs->filter_ctx = buffersrc_ctx;
outputs->pad_idx = 0;
outputs->next = NULL;
inputs->name = av_strdup("out");
inputs->filter_ctx = buffersink_ctx;
inputs->pad_idx = 0;
inputs->next = NULL;
if ((ret = avfilter_graph_parse_ptr(filter_graph, filters_descr,
&inputs, &outputs, NULL)) < 0)
goto end;
if ((ret = avfilter_graph_config(filter_graph, NULL)) < 0)
goto end;
end:
avfilter_inout_free(&inputs);
avfilter_inout_free(&outputs);
return ret;
}
int main(int argc, char **argv)
{
int ret;
AVPacket packet;
AVFrame *frame = av_frame_alloc();
AVFrame *filt_frame = av_frame_alloc();
if (!frame || !filt_frame) {
perror("Could not allocate frame");
exit(1);
}
if (argc != 2) {
fprintf(stderr, "Usage: %s file\n", argv[0]);
exit(1);
}
av_register_all();
avfilter_register_all();
if ((ret = open_input_file(argv[1])) < 0)
goto end;
if ((ret = init_filters(filter_descr)) < 0)
goto end;
/* read all packets */
while (1) {
if ((ret = av_read_frame(fmt_ctx, &packet)) < 0)
break;
if (packet.stream_index == video_stream_index) {
ret = avcodec_send_packet(dec_ctx, &packet);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while sending a packet to the decoder\n");
break;
}
while (ret >= 0) {
ret = avcodec_receive_frame(dec_ctx, frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
break;
} else if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while receiving a frame from the decoder\n");
goto end;
}
if (ret >= 0) {
frame->pts = av_frame_get_best_effort_timestamp(frame);
/* push the decoded frame into the filtergraph */
if (av_buffersrc_add_frame_flags(buffersrc_ctx, frame, AV_BUFFERSRC_FLAG_KEEP_REF) < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while feeding the filtergraph\n");
break;
}
/* pull filtered frames from the filtergraph */
while (1) {
ret = av_buffersink_get_frame(buffersink_ctx, filt_frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
break;
if (ret < 0)
goto end;
av_frame_unref(filt_frame);
}
av_frame_unref(frame);
}
}
}
av_packet_unref(&packet);
}
end:
avfilter_graph_free(&filter_graph);
avcodec_free_context(&dec_ctx);
avformat_close_input(&fmt_ctx);
av_frame_free(&frame);
av_frame_free(&filt_frame);
return ret;
} -
Libav (ffmpeg) copying decoded video timestamps to encoder
31 octobre 2016, par Jason CI am writing an application that decodes a single video stream from an input file (any codec, any container), does a bunch of image processing, and encodes the results to an output file (single video stream, Quicktime RLE, MOV). I am using ffmpeg’s libav 3.1.5 (Windows build for now, but the application will be cross-platform).
There is a 1:1 correspondence between input and output frames and I want the frame timing in the output to be identical to the input. I am having a really, really hard time accomplishing this. So my general question is : How do I reliably (as in, in all cases of inputs) set the output frame timing identical to the input ?
It took me a very long time to slog through the API and get to the point I am at now. I put together a minimal test program to work with :
#include <cstdio>
extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>avutil.h>
#include <libavutil></libavutil>imgutils.h>
#include <libswscale></libswscale>swscale.h>
}
using namespace std;
struct DecoderStuff {
AVFormatContext *formatx;
int nstream;
AVCodec *codec;
AVStream *stream;
AVCodecContext *codecx;
AVFrame *rawframe;
AVFrame *rgbframe;
SwsContext *swsx;
};
struct EncoderStuff {
AVFormatContext *formatx;
AVCodec *codec;
AVStream *stream;
AVCodecContext *codecx;
};
template <typename t="t">
static void dump_timebase (const char *what, const T *o) {
if (o)
printf("%s timebase: %d/%d\n", what, o->time_base.num, o->time_base.den);
else
printf("%s timebase: null object\n", what);
}
// reads next frame into d.rawframe and d.rgbframe. returns false on error/eof.
static bool read_frame (DecoderStuff &d) {
AVPacket packet;
int err = 0, haveframe = 0;
// read
while (!haveframe && err >= 0 && ((err = av_read_frame(d.formatx, &packet)) >= 0)) {
if (packet.stream_index == d.nstream) {
err = avcodec_decode_video2(d.codecx, d.rawframe, &haveframe, &packet);
}
av_packet_unref(&packet);
}
// error output
if (!haveframe && err != AVERROR_EOF) {
char buf[500];
av_strerror(err, buf, sizeof(buf) - 1);
buf[499] = 0;
printf("read_frame: %s\n", buf);
}
// convert to rgb
if (haveframe) {
sws_scale(d.swsx, d.rawframe->data, d.rawframe->linesize, 0, d.rawframe->height,
d.rgbframe->data, d.rgbframe->linesize);
}
return haveframe;
}
// writes an output frame, returns false on error.
static bool write_frame (EncoderStuff &e, AVFrame *inframe) {
// see note in so post about outframe here
AVFrame *outframe = av_frame_alloc();
outframe->format = inframe->format;
outframe->width = inframe->width;
outframe->height = inframe->height;
av_image_alloc(outframe->data, outframe->linesize, outframe->width, outframe->height,
AV_PIX_FMT_RGB24, 1);
//av_frame_copy(outframe, inframe);
static int count = 0;
for (int n = 0; n < outframe->width * outframe->height; ++ n) {
outframe->data[0][n*3+0] = ((n+count) % 100) ? 0 : 255;
outframe->data[0][n*3+1] = ((n+count) % 100) ? 0 : 255;
outframe->data[0][n*3+2] = ((n+count) % 100) ? 0 : 255;
}
++ count;
AVPacket packet;
av_init_packet(&packet);
packet.size = 0;
packet.data = NULL;
int err, havepacket = 0;
if ((err = avcodec_encode_video2(e.codecx, &packet, outframe, &havepacket)) >= 0 && havepacket) {
packet.stream_index = e.stream->index;
err = av_interleaved_write_frame(e.formatx, &packet);
}
if (err < 0) {
char buf[500];
av_strerror(err, buf, sizeof(buf) - 1);
buf[499] = 0;
printf("write_frame: %s\n", buf);
}
av_packet_unref(&packet);
av_freep(&outframe->data[0]);
av_frame_free(&outframe);
return err >= 0;
}
int main (int argc, char *argv[]) {
const char *infile = "wildlife.wmv";
const char *outfile = "test.mov";
DecoderStuff d = {};
EncoderStuff e = {};
av_register_all();
// decoder
avformat_open_input(&d.formatx, infile, NULL, NULL);
avformat_find_stream_info(d.formatx, NULL);
d.nstream = av_find_best_stream(d.formatx, AVMEDIA_TYPE_VIDEO, -1, -1, &d.codec, 0);
d.stream = d.formatx->streams[d.nstream];
d.codecx = avcodec_alloc_context3(d.codec);
avcodec_parameters_to_context(d.codecx, d.stream->codecpar);
avcodec_open2(d.codecx, NULL, NULL);
d.rawframe = av_frame_alloc();
d.rgbframe = av_frame_alloc();
d.rgbframe->format = AV_PIX_FMT_RGB24;
d.rgbframe->width = d.codecx->width;
d.rgbframe->height = d.codecx->height;
av_frame_get_buffer(d.rgbframe, 1);
d.swsx = sws_getContext(d.codecx->width, d.codecx->height, d.codecx->pix_fmt,
d.codecx->width, d.codecx->height, AV_PIX_FMT_RGB24,
SWS_POINT, NULL, NULL, NULL);
//av_dump_format(d.formatx, 0, infile, 0);
dump_timebase("in stream", d.stream);
dump_timebase("in stream:codec", d.stream->codec); // note: deprecated
dump_timebase("in codec", d.codecx);
// encoder
avformat_alloc_output_context2(&e.formatx, NULL, NULL, outfile);
e.codec = avcodec_find_encoder(AV_CODEC_ID_QTRLE);
e.stream = avformat_new_stream(e.formatx, e.codec);
e.codecx = avcodec_alloc_context3(e.codec);
e.codecx->bit_rate = 4000000; // arbitrary for qtrle
e.codecx->width = d.codecx->width;
e.codecx->height = d.codecx->height;
e.codecx->gop_size = 30; // 99% sure this is arbitrary for qtrle
e.codecx->pix_fmt = AV_PIX_FMT_RGB24;
e.codecx->time_base = d.stream->time_base; // ???
e.codecx->flags |= (e.formatx->flags & AVFMT_GLOBALHEADER) ? AV_CODEC_FLAG_GLOBAL_HEADER : 0;
avcodec_open2(e.codecx, NULL, NULL);
avcodec_parameters_from_context(e.stream->codecpar, e.codecx);
//av_dump_format(e.formatx, 0, outfile, 1);
dump_timebase("out stream", e.stream);
dump_timebase("out stream:codec", e.stream->codec); // note: deprecated
dump_timebase("out codec", e.codecx);
// open file and write header
avio_open(&e.formatx->pb, outfile, AVIO_FLAG_WRITE);
avformat_write_header(e.formatx, NULL);
// frames
while (read_frame(d) && write_frame(e, d.rgbframe))
;
// write trailer and close file
av_write_trailer(e.formatx);
avio_closep(&e.formatx->pb);
}
</typename></cstdio>A few notes about that :
- Since all of my attempts at frame timing so far have failed, I’ve removed almost all timing-related stuff from this code to start with a clean slate.
- Almost all error checking and cleanup omitted for brevity.
- The reason I allocate a new output frame with a new buffer in
write_frame
, rather than usinginframe
directly, is because this is more representative of what my real application is doing. My real app also uses RGB24 internally, hence the conversions here. - The reason I generate a weird pattern in
outframe
, rather than using e.g.av_copy_frame
, is because I just wanted a test pattern that compressed well with Quicktime RLE (my test input ends up generating a 1.7GB output file otherwise). - The input video I am using, "wildlife.wmv", can be found here. I’ve hard-coded the filenames.
- I am aware that
avcodec_decode_video2
andavcodec_encode_video2
are deprecated, but don’t care. They work fine, I’ve already struggled too much getting my head around the latest version of the API, ffmpeg changes their API with nearly every release, and I really don’t feel like dealing withavcodec_send_*
andavcodec_receive_*
right now. - I think I’m supposed to be finishing off by passing a NULL frame to
avcodec_encode_video2
to flush some buffers or something but I’m a bit confused about that. Unless somebody feels like explaining that let’s ignore it for now, it’s a separate question. The docs are as vague about this point as they are about everything else. - My test input file’s frame rate is 29.97.
Now, as for my current attempts. The following timing related fields are present in the above code, with details/confusion in bold. There’s a lot of them, because the API is mind-bogglingly convoluted :
main: d.stream->time_base
: Input video stream time base. For my test input file this is 1/1000.main: d.stream->codec->time_base
: Not sure what this is (I never could make sense of whyAVStream
has anAVCodecContext
field when you always use your own new context anyways) and also thecodec
field is deprecated. For my test input file this is 1/1000.main: d.codecx->time_base
: Input codec context time-base. For my test input file this is 0/1. Am I supposed to set it ?main: e.stream->time_base
: Time base of the output stream I create. What do I set this to ?main: e.stream->codec->time_base
: Time base of the deprecated and mysterious codec field of the output stream I create. Do I set this to anything ?main: e.codecx->time_base
: Time base of the encoder context I create. What do I set this to ?read_frame: packet.dts
: Decoding timestamp of packet read.read_frame: packet.pts
: Presentation timestamp of packet read.read_frame: packet.duration
: Duration of packet read.read_frame: d.rawframe->pts
: Presentation timestamp of raw frame decoded. This is always 0. Why isn’t it read by the decoder...?read_frame: d.rgbframe->pts
/write_frame: inframe->pts
: Presentation timestamp of decoded frame converted to RGB. Not set to anything currently.read_frame: d.rawframe->pkt_*
: Fields copied from packet, discovered after reading this post. They are set correctly but I don’t know if they are useful.write_frame: outframe->pts
: Presentation timestamp of frame being encoded. Should I set this to something ?write_frame: outframe->pkt_*
: Timing fields from a packet. Should I set these ? They seem to be ignored by the encoder.write_frame: packet.dts
: Decoding timestamp of packet being encoded. What do I set it to ?write_frame: packet.pts
: Presentation timestamp of packet being encoded. What do I set it to ?write_frame: packet.duration
: Duration of packet being encoded. What do I set it to ?
I have tried the following, with the described results. Note that
inframe
isd.rgbframe
:-
- Init
e.stream->time_base = d.stream->time_base
- Init
e.codecx->time_base = d.codecx->time_base
- Set
d.rgbframe->pts = packet.dts
inread_frame
- Set
outframe->pts = inframe->pts
inwrite_frame
- Result : Warning that encoder time base is not set (since
d.codecx->time_base was 0/1
), seg fault.
- Init
-
- Init
e.stream->time_base = d.stream->time_base
- Init
e.codecx->time_base = d.stream->time_base
- Set
d.rgbframe->pts = packet.dts
inread_frame
- Set
outframe->pts = inframe->pts
inwrite_frame
- Result : No warnings, but VLC reports frame rate as 480.048 (no idea where this number came from) and file plays too fast.
Also the encoder sets all the timing fields in(Edit : Turns out this is becausepacket
to 0, which was not what I expected.av_interleaved_write_frame
, unlikeav_write_frame
, takes ownership of the packet and swaps it with a blank one, and I was printing the values after that call. So they are not ignored.)
- Init
-
- Init
e.stream->time_base = d.stream->time_base
- Init
e.codecx->time_base = d.stream->time_base
- Set
d.rgbframe->pts = packet.dts
inread_frame
- Set any of pts/dts/duration in
packet
inwrite_frame
to anything. - Result : Warnings about packet timestamps not set. Encoder seems to reset all packet timing fields to 0, so none of this has any effect.
- Init
-
- Init
e.stream->time_base = d.stream->time_base
- Init
e.codecx->time_base = d.stream->time_base
- I found these fields,
pkt_pts
,pkt_dts
, andpkt_duration
inAVFrame
after reading this post, so I tried copying those all the way through tooutframe
. - Result : Really had my hopes up, but ended up with same results as attempt 3 (packet timestamp not set warning, incorrect results).
- Init
I tried various other hand-wavey permutations of the above and nothing worked. What I want to do is create an output file that plays back with the same timing and frame rate as the input (29.97 constant frame rate in this case).
So how do I do this ? Of the zillions of timing related fields here, what do I do to make the output be the same as the input ? And how do I do it in such a way that handles arbitrary video input formats that may store their time stamps and time bases in different places ? I need this to always work.
For reference, here is a table of all the packet and frame timestamps read from the video stream of my test input file, to give a sense of what my test file looks like. None of the input packet pts’ are set, same with frame pts, and for some reason the duration of the first 108 frames is 0. VLC plays the file fine and reports the frame rate as 29.9700089 :
- Table is here since it was too large for this post.