Recherche avancée

Médias (91)

Autres articles (29)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

Sur d’autres sites (4594)

  • FFMPEG : cannot play MPEG4 video encoded from images. Duration and bitrate undefined

    17 juin 2013, par KaiK

    I've been trying to set a H264 video stream created from images, into an MPEG4 container. I've been able to get the video stream from images successfully. But when muxing it in the container, I must do something wrong because no player is able to reproduce it, despite ffplay - that plays the video until the end and after that, the image gets frozen until the eternity -.

    The ffplay cannot identify Duration neither bitrate, so I supose it might be an issue related with dts and pts, but I've searched about how to solve it with no success.

    Here's the ffplay output :

    ~$ ffplay testContainer.mp4
    ffplay version git-2012-01-31-c673671 Copyright (c) 2003-2012 the FFmpeg developers
     built on Feb  7 2012 20:32:12 with gcc 4.4.3
     configuration: --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-        libfaac --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-x11grab --enable-libvpx --enable-libmp3lame --enable-debug=3
     libavutil      51. 36.100 / 51. 36.100
     libavcodec     54.  0.102 / 54.  0.102
     libavformat    54.  0.100 / 54.  0.100
     libavdevice    53.  4.100 / 53.  4.100
     libavfilter     2. 60.100 /  2. 60.100
     libswscale      2.  1.100 /  2.  1.100
     libswresample   0.  6.100 /  0.  6.100
     libpostproc    52.  0.100 / 52.  0.100
    [h264 @ 0xa4849c0] max_analyze_duration 5000000 reached at 5000000
    [h264 @ 0xa4849c0] Estimating duration from bitrate, this may be inaccurate
    Input #0, h264, from 'testContainer.mp4':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: h264 (High), yuv420p, 512x512, 25 fps, 25 tbr, 1200k tbn, 50 tbc
          2.74 A-V:  0.000 fd=   0 aq=    0KB vq=  160KB sq=    0B f=0/0   0/0

    Structure

    My code is C++ styled, so I've a class that handles all the encoding, and then a main that initilize it, passes some images in a bucle, and finally notify the end of the process as following :

    int main (int argc, const char * argv[])
    {

    MyVideoEncoder* videoEncoder = new MyVideoEncoder(512, 512, 512, 512, "output/testContainer.mp4", 25, 20);
    if(!videoEncoder->initWithCodec(MyVideoEncoder::H264))
    {
       std::cout << "something really bad happened. Exit!!" << std::endl;
       exit(-1);
    }

    /* encode 1 second of video */
    for(int i=0;i<228;i++) {

       std::stringstream filepath;
       filepath << "input2/image" << i << ".jpg";

       videoEncoder->encodeFrameFromJPG(const_cast(filepath.str().c_str()));

    }

    videoEncoder->endEncoding();

    }

    Hints

    I've seen a lot of examples about decoding of a video and encoding into another, but no working example of muxing a video from the scratch, so I'm not sure how to proceed with the pts and dts packet values. That's the reason why I suspect the issue must be in the following method :

    bool MyVideoEncoder::encodeImageAsFrame(){
       bool res = false;


       pTempFrame->pts = frameCount * frameRate * 90; //90Hz by the standard for PTS-values
       frameCount++;

       /* encode the image */
       out_size = avcodec_encode_video(pVideoStream->codec, outbuf, outbuf_size, pTempFrame);


       if (out_size > 0) {
           AVPacket pkt;
           av_init_packet(&pkt);
           pkt.pts = pkt.dts = 0;

           if (pVideoStream->codec->coded_frame->pts != AV_NOPTS_VALUE) {
               pkt.pts = av_rescale_q(pVideoStream->codec->coded_frame->pts,
                       pVideoStream->codec->time_base, pVideoStream->time_base);
               pkt.dts = pTempFrame->pts;

           }
           if (pVideoStream->codec->coded_frame->key_frame) {
               pkt.flags |= AV_PKT_FLAG_KEY;
           }
           pkt.stream_index = pVideoStream->index;
           pkt.data = outbuf;
           pkt.size = out_size;

           res = (av_interleaved_write_frame(pFormatContext, &pkt) == 0);
       }


       return res;
    }

    Any help or insight would be appreciated. Thanks in advance !!

    P.S. The rest of the code, where config is done, is the following :

    // MyVideoEncoder.cpp

    #include "MyVideoEncoder.h"
    #include "Image.hpp"
    #include <cstring>
    #include <sstream>
    #include

    #define MAX_AUDIO_PACKET_SIZE (128 * 1024)



    MyVideoEncoder::MyVideoEncoder(int inwidth, int inheight,
           int outwidth, int outheight, char* fileOutput, int framerate,
           int compFactor) {
       inWidth = inwidth;
       inHeight = inheight;
       outWidth = outwidth;
       outHeight = outheight;
       pathToMovie = fileOutput;
       frameRate = framerate;
       compressionFactor = compFactor;
       frameCount = 0;

    }

    MyVideoEncoder::~MyVideoEncoder() {

    }

    bool MyVideoEncoder::initWithCodec(
           MyVideoEncoder::encoderType type) {
       if (!initializeEncoder(type))
           return false;

       if (!configureFrames())
           return false;

       return true;

    }

    bool MyVideoEncoder::encodeFrameFromJPG(char* filepath) {

       setJPEGImage(filepath);
       return encodeImageAsFrame();
    }



    bool MyVideoEncoder::encodeDelayedFrames(){
       bool res = false;

       while(out_size > 0)
       {
           pTempFrame->pts = frameCount * frameRate * 90; //90Hz by the standard for PTS-values
           frameCount++;

           out_size = avcodec_encode_video(pVideoStream->codec, outbuf, outbuf_size, NULL);

           if (out_size > 0)
           {
               AVPacket pkt;
               av_init_packet(&amp;pkt);
               pkt.pts = pkt.dts = 0;

               if (pVideoStream->codec->coded_frame->pts != AV_NOPTS_VALUE) {
                   pkt.pts = av_rescale_q(pVideoStream->codec->coded_frame->pts,
                           pVideoStream->codec->time_base, pVideoStream->time_base);
                   pkt.dts = pTempFrame->pts;
               }
               if (pVideoStream->codec->coded_frame->key_frame) {
                   pkt.flags |= AV_PKT_FLAG_KEY;
               }
               pkt.stream_index = pVideoStream->index;
               pkt.data = outbuf;
               pkt.size = out_size;


               res = (av_interleaved_write_frame(pFormatContext, &amp;pkt) == 0);
           }

       }

       return res;
    }






    void MyVideoEncoder::endEncoding() {
       encodeDelayedFrames();
       closeEncoder();
    }

    bool MyVideoEncoder::setJPEGImage(char* imgFilename) {
       Image* rgbImage = new Image();
       rgbImage->read_jpeg_image(imgFilename);

       bool ret = setImageFromRGBArray(rgbImage->get_data());

       delete rgbImage;

       return ret;
    }

    bool MyVideoEncoder::setImageFromRGBArray(unsigned char* data) {

       memcpy(pFrameRGB->data[0], data, 3 * inWidth * inHeight);

       int ret = sws_scale(img_convert_ctx, pFrameRGB->data, pFrameRGB->linesize,
               0, inHeight, pTempFrame->data, pTempFrame->linesize);

       pFrameRGB->pts++;
       if (ret)
           return true;
       else
           return false;
    }

    bool MyVideoEncoder::initializeEncoder(encoderType type) {

       av_register_all();

       pTempFrame = avcodec_alloc_frame();
       pTempFrame->pts = 0;
       pOutFormat = NULL;
       pFormatContext = NULL;
       pVideoStream = NULL;
       pAudioStream = NULL;
       bool res = false;

       // Create format
       switch (type) {
           case MyVideoEncoder::H264:
               pOutFormat = av_guess_format("h264", NULL, NULL);
               break;
           case MyVideoEncoder::MPEG1:
               pOutFormat = av_guess_format("mpeg", NULL, NULL);
               break;
           default:
               pOutFormat = av_guess_format(NULL, pathToMovie.c_str(), NULL);
               break;
       }

       if (!pOutFormat) {
           pOutFormat = av_guess_format(NULL, pathToMovie.c_str(), NULL);
           if (!pOutFormat) {
               std::cout &lt;&lt; "output format not found" &lt;&lt; std::endl;
               return false;
           }
       }


       // allocate context
       pFormatContext = avformat_alloc_context();
       if(!pFormatContext)
       {
           std::cout &lt;&lt; "cannot alloc format context" &lt;&lt; std::endl;
           return false;
       }

       pFormatContext->oformat = pOutFormat;

       memcpy(pFormatContext->filename, pathToMovie.c_str(), min( (const int) pathToMovie.length(), (const int)sizeof(pFormatContext->filename)));


       //Add video and audio streams
       pVideoStream = AddVideoStream(pFormatContext,
               pOutFormat->video_codec);

       // Set the output parameters
       av_dump_format(pFormatContext, 0, pathToMovie.c_str(), 1);

       // Open Video stream
       if (pVideoStream) {
           res = openVideo(pFormatContext, pVideoStream);
       }


       if (res &amp;&amp; !(pOutFormat->flags &amp; AVFMT_NOFILE)) {
           if (avio_open(&amp;pFormatContext->pb, pathToMovie.c_str(), AVIO_FLAG_WRITE) &lt; 0) {
               res = false;
               std::cout &lt;&lt; "Cannot open output file" &lt;&lt; std::endl;
           }
       }

       if (res) {
           avformat_write_header(pFormatContext,NULL);
       }
       else{
           freeMemory();
           std::cout &lt;&lt; "Cannot init encoder" &lt;&lt; std::endl;
       }


       return res;

    }



    AVStream *MyVideoEncoder::AddVideoStream(AVFormatContext *pContext, CodecID codec_id)
    {
     AVCodecContext *pCodecCxt = NULL;
     AVStream *st    = NULL;

     st = avformat_new_stream(pContext, NULL);
     if (!st)
     {
         std::cout &lt;&lt; "Cannot add new video stream" &lt;&lt; std::endl;
         return NULL;
     }
     st->id = 0;

     pCodecCxt = st->codec;
     pCodecCxt->codec_id = (CodecID)codec_id;
     pCodecCxt->codec_type = AVMEDIA_TYPE_VIDEO;
     pCodecCxt->frame_number = 0;


     // Put sample parameters.
     pCodecCxt->bit_rate = outWidth * outHeight * 3 * frameRate/ compressionFactor;

     pCodecCxt->width  = outWidth;
     pCodecCxt->height = outHeight;

     /* frames per second */
     pCodecCxt->time_base= (AVRational){1,frameRate};

     /* pixel format must be YUV */
     pCodecCxt->pix_fmt = PIX_FMT_YUV420P;


     if (pCodecCxt->codec_id == CODEC_ID_H264)
     {
         av_opt_set(pCodecCxt->priv_data, "preset", "slow", 0);
         av_opt_set(pCodecCxt->priv_data, "vprofile", "baseline", 0);
         pCodecCxt->max_b_frames = 16;
     }
     if (pCodecCxt->codec_id == CODEC_ID_MPEG1VIDEO)
     {
         pCodecCxt->mb_decision = 1;
     }

     if(pContext->oformat->flags &amp; AVFMT_GLOBALHEADER)
     {
         pCodecCxt->flags |= CODEC_FLAG_GLOBAL_HEADER;
     }

     pCodecCxt->coder_type = 1;  // coder = 1
     pCodecCxt->flags|=CODEC_FLAG_LOOP_FILTER;   // flags=+loop
     pCodecCxt->me_range = 16;   // me_range=16
     pCodecCxt->gop_size = 50;  // g=250
     pCodecCxt->keyint_min = 25; // keyint_min=25


     return st;
    }


    bool MyVideoEncoder::openVideo(AVFormatContext *oc, AVStream *pStream)
    {
       AVCodec *pCodec;
       AVCodecContext *pContext;

       pContext = pStream->codec;

       // Find the video encoder.
       pCodec = avcodec_find_encoder(pContext->codec_id);
       if (!pCodec)
       {
           std::cout &lt;&lt; "Cannot found video codec" &lt;&lt; std::endl;
           return false;
       }

       // Open the codec.
       if (avcodec_open2(pContext, pCodec, NULL) &lt; 0)
       {
           std::cout &lt;&lt; "Cannot open video codec" &lt;&lt; std::endl;
           return false;
       }


       return true;
    }



    bool MyVideoEncoder::configureFrames() {

       /* alloc image and output buffer */
       outbuf_size = outWidth*outHeight*3;
       outbuf = (uint8_t*) malloc(outbuf_size);

       av_image_alloc(pTempFrame->data, pTempFrame->linesize, pVideoStream->codec->width,
               pVideoStream->codec->height, pVideoStream->codec->pix_fmt, 1);

       //Alloc RGB temp frame
       pFrameRGB = avcodec_alloc_frame();
       if (pFrameRGB == NULL)
           return false;
       avpicture_alloc((AVPicture *) pFrameRGB, PIX_FMT_RGB24, inWidth, inHeight);

       pFrameRGB->pts = 0;

       //Set SWS context to convert from RGB images to YUV images
       if (img_convert_ctx == NULL) {
           img_convert_ctx = sws_getContext(inWidth, inHeight, PIX_FMT_RGB24,
                   outWidth, outHeight, pVideoStream->codec->pix_fmt, /*SWS_BICUBIC*/
                   SWS_FAST_BILINEAR, NULL, NULL, NULL);
           if (img_convert_ctx == NULL) {
               fprintf(stderr, "Cannot initialize the conversion context!\n");
               return false;
           }
       }

       return true;

    }

    void MyVideoEncoder::closeEncoder() {
       av_write_frame(pFormatContext, NULL);
       av_write_trailer(pFormatContext);
       freeMemory();
    }


    void MyVideoEncoder::freeMemory()
    {
     bool res = true;

     if (pFormatContext)
     {
       // close video stream
       if (pVideoStream)
       {
         closeVideo(pFormatContext, pVideoStream);
       }

       // Free the streams.
       for(size_t i = 0; i &lt; pFormatContext->nb_streams; i++)
       {
         av_freep(&amp;pFormatContext->streams[i]->codec);
         av_freep(&amp;pFormatContext->streams[i]);
       }

       if (!(pFormatContext->flags &amp; AVFMT_NOFILE) &amp;&amp; pFormatContext->pb)
       {
         avio_close(pFormatContext->pb);
       }

       // Free the stream.
       av_free(pFormatContext);
       pFormatContext = NULL;
     }
    }

    void MyVideoEncoder::closeVideo(AVFormatContext *pContext, AVStream *pStream)
    {
     avcodec_close(pStream->codec);
     if (pTempFrame)
     {
       if (pTempFrame->data)
       {
         av_free(pTempFrame->data[0]);
         pTempFrame->data[0] = NULL;
       }
       av_free(pTempFrame);
       pTempFrame = NULL;
     }

     if (pFrameRGB)
     {
       if (pFrameRGB->data)
       {
         av_free(pFrameRGB->data[0]);
         pFrameRGB->data[0] = NULL;
       }
       av_free(pFrameRGB);
       pFrameRGB = NULL;
     }

    }
    </sstream></cstring>
  • FFMPEG : cannot play MPEG4 video encoded from images. Duration and bitrate undefined

    17 juin 2013, par KaiK

    I've been trying to set a H264 video stream created from images, into an MPEG4 container. I've been able to get the video stream from images successfully. But when muxing it in the container, I must do something wrong because no player is able to reproduce it, despite ffplay - that plays the video until the end and after that, the image gets frozen until the eternity -.

    The ffplay cannot identify Duration neither bitrate, so I supose it might be an issue related with dts and pts, but I've searched about how to solve it with no success.

    Here's the ffplay output :

    ~$ ffplay testContainer.mp4
    ffplay version git-2012-01-31-c673671 Copyright (c) 2003-2012 the FFmpeg developers
     built on Feb  7 2012 20:32:12 with gcc 4.4.3
     configuration: --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-        libfaac --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-x11grab --enable-libvpx --enable-libmp3lame --enable-debug=3
     libavutil      51. 36.100 / 51. 36.100
     libavcodec     54.  0.102 / 54.  0.102
     libavformat    54.  0.100 / 54.  0.100
     libavdevice    53.  4.100 / 53.  4.100
     libavfilter     2. 60.100 /  2. 60.100
     libswscale      2.  1.100 /  2.  1.100
     libswresample   0.  6.100 /  0.  6.100
     libpostproc    52.  0.100 / 52.  0.100
    [h264 @ 0xa4849c0] max_analyze_duration 5000000 reached at 5000000
    [h264 @ 0xa4849c0] Estimating duration from bitrate, this may be inaccurate
    Input #0, h264, from &#39;testContainer.mp4&#39;:
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: h264 (High), yuv420p, 512x512, 25 fps, 25 tbr, 1200k tbn, 50 tbc
          2.74 A-V:  0.000 fd=   0 aq=    0KB vq=  160KB sq=    0B f=0/0   0/0

    Structure

    My code is C++ styled, so I've a class that handles all the encoding, and then a main that initilize it, passes some images in a bucle, and finally notify the end of the process as following :

    int main (int argc, const char * argv[])
    {

    MyVideoEncoder* videoEncoder = new MyVideoEncoder(512, 512, 512, 512, "output/testContainer.mp4", 25, 20);
    if(!videoEncoder->initWithCodec(MyVideoEncoder::H264))
    {
       std::cout &lt;&lt; "something really bad happened. Exit!!" &lt;&lt; std::endl;
       exit(-1);
    }

    /* encode 1 second of video */
    for(int i=0;i&lt;228;i++) {

       std::stringstream filepath;
       filepath &lt;&lt; "input2/image" &lt;&lt; i &lt;&lt; ".jpg";

       videoEncoder->encodeFrameFromJPG(const_cast(filepath.str().c_str()));

    }

    videoEncoder->endEncoding();

    }

    Hints

    I've seen a lot of examples about decoding of a video and encoding into another, but no working example of muxing a video from the scratch, so I'm not sure how to proceed with the pts and dts packet values. That's the reason why I suspect the issue must be in the following method :

    bool MyVideoEncoder::encodeImageAsFrame(){
       bool res = false;


       pTempFrame->pts = frameCount * frameRate * 90; //90Hz by the standard for PTS-values
       frameCount++;

       /* encode the image */
       out_size = avcodec_encode_video(pVideoStream->codec, outbuf, outbuf_size, pTempFrame);


       if (out_size > 0) {
           AVPacket pkt;
           av_init_packet(&amp;pkt);
           pkt.pts = pkt.dts = 0;

           if (pVideoStream->codec->coded_frame->pts != AV_NOPTS_VALUE) {
               pkt.pts = av_rescale_q(pVideoStream->codec->coded_frame->pts,
                       pVideoStream->codec->time_base, pVideoStream->time_base);
               pkt.dts = pTempFrame->pts;

           }
           if (pVideoStream->codec->coded_frame->key_frame) {
               pkt.flags |= AV_PKT_FLAG_KEY;
           }
           pkt.stream_index = pVideoStream->index;
           pkt.data = outbuf;
           pkt.size = out_size;

           res = (av_interleaved_write_frame(pFormatContext, &amp;pkt) == 0);
       }


       return res;
    }

    Any help or insight would be appreciated. Thanks in advance !!

    P.S. The rest of the code, where config is done, is the following :

    // MyVideoEncoder.cpp

    #include "MyVideoEncoder.h"
    #include "Image.hpp"
    #include <cstring>
    #include <sstream>
    #include

    #define MAX_AUDIO_PACKET_SIZE (128 * 1024)



    MyVideoEncoder::MyVideoEncoder(int inwidth, int inheight,
           int outwidth, int outheight, char* fileOutput, int framerate,
           int compFactor) {
       inWidth = inwidth;
       inHeight = inheight;
       outWidth = outwidth;
       outHeight = outheight;
       pathToMovie = fileOutput;
       frameRate = framerate;
       compressionFactor = compFactor;
       frameCount = 0;

    }

    MyVideoEncoder::~MyVideoEncoder() {

    }

    bool MyVideoEncoder::initWithCodec(
           MyVideoEncoder::encoderType type) {
       if (!initializeEncoder(type))
           return false;

       if (!configureFrames())
           return false;

       return true;

    }

    bool MyVideoEncoder::encodeFrameFromJPG(char* filepath) {

       setJPEGImage(filepath);
       return encodeImageAsFrame();
    }



    bool MyVideoEncoder::encodeDelayedFrames(){
       bool res = false;

       while(out_size > 0)
       {
           pTempFrame->pts = frameCount * frameRate * 90; //90Hz by the standard for PTS-values
           frameCount++;

           out_size = avcodec_encode_video(pVideoStream->codec, outbuf, outbuf_size, NULL);

           if (out_size > 0)
           {
               AVPacket pkt;
               av_init_packet(&amp;pkt);
               pkt.pts = pkt.dts = 0;

               if (pVideoStream->codec->coded_frame->pts != AV_NOPTS_VALUE) {
                   pkt.pts = av_rescale_q(pVideoStream->codec->coded_frame->pts,
                           pVideoStream->codec->time_base, pVideoStream->time_base);
                   pkt.dts = pTempFrame->pts;
               }
               if (pVideoStream->codec->coded_frame->key_frame) {
                   pkt.flags |= AV_PKT_FLAG_KEY;
               }
               pkt.stream_index = pVideoStream->index;
               pkt.data = outbuf;
               pkt.size = out_size;


               res = (av_interleaved_write_frame(pFormatContext, &amp;pkt) == 0);
           }

       }

       return res;
    }






    void MyVideoEncoder::endEncoding() {
       encodeDelayedFrames();
       closeEncoder();
    }

    bool MyVideoEncoder::setJPEGImage(char* imgFilename) {
       Image* rgbImage = new Image();
       rgbImage->read_jpeg_image(imgFilename);

       bool ret = setImageFromRGBArray(rgbImage->get_data());

       delete rgbImage;

       return ret;
    }

    bool MyVideoEncoder::setImageFromRGBArray(unsigned char* data) {

       memcpy(pFrameRGB->data[0], data, 3 * inWidth * inHeight);

       int ret = sws_scale(img_convert_ctx, pFrameRGB->data, pFrameRGB->linesize,
               0, inHeight, pTempFrame->data, pTempFrame->linesize);

       pFrameRGB->pts++;
       if (ret)
           return true;
       else
           return false;
    }

    bool MyVideoEncoder::initializeEncoder(encoderType type) {

       av_register_all();

       pTempFrame = avcodec_alloc_frame();
       pTempFrame->pts = 0;
       pOutFormat = NULL;
       pFormatContext = NULL;
       pVideoStream = NULL;
       pAudioStream = NULL;
       bool res = false;

       // Create format
       switch (type) {
           case MyVideoEncoder::H264:
               pOutFormat = av_guess_format("h264", NULL, NULL);
               break;
           case MyVideoEncoder::MPEG1:
               pOutFormat = av_guess_format("mpeg", NULL, NULL);
               break;
           default:
               pOutFormat = av_guess_format(NULL, pathToMovie.c_str(), NULL);
               break;
       }

       if (!pOutFormat) {
           pOutFormat = av_guess_format(NULL, pathToMovie.c_str(), NULL);
           if (!pOutFormat) {
               std::cout &lt;&lt; "output format not found" &lt;&lt; std::endl;
               return false;
           }
       }


       // allocate context
       pFormatContext = avformat_alloc_context();
       if(!pFormatContext)
       {
           std::cout &lt;&lt; "cannot alloc format context" &lt;&lt; std::endl;
           return false;
       }

       pFormatContext->oformat = pOutFormat;

       memcpy(pFormatContext->filename, pathToMovie.c_str(), min( (const int) pathToMovie.length(), (const int)sizeof(pFormatContext->filename)));


       //Add video and audio streams
       pVideoStream = AddVideoStream(pFormatContext,
               pOutFormat->video_codec);

       // Set the output parameters
       av_dump_format(pFormatContext, 0, pathToMovie.c_str(), 1);

       // Open Video stream
       if (pVideoStream) {
           res = openVideo(pFormatContext, pVideoStream);
       }


       if (res &amp;&amp; !(pOutFormat->flags &amp; AVFMT_NOFILE)) {
           if (avio_open(&amp;pFormatContext->pb, pathToMovie.c_str(), AVIO_FLAG_WRITE) &lt; 0) {
               res = false;
               std::cout &lt;&lt; "Cannot open output file" &lt;&lt; std::endl;
           }
       }

       if (res) {
           avformat_write_header(pFormatContext,NULL);
       }
       else{
           freeMemory();
           std::cout &lt;&lt; "Cannot init encoder" &lt;&lt; std::endl;
       }


       return res;

    }



    AVStream *MyVideoEncoder::AddVideoStream(AVFormatContext *pContext, CodecID codec_id)
    {
     AVCodecContext *pCodecCxt = NULL;
     AVStream *st    = NULL;

     st = avformat_new_stream(pContext, NULL);
     if (!st)
     {
         std::cout &lt;&lt; "Cannot add new video stream" &lt;&lt; std::endl;
         return NULL;
     }
     st->id = 0;

     pCodecCxt = st->codec;
     pCodecCxt->codec_id = (CodecID)codec_id;
     pCodecCxt->codec_type = AVMEDIA_TYPE_VIDEO;
     pCodecCxt->frame_number = 0;


     // Put sample parameters.
     pCodecCxt->bit_rate = outWidth * outHeight * 3 * frameRate/ compressionFactor;

     pCodecCxt->width  = outWidth;
     pCodecCxt->height = outHeight;

     /* frames per second */
     pCodecCxt->time_base= (AVRational){1,frameRate};

     /* pixel format must be YUV */
     pCodecCxt->pix_fmt = PIX_FMT_YUV420P;


     if (pCodecCxt->codec_id == CODEC_ID_H264)
     {
         av_opt_set(pCodecCxt->priv_data, "preset", "slow", 0);
         av_opt_set(pCodecCxt->priv_data, "vprofile", "baseline", 0);
         pCodecCxt->max_b_frames = 16;
     }
     if (pCodecCxt->codec_id == CODEC_ID_MPEG1VIDEO)
     {
         pCodecCxt->mb_decision = 1;
     }

     if(pContext->oformat->flags &amp; AVFMT_GLOBALHEADER)
     {
         pCodecCxt->flags |= CODEC_FLAG_GLOBAL_HEADER;
     }

     pCodecCxt->coder_type = 1;  // coder = 1
     pCodecCxt->flags|=CODEC_FLAG_LOOP_FILTER;   // flags=+loop
     pCodecCxt->me_range = 16;   // me_range=16
     pCodecCxt->gop_size = 50;  // g=250
     pCodecCxt->keyint_min = 25; // keyint_min=25


     return st;
    }


    bool MyVideoEncoder::openVideo(AVFormatContext *oc, AVStream *pStream)
    {
       AVCodec *pCodec;
       AVCodecContext *pContext;

       pContext = pStream->codec;

       // Find the video encoder.
       pCodec = avcodec_find_encoder(pContext->codec_id);
       if (!pCodec)
       {
           std::cout &lt;&lt; "Cannot found video codec" &lt;&lt; std::endl;
           return false;
       }

       // Open the codec.
       if (avcodec_open2(pContext, pCodec, NULL) &lt; 0)
       {
           std::cout &lt;&lt; "Cannot open video codec" &lt;&lt; std::endl;
           return false;
       }


       return true;
    }



    bool MyVideoEncoder::configureFrames() {

       /* alloc image and output buffer */
       outbuf_size = outWidth*outHeight*3;
       outbuf = (uint8_t*) malloc(outbuf_size);

       av_image_alloc(pTempFrame->data, pTempFrame->linesize, pVideoStream->codec->width,
               pVideoStream->codec->height, pVideoStream->codec->pix_fmt, 1);

       //Alloc RGB temp frame
       pFrameRGB = avcodec_alloc_frame();
       if (pFrameRGB == NULL)
           return false;
       avpicture_alloc((AVPicture *) pFrameRGB, PIX_FMT_RGB24, inWidth, inHeight);

       pFrameRGB->pts = 0;

       //Set SWS context to convert from RGB images to YUV images
       if (img_convert_ctx == NULL) {
           img_convert_ctx = sws_getContext(inWidth, inHeight, PIX_FMT_RGB24,
                   outWidth, outHeight, pVideoStream->codec->pix_fmt, /*SWS_BICUBIC*/
                   SWS_FAST_BILINEAR, NULL, NULL, NULL);
           if (img_convert_ctx == NULL) {
               fprintf(stderr, "Cannot initialize the conversion context!\n");
               return false;
           }
       }

       return true;

    }

    void MyVideoEncoder::closeEncoder() {
       av_write_frame(pFormatContext, NULL);
       av_write_trailer(pFormatContext);
       freeMemory();
    }


    void MyVideoEncoder::freeMemory()
    {
     bool res = true;

     if (pFormatContext)
     {
       // close video stream
       if (pVideoStream)
       {
         closeVideo(pFormatContext, pVideoStream);
       }

       // Free the streams.
       for(size_t i = 0; i &lt; pFormatContext->nb_streams; i++)
       {
         av_freep(&amp;pFormatContext->streams[i]->codec);
         av_freep(&amp;pFormatContext->streams[i]);
       }

       if (!(pFormatContext->flags &amp; AVFMT_NOFILE) &amp;&amp; pFormatContext->pb)
       {
         avio_close(pFormatContext->pb);
       }

       // Free the stream.
       av_free(pFormatContext);
       pFormatContext = NULL;
     }
    }

    void MyVideoEncoder::closeVideo(AVFormatContext *pContext, AVStream *pStream)
    {
     avcodec_close(pStream->codec);
     if (pTempFrame)
     {
       if (pTempFrame->data)
       {
         av_free(pTempFrame->data[0]);
         pTempFrame->data[0] = NULL;
       }
       av_free(pTempFrame);
       pTempFrame = NULL;
     }

     if (pFrameRGB)
     {
       if (pFrameRGB->data)
       {
         av_free(pFrameRGB->data[0]);
         pFrameRGB->data[0] = NULL;
       }
       av_free(pFrameRGB);
       pFrameRGB = NULL;
     }

    }
    </sstream></cstring>
  • Playing videos on iPhone after video conversion using ffmpeg

    12 juillet 2013, par Shashi

    It is a backend API(developed in Ruby on Rails) which is used by iphone and android mobiles to upload the videos. API is converting uploaded videos to MP4 format. I am using paperclip-ffmpeg gem for video conversion at backend. Here is what I used :

    has_attached_file :video ,
                 :styles=>{
                   :medium=>{ :geometry=>"480x360",:format=>&#39;mp4&#39;,:streaming => true,
                       :convert_options => { :output =>
                                           {
                                             :acodec => &#39;aac&#39;,
                                             :ac => 2 ,
                                             :strict => &#39;experimental&#39;,
                                             &#39;b:a&#39; => &#39;160k&#39;,

                                             :vcodec => &#39;libx264&#39;,
                                             :preset => &#39;slow&#39;,
                                             &#39;profile:v&#39; => &#39;baseline&#39;,
                                             :level => &#39;30&#39;,
                                             :maxrate => &#39;10000000&#39;,
                                             :bufsize => &#39;10000000&#39;,
                                             &#39;b:v&#39; => &#39;750k&#39;,
                                             :f => &#39;mp4&#39;,
                                             :threads => &#39;0&#39;
                                           }
                                       } },
                     :thumb => { :geometry => "160x120", :format => &#39;jpg&#39;, :time => 2 },
                     :thumb_large => { :geometry => "520x390", :format => &#39;jpg&#39;, :time => 2}
                   }, :processors => [:ffmpeg, :qtfaststart],
           :path => path/to/store/video,
           :url => url/of/video

    Problems I am facing are :

    When user uploads videos from android mobile it works perfect on android but does not play on iPhone. One more strange thing happens when user upload videos from iPhone, then it play perfect just after video upload, but after some time(2 or 3 hours) it stops to play those videos.

    How can I get rid of both these issues ? Any suggestion would be appreciated. Thanks

    @Michael, here is the detailed log for the video conversion uploaded by android mobile

    ffmpeg version git-2013-05-28-ced0307 Copyright (c) 2000-2013 the FFmpeg developers
     built on May 28 2013 07:45:18 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
     configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libopus --enable-libvpx --enable-x11grab --enable-libx264 --enable-nonfree --enable-version3
     libavutil      52. 34.100 / 52. 34.100
     libavcodec     55. 12.100 / 55. 12.100
     libavformat    55.  7.100 / 55.  7.100
     libavdevice    55.  1.101 / 55.  1.101
     libavfilter     3. 72.100 /  3. 72.100
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 0
       compatible_brands: isom3gp4
       creation_time   : 2013-06-19 15:39:02
     Duration: 00:00:25.88, start: 0.000000, bitrate: 16700 kb/s
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1920x1080, 16829 kb/s, 29.75 fps, 30 tbr, 90k tbn, 180k tbc
       Metadata:
         rotate          : 90
         creation_time   : 2013-06-19 15:39:02
         handler_name    : VideoHandle
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 122 kb/s
       Metadata:
         creation_time   : 2013-06-19 15:39:02
         handler_name    : SoundHandle
    [libx264 @ 0x2613c00] using cpu capabilities: MMX2 SSE2Fast SSEMisalign LZCNT
    [libx264 @ 0x2613c00] profile Constrained Baseline, level 3.0
    [libx264 @ 0x2613c00] 264 - core 133 r2 a3ac64b - H.264/MPEG-4 AVC codec - Copyleft 2003-2013 - http://www.videolan.org/x264.html - options: cabac=0 ref=5 deblock=1:0:0 analyse=0x1:0x111 me=umh subme=8 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=50 rc=abr mbtree=1 bitrate=750 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=10000 vbv_bufsize=10000 nal_hrd=none ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p.mp4&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 0
       compatible_brands: isom3gp4
       encoder         : Lavf55.7.100
       Stream #0:0(eng): Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 640x360, q=-1--1, 750 kb/s, 15360 tbn, 30 tbc
       Metadata:
         rotate          : 90
         creation_time   : 2013-06-19 15:39:02
         handler_name    : VideoHandle
       Stream #0:1(eng): Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 160 kb/s
       Metadata:
         creation_time   : 2013-06-19 15:39:02
         handler_name    : SoundHandle
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 -> libx264)
     Stream #0:1 -> #0:1 (aac -> aac)
    Press [q] to stop, [?] for help
    frame=  765 fps= 19 q=-1.0 Lsize=    2621kB time=00:00:25.88 bitrate= 829.4kbits/s dup=7 drop=0    
    video:2093kB audio:504kB subtitle:0 global headers:0kB muxing overhead 0.878228%
    [libx264 @ 0x2613c00] frame I:4     Avg QP:25.48  size: 11731
    [libx264 @ 0x2613c00] frame P:761   Avg QP:26.71  size:  2754
    [libx264 @ 0x2613c00] mb I  I16..4: 33.0%  0.0% 67.0%
    [libx264 @ 0x2613c00] mb P  I16..4:  2.1%  0.0%  0.9%  P16..4: 47.2%  9.5%  3.4%  0.0%  0.0%    skip:36.9%
    [libx264 @ 0x2613c00] final ratefactor: 23.58
    [libx264 @ 0x2613c00] coded y,uvDC,uvAC intra: 31.9% 54.5% 9.5% inter: 11.9% 24.4% 0.5%
    [libx264 @ 0x2613c00] i16 v,h,dc,p: 25% 33%  7% 34%
    [libx264 @ 0x2613c00] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 17% 14%  9% 14% 10% 10% 11%  8%  7%
    [libx264 @ 0x2613c00] i8c dc,h,v,p: 50% 25% 17%  9%
    [libx264 @ 0x2613c00] ref P L0: 67.6% 15.2% 10.1%  3.9%  3.2%
    [libx264 @ 0x2613c00] kb/s:672.30
    ffmpeg version git-2013-05-28-ced0307 Copyright (c) 2000-2013 the FFmpeg developers
     built on May 28 2013 07:45:18 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
     configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libopus --enable-libvpx --enable-x11grab --enable-libx264 --enable-nonfree --enable-version3
     libavutil      52. 34.100 / 52. 34.100
     libavcodec     55. 12.100 / 55. 12.100
     libavformat    55.  7.100 / 55.  7.100
     libavdevice    55.  1.101 / 55.  1.101
     libavfilter     3. 72.100 /  3. 72.100 octet-stream\r\n" name="video" tempfile=#tmp/RackMultipart20130620-28711-187049m>>
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
     Duration: 00:00:25.91, start: 0.021333, bitrate: 828 kb/s
       Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x360, 672 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 159 kb/s
       Metadata:
         handler_name    : SoundHandler
    [libx264 @ 0x326d820] using cpu capabilities: MMX2 SSE2Fast SSEMisalign LZCNT
    [libx264 @ 0x326d820] profile Constrained Baseline, level 3.0
    [libx264 @ 0x326d820] 264 - core 133 r2 a3ac64b - H.264/MPEG-4 AVC codec - Copyleft 2003-2013 - http://www.videolan.org/x264.html - options: cabac=0 ref=5 deblock=1:0:0 analyse=0x1:0x111 me=umh subme=8 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=50 rc=abr mbtree=1 bitrate=750 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=10000 vbv_bufsize=10000 nal_hrd=none ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-130rd1i.mp4&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
       Stream #0:0(eng): Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 480x270, q=-1--1, 750 kb/s, 15360 tbn, 30 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 160 kb/s
       Metadata:
         handler_name    : SoundHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 -> libx264)
     Stream #0:1 -> #0:1 (aac -> aac)
    Press [q] to stop, [?] for help
    frame=  766 fps= 38 q=-1.0 Lsize=    2666kB time=00:00:25.90 bitrate= 843.2kbits/s dup=1 drop=0    
    video:2138kB audio:505kB subtitle:0 global headers:0kB muxing overhead 0.863633%
    [libx264 @ 0x326d820] frame I:4     Avg QP:20.82  size: 11283
    [libx264 @ 0x326d820] frame P:762   Avg QP:22.03  size:  2813
    [libx264 @ 0x326d820] mb I  I16..4: 19.9%  0.0% 80.1%
    [libx264 @ 0x326d820] mb P  I16..4:  1.7%  0.0%  1.1%  P16..4: 49.3% 20.6%  7.0%  0.0%  0.0%    skip:20.3%
    [libx264 @ 0x326d820] final ratefactor: 19.24
    [libx264 @ 0x326d820] coded y,uvDC,uvAC intra: 43.4% 76.7% 29.0% inter: 23.0% 44.6% 1.7%
    [libx264 @ 0x326d820] i16 v,h,dc,p: 23% 29%  9% 39%
    [libx264 @ 0x326d820] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 12% 21%  9% 11% 10% 10% 11%  8%  8%
    [libx264 @ 0x326d820] i8c dc,h,v,p: 44% 26% 18% 12%
    [libx264 @ 0x326d820] ref P L0: 74.0% 13.1%  7.7%  3.0%  2.1%
    [libx264 @ 0x326d820] kb/s:685.81
    ffmpeg version git-2013-05-28-ced0307 Copyright (c) 2000-2013 the FFmpeg developers
     built on May 28 2013 07:45:18 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
     configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libopus --enable-libvpx --enable-x11grab --enable-libx264 --enable-nonfree --enable-version3
     libavutil      52. 34.100 / 52. 34.100
     libavcodec     55. 12.100 / 55. 12.100
     libavformat    55.  7.100 / 55.  7.100
     libavdevice    55.  1.101 / 55.  1.101
     libavfilter     3. 72.100 /  3. 72.100
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
     Duration: 00:00:25.91, start: 0.021333, bitrate: 828 kb/s
       Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x360, 672 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 159 kb/s
       Metadata:
         handler_name    : SoundHandler
    Output #0, image2, to &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-1bdslpu.jpg&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
       Stream #0:0(eng): Video: mjpeg, yuvj420p, 160x90, q=2-31, 200 kb/s, 90k tbn, 30 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 -> mjpeg)
    Press [q] to stop, [?] for help
    frame=    1 fps=0.0 q=0.0 Lsize=N/A time=00:00:00.03 bitrate=N/A dup=0 drop=57    
    video:4kB audio:0kB subtitle:0 global headers:0kB muxing overhead -100.479303%
    ffmpeg version git-2013-05-28-ced0307 Copyright (c) 2000-2013 the FFmpeg developers
     built on May 28 2013 07:45:18 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
     configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libopus --enable-libvpx --enable-x11grab --enable-libx264 --enable-nonfree --enable-version3
     libavutil      52. 34.100 / 52. 34.100
     libavcodec     55. 12.100 / 55. 12.100
     libavformat    55.  7.100 / 55.  7.100
     libavdevice    55.  1.101 / 55.  1.101
     libavfilter     3. 72.100 /  3. 72.100
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
     Duration: 00:00:25.91, start: 0.021333, bitrate: 828 kb/s
       Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x360, 672 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 159 kb/s
       Metadata:
         handler_name    : SoundHandler
    Output #0, image2, to &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-1vp4al6.jpg&#39;:
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.7.100
       Stream #0:0(eng): Video: mjpeg, yuvj420p, 520x292, q=2-31, 200 kb/s, 90k tbn, 30 tbc
       Metadata:
         rotate          : 90
         handler_name    : VideoHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 -> mjpeg)
    Press [q] to stop, [?] for help
    frame=    1 fps=0.0 q=0.0 Lsize=N/A time=00:00:00.03 bitrate=N/A dup=0 drop=57    
    video:19kB audio:0kB subtitle:0 global headers:0kB muxing overhead -100.113677%


    Started POST "/api/v1/videos/create" for 69.42.0.13 at 2013-06-20 09:41:50 -0500
    Command :: file -b --mime &#39;/tmp/RackMultipart20130620-28711-187049m&#39;
    [paperclip] [ffmpeg] ffmpeg -i "/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l" 2>&amp;1
    [paperclip] [ffmpeg] Command Success
    [paperclip] [ffmpeg] Making...
    [paperclip] [ffmpeg] Building Destination File: &#39;RackMultipart20130620-28711-187049m20130620-28711-2x7b7l&#39; + &#39;mp4&#39;
    [paperclip] [ffmpeg] Destination File Built
    [paperclip] [ffmpeg] Adding Geometry
    [paperclip] [ffmpeg] Extracting Target Dimensions
    [paperclip] [ffmpeg] Target Size is Available
    [paperclip] [ffmpeg] Keeping Aspect Ratio
    [paperclip] [ffmpeg] Resize
    [paperclip] [ffmpeg] Convert Options: 640x360
    [paperclip] [ffmpeg] Adding Format
    [paperclip] [ffmpeg] Adding Source
    [paperclip] [ffmpeg] Building Parameters
    [paperclip] [ffmpeg] -i :source -acodec aac -ac 2 -strict experimental -b:a 160k -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b:v 750k -f mp4 -threads 0 -s 640x360 -y :dest
    Command :: ffmpeg -i &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l&#39; -acodec aac -ac 2 -strict experimental -b:a 160k -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b:v 750k -f mp4 -threads 0 -s 640x360 -y &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p.mp4&#39;
    [paperclip] [qt-faststart] :source :dest
    Command :: qt-faststart &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p.mp4&#39; &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl.mp4&#39;
    [paperclip] [ffmpeg] ffmpeg -i "/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4" 2>&amp;1
    [paperclip] [ffmpeg] Command Success
    [paperclip] [ffmpeg] Making...
    [paperclip] [ffmpeg] Building Destination File: &#39;RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06&#39; + &#39;mp4&#39;
    [paperclip] [ffmpeg] Destination File Built
    [paperclip] [ffmpeg] Adding Geometry
    [paperclip] [ffmpeg] Extracting Target Dimensions
    [paperclip] [ffmpeg] Target Size is Available
    [paperclip] [ffmpeg] Keeping Aspect Ratio
    [paperclip] [ffmpeg] Resize
    [paperclip] [ffmpeg] Convert Options: 480x270
    [paperclip] [ffmpeg] Adding Format
    [paperclip] [ffmpeg] Adding Source
    [paperclip] [ffmpeg] Building Parameters
    [paperclip] [ffmpeg] -i :source -acodec aac -ac 2 -strict experimental -b:a 160k -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b:v 750k -f mp4 -threads 0 -s 480x270 -y :dest
    Command :: ffmpeg -i &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39; -acodec aac -ac 2 -strict experimental -b:a 160k -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b:v 750k -f mp4 -threads 0 -s 480x270 -y &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-130rd1i.mp4&#39;
    [paperclip] [qt-faststart] :source :dest
    Command :: qt-faststart &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-130rd1i.mp4&#39; &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-130rd1i20130620-28711-snde6v.mp4&#39;
    [paperclip] [ffmpeg] ffmpeg -i "/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4" 2>&amp;1
    [paperclip] [ffmpeg] Command Success
    [paperclip] [ffmpeg] Making...
    [paperclip] [ffmpeg] Building Destination File: &#39;RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06&#39; + &#39;jpg&#39;
    [paperclip] [ffmpeg] Destination File Built
    [paperclip] [ffmpeg] Adding Geometry
    [paperclip] [ffmpeg] Extracting Target Dimensions
    [paperclip] [ffmpeg] Target Size is Available
    [paperclip] [ffmpeg] Keeping Aspect Ratio
    [paperclip] [ffmpeg] Resize
    [paperclip] [ffmpeg] Convert Options: 160x90
    [paperclip] [ffmpeg] Adding Format
    [paperclip] [ffmpeg] Adding Source
    [paperclip] [ffmpeg] Building Parameters
    [paperclip] [ffmpeg] -ss 2 -i :source -s 160x90 -vframes 1 -f image2 -y :dest
    Command :: ffmpeg -ss 2 -i &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39; -s 160x90 -vframes 1 -f image2 -y &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-1bdslpu.jpg&#39;
    [paperclip] [ffmpeg] ffmpeg -i "/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4" 2>&amp;1
    [paperclip] [ffmpeg] Command Success
    [paperclip] [ffmpeg] Making...
    [paperclip] [ffmpeg] Building Destination File: &#39;RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06&#39; + &#39;jpg&#39;
    [paperclip] [ffmpeg] Destination File Built
    [paperclip] [ffmpeg] Adding Geometry
    [paperclip] [ffmpeg] Extracting Target Dimensions
    [paperclip] [ffmpeg] Target Size is Available
    [paperclip] [ffmpeg] Keeping Aspect Ratio
    [paperclip] [ffmpeg] Resize
    [paperclip] [ffmpeg] Convert Options: 520x292
    [paperclip] [ffmpeg] Adding Format
    [paperclip] [ffmpeg] Adding Source
    [paperclip] [ffmpeg] Building Parameters
    [paperclip] [ffmpeg] -ss 2 -i :source -s 520x292 -vframes 1 -f image2 -y :dest
    Command :: ffmpeg -ss 2 -i &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi06.mp4&#39; -s 520x292 -vframes 1 -f image2 -y &#39;/tmp/RackMultipart20130620-28711-187049m20130620-28711-2x7b7l20130620-28711-uibd1p20130620-28711-1tgemjl20130620-28711-15qsi0620130620-28711-1vp4al6.jpg&#39;
    [paperclip] Saving attachments.