Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (27)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

Sur d’autres sites (2405)

  • split video (avi/h264) on keyframe

    30 novembre 2012, par m.sr

    Hallo.

    I have a big video file. ffmpeg, tcprobe and other tool say, it is an h264-stream in an AVI-container.

    Now i'd like to cut out small chunks form the video.

    1. Problem : The index of the video seam corrupted/destroyed. I kind of fixed this via mplayer -forceidx -saveidx <indexfile> <bigvideofile></bigvideofile></indexfile>. The Problem here is, that I'm now stuck with mplayer/mencoder which can use this index file via -loadidx <indexfile></indexfile>. I have tried correcting the index like described in man aviindex (mplayer -frames 0 -saveidx mpidx broken.avi ; aviindex -i mpidx -o tcindex ; avimerge -x tcindex -i broken.avi -o fixed.avi), but this didn't fix my video - meaning that most tools i've tested couldn't search in the video file.

    2. Problem : I cut out parts of the video via following command : mencoder -loadidx in.idx -ss 8578 -endpos 20 -oac faac -ovc x264 -sws 9 -lavfopts format=mp4 -x264encopts <lotsofopts> -of lavf -vf scale=800:-10,harddup in.avi -o out.mp4</lotsofopts>. Now here the problem is, that some videos are corrupted at the beginning. I think this is because the fact, that i do not necessarily cut at keyframe.

    Questions :

    1. What is the best way to fix the index of an avi "inline" so that every tool can again work as expected with it ?

    2. How can i split at the keyframes ? Is there an mencoder-option for this ?

    3. Are Keyframes coming in a frequency ? How to find out this frequency ? (So with a bit of math it should be possible to calculate the next keyframe and cut there)

    4. Is ther perhaps some completely other way to split this movie ? Doing it by hand is no option, i've to cut out 1000+ chunks ...

    Thanks a lot !

  • Transcoding a video from GoToMeeting WMV to MP4 using FFMPEG on Windows locks up dos window [migrated]

    22 août 2011, par Ryan

    I've been tasked with converting some pre-recorded GoToMeeting videos from wmv to mp4 format. I've got the GoToMeeting transcoder app available from Citrix (g2mtranscoder.exe) that takes the source wmv and makes it so that ffmpeg can work with the wmv. The problem I have is when I then try and take that wmv and transcode it to mp4.

    I'm using the latest pre-compiled static version of ffmpeg for Windows from here http://ffmpeg.zeranoe.com/builds/

    Here is the command line I'm using to begin the conversion :

    "C:\ffmpeg\ffmpeg.exe" -i "999_1366_768_g2m_transcoded.wmv" -f "mp4" -b "1000k" -r "30" -ab "128k" -ar "22050" -s "1366x768" -strict "experimental" -y "999.mp4"

    Here is FFMPEG's output. The last line is where the dos window seems to stop executing/locks up (when I say lock up I mean just the dos window is locked up, not the entire machine) :

    ffmpeg version N-31932-g41bf67d, Copyright (c) 2000-2011 the FFmpeg developers built on Aug 16 2011 18:54:12 with gcc 4.6.1
    configuration: --enable-gpl --enable-version3 --enable-memalign-hack --enable-runtime-cpudetect --enable-avisynth --enable-bzlib --enable-frei0r --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --enable-zlib
     libavutil    51. 12. 0 / 51. 12. 0
     libavcodec   53. 10. 0 / 53. 10. 0
     libavformat  53.  7. 0 / 53.  7. 0
     libavdevice  53.  3. 0 / 53.  3. 0
     libavfilter   2. 31. 1 /  2. 31. 1
     libswscale    2.  0. 0 /  2.  0. 0
     libpostproc  51.  2. 0 / 51.  2. 0
    [asf @ 01BBB600] max_analyze_duration 5000000 reached at 5194000

    Seems stream 1 codec frame rate differs from container frame rate: 1000.00 (1000/1) -> 59.92 (719/12)
    Input #0, asf, from &#39;D:\Inetpub\g2m\Transcoding\999_1366_768_g2m_transcoded.wmv&#39;:
     Metadata:
       WMFSDKVersion   : 10.00.00.4007
       WMFSDKNeeded    : 0.0.0.0000
       IsVBR           : 1
       VBR Peak        : 4403
       Buffer Average  : 1470
       WM/ToolVersion  : 4.8 Build 723
       WM/ToolName     : GoToMeeting
       BitRateFrom the writer: 1553
       Audio samples   : 6221
       Video samples   : 3455
       recording time  : Mon, 22 Aug 2011 11:48:42 Eastern Daylight Time
     Duration: 00:38:30.71, start: 0.000000, bitrate: 160 kb/s
       Stream #0.0: Audio: wmav2, 44100 Hz, 1 channels, s16, 48 kb/s
       Stream #0.1: Video: wmv3 (Main), yuv420p, 1366x768, 2154 kb/s, 59.92 tbr, 1k tbn, 1k tbc
    [buffer @ 01BBC120] w:1366 h:768 pixfmt:yuv420p tb:1/1000000 sar:0/1 sws_param:
    Output #0, mp4, to &#39;\\nas4\FMS_APPLICATIONS\SVC\sk12\webinars\999.mp4&#39;:
     Metadata:
       WMFSDKVersion   : 10.00.00.4007
       WMFSDKNeeded    : 0.0.0.0000
       IsVBR           : 1
       VBR Peak        : 4403
       Buffer Average  : 1470
       WM/ToolVersion  : 4.8 Build 723
       WM/ToolName     : GoToMeeting
       BitRateFrom the writer: 1553
       Audio samples   : 6221
       Video samples   : 3455
       recording time  : Mon, 22 Aug 2011 11:48:42 Eastern Daylight Time
       encoder         : Lavf53.7.0
       Stream #0.0: Video: mpeg4, yuv420p, 1366x768, q=2-31, 1000 kb/s, 30 tbn, 30 tbc
       Stream #0.1: Audio: aac, 22050 Hz, 1 channels, s16, 128 kb/s
    Stream mapping:
     Stream #0.1 -> #0.0
     Stream #0.0 -> #0.1
    Press [q] to stop, [?] for help
    frame=  149 fps= 63 q=23.5 size=    1103kB time=00:00:04.96 bitrate=1818.6kbits/s dup=144 drop=0

    Does anyone have any idea why it's locking up the dos window and essentially failing to convert the file to mp4 ? I had a friend test it on a linux box and he was able to convert the video just fine.

    Thanks

  • Video created using H263 codec and ffmpeg does not play on android device [closed]

    21 mars 2013, par susheel tickoo

    I have created a video using FFmpeg and H263 codec. But when I play the video on an Android device the player is unable to play it. I have used both the extensions .mp4 and .3gp.

     void  generate(JNIEnv *pEnv, jobject pObj,jobjectArray stringArray,int famerate,int width,int height,jstring videoFilename)
       {
           AVCodec *codec;
           AVCodecContext *c= NULL;
           //int framesnum=5;
           int i,looper, out_size, size, x, y,encodecbuffsize,j;
           __android_log_write(ANDROID_LOG_INFO, "record","************into generate************");
           int imagecount= (*pEnv)->GetArrayLength(pEnv, stringArray);
           __android_log_write(ANDROID_LOG_INFO, "record","************got magecount************");
           int retval=-10;

           FILE *f;
           AVFrame *picture,*encoded_avframe;
           uint8_t  *encodedbuffer;
           jbyte *raw_record;
           char logdatadata[100];




           int returnvalue = -1,numBytes =-1;
           const char *gVideoFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, videoFilename, NULL);
           __android_log_write(ANDROID_LOG_INFO, "record","************got video file name************");

           /* find the mpeg1 video encoder */
           codec = avcodec_find_encoder(CODEC_ID_H264);
           if (!codec) {
               __android_log_write(ANDROID_LOG_INFO, "record","codec not found");
               exit(1);
           }
           c= avcodec_alloc_context();
           /*c->bit_rate = 400000;

           c->width = width;
           c->height = height;

           c->time_base= (AVRational){1,famerate};
           c->gop_size = 12; // emit one intra frame every ten frames
           c->max_b_frames=0;
           c->pix_fmt = PIX_FMT_YUV420P;
           c->codec_type = AVMEDIA_TYPE_VIDEO;
           c->codec_id = CODEC_ID_H263;*/

            c->bit_rate = 400000;
               // resolution must be a multiple of two
               c->width = 176;
               c->height = 144;
                   c->pix_fmt = PIX_FMT_YUV420P;


               c->qcompress = 0.0;
               c->qblur = 0.0;
               c->gop_size = 20;  //or 1
               c->sub_id = 1;
               c->workaround_bugs = FF_BUG_AUTODETECT;

               //pFFmpeg->c->time_base = (AVRational){1,25};
               c->time_base.num = 1;
               c->time_base.den = famerate;
               c->max_b_frames = 0; //pas de B frame en H263

              // c->opaque = opaque;
               c->dct_algo = FF_DCT_AUTO;
               c->idct_algo = FF_IDCT_AUTO;
               //lc->rtp_mode = 0;
               c->rtp_payload_size = 1000;
               c->rtp_callback = 0; // ffmpeg_rtp_callback;


               c->flags |= CODEC_FLAG_QSCALE;
               c->mb_decision = FF_MB_DECISION_RD;
               c->thread_count = 1;
           #define DEFAULT_RATE    (16 * 8 * 1024)
               c->rc_min_rate = DEFAULT_RATE;
               c->rc_max_rate = DEFAULT_RATE;
               c->rc_buffer_size = DEFAULT_RATE * 64;
               c->bit_rate = DEFAULT_RATE;                    


           sprintf(logdatadata, "------width from c ---- = %d",width);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           sprintf(logdatadata, "------height from c ---- = %d",height);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           __android_log_write(ANDROID_LOG_INFO, "record","************Found codec and now opening it************");
           /* open it */
           retval = avcodec_open(c, codec);
           if ( retval &lt; 0)
           {
               sprintf(logdatadata, "------avcodec_open ---- retval = %d",retval);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
               __android_log_write(ANDROID_LOG_INFO, "record","could not open codec");
               exit(1);
           }
           __android_log_write(ANDROID_LOG_INFO, "record","statement 5");
           f = fopen(gVideoFileName, "ab");

           if (!f) {
               __android_log_write(ANDROID_LOG_INFO, "record","could not open video file");
               exit(1);
           }

           __android_log_write(ANDROID_LOG_INFO, "record", "***************Allocating encodedbuffer*********\n");
           encodecbuffsize = avpicture_get_size(PIX_FMT_RGB24, c->width, c->height);

           sprintf(logdatadata, "encodecbuffsize = %d",encodecbuffsize);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           encodedbuffer = malloc(encodecbuffsize);

           jclass cls = (*pEnv)->FindClass(pEnv, "com/canvasm/mediclinic/VideoGenerator");
           jmethodID mid = (*pEnv)->GetMethodID(pEnv, cls, "videoProgress", "(Ljava/lang/String;)Ljava/lang/String;");
           jmethodID mid_delete = (*pEnv)->GetMethodID(pEnv, cls, "deleteTempFile", "(Ljava/lang/String;)Ljava/lang/String;");

           if (mid == 0)
               return;

           __android_log_write(ANDROID_LOG_INFO, "native","got method id");


           for(i=0;i&lt;=imagecount;i++) {

               jboolean isCp;
               int progress = 0;
               float temp;
               jstring string;
               if(i==imagecount)
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, imagecount-1);
               else
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, i);

               const char *rawString = (*pEnv)->GetStringUTFChars(pEnv, string, &amp;isCp);

               __android_log_write(ANDROID_LOG_INFO, "record",rawString);
               picture = OpenImage(rawString,width,height);
               //WriteJPEG(c,picture,i);
               //   encode video
               memset(encodedbuffer,0,encodecbuffsize);

               //do{

               for(looper=0;looper&lt;5;looper++)
               {
                   memset(encodedbuffer,0,encodecbuffsize);
                   out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, picture);
                   sprintf(logdatadata, "avcodec_encode_video ----- out_size = %d \n",out_size );
                   __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
                   if(out_size>0)
                       break;
               }
               __android_log_write(ANDROID_LOG_INFO, "record","*************Start looping for same image*******");
               returnvalue = fwrite(encodedbuffer, 1, out_size, f);
               sprintf(logdatadata, "fwrite ----- returnvalue = %d \n",returnvalue );
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               __android_log_write(ANDROID_LOG_INFO, "record","*************End looping for same image*******");

               // publishing progress
               progress = ((i*100)/(imagecount+1))+15;//+1 is for last frame duplicated entry
               if(progress&lt;20 )
                   progress =20;
               if(progress>=95 )
                   progress =95;

               sprintf(logdatadata, "%d",progress );
               jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
               (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);

               if(i>0)
                   (*pEnv)->CallObjectMethod(pEnv, pObj, mid_delete,string);

           }


           /* get the delayed frames */
           for(; out_size; i++) {
               fflush(stdout);
               out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, NULL);
               fwrite(encodedbuffer, 20, out_size, f);
           }

           /* add sequence end code to have a real mpeg file */
           encodedbuffer[0] = 0x00;
           encodedbuffer[1] = 0x00;
           encodedbuffer[2] = 0x01;
           encodedbuffer[3] = 0xb7;
           fwrite(encodedbuffer, 1, 4, f);
           fclose(f);
           free(encodedbuffer);
           avcodec_close(c);
           av_free(c);
           __android_log_write(ANDROID_LOG_INFO, "record","Video created ");

           // last updation of 100%
           sprintf(logdatadata, "%d",100 );
           jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
           (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);
       }



       AVFrame* OpenImage(const char* imageFileName,int w,int h)
       {
           AVFrame *pFrame;
           AVCodec *pCodec ;
           AVFormatContext *pFormatCtx;
           AVCodecContext *pCodecCtx;
           uint8_t *buffer;
           int frameFinished,framesNumber = 0,retval = -1,numBytes=0;
           AVPacket packet;
           char logdatadata[100];
           //__android_log_write(ANDROID_LOG_INFO, "OpenImage",imageFileName);
           if(av_open_input_file(&amp;pFormatCtx, imageFileName, NULL, 0, NULL)!=0)
           //if(avformat_open_input(&amp;pFormatCtx,imageFileName,NULL,NULL)!=0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can&#39;t open image file ");
               return NULL;
           }

           pCodecCtx = pFormatCtx->streams[0]->codec;
           pCodecCtx->width = w;
           pCodecCtx->height = h;
           pCodecCtx->pix_fmt = PIX_FMT_YUV420P;

           // Find the decoder for the video stream
           pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
           if (!pCodec)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can&#39;t open image file ");
               return NULL;
           }

           pFrame = avcodec_alloc_frame();

           numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
           buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));
           sprintf(logdatadata, "numBytes  = %d",numBytes);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           retval = avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);

           // Open codec
           if(avcodec_open(pCodecCtx, pCodec)&lt;0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Could not open codec");
               return NULL;
           }

           if (!pFrame)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Can&#39;t allocate memory for AVFrame\n");
               return NULL;
           }
           int readval = -5;
           while (readval = av_read_frame(pFormatCtx, &amp;packet) >= 0)
           {
               if(packet.stream_index != 0)
                   continue;

               int ret = avcodec_decode_video2(pCodecCtx, pFrame, &amp;frameFinished, &amp;packet);
               sprintf(logdatadata, "avcodec_decode_video2 ret = %d",ret);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               if (ret > 0)
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","Frame is decoded\n");
                   pFrame->quality = 4;
                   av_free_packet(&amp;packet);
                   av_close_input_file(pFormatCtx);
                   return pFrame;
               }
               else
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","error while decoding frame \n");
               }
           }
           sprintf(logdatadata, "readval = %d",readval);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
       }

    The generate method takes a list of strings (path to images) and converts them to video and the OpenImage method is responsible for convertign a single image to AVFrame.