Recherche avancée

Médias (91)

Autres articles (5)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (2427)

  • Android : mp4 file plays when downloaded but when choosing "Video" player gets "Cannot play video"

    14 janvier 2014, par gview

    I've converted the video to an mp4 with ffmpeg using the h264 codec and AAC, and used the baseline profile.

    Videos are 540x360x250kbps

    I then ran qt-faststart on the file to move the atoms into the right order.

    I've stuck the file up on a wiki we use and created a link to it.

    My test phone is a Samsung Galaxy S3.

    When I browse to the page that has links to the mp4's on it, and I click on them, I get a popup window with 2 options : Internet and Video.

    If I download the videos using the "Internet" option, I can play them on the phone without issue.

    I've done other encodings with the main profile as well, and these also play fine. I thought that a powerful phone like the s3 would be able to handle the more advanced compression schemes available in h264, however I've also browsed the Android docs in regards to supported video formats, and it seems to state that only the "baseline" compression profile is supported.

    Regardless, what doesn't work is trying to use the "Video" option which I assume tries to stream the video.

    For the wiki in question, clicking on the link reveals that the content-type and content-length headers are being set :

    Content-Length  6175996
    Content-Type    video/mp4;charset=UTF-8

    Clicking on the link with a browser invokes a player (Quicktime in most cases) that can play the mp4's.

    Is there more to having the file HTTP streamable beyond making a link to it ? Why won't my Android 4 play these files ?

    UPDATE :
    I decided to make a quick HTML5 page using the video tag, and the videos do play on both my Galaxy S3 and the latest IOS.

  • Why cant my smartphone plays a video ?

    6 juillet 2015, par John Smith

    I create a video from frames, with no sound with ffmpeg :

    ffmpeg -f image2 -r 1 -i "img%d.png" -vcodec libx264 -pix_fmt yuv420p -movflags faststart x.mp4

    on desktop, it goes well. But on smartphone, Firefox say :

    no video with supported format and MIME type found.

    The source images are 1024x768. What could I do ? HTML5 is :

    <video controls="controls" autoplay="autoplay">
    <source type="video/mp4" src="/x.mp4"></source>
    </video>
  • flv created using ffmpeg library plays too fast

    25 avril 2015, par Muhammad Ali

    I am muxing an h264 annex-b stream and an ADTS AAC stream coming from IP Camera into an FLV. I have gone through all the necessary things (that I knew of) e-g stripping ADTS header from AAC and converting H264 annex-b to AVC.

    I am able to create the flv file which plays but it plays fast. The params of my output format video codec are :-

    Time base = 1/60000   &lt;-- I don't know why
    Bitrate = 591949 (591Kbps)
    GOP Size = 12
    FPS = 30 Fps (that's the rate encoder sends me data at)

    Params for output format audio codec are :-

    Timebase = 1/44100
    Bitrate = 45382 (45Kbps)
    Sample rate = 48000

    I am using NO_PTS for both audio and video.
    The resultant video has double the bit rate (2x(audio bitrate + vid bitrate)) and half the duration.
    If i play the resultant video in ffplay the video playsback fast so it ends quickly but audio plays on its original time. So even after the video has ended quickly the audio still plays till its full duration.

    If I set pts and dts equal to an increasing index (separate indices for audio and video) the video plays Super fast, bit rate shoots to an insane value and video duration gets very short but audio plays fine and on time.

    EDIT :

    Duration: 00:00:09.96, start: 0.000000, bitrate: 1230 kb/s
    Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709), 1280x720 [SAR 1:1 DAR 16:9], 591 kb/s, 30.33 fps, 59.94 tbr, 1k tbn, 59.94 tbc
    Stream #0:1: Audio: aac, 48000 Hz, mono, fltp, 45 kb/s

    Why is tbr 59.94 ? how was that calculated ? maybe that is the problem ?

    Code for muxing :

    if(packet.header.dataType == TRANSFER_PACKET_TYPE_H264)
             {

               if((packet.data[0] == 0x00) &amp;&amp; (packet.data[1] == 0x00) &amp;&amp; (packet.data[2]==0x00) &amp;&amp; (packet.data[3]==0x01))
               {

                 unsigned char tempCurrFrameLength[4];
                 unsigned int nal_unit_length;
                 unsigned char nal_unit_type;
                 unsigned int cursor = 0;
                 int size = packet.header.dataLen;

                  do {

                       av_init_packet(&amp;pkt);
                       int currFrameLength = 0;
                       if((packet.header.frameType == TRANSFER_FRAME_IDR_VIDEO) || (packet.header.frameType == TRANSFER_FRAME_I_VIDEO))
                       {
                         //pkt.flags        |= AV_PKT_FLAG_KEY;
                       }
                       pkt.stream_index  = packet.header.streamId;//0;//ost->st->index;     //stream index 0 for vid : 1 for aud
                       outStreamIndex = outputVideoStreamIndex;
                       /*vDuration += (packet.header.dataPTS - lastvPts);
                       lastvPts = packet.header.dataPTS;
                       pkt.pts = pkt.dts= packet.header.dataPTS;*/
                       pkt.pts = pkt.dts = AV_NOPTS_VALUE;

                       if(framebuff != NULL)
                       {
                         //printf("Framebuff has mem alloc : freeing 1\n\n");
                         free(framebuff);
                         framebuff = NULL;
                         //printf("free successfully \n\n");
                       }


                       nal_unit_length = GetOneNalUnit(&amp;nal_unit_type, packet.data + cursor/*pData+cursor*/, size-cursor);
                       if(nal_unit_length > 0 &amp;&amp; nal_unit_type > 0)
                       {

                       }
                       else
                       {
                         printf("Fatal error : nal unit lenth wrong \n\n");
                         exit(0);
                       }
                       write_header_done = 1;
                       //#define _USE_SPS_PPS    //comment this line to write everything on to the stream. SPS+PPSframeframe
                       #ifdef _USE_SPS_PPS
                       if (nal_unit_type == 0x07 /*NAL_SPS*/)
                       { // write sps


                         printf("Got SPS \n");
                         if (_sps == NULL)
                         {
                           _sps_size = nal_unit_length -4;
                           _sps = new U8[_sps_size];
                           memcpy(_sps, packet.data+cursor+4, _sps_size); //exclude start code 0x00000001
                         }

                       }
                       else if (nal_unit_type == 0x08/*NAL_PPS*/)
                       { // write pps

                         printf("Got PPS \n");
                         if (_pps == NULL)
                         {
                           _pps_size = nal_unit_length -4;
                           _pps = new U8[_pps_size];
                           memcpy(_pps, packet.data+cursor+4, _pps_size); //exclude start code 0x00000001

                           //out_stream->codec->extradata
                           //ofmt_ctx->streams[outputVideoStreamIndex]->codec->extradata
                           free(ofmt_ctx->streams[outputVideoStreamIndex]->codec->extradata);
                           ofmt_ctx->streams[outputVideoStreamIndex]->codec->extradata = (uint8_t*)av_mallocz(_sps_size + _pps_size);

                           memcpy(ofmt_ctx->streams[outputVideoStreamIndex]->codec->extradata,_sps,_sps_size);
                           memcpy(ofmt_ctx->streams[outputVideoStreamIndex]->codec->extradata + _sps_size,_pps,_pps_size);
                           ret = avformat_write_header(ofmt_ctx, NULL);
                           if (ret &lt; 0) {
                                 //fprintf(stderr, "Error occurred when opening output file\n");
                                 printf("Error occured when opening output \n");
                                 exit(0);
                            }
                            write_header_done = 1;
                           printf("Done writing header \n");

                         }

                       }
                       //else
                       #endif  /*end _USE_SPS_PPS */
                       { //IDR Frame

                           videoPts++;
                           if( (nal_unit_type == 0x06) || (nal_unit_type == 0x09) || (nal_unit_type == 0x07) || (nal_unit_type == 0x08))
                           {
                             av_free_packet(&amp;pkt);
                             cursor += nal_unit_length;
                             continue;
                           }

                         if( (nal_unit_type == 0x05) || (nal_unit_type == 0x05))
                         {
                           //videoPts++;
                         }
                         if ((nal_unit_type != 0x07) &amp;&amp; (nal_unit_type != 0x08))
                         {

                           vDuration += (packet.header.dataPTS - lastvPts);
                           lastvPts = packet.header.dataPTS;
                           //pkt.pts = pkt.dts= packet.header.dataPTS;
                           pkt.pts = pkt.dts= AV_NOPTS_VALUE;//videoPts;
                         }
                         else
                         {
                           //probably sps pps ... no need to transmit. free the packet
                           //av_free_packet(&amp;pkt);
                           pkt.pts = pkt.dts = AV_NOPTS_VALUE;
                         }


                         currFrameLength  = nal_unit_length - 4;//packet.header.dataLen -4;
                         tempCurrFrameLength[3] = currFrameLength;
                         tempCurrFrameLength[2] = currFrameLength>>8;
                         tempCurrFrameLength[1] = currFrameLength>>16;
                         tempCurrFrameLength[0] = currFrameLength>>24;



                         if(nal_unit_type == 0x05)
                         {  
                           pkt.flags        |= AV_PKT_FLAG_KEY;
                         }


                         framebuff = (unsigned char *)malloc(sizeof(unsigned char)* /*packet.header.dataLen*/nal_unit_length );
                         if(framebuff == NULL)
                         {
                           printf("Failed to allocate memory for frame \n\n ");
                           exit(0);
                         }


                         memcpy(framebuff, tempCurrFrameLength,0x04);
                         //memcpy(&amp;framebuff[4], &amp;packet.data[4] , currFrameLength);
                         //put_buffer(pData + cursor + 4, nal_unit_length - 4);// save ES data
                         memcpy(framebuff+4,packet.data + cursor + 4, currFrameLength );
                         pkt.data = framebuff;
                         pkt.size = nal_unit_length;//packet.header.dataLen ;

                         //printf("\nPrinting Frame| Size: %d | NALU Lenght: %d | NALU: %02x \n",pkt.size,nal_unit_length ,nal_unit_type);

                         /* GET READY TO TRANSMIT THE packet */
                         //pkt.duration = vDuration;
                         in_stream  = ifmt_ctx->streams[pkt.stream_index];
                         out_stream = ofmt_ctx->streams[outStreamIndex];
                         cn = out_stream->codec;
                         //av_packet_rescale_ts(&amp;pkt, cn->time_base, out_stream->time_base);
                         //pkt.pts = av_rescale_q_rnd(pkt.pts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);
                         //pkt.dts = av_rescale_q_rnd(pkt.dts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);
                         //pkt.duration = av_rescale_q(pkt.duration, in_stream->time_base, out_stream->time_base);
                         pkt.pos = -1;
                         pkt.stream_index = outStreamIndex;

                          if (!write_header_done)
                          {
                           }
                           else
                           {
                         //doxygen suggests i use av_write_frame if i am taking care of interleaving
                         ret = av_interleaved_write_frame(ofmt_ctx, &amp;pkt);
                         //ret = av_write_frame(ofmt_ctx, &amp;pkt);
                         if (ret &lt; 0)
                          {
                              fprintf(stderr, "Error muxing Video packet\n");
                              continue;
                          }
                          }

                         /*for(int ii = 0; ii &lt; pkt.size ; ii++)
                           printf("%02x ",framebuff[ii]);*/

                          av_free_packet(&amp;pkt);
                          if(framebuff != NULL)
                           {
                             //printf("Framebuff has mem alloc : freeing 2\n\n");
                             free(framebuff);
                             framebuff = NULL;
                             //printf("Freeing successfully \n\n");
                           }

                         /* TRANSMIT DONE */

                       }
                     cursor += nal_unit_length;
                     }while(cursor &lt; size);


               }
               else
               {
                 printf("This is not annex B bitstream \n\n");
                 for(int ii = 0; ii &lt; packet.header.dataLen ; ii++)
                   printf("%02x ",packet.data[ii]);
                 printf("\n\n");
                 exit(0);
               }

               //video frame has been parsed completely.
               continue;

             }
           else if(packet.header.dataType == TRANSFER_PACKET_TYPE_AAC)
             {

               av_init_packet(&amp;pkt);


               pkt.flags = 1;          
               pkt.pts = audioPts*1024;
               pkt.dts = audioPts*1024;
               //pkt.duration = 1024;    

               pkt.stream_index  = packet.header.streamId + 1;//1;//ost->st->index;     //stream index 0 for vid : 1 for aud
               outStreamIndex = outputAudioStreamIndex;
               //aDuration += (packet.header.dataPTS - lastaPts);
               //lastaPts = packet.header.dataPTS;

               //NOTE: audio sync requires this value
               pkt.pts = pkt.dts= AV_NOPTS_VALUE ;
               //pkt.pts = pkt.dts=audioPts++;
               pkt.data = (uint8_t *)packet.data;//raw_data;
               pkt.size          = packet.header.dataLen;
             }


           //packet.header.streamId
           //now assigning pkt.data in repsective if statements above
           //pkt.data          = (uint8_t *)packet.data;//raw_data;
           //pkt.size          = packet.header.dataLen;


           //pkt.duration = 24000;      //24000 assumed basd on observation

           //duration calculation
           /*if(packet.header.dataType == TRANSFER_PACKET_TYPE_H264)
             {
               pkt.duration = vDuration;

             }
           else*/ if(packet.header.dataType == TRANSFER_PACKET_TYPE_AAC)
             {
               //pkt.duration = aDuration;
             }

           in_stream  = ifmt_ctx->streams[pkt.stream_index];
           out_stream = ofmt_ctx->streams[outStreamIndex];

           cn = out_stream->codec;

           if(packet.header.dataType == TRANSFER_PACKET_TYPE_AAC)
            ret= av_bitstream_filter_filter(aacbsfc, in_stream->codec, NULL, &amp;pkt.data, &amp;pkt.size, packet.data/*pkt.data*/, packet.header.dataLen, pkt.flags &amp; AV_PKT_FLAG_KEY);

             if(ret &lt; 0)
             {
               printf("Failed to execute aac bitstream filter \n\n");
               exit(0);
             }

           //if(packet.header.dataType == TRANSFER_PACKET_TYPE_H264)
           // av_bitstream_filter_filter(h264bsfc, in_stream->codec, NULL, &amp;pkt.data, &amp;pkt.size, packet.data/*pkt.data*/, pkt.size, 0);

             pkt.flags = 1;              


              //NOTE : Commented the lines below synced audio and video streams

             //av_packet_rescale_ts(&amp;pkt, cn->time_base, out_stream->time_base);

             //pkt.pts = av_rescale_q_rnd(pkt.pts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);

             //pkt.dts = av_rescale_q_rnd(pkt.dts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);


             //pkt.duration = av_rescale_q(pkt.duration, in_stream->time_base, out_stream->time_base);



           //enabled on Tuesday
           pkt.pos = -1;

           pkt.stream_index = outStreamIndex;

           //doxygen suggests i use av_write_frame if i am taking care of interleaving
           ret = av_interleaved_write_frame(ofmt_ctx, &amp;pkt);
           //ret = av_write_frame(ofmt_ctx, &amp;pkt);
           if (ret &lt; 0)
            {
                fprintf(stderr, "Error muxing packet\n");
                continue;
            }

          av_free_packet(&amp;pkt);
          if(framebuff != NULL)
           {
             //printf("Framebuff has mem alloc : freeing 2\n\n");
             free(framebuff);
             framebuff = NULL;
             //printf("Freeing successfully \n\n");
           }
         }