Recherche avancée

Médias (91)

Autres articles (29)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (4116)

  • FFMPEG : directly decode packets after encoding

    18 avril 2012, par user1282931

    using FFMPEG API, I try to encode a x264 video to a MP4 file with 0 frame latency and also, in realtime, show the currently encoded frame on screen (with encoding artifacts). The encoding to the file works, but so far I don't get the frames decoded right after writing them to the file. What I try is to feed the packetdata that is returned from avcodec_encode_video() right into avcodec_decode_video2() but the function returns -1 and the cmd output shows :

    [h264 @ 00000000025F0710] non-existing PPS 0 referenced
    [h264 @ 00000000025F0710] decode_slice_header error
    [h264 @ 00000000025F0710] no frame

    here is some code i use for encoding :

    AVPacket FFMpegEncoder2::write_video_frame(AVFrame* pic, int &numBytes)
    {
       int out_size, ret;

       AVPacket pkt;


       /* encode the image */
       out_size = avcodec_encode_video(m_cctx, m_outbuf,
                                           m_outbufSize, pic);
       /* If size is zero, it means the image was buffered. */
       assert(out_size>0) //0 frame delay

       av_init_packet(&pkt);

       if (m_cctx->coded_frame->pts != AV_NOPTS_VALUE)
             pkt.pts = av_rescale_q(m_cctx->coded_frame->pts,m_cctx->time_base, m_video_st->time_base);
       if (m_cctx->coded_frame->key_frame)
             pkt.flags |= AV_PKT_FLAG_KEY;
       pkt.stream_index = m_video_st->index;
       pkt.data         = m_outbuf;
       pkt.size         = out_size;

       /* Write the compressed frame to the media file. */
       ret = av_interleaved_write_frame(m_fctx, &pkt);

       if (ret != 0) {
           fprintf(stderr, "Error while writing video frame\n");
           exit(1);
       }
       numBytes = out_size;
       return pkt;
    }

    and then I take this returned packet and feed it into the decoder :

    const AVFrame* FFMpegDecoder2::decode(AVPacket* packet){
       AVPacket pkt;
       av_init_packet(&pkt);
       pkt.size = packet->size;
       pkt.data = packet->data;

       int len=0;
       int got_picture=0;


       while (pkt.size > 0) {
               len = avcodec_decode_video2(m_cctx, m_frame, &got_picture, &pkt);
               if (len < 0) {
                   fprintf(stderr, "Error while decoding frame %d\n", m_f);
                   exit(1);
               }
               if (got_picture) {
                   assert(pkt.size==len);
                   m_f++;
               }
               pkt.size -= len;
               pkt.data += len;
           }
       assert(got_picture);
       return m_frame;
    }

    but as stated, avcodec_decode_video2() returns -1

    what am I doing wrong ? Do i need to feed some headerdata into the decoder first somehow ?

    //edit :

    if i set

    m_formatCtx->oformat->flags &= ~AVFMT_GLOBALHEADER;
    m_codecctx->flags &= ~CODEC_FLAG_GLOBAL_HEADER;

    then i can decode the returned packet without error, but the written mp4 file will be black.

    //edit : this is how i setup the decoder :

    FFMpegDecoder2::FFMpegDecoder2(CodecID id)
       : m_codec(NULL)
       , m_cctx(NULL)
    {


       /* Initialize libavcodec, and register all codecs and formats. */
       avcodec_register_all();

       m_codec = avcodec_find_decoder(id);
       if (!m_codec) {
           fprintf(stderr, "codec not found\n");
           exit(1);
       }

       m_cctx = avcodec_alloc_context3(m_codec);
       m_cctx->codec = m_codec;
       m_cctx->pix_fmt = PIX_FMT_YUV420P;

       avcodec_open2(m_cctx, m_codec, NULL);

       //alloc frame
       m_frame = avcodec_alloc_frame();
    }

    this is what the memory window shows for the first packet (didn't copy all. the size of the first packet is 7859) :

    0x0000000002E66670  00 00 01 06 05 ff ff 55 dc 45 e9 bd e6 d9 48 b7 96 2c d8 20 d9 23 ee ef 78 32 36 34 20 2d 20 63 6f 72 65 20 31 32 30 20 72 32 31 34 36 20 62  .....ÿÿUÜEé.æÙH·–,Ø Ù#îïx264 - core 120 r2146 b
    0x0000000002E6669F  63 64 34 31 64 62 20 2d 20 48 2e 32 36 34 2f 4d 50 45 47 2d 34 20 41 56 43 20 63 6f 64 65 63 20 2d 20 43 6f 70 79 6c 65 66 74 20 32 30 30 33  cd41db - H.264/MPEG-4 AVC codec - Copyleft 2003
    0x0000000002E666CE  2d 32 30 31 31 20 2d 20 68 74 74 70 3a 2f 2f 77 77 77 2e 76 69 64 65 6f 6c 61 6e 2e 6f 72 67 2f 78 32 36 34 2e 68 74 6d 6c 20 2d 20 6f 70 74  -2011 - http://www.videolan.org/x264.html - opt
    0x0000000002E666FD  69 6f 6e 73 3a 20 63 61 62 61 63 3d 30 20 72 65 66 3d 33 20 64 65 62 6c 6f 63 6b 3d 31 3a 30 3a 30 20 61 6e 61 6c 79 73 65 3d 30 78 33 3a 30  ions: cabac=0 ref=3 deblock=1:0:0 analyse=0x3:0
    0x0000000002E6672C  78 31 31 33 20 6d 65 3d 68 65 78 20 73 75 62 6d 65 3d 34 20 70 73 79 3d 31 20 70 73 79 5f 72 64 3d 31 2e 30 30 3a 30 2e 30 30 20 6d 69 78 65  x113 me=hex subme=4 psy=1 psy_rd=1.00:0.00 mixe
    0x0000000002E6675B  64 5f 72 65 66 3d 31 20 6d 65 5f 72 61 6e 67 65 3d 31 36 20 63 68 72 6f 6d 61 5f 6d 65 3d 31 20 74 72 65 6c 6c 69 73 3d 30 20 38 78 38 64 63  d_ref=1 me_range=16 chroma_me=1 trellis=0 8x8dc
    0x0000000002E6678A  74 3d 31 20 63 71 6d 3d 30 20 64 65 61 64 7a 6f 6e 65 3d 32 31 2c 31 31 20 66 61 73 74 5f 70 73 6b 69 70 3d 31 20 63 68 72 6f 6d 61 5f 71 70  t=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp
    0x0000000002E667B9  5f 6f 66 66 73 65 74 3d 30 20 74 68 72 65 61 64 73 3d 31 20 73 6c 69 63 65 64 5f 74 68 72 65 61 64 73 3d 30 20 6e 72 3d 30 20 64 65 63 69 6d  _offset=0 threads=1 sliced_threads=0 nr=0 decim
    0x0000000002E667E8  61 74 65 3d 31 20 69 6e 74 65 72 6c 61 63 65 64 3d 30 20 62 6c 75 72 61 79 5f 63 6f 6d 70 61 74 3d 30 20 63 6f 6e 73 74 72 61 69 6e 65 64 5f  ate=1 interlaced=0 bluray_compat=0 constrained_
    0x0000000002E66817  69 6e 74 72 61 3d 30 20 62 66 72 61 6d 65 73 3d 30 20 77 65 69 67 68 74 70 3d 32 20 6b 65 79 69 6e 74 3d 32 35 20 6b 65 79 69 6e 74 5f 6d 69  intra=0 bframes=0 weightp=2 keyint=25 keyint_mi
    0x0000000002E66846  6e 3d 32 20 73 63 65 6e 65 63 75 74 3d 34 30 20 69 6e 74 72 61 5f 72 65 66 72 65 73 68 3d 30 20 72 63 3d 61 62 72 20 6d 62 74 72 65 65 3d 30  n=2 scenecut=40 intra_refresh=0 rc=abr mbtree=0
    0x0000000002E66875  20 62 69 74 72 61 74 65 3d 34 30 30 20 72 61 74 65 74 6f 6c 3d 31 2e 30 20 71 63 6f 6d 70 3d 30 2e 36 30 20 71 70 6d 69 6e 3d 30 20 71 70 6d   bitrate=400 ratetol=1.0 qcomp=0.60 qpmin=0 qpm
    0x0000000002E668A4  61 78 3d 36 39 20 71 70 73 74 65 70 3d 34 20 69 70 5f 72 61 74 69 6f 3d 31 2e 34 30 20 61 71 3d 31 3a 31 2e 30 30 00 80 00 00 00 01 65 88 84  ax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00.€....eˆ.
    0x0000000002E668D3  11 ef ff f8 22 0f 8a 00 02 09 7e 38 00 08 45 c7 00 01 1d c9 39 3d 87 ff e0 ac 13 03 6d 05 f1 00 10 00 10 12 88 04 00 04 02 60 70 4e 2d cc 38  .ïÿø".Š...~8..EÇ...É9=.ÿà¬..m.ñ.....ˆ....`pN-Ì8
    0x0000000002E66902  27 16 e6 07 21 1a e6 1c 84 6b 9f f0 f0 27 15 f2 7b 87 ff c1 58 2a 8a 00 04 b8 80 00 58 00 04 02 62 01 03 c1 c1 04 63 07 04 11 88 90 b1 89 0b  '.æ.!.æ..kŸðð'.ò{.ÿÁX*Š..¸€.X...b..ÁÁ.c...ˆ.±..
    0x0000000002E66931  1f 2c 11 02 b1 40 00 87 8f a4 f7 0f ff 82 b0 55 06 93 41 c4 10 51 00 00 40 14 00 04 00 a3 b7 35 b7 30 38 26 1e e6 1c 13 0f 73 f2 c1 10 2b 14  .,..±@...¤÷.ÿ.°U.“AÄ.Q..@....£·5·08&.æ...sòÁ.+.
    0x0000000002E66960  1f 1f 1c 32 7f 94 11 82 a1 40 01 f1 00 00 40 14 01 22 00 01 e0 1e 22 0a e3 83 1c 19 3d f8 7f e0 b0 16 03 01 22 0f 88 00 02 00 00 16 20 01 17  ...2.”..¡@.ñ..@.."..à.".ãƒ..=ø.à°...".ˆ..... ..
    0x0000000002E6698F  03 84 c2 5c 87 09 84 b9 06 4a e4 a4 ae 08 82 d8 e0 00 20 0f 1d 93 df c3 fe 0b 01 54 50 07 88 a8 80 00 64 09 88 58 88 58 83 84 1d 88 38 41 d8  ..Â\.....J䤮..Øà. ..“ßÃþ..TP.ˆ¨€.d.ˆXˆXƒ..ˆ8AØ
    0x0000000002E669BE  f2 c1 10 2b 14 00 08 f8 e0 00 62 38 64 ff 08 70 13 0a c1 d2 e9 b5 5d ba 10 80 09 a2 01 2e 07 04 c2 dc 87 04 c2 dc 81 c8 66 b9 0e 43 35 cb 0f  òÁ.+...øà.b8dÿ.p..ÁÒéµ]º.€.¢....ÂÜ..ÂÜ.Èf..C5Ë.
    0x0000000002E669ED  ff c1 10 27 2c 00 7e 8e 00 05 64 e4 f6 1f ff 82 28 a0 00 21 99 e3 80 00 99 ac 70 00 11 39 93 93 d8 7f fe 0a c1 40 34 9a 0b e3 40 00 84 40 01  ÿÁ.',.~Ž..däö.ÿ.( .!™ã€.™¬p..9““Ø.þ.Á@4š.ã@..@.
    0x0000000002E66A1C  00 01 02 88 fd cd 7d cc 0e 08 a4 dc c3 82 29 37 3f e0 88 14 8b f1 c3 1c 03 27 f0 c3 60 a0 50 62 86 da 36 1f 10 00 0a 80 00 80 14 40 00 20 00  ...ˆýÍ}Ì..¤ÜÃ.)7?àˆ..ñÃ..'ðÃ` Pb.Ú6....€.€.@. .

    and this is the encoders output (until after encoding frame 0) :

    [libx264 @ 00000000005ADAA0] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShu
    ffle SSE4.2
    [libx264 @ 00000000005ADAA0] profile High, level 3.0
    [libx264 @ 00000000005ADAA0] 264 - core 120 r2146 bcd41db - H.264/MPEG-4 AVC cod
    ec - Copyleft 2003-2011 - http://www.videolan.org/x264.html - options: cabac=0 r
    ef=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=4 psy=1 psy_rd=1.00:0.00 mixed
    _ref=1 me_range=16 chroma_me=1 trellis=0 8x8dct=1 cqm=0 deadzone=21,11 fast_pski
    p=1 chroma_qp_offset=0 threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 b
    luray_compat=0 constrained_intra=0 bframes=0 weightp=2 keyint=25 keyint_min=2 sc
    enecut=40 intra_refresh=0 rc=abr mbtree=0 bitrate=100 ratetol=1.0 qcomp=0.60 qpm
    in=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'out2.mp4':
       Stream #0:0: Video: h264, yuv420p, 640x480, q=-1--1, 100 kb/s, 90k tbn, 25 t
    bc
    [mp4 @ 0000000000467570] Encoder did not produce proper pts, making some up.
  • Method For Crawling Google

    28 mai 2011, par Multimedia Mike — Big Data

    I wanted to crawl Google in order to harvest a large corpus of certain types of data as yielded by a certain search term (we’ll call it “term” for this exercise). Google doesn’t appear to offer any API to automatically harvest their search results (why would they ?). So I sat down and thought about how to do it. This is the solution I came up with.



    FAQ
    Q : Is this legal / ethical / compliant with Google’s terms of service ?
    A : Does it look like I care ? Moving right along…

    Manual Crawling Process
    For this exercise, I essentially automated the task that would be performed by a human. It goes something like this :

    1. Search for “term”
    2. On the first page of results, download each of the 10 results returned
    3. Click on the next page of results
    4. Go to step 2, until Google doesn’t return anymore pages of search results

    Google returns up to 1000 results for a given search term. Fetching them 10 at a time is less than efficient. Fortunately, the search URL can easily be tweaked to return up to 100 results per page.

    Expanding Reach
    Problem : 1000 results for the “term” search isn’t that many. I need a way to expand the search. I’m not aiming for relevancy ; I’m just searching for random examples of some data that occurs around the internet.

    My solution for this is to refine the search using the “site” wildcard. For example, you can ask Google to search for “term” at all Canadian domains using “site :.ca”. So, the manual process now involves harvesting up to 1000 results for every single internet top level domain (TLD). But many TLDs can be more granular than that. For example, there are 50 sub-domains under .us, one for each state (e.g., .ca.us, .ny.us). Those all need to be searched independently. Same for all the sub-domains under TLDs which don’t allow domains under the main TLD, such as .uk (search under .co.uk, .ac.uk, etc.).

    Another extension is to combine “term” searches with other terms that are likely to have a rich correlation with “term”. For example, if “term” is relevant to various scientific fields, search for “term” in conjunction with various scientific disciplines.

    Algorithmically
    My solution is to create an SQLite database that contains a table of search seeds. Each seed is essentially a “site :” string combined with a starting index.

    Each TLD and sub-TLD is inserted as a searchseed record with a starting index of 0.

    A script performs the following crawling algorithm :

    • Fetch the next record from the searchseed table which has not been crawled
    • Fetch search result page from Google
    • Scrape URLs from page and insert each into URL table
    • Mark the searchseed record as having been crawled
    • If the results page indicates there are more results for this search, insert a new searchseed for the same seed but with a starting index 100 higher

    Digging Into Sites
    Sometimes, Google notes that certain sites are particularly rich sources of “term” and offers to let you search that site for “term”. This basically links to another search for ‘term site:somesite”. That site gets its own search seed and the program might harvest up to 1000 URLs from that site alone.

    Harvesting the Data
    Armed with a database of URLs, employ the following algorithm :

    • Fetch a random URL from the database which has yet to be downloaded
    • Try to download it
    • For goodness sake, have a mechanism in place to detect whether the download process has stalled and automatically kill it after a certain period of time
    • Store the data and update the database, noting where the information was stored and that it is already downloaded

    This step is easy to parallelize by simply executing multiple copies of the script. It is useful to update the URL table to indicate that one process is already trying to download a URL so multiple processes don’t duplicate work.

    Acting Human
    A few factors here :

    • Google allegedly doesn’t like automated programs crawling its search results. Thus, at the very least, don’t let your script advertise itself as an automated program. At a basic level, this means forging the User-Agent : HTTP header. By default, Python’s urllib2 will identify itself as a programming language. Change this to a well-known browser string.
    • Be patient ; don’t fire off these search requests as quickly as possible. My crawling algorithm inserts a random delay of a few seconds in between each request. This can still yield hundreds of useful URLs per minute.
    • On harvesting the data : Even though you can parallelize this and download data as quickly as your connection can handle, it’s a good idea to randomize the URLs. If you hypothetically had 4 download processes running at once and they got to a point in the URL table which had many URLs from a single site, the server might be configured to reject too many simultaneous requests from a single client.

    Conclusion
    Anyway, that’s just the way I would (and did) do it. What did I do with all the data ? That’s a subject for a different post.

    Adorable spider drawing from here.

  • c++, FFMPEG, how to use/set private options ?

    15 avril 2012, par Mat

    I'm trying to get a video c++ video encoder to run. I'm coding in VisualStudio 2010 and use a precompiled version of the library (Zeranoe's FFmpeg Build) from April 2012. Now - the Api doesn't seem to correspond anymore to most references that i find online.

    Particularily I wonder about private options. Through google I find things like this :

    http://ffmpeg.org/pipermail/ffmpeg-cvslog/2011-August/039836.html

    but I don't understand how to access and set those private options that replace the deprecated global ones.

    Any help on this ?