Recherche avancée

Médias (1)

Mot : - Tags -/biographie

Autres articles (36)

  • Les statuts des instances de mutualisation

    13 mars 2010, par

    Pour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
    Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...)

  • L’espace de configuration de MediaSPIP

    29 novembre 2010, par

    L’espace de configuration de MediaSPIP est réservé aux administrateurs. Un lien de menu "administrer" est généralement affiché en haut de la page [1].
    Il permet de configurer finement votre site.
    La navigation de cet espace de configuration est divisé en trois parties : la configuration générale du site qui permet notamment de modifier : les informations principales concernant le site (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (4625)

  • Studying A Game Wave Disc

    23 novembre 2010, par Multimedia Mike — Game Hacking

    I picked up a used copy of game called Gemz — a rather flagrant Bejeweled clone — for a game console called Game Wave Family Entertainment System. Heard of it ? Neither had I. But the game media is optical, so I had to get it and study it.



    When mounted in Linux (as UDF), the disc is reported to contain 2.8 GB of data, so it has to be a DVD. 810 MB of that is dedicated to the movies/ directory. Multimedia format ? Just plain, boring MPEG files (very YouTube-friendly— here’s the opening animation). Deeper digging reveals some more subdirectories called movies/ that, combined, occupy the lion’s share of the disc space. Additionally, there are several single-frame .m2v files in a directory called iframes/ which are used to encode things like load screens.



    There are more interesting data files including .zbm files for images and fonts, and .zwf files for audio. I suspect that these stand for zipped bitmap and zipped wave file, respectively. They can’t be directly unzipped with ’gunzip’. Some of the numbers at the start of some files lead me to believe they can be easily decompressed with standard zlib facilities.

    Based on the binary files on the Gemz disc, I couldn’t find any data on what CPU this system might use. A little Googling led me to this page at the Video Game Console Library which pegs the brain as a Mediamatics 6811. Some searching for that leads me to a long-discontinued line of hardware from National Semiconductor.

    The Console Library page also mentions that the games were developed using the Lua programming language. Indeed, there are many Lua-related strings in the game’s binaries (’zlib’ also makes an appearance).

  • FFMPEG :Adjusting PTS and DTS of encoded packets with real timestamps information

    19 août 2020, par jackey balwani

    I am trying to record a video of activities in desktop using muxing in FFMPEG Library.
I referred following link for my implementation.
https://ffmpeg.org/doxygen/trunk/muxing_8c_source.html

    


    I observe that the encoded video plays too fast. Below is the code of encoding the frame and then writing it to output file.

    


    static int write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c,
                        AVStream *st, AVFrame *frame)
{
    int ret;
 
    // send the frame to the encoder
    ret = avcodec_send_frame(c, frame);
    if (ret < 0) {
    fprintf(stderr, "Error sending a frame to the encoder: %s\n",av_err2str(ret));
    exit(1);
    }
    while (ret >= 0) {
        AVPacket pkt = { 0 };

        ret = avcodec_receive_packet(c, &pkt);
        if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
           break;
        else if (ret < 0) {
           fprintf(stderr, "Error encoding a frame: %s\n", av_err2str(ret));
           exit(1);
        }

        /* rescale output packet timestamp values from codec to stream timebase(from 25 to 90000 this case) */
        av_packet_rescale_ts(&pkt, c->time_base, st->time_base);
        pkt.stream_index = st->index;

        /* Write the compressed frame to the media file. */
        log_packet(fmt_ctx, &pkt);
        ret = av_interleaved_write_frame(fmt_ctx, &pkt);
        av_packet_unref(&pkt);
        if (ret < 0) {
              fprintf(stderr, "Error while writing output packet: %s\n", av_err2str(ret));
              exit(1);
        }
    }
  
    return ret == AVERROR_EOF ? 1 : 0;
  }


    


    -> I am rescaling output packet timestamp values from codec to stream timebase (25 -> 90,000) before writing.
-> Since encoded video does plays very fast, so I have added temporary fix to multiply stream timebase by some factor just before av_packet_rescale_ts (in my case it is 5 -> 5*st->time_base.den which becomes 4,50,000) so that output video duration increases. This may not be the accurate solution as this would not give the real time shift between the frames.

    


    /**  rescale output packet timestamp values from codec to stream timebase  **/
st->time_base.den = 5*st->time_base.den;
av_packet_rescale_ts(pkt, *time_base, st->time_base);
pkt->stream_index = st->index;
/**  Write the compressed frame to the media file.  **/
return av_interleaved_write_frame(fmt_ctx, pkt);


    


    -> I want that output video duration should be similar to the input. I have heard of things like giving real timestamps to the PTS and DTS of frame and packets and encoding at variable or unknown frame rate.
Can we use these , If yes then How to implement ?
any suggestions or pseudo code of the approach would really help.

    


    Thanks

    


  • C flags which gives priority to quality for decoded images in ffmpeg

    18 janvier 2016, par Tank2005

    I’m trying to decode h.264 streaming with ffmpeg(latest version) on Android NDK.

    I succeeded to get a decoded frame. But, an aquired image is very dirty even if low latency flag is disabled.

    If I want to give priority to quality over decoding speed, which flags should I specify ?

    bool initCodec(bool low_latency)
    {
       av_register_all();

       codec = avcodec_find_decoder(AV_CODEC_ID_H264);
       if(!codec) return false;

       context = avcodec_alloc_context3(codec);
       if(!context) return false;

       if(codec->capabilities & CODEC_CAP_TRUNCATED) context->flags |= CODEC_FLAG_TRUNCATED;
       if(low_latency == true) context->flags |= CODEC_FLAG_LOW_DELAY;

       frame = av_frame_alloc();

       int res = avcodec_open2(context, codec, NULL);
       if (res < 0) {
           qDebug() << "Coundn't open codec :" << res;
           return false;
       }

       av_init_packet(&avpkt);
       return true;
    }

    void sendBytes(unsigned char *buf, int buflen)
    {
       avpkt.size = buflen;
       avpkt.data = buf;

       int got_frame, len;
       while (avpkt.size > 0) {
           len = avcodec_decode_video2(context, frame, &got_frame, &avpkt);
           if (len < 0) {
               qDebug() << "Error while decoding : " << len;
               break;
           }
           if (got_frame) {
               onGotFrame(frame);
           }
           avpkt.size -= len;
           avpkt.data += len;
       }
    }

    Decoded image sample

    Ex : I heard that it made a problem while compiling the library. So I write a compile option here(I built it on OpenSUSE Linux).

    #!/bin/bash
    NDK=/home/ndk
    SYSROOT=$NDK/platforms/android-9/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    function build_one
    {
    ./configure \
    --prefix=$PREFIX \
    --enable-shared \
    --disable-static \
    --disable-avdevice \
    --disable-doc \
    --disable-symver \
    --disable-encoders \
    --disable-decoders \
    --enable-decoder=h264 \
    --enable-decoder=aac \
    --disable-protocols \
    --disable-demuxers \
    --disable-muxers \
    --disable-filters \
    --disable-network \
    --disable-parsers \
    --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
    --target-os=linux \
    --arch=arm \
    --enable-asm --enable-yasm \
    --enable-cross-compile \
    --sysroot=$SYSROOT \
    --extra-cflags="-Os -marm -march=armv7-a -mfloat-abi=softfp -mfpu=neon" \
    --extra-ldflags="-Wl,--fix-cortex-a8" \
    $ADDITIONAL_CONFIGURE_FLAG
    make clean
    make
    make install
    }
    CPU=armv7-a
    PREFIX=$(pwd)/android/$CPU
    build_on