Recherche avancée

Médias (1)

Mot : - Tags -/publicité

Autres articles (78)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (7311)

  • Set presentation timestamps on sample buffers before giving them to AVSampleBufferDisplayLayer

    6 octobre 2016, par Moustik

    I am trying to decode and render a H264 video network stream using AVSampleBufferDisplayLayer on iOS 10.

    I am getting the frame packets using ffmpeg. Then, I am converting the NALUs to AVCC format and creating sample buffers. Finally I pass the buffers to the AVSampleBufferDisplayLayer for rendering. The stream is displaying well when kCMSampleAttachmentKey_DisplayImmediately is set to kCFBooleanTrue.
    However when I am trying to use a controlTimebase in order to define the presentation timestamps, the display is quite stuck.

    Any idea or help with the handling of presentation timestamps ?

    AVPacket packet;
    av_read_frame(_formatCtx, &packet);

    // ...
    // Parse NALUs and create blockbuffer
    // ...

    AVStream *st = _formatCtx->streams[_videoStream];

    CMSampleTimingInfo* timing;
    timing = malloc(sizeof(CMSampleTimingInfo));
    timing->presentationTimeStamp = CMTimeMake(packet.pts, st->time_base.den);
    timing->duration = CMTimeMake(packet.duration, st->time_base.den);
    timing->decodeTimeStamp = CMTimeMake(packet.dts, st->time_base.den);

    const size_t sampleSize = blockLength;
    _status = CMSampleBufferCreate(kCFAllocatorDefault,
                                  blockBuffer, true, NULL, NULL,
                                  _formatDescriptionRef, 1, 1, timing, 1,
                                  &sampleSize, &sampleBuffer);

    [self.renderer enqueueSampleBuffer:sampleBuffer];
  • hls.js - how to subscribe to any event

    24 octobre 2016, par momomo

    First time I play a playlist, I keep getting bufferStalledError.

    I can detect this error, however, if I resolve this error then a jump or skip in the video play back occurs and it stops occurring.

    However, if i ignore it then it will play without a noticable interruption and the video keeps playing. But this error will continue to be raised but without any noticable issues.

    However, at times, the same error will result in an overflow and hls.js won’t be able to recover automatically, and no further errors are reported after such a failure than the last bufferStalled error.

    I have to restart the video through destroy and attach again resuming the play functionality hls.js is unable to do automatically or through recoverMediaErrors().

    The only problem is that I am unable to subscribe to an event that says that the video is playing, or stuck. After a bufferStalledMediaError hls.js will recover it automatically without a glitch, but sometimes it fails to do so. In both cases no more errors are reported.

    But is there maybe another event reported that says it is playing that is not an error report ?

    Is there an hls.on(Hls.Events.ALL, ... ) event ?

    What about setting up a TimeLineController ? It’s not documented.

    Reference :
    https://github.com/dailymotion/hls.js/blob/master/API.md

  • geting no result from GetStringUTFChars JNI

    9 avril 2013, par talhamalik22

    android NDK for my android application. i am stuck on the starting lines and it is not compiling further. Following is my code. It doest not compile after "str = (*env)->GetStringUTFChars(env, filename, NULL) ;". Please check my java and c code

    The java code :

    public class MyffmpegActivity extends Activity {

    private static native int logFileInfo(String filename);

    static
    {
       Log.i("HEHA", "HOHA");
       System.loadLibrary("mylib");
    }

    @Override
    protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_myffmpeg);
       String path=Environment.getExternalStorageDirectory().getPath();
       path=path+"/test.mp4";
       Log.i("Name Returned is ", ":"+path);
    int x=logFileInfo(path);


    }

    The C code

    jint Java_com_example_myffmpegtest_MyffmpegActivity_logFileInfo(JNIEnv * env, jobject  this, jstring filename)
    {

    av_register_all();
      AVFormatContext *pFormatCtx;
      const jbyte *str;

     str = (*env)->GetStringUTFChars(env, filename, NULL);
    if(av_open_input_file(&pFormatCtx, str, NULL, 0, NULL)!=0)
    {
       LOGE("Can't open file '%s'\n", str);
       return 1;
    }
    else
    {

       LOGI("File was opened\n");
       LOGI("File '%s', Codec %s",
           pFormatCtx->filename,
           pFormatCtx->iformat->name
       );
    }
    return 0;}