Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (33)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5134)

  • is it possible to play h264-mpegts format in ios ?

    11 octobre 2012, par jAckOdE

    Server transcodes the video to the h264/aac, and send video data to client simultaneously. To play and save the video data at the same time, I use mpegts as the container format, but the problem is that ios MediaPlayer can not play mpegts file.

    Google said that i can used ios-build of ffmpeg to do the task, but it seem to be an overkill. is there any other ways to play mpegts in iphone using just iOS SDK ?

  • Getting accurate time from FFMPeg with Objective C (Audio Queue Services)

    2 avril 2012, par Winston

    My iPhone app plays an audio file using FFMPeg.

    I'm getting the elapsed time (to show to user) from the playing audio (in minutes and seconds after converting from microseconds, given by FFMPeg) like so :

    AudioTimeStamp currentTimeStamp;
    AudioQueueGetCurrentTime (audioQueue, NULL, &currentTimeStamp, NULL);

    getFFMPEGtime = currentTimeStamp.mSampleTime/self.basicAudioDescription.mSampleRate;

    self.currentAudioTime = [NSString stringWithFormat: @"%02d:%02d",
                               (int) getFFMPEGtime / (int)60000000,
                               (int) ((getFFMPEGtime % 60000000)/1000000)];

    Everything works fine, but when I scrub back or forward to play another portion of the song, the elapsed time will go back to zero, no matter the current position. The timer will always zero out.

    I know I'm suposed to do some math to keep track of the old time and the new time, maybe constructing another clock or so, perhaps implementing another callback function, etc... I'm not sure what way I should go.

    My questions are :

    1) What's the best approach to keep track of the elapsed time when going back/forward in a song, avoiding the clock to always going back to zero ?

    2) Should I look deeply into FFMPeg functions or should I stick with Objective-C and Cocoa Touch for solving this problem ?

    Please, I need some advices/ideas from experienced programmers. I'm stuck. Thanks beforehand !

  • Displaying YUV420 data using Opengles shader is too slow

    28 novembre 2012, par user1278982

    I have a child thread called A to decode video using ffmpeg on iPhone 3GS, another thread called B to display yuv data, in thread B, I used glSubTexImage2D to upload Y U V textures, and then convert yuv data to RGB in shader, but the frame rate in the decode thread is only 15fps.Why ?

    Update :
    The frame size is 720 * 576.
    I also found something interesting that if I didn't start the thread displaying the YUV data, the frame rate calculated in the decode thread is 22 fps,otherwise 15 fps.So I think that my displaying method must be inefficient.the code as below.

    I have a callback in the decode thread :

    typedef struct _DVDVideoPicture
    {
      char *plane[4];
      int iLineSize[4];
    }DVDVideoPicture;

    void YUVCallBack(void *pYUVData, void *pContext)
    {
      VideoView *view = (VideoView *)pContext;
      [view.glView copyYUVData:(DVDVideoPicture *)pData];
      [view calculateFrameRate];
    }

    The copyYUVData method extract the y u v planes separately. The following is displaying thread method.