Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (32)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (6041)

  • Android-How to pass back frames from FFmpeg back to Android

    23 octobre 2013, par yarin

    It is an architecture question-i am really interesting about the answer

    I building an app with following goals :

    1.record video with effect in real time(using FFmpeg)

    2.display the customized video in real time for the user while he recording

    So,after 1 month of working...i decide to remember that goal number 2 is worth to thinking about :)
    I have a ready skeleton app that record video with effect in real time.
    but i have to preview this customized frame back to the user.

    My options (and this is my question) :

    1.Each frame that pass from onPreviewFrame(byte[] video_frame_data, Camera camera) to ffmpeg with JNI to encode-will sending back to android through the same JNI after i apply the effects(i mean : onPreviewFrame->JNI to FFMPEG->immediately apply effect->send the costumed frame back to android side for display->encode the costumed frame).

    Advantages : it is look like is the most easy to use.

    Disadvantages : use the JNI twice or the passing back the frame could consume time(i really don't now if it really big price to pay,cuz it is only byte array or int array per frame to send to android side)

    2.I heard about openGL on ndk,but i think that the surface it self created on the android side-so is it really going to be better ?
    i prefer to use other surface that i using now in java

    3.Create an video player on FFmpeg to preview each customized frame in real time.

    Thank for your helping,i hope that the first solution is available and not consume to much expensive time in terms of real time processing

  • fate : Switch ra4-288 test from framecrc() to pcm()

    24 septembre 2014, par Katerina Barone-Adesi
    fate : Switch ra4-288 test from framecrc() to pcm()
    

    The decoder is float-based and the test needs to allow for some fuzz.

    Signed-off-by : Diego Biurrun <diego@biurrun.de>

    • [DBH] tests/fate/real.mak
    • [DBH] tests/ref/fate/ra4-288
  • How to display a stream of mdat/moof boxes in VLC ? [closed]

    8 juillet 2024, par roacs

    I am trying to display a real-time video stream in VLC. The snag is that the real-time video that is being received is a stream of just the mdat and moof boxes of a fragmented MP4 file that is being recorded elsewhere. The initialization information (ftyp/moov) is not and will never be available in the real-time stream. There is also no audio.

    &#xA;

    I have access to initialization information (ftyp/moov) of a previously completed file and can use that to aid in the processing/streaming of the real-time mdat/moof boxes.

    &#xA;

    I am currently extracting the contents of the mdat box, splitting those up and packaging them in 188 byte MPEG-TS packets and multicasting them for VLC to pick up. And just as a shot in the dark, every 50 mdat's I am also packaging the SPS and PPS NALUs from the initialization information of the completed file and multicasting those in one MPEG-TS packet.

    &#xA;

    Input looks like this :

    &#xA;

      &#xA;
    • ...
    • &#xA;

    • mdat 1
    • &#xA;

    • moof 1
    • &#xA;

    • mdat 2
    • &#xA;

    • moof 2
    • &#xA;

    • ...
    • &#xA;

    • mdat N
    • &#xA;

    • moof N
    • &#xA;

    • ...
    • &#xA;

    &#xA;

    And my output looks like this :

    &#xA;

      &#xA;
    • ...
    • &#xA;

    • MPEG-TS 1 containing first 184 bytes of mdat 1
    • &#xA;

    • MPEG-TS 2 containing next 184 bytes of mdat 1
    • &#xA;

    • ...
    • &#xA;

    • MPEG-TS N containing last 184 bytes of mdat 1
    • &#xA;

    • MPEG-TS N+1 containing first 184 bytes of mdat 2
    • &#xA;

    • MPEG-TS N+2 containing next 184 bytes of mdat 2
    • &#xA;

    • ...
    • &#xA;

    • MPEG-TS N+M containing last 184 bytes of mdat 2
    • &#xA;

    • ...
    • &#xA;

    • MPEG-TS containing SPS and PPS NALU
    • &#xA;

    • ...
    • &#xA;

    &#xA;

    VLC gets the data but no video playback.

    &#xA;

    How do I process this input in order to get it to play in VLC ?

    &#xA;