Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (18)

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (4826)

  • Real time livestreaming - RPI FFmpeg and H5 Player

    29 avril 2022, par Victor

    I work at a telehealth company and we are using connected medical devices in order to provide the doctor with real time information from these equipements, the equipements are used by a trained health Professional.

    


    Those devices work with video and audio. Right now, we are using them with peerjs (so peer to peer connection) but we are trying to move away from that and have a RPI with his only job to stream data (so streaming audio and video).

    


    Because the equipements are supposed to be used with instructions from a doctor we need the doctor to receive the data in real time.

    


    But we also need the trained health professional to see what he is doing (so we need a local feed from the equipement)

    


    How do we capture audio and video

    


    We are using ffmpeg with a go client that is in charge of managing the ffmpeg clients and stream them to a SRS server.
This works but we are having a 2-3 sec delay when streaming the data. (rtmp from ffmpeg and flv on the front end)

    


    ffmpeg settings :

    


    ("ffmpeg", "-f", "v4l2", `-i`, "*/video0", "-f", "flv", "-vcodec", "libx264", "-x264opts", "keyint=15", "-preset", "ultrafast", "-tune", "zerolatency", "-fflags", "nobuffer", "-b:a", "160k", "-threads", "0", "-g", "0", "rtmp://srs-url")


    


    My questions

    


      

    • Is there a way for this set up to achieve low latency (<1 sec) (for the nurse and for the doctor) ?
    • &#xA;

    • Is the way I want to achieve this good ? Is there a batter way ?
    • &#xA;

    &#xA;

    Flow schema

    &#xA;

    Data exchange and use case flow :

    &#xA;

    Data exchange and use case flow

    &#xA;

    &#xA;

    Note : The nurse and doctor use HTTP-FLV to play the live stream, for low latency.

    &#xA;

    &#xA;

  • FFMPEG video editing application. Need time and date stamp burned into video

    11 mai 2022, par Jacob

    I am developing an application for video editing. The main component of this application is to produce a single video file from several video files captured from a camcorder with the time and date stamp displayed on the final rendered video, much like the final product from a security camera. I have figured out, by using FFMPEG, how to burn the date and time into the video with a .SRT file as well as with DrawText like the following :

    &#xA;

    ffmpeg -y -i video.mp4 -vf “drawtext=fontfile=roboto.ttf:fontsize=12:fontcolor=yellow:text=&#x27;%{pts\:localtime\:1575526882\:%A, %d, %B %Y %I\\\:%M\\\:%S %p}&#x27;" -preset ultrafast -f mp4 output_new.mp4    &#xA;

    &#xA;

    I would rather use the DrawText method so the user does not have to wait longer while creating the .SRT files. I am new to FFMPEG and I find their documentation very confusing. I guess I am hoping there is someone out there who has experience with it.

    &#xA;

    Everything seems to work when I pass in the date created meta data from the video file and drawtext just does its thing. The problem is my application allows for editing of the video. I do this, for lack of better solution, by allowing the user to select beginning and ending frames they do not want, from the UI and then the code simply deletes the frames from the directory where they were split and saved. I then use FFMPEG to iterate through the directory and combine the remaining frames to make a video file.

    &#xA;

    This approach starts the time and date from the date created metadata ; however, cutting the frames out of the video will make the DT stamp inaccurate, due to the missing frames.

    &#xA;

    Is there any way to tell FFMPEG to burn in the date and time from date/time retrieved from each individual frame ? I appreciate any advice that you may have.

    &#xA;

  • How to calculate the start time of mp4 video ?

    27 mai 2022, par Neil Galiaskarov

    I am studying mp4 video structure. I have an issue with reading the start time value for the following mp4 video

    &#xA;

    I have read this answer&#xA;mp4 video starts at different time on Quicktime/AVplayer vs Chrome/Firefox

    &#xA;

    and it says that Edit atom can modify the start time.

    &#xA;

    Using ffprobe I have the following output :

    &#xA;

        "start_time": "0.033333",&#xA;    "duration_ts": 327,&#xA;    "duration": "10.900000",&#xA;    "bit_rate": "9420949",&#xA;

    &#xA;

    Using mp4dumper I have the following atoms structure which proves missing Edit atom file :

    &#xA;

    ftyp (24 @ 0)&#xA;free (8 @ 24)&#xA;moov (7034 @ 32)&#xA;  mvhd (108 @ 40)&#xA;  trak (2883 @ 148)&#xA;      tkhd (92 @ 156)&#xA;      mdia (2783 @ 248)&#xA;          mdhd (32 @ 256)&#xA;          hdlr (52 @ 288)&#xA;          minf (2691 @ 340)&#xA;              smhd (16 @ 348)&#xA;              dinf (36 @ 364)&#xA;                  dref (28 @ 372)&#xA;                      url  (12 @ 388)&#xA;              stbl (2631 @ 400)&#xA;                  stsd (91 @ 408)&#xA;                      mp4a (75 @ 424)&#xA;                  stts (24 @ 499)&#xA;                  stsc (304 @ 523)&#xA;                  stsz (2056 @ 827)&#xA;                  stco (148 @ 2883)&#xA;  trak (4035 @ 3031)&#xA;      tkhd (92 @ 3039)&#xA;      mdia (3935 @ 3131)&#xA;          mdhd (32 @ 3139)&#xA;          hdlr (52 @ 3171)&#xA;          minf (3843 @ 3223)&#xA;              vmhd (20 @ 3231)&#xA;              dinf (36 @ 3251)&#xA;                  dref (28 @ 3259)&#xA;                      url  (12 @ 3275)&#xA;              stbl (3779 @ 3287)&#xA;                  stsd (163 @ 3295)&#xA;                      avc1 (147 @ 3311)&#xA;                  stts (24 @ 3458)&#xA;                  ctts (1960 @ 3482)&#xA;                  stsc (40 @ 5442)&#xA;                  stsz (1328 @ 5482)&#xA;                  stco (148 @ 6810)&#xA;                  stss (108 @ 6958)&#xA;mdat (13096745 @ 7066)&#xA;

    &#xA;

    How ffprobe calculates 0.033333 start time value ?

    &#xA;