Recherche avancée

Médias (91)

Autres articles (67)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

Sur d’autres sites (6115)

  • How to generate valid live DASH for YouTube ?

    24 septembre 2019, par Matt Hensley

    I am attempting to implement YouTube live video ingestion via DASH as documented at :
    https://developers.google.com/youtube/v3/live/guides/encoding-with-dash

    To start, I am exercising the YouTube API manually and running ffmpeg to verify required video parameters before implementing in my app.

    Created a new livestream with liveStreams.insert and these values for the cdn field :

    "cdn": {
       "frameRate": "variable",
       "ingestionType": "dash",
       "resolution": "variable"
    }

    Created a broadcast via liveBroadcasts.insert, then used liveBroadcasts.bind to bind the stream to the broadcast.

    Then I grabbed the ingestionInfo from the stream and ran this ffmpeg command, copying in the ingestionAddress with the streamName :

    ffmpeg -stream_loop -1 -re -i mov_bbb.mp4 \
       -loglevel warning \
       -r 30 \
       -g 60 \
       -keyint_min 60 \
       -force_key_frames "expr:eq(mod(n,60),0)" \
       -quality realtime \
       -map v:0 \
       -c:v libx264 \
       -b:v:0 800k \
       -map a:0 \
       -c:a aac \
       -b:a 128k \
       -strict -2 \
       -f dash \
       -streaming 1 \
       -seg_duration 2 \
       -use_timeline 0 \
       -use_template 1 \
       -window_size 5 \
       -extra_window_size 10 \
       -index_correction 1 \
       -adaptation_sets "id=0,streams=v id=1,streams=a" \
       -dash_segment_type mp4 \
       -method PUT \
       -http_persistent 1 \
       -init_seg_name "dash_upload?cid=${streamName}&copy=0&file=init$RepresentationID$.mp4" \
       -media_seg_name "dash_upload?cid=${streamName}&copy=0&file=media$RepresentationID$$Number$.mp4" \
       'https://a.upload.youtube.com/dash_upload?cid=${streamName}&copy=0&file=dash.mpd'

    It appears all the playlist updates and video segments upload fine to YouTube - ffmpeg does not report any errors. However the liveStream status always shows noData, and the YouTube Live Control Room doesn’t show the stream as receiving data.

    The DASH output, when written to files play backs fine in this test player. The playlist output doesn’t match exactly the samples, but does have the required tags per the "MPD Contents" section in the documentation.

    Are my ffmpeg arguments incorrect, or does YouTube have additional playlist format requirements that are not documented ?

  • NodeJS create a storybook like youtube videos ?

    24 septembre 2019, par Ben Beri

    I need to be creating something like this :

    enter image description here

    This image was generated using this package : https://www.npmjs.com/package/ffmpeg-generate-video-preview

    However its not really suitable because my storybook has to be limited to 10x10 rows/cols and after every image I have to create a next one of 10x10. And I can’t really make it generate rows/columns automatically because it just doesn’t know what is the maximum amount of cols to generate based on the frames.

    How can I do such thing with maybe using ffmpeg ?

  • ffmpeg stream rejected by youtube because it's too slow

    15 septembre 2019, par DeadlyBacon

    I have an app that sends WebM video to a socket in my server, the socket then executes ffmpeg to transform the video to flv and send it to a youtube rtmp ingester.

    Thing is, youtube rejects the stream, saying that the broadcast status is incorrect, no matter what I do.

    In my naivette I initially tried to stream 1080p and 720p, that failed, so I went down, assuming that that might help me with the bitrate issue... It did not.

    The error that Youtube gives me is in spanish but it basically says that the stream is too slow and I should lower resolution or bitrate(I’m already @ 240p and it’s not working).

    Edit : Here’s the error, translated :

    Main broadcast :
    YouTube is not receiving enough video to guarantee a fluid broadcast. Buffering will occur.

    Main Broadcast : We are not receiving video data at a fast enough speed. Your audience may experience buffering. Make sure that your connection is fast enough o consider using a lower bitrate.

    Here is my call to ffmpeg thus far (language is node.js if that matters.) :

    const ffmpeg = child_process.spawn('ffmpeg', [

       '-f', 'lavfi', '-i', 'anullsrc',

       //    '-re', // I was told i shouldnt use this parameter. i dont know, honestly.



       // FFmpeg will read input video from STDIN
       '-i', '-',

       // Because we're using a generated audio source which never ends,
       // specify that we'll stop at end of other input.  Remove this line if you
       // send audio from the browser.
       '-shortest',

       '-vcodec', 'libx264',

       '-acodec', 'aac',

       //ffmpeg
       //-re -loop 1
       //-framerate 2 -i test1.jpg -i https://xxxxxxxxxxx:8443/live.ogg
       //-c:a aac
       //-s 2560x1440
       //-ab 128k -vcodec libx264 -pix_fmt yuv420p -maxrate 2048k -bufsize 2048k
       //'-framerate', '30',
       '-r', '24',
       //'-s', '2560x1440',


       '-force_key_frames','expr:gte(t\,n_forced/2)',

       '-preset', 'ultrafast',
       '-pix_fmt', 'yuv420p',
       '-s', '426x240',
       '-crf', '23',
       '-bf', '2',
       '-q:a', '1',
       '-ac', '2',
       '-ar','48000',
       '-use_editlist','0',
       '-movflags','+faststart',
       //  '-ab', '128k',
       '-g', '48',

       //'-minrate', '1500k',
       '-minrate', '1000k',
       '-maxrate', '2000k',

       '-bufsize', '2000k',/*
       '-g', '30',*/
       //'-keyint_min', '30',
       //'-t', '30',

       ////
       '-deadline', 'realtime',
       '-cpu-used','-16',

       '-tune', 'zerolatency',

       '-threads', '4',

       //-g 2 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/xxxxxxxxxxxxx


       // FLV is the container format used in conjunction with RTMP
       '-f', 'flv',



       // The output RTMP URL.
       // For debugging, you could set this to a filename like 'test.flv', and play
       // the resulting file with VLC.  Please also read the security considerations
       // later on in this tutorial.
       rtmpUrl
    ]);

    Edit : input is fed via stdin whenever the socket receives data.

    In my completely uneducated opinion, the "-deadline realtime" & the "-tune zerolatency" parts seemed to help somewhat but not enough to get me streamin’.

    Here is some of the output of ffmpeg :

    FFmpeg STDERR: frame=   35 fps=3.1 q=23.0 size=     104kB time=00:00:08.44 bitrate= 100.4kbits/s dup=0 drop=3 speed=0.76x    
    FFmpeg STDERR: frame=   35 fps=3.0 q=23.0 size=     104kB time=00:00:08.44 bitrate= 100.4kbits/s dup=0 drop=3 speed=0.726x    
    FFmpeg STDERR: frame=   35 fps=2.9 q=23.0 size=     104kB time=00:00:08.44 bitrate= 100.4kbits/s dup=0 drop=3 speed=0.696x    
    FFmpeg STDERR: frame=   36 fps=2.8 q=23.0 size=     109kB time=00:00:09.45 bitrate=  94.6kbits/s dup=0 drop=3 speed=0.747x    
    FFmpeg STDERR: frame=   36 fps=2.7 q=23.0 size=     109kB time=00:00:09.45 bitrate=  94.6kbits/s dup=0 drop=3 speed=0.719x    
    FFmpeg STDERR: frame=   38 fps=2.8 q=23.0 size=     121kB time=00:00:11.45 bitrate=  86.7kbits/s dup=0 drop=3 speed=0.839x  

    This is extremely frustrating.

    Please, guys, any guidance is good. what is it that I should be maximizing or minimizing ? how do I do that ? I honestly do not care about everything else going to hell (for example the video freezing or being severely pixelated at times) I just care about being able to stream

    Thank you in advance for anything that might help me !

    UPDATE : I created a new broadcast, with variable bitrate, I broadcasted to it, it was at "status:optimal" for a few seconds then fps and speed went down and the stream went back to "status : incorrect"