Recherche avancée

Médias (1)

Mot : - Tags -/ticket

Autres articles (13)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (3114)

  • Using FFMPEG filters directly in Web Audio API AudioWorklet

    13 juillet 2021, par Cynic_Android

    I am trying to leverage the vast numbers of audio filters that FFMPEG has and see if I can use them directly in a custom AudioWorklet so I dont have to reinvent the wheel for each and every filter. One option I came across was to convert the AVFilter library to WASM and write a wrapper class to call the library functions.
https://dev.to/alfg/ffmpeg-webassembly-2cbl

    


    But I am looking for a solution where the data can be piped to the filter and the output passed instantaneously to the other audio worklet nodes so the effect can be heard without a delay.

    


    Any kind of help would be highly appreciated.

    


  • FFMPEG/DASH-LL creates audio and video chunks at different rates ; player is confused (404 errors)

    26 mai 2021, par Danny

    I'm trying to create a MPEG-DASH "live" stream from a static file to test various low latency modes. The DASH muxer in FFmpeg creates two AdaptationSets, one for video chunks and one for audio chunks.

    


    However, the audio and video chunk files are not created at the same rate (should they be ?). ie, here stream0 are the video chunks and stream1 are the audio chunks. After a few seconds of running, the webroot directory contains :

    


    chunk-stream0-00001.m4s  chunk-stream1-00001.m4s  
chunk-stream0-00002.m4s  chunk-stream1-00002.m4s  
chunk-stream0-00003.m4s  chunk-stream1-00003.m4s  
chunk-stream0-00004.m4s  chunk-stream1-00004.m4s  
                         chunk-stream1-00005.m4s  
                         chunk-stream1-00006.m4s  
                         chunk-stream1-00007.m4s  
                         chunk-stream1-00008.m4s  
                         chunk-stream1-00009.m4s  
master.mpd  
init-stream0.m4s  
init-stream1.m4s  


    


    The stream doesn't load (or play) on either dash.js or shaka-player and there are lots of 404 (Not Found) errors for the video chunks. The player is requesting chunks from both stream0 and stream1 in sequence, ie, stream0-001 + stream1-001, then stream0-002 + stream1-002 and so on.

    


    But since stream0 only goes from 001 to 004, there are lots of 404 errors as it tries to load stream0-005 through 009.

    


    The gap gets wider after letting FFmpeg run for a while. eg, stream0 is 62 to 75 but stream1 is 174 to 187. Reloading the player page at this point fails with dash.all.debug.js:15615 [2055][FragmentController] No video bytes to push or stream is inactive. and shows 404 errors stream0 chunk 188 (which doesn't exist yet !)

    


    enter image description here

    


    The FFmpeg command was adopted from DASH streaming from the top-down :

    


    ffmpeg -re -i /mnt/swdevel/TestStreams/H264/ThreeHourMovie.mp4 \
-c:v libx264 -x264-params keyint=120:scenecut=0 -b:v 1M -c:a copy \
-f dash -dash_segment_type mp4 \
 -seg_duration 2 \
 -target_latency 3 \
 -frag_type duration \
 -frag_duration 0.2 \
 -window_size 10 \
 -extra_window_size 3 \
 -streaming 1 \
 -ldash 1 \
 -use_template 1 \
 -use_timeline 0 \
 -write_prft 1 \
 -fflags +nobuffer+flush_packets \
 -format_options "movflags=+cmaf" \
 -utc_timing_url "/pelican/testPlayers/time.php" \
 master.mpd


    


    And the dash.js player code is very simple :

    


    const srcUrl = "../ottWebRoot/playerTest/master.mpd"; 

var player = dashjs.MediaPlayer().create();

let autoPlay = false;
player.initialize(document.querySelector("#videoTagId"), srcUrl, autoPlay);

player.updateSettings(
{
    streaming :
    {
        lowLatencyEnabled : true,
        liveDelay : 2,
        jumpGaps : true,
        jumpLargeGaps : true,
        smallGapLimit : 1.5,
    }
});


    


    To provide the UTCTiming element in the manifest, the small time.php URL returns a UTC time from the web server :

    


    <?php
    print gmdate("Y-m-d\TH:i:s\Z");
?>


    


    (It also shows 404 errors for the latest stream1/audio chunk, that's likely a different problem)

    


    I'm not sure what to try next. Any and suggestions greatly appreciated.

    


    EDIT I

    


    The suggestion by @Anonymous Coward to change the key interval improved things a lot. The chunks for stream0 and stream1 are in lock-step and have identical sequence numbers.

    


    However, there are still many 404 errors, both on initial page load (without pressing play) and during playback.

    


    I ran watch -n 1 ls -lt code> and compared side-by-side to the errors in the browser console.  It&#x27;s hard to compare but it <em>looks</em> like the browser is trying to fetch files "on the play edge" which haven&#x27;t yet been created by FFmpeg.  See the pic below.

    &#xA;

    How do I instruct the browser to wait just a bit more before fetching the edge chunks ?

    &#xA;

    enter image description here

    &#xA;

    EDIT II

    &#xA;

    Using shaka-player instead of dash.js plays properly without 404 errors. Configured as :

    &#xA;

        player.configure(&#xA;    {&#xA;        streaming: &#xA;        {&#xA;            lowLatencyMode: true,&#xA;            inaccurateManifestTolerance: 0,&#xA;            rebufferingGoal: 0.1,&#xA;        }&#xA;        &#xA;    });&#xA;

    &#xA;

    Client

    &#xA;

      &#xA;
    • MacOS 10.12
    • &#xA;

    • dash.js latest 3.2.2
    • &#xA;

    • Chrome 79, Safari 12, FireFox v ?
    • &#xA;

    &#xA;

    Server

    &#xA;

      &#xA;
    • Apache 2.4.37
    • &#xA;

    • PHP 7.2.4 (for time function only)
    • &#xA;

    • Centos 8
    • &#xA;

    &#xA;

    (For reference, here is the mpd file generated by FFmpeg)

    &#xA;

    &lt;?xml version="1.0" encoding="utf-8"?>&#xA;<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="dynamic" minimumupdateperiod="PT500S" availabilitystarttime="2021-05-24T14:50:00.263Z" publishtime="2021-05-24T15:22:45.335Z" timeshiftbufferdepth="PT50.0S" maxsegmentduration="PT2.0S" minbuffertime="PT5.0S">&#xA;    <programinformation>&#xA;    </programinformation>&#xA;    <servicedescription>&#xA;        <latency target="3000" referenceid="0"></latency>&#xA;    </servicedescription>&#xA;    <period start="PT0.0S">&#xA;        <adaptationset contenttype="video" startwithsap="1" segmentalignment="true" bitstreamswitching="true" framerate="24/1" maxwidth="1280" maxheight="682" par="15:8" lang="und">&#xA;            <resync dt="200000" type="0"></resync>&#xA;            <representation mimetype="video/mp4" codecs="avc1.64081f" bandwidth="1000000" width="1280" height="682" sar="1023:1024">&#xA;                <producerreferencetime inband="true" type="captured" wallclocktime="2021-05-24T14:50:00.263Z" presentationtime="0">&#xA;                    <utctiming schemeiduri="urn:mpeg:dash:utc:http-xsdate:2014" value="/pelican/testPlayers/time.php"></utctiming>&#xA;                </producerreferencetime>&#xA;                <resync dt="5000000" type="1"></resync>&#xA;                <segmenttemplate timescale="1000000" duration="2000000" availabilitytimeoffset="1.800" availabilitytimecomplete="false" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="1">&#xA;                </segmenttemplate>&#xA;            </representation>&#xA;        </adaptationset>&#xA;        <adaptationset contenttype="audio" startwithsap="1" segmentalignment="true" bitstreamswitching="true" lang="und">&#xA;            <resync dt="200000" type="0"></resync>&#xA;            <representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="116317" audiosamplingrate="48000">&#xA;                <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>&#xA;                <producerreferencetime inband="true" type="captured" wallclocktime="2021-05-24T14:50:00.306Z" presentationtime="0">&#xA;                    <utctiming schemeiduri="urn:mpeg:dash:utc:http-xsdate:2014" value="/pelican/testPlayers/time.php"></utctiming>&#xA;                </producerreferencetime>&#xA;                <resync dt="21333" type="1"></resync>&#xA;                <segmenttemplate timescale="1000000" duration="2000000" availabilitytimeoffset="1.800" availabilitytimecomplete="false" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="1">&#xA;                </segmenttemplate>&#xA;            </representation>&#xA;        </adaptationset>&#xA;    </period>&#xA;    <utctiming schemeiduri="urn:mpeg:dash:utc:http-xsdate:2014" value="/pelican/testPlayers/time.php"></utctiming>&#xA;</mpd>&#xA;

    &#xA;

  • lavf/srtdec : rewrite parsing logic

    22 décembre 2015, par Clément Bœsch
    lavf/srtdec : rewrite parsing logic
    

    Fixes Ticket #5032

    The samples in Ticket #5032 is using \r\r\n as line breaks. Since we
    already are handling \r, or \n, or \r\n as line breaks, \r\n\n will be
    considered as a double line breaks. This is an issue because
    ff_subtitles_read_text_chunk() will as a result stop extracting a chunk
    after just one line.

    So instead of parsing the SRT by "chunks" (which means splitting every
    double LB), this new parser is detecting timing lines, and split the
    events on this basis. While this sounds safe and simple, it needs to
    take into account the event number preceding the timing line while
    handling situations such as :

    - event number starting at 0 or actually any number instead of 1
    - event numbers not being ordered at all
    - event number being followed by text garbage (this really happened,
    see Ticket #4898)
    - event payload containing one or multiple number (a protagonist saying
    a count-down, a date or whatever) which could be confused with a
    chapter number
    - event number being empty (see Ticket #2167)
    - all kind of weird line breaks can appear randomly like wild pokémons
    - untrustable line breaks (Ticket #5032)

    The sample madness.srt tries to sum up most of this into one sample,
    ticket5032-rrn.srt is the file containing \r\r\n line breaks. and
    empty-events-2167.srt contains empty events.

    • [DH] libavformat/srtdec.c
    • [DH] tests/fate/subtitles.mak
    • [DH] tests/ref/fate/sub-srt-empty-events
    • [DH] tests/ref/fate/sub-srt-madness-timeshift
    • [DH] tests/ref/fate/sub-srt-rrn-remux