Recherche avancée

Médias (91)

Autres articles (32)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (4529)

  • Put vast or vpaid overlay on live stream rtmp url output

    8 février 2020, par yolov3

    how to merge vast/vpaid overlay on live stream rtmp url output using ffmpeg ?
    it means get rtmp or m3u8 output stream link with merged vast/paid generated on stream time & can’t be change it.

  • Node.js Live Streaming : Avoid buffering

    27 octobre 2012, par Shirish Kamath

    I've written a small nodeJS server that outputs system audio captured by ffmpeg on Windows (using DirectShow) to the browser as a streaming MP3 file. The audio needs to be as live as possible, with minimum/no buffering, and a "skipping" effect in the audio is perfectly acceptable.

    When I play the audio in Chrome using the HTML5 audio tag, there's a delay of about 8-10 secs over a low-latency LAN connection. I suspected this to be a client-side buffer, and used a Flash MP3 player on the client-side, which brought down the delay to 2-3 secs.

    Now, the buffering seems to taking place on the server-side. The documentation for NodeJS's response.write mentions that the data is written kernel buffers. How do I go about avoiding any buffering altogether or at least getting around it, so that the client always gets the latest audio data ? Strategies for handling 'drain' events to always push live data ?

    On the request object, I've used setNoDelay(true) to avoid the use of Nagle's algorithm. Following is a snippet of how data is written when the spawned ffmpeg process emits data.

    var clients = []; //List of client connections currently being served
    ffmpeg.stdout.on('data', function(data) {
       for(var i = 0; i < clients.length; i++){
           clients[i].res.write(data);
       }
    });
  • Live Stream using .m3u8 and .ts files with iPhone as server

    26 février 2015, par Bhumit

    I am trying to accomplish a task to live stream from iPhone camera. I have done some research and found that i can use .m3u8 files for streaming live video with should contain .ts(Mpeg-2) files .

    Now the file which i have on my iPhone is .mp4 file and it does not work with .m3u8, so i figured out i will have to convert .mp4 to .ts for that , but i have not succeeded in doing so.

    I found that it is possible to convert video ffmpeg lib as mentioned in this article here. I have successfully imported ffmpeg library but not able figure out how can i use it to convert a video as i am using this for first time.

    One another thing apple documentation says

    There are a number of hardware and software encoders that can create
    MPEG-2 transport streams carrying MPEG-4 video and AAC audio in real
    time.

    What is being said here ? is there any other way i can use .mp4 files for live streaming without converting them from iOS ?

    Let me know if i am not clear, i can provide more information .Any suggestion is appreciated. I would like to know am i on a right path here ?

    EDIT

    I am adding more info to my question, so basically what i am asking is , we can convert .mp4 video to .ts using following command

    ffmpeg -i file.mp4 -acodec libfaac -vcodec libx264 -an -map 0 -f segment -segment_time 10 -segment_list test.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream%05d.ts

    How can i use ffmpeg library to do what this command does in iOS.