Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (8749)

  • FFmpeg drawtext filter - is it possible to use variables with live data for x,y coordinates ?

    3 mai 2019, par DavidK

    I’d like to use variables for FFmpeg’s drawtext filter’s x,y coordinates so I can feed them with real-time data. The below solution with sendcmd works but I have to add relative timecodes at the beginnings so FFmpeg can link coordinates to time positions. Can it be done without timecodes with only the actual coordinates and tell FFmpeg that it should update these coordinates every 100ms ?

    It would look like this in my case :

    cmd.txt

    drawtext reinit ’x=960:y=540’ ; (coordinates change when there’s a new position from the live source and FFmpeg updates these via sendcmd regularly).

    Thanks !

  • How to use ffmpeg for live streaming fragmented mp4 ?

    2 mai 2018, par Cross_

    Following a variety of stackoverflow suggestions I am able to create a fragmented MP4 file, then slice it into the header part (FTYP & MOOV) and various segment files (each containing MOOF & MDAT). Using Media Source Extensions I download and add the individual segments - that’s all working great.

    Now I would like to create a live streaming webcam with that same approach. I had hoped that I could just send the MOOV box to each new client plus the currently streaming segment. This however is rejected as invalid data in the browser. I have to start with the first segment and they have to be appended in order. That’s not helpful for a live streaming scenario where you don’t want to see the whole stream from the start. Is there any way to alter the files so that the segments are truly independent and you can start playback from the middle ?

    For reference, this is how I am setting up the stream on the OS X server :

    $ ffmpeg -y -hide_banner -f avfoundation -r 30 -s 1280x720
    -pixel_format yuyv422 -i default -an -c:v libx264 -profile:v main -level 3.2 -preset medium -tune zerolatency -flags +cgop+low_delay -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof+isml
    -pix_fmt yuv420p | split_into_segments.py

    Playback is done with a slightly modified version of this sample code :
    https://github.com/bitmovin/mse-demo/blob/master/index.html

  • live-streaming media with

    8 février 2012, par Thanh Hoang

    My live-streaming system streams media in following step :

    1. Server loads source (from a file, webcam,...)
    2. Server decodes this source by using ffmpeg to get audio and video stream in raw format
    3. Server encodes audio and video to aac and libx264 and then stores them with [AVPacket] (http://ffmpeg.org/doxygen/0.6/avpacket_8c.html)
    4. Then server transmits these packets to client
    5. Client receive these packets, then calculates pts and dts and then display them using libsdl. (following this [tutorial] (http://dranger.com/ffmpeg/tutorial01.html).

    And now, I want to change in client. I want when client receive audio and video packets, It will stream them under http stream, rtp stream or somethings like those to a player such as windows media player, flash player.... Anyone can helps me ? Thanks a lot. :)