Recherche avancée

Médias (91)

Autres articles (20)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (6542)

  • Convert WebM/H.264 to MP4/H.264 efficiently with ffmpeg.js

    31 juillet 2017, par SB2055

    As a result of the answer here : Recording cross-platform (H.264 ?) videos using WebRTC MediaRecorder

    How can one go about using ffmpeg.js to efficiently unwrap a webm h.264 video and re-wrap it into an mp4 container ?

    I’m looking through the docs : https://github.com/Kagami/ffmpeg.js?files=1

    However I don’t see (or perhaps I’m looking for the wrong terminology) any examples for the above. This operation will be performed on the browser (chrome) prior to uploading as a Blob - I could use a web worker though hopefully rewrapping is a trivial enough operation to not require it.

  • Trying to compile x264 and ffmpeg for iPhone - "missing required architecture arm in file"

    4 août 2012, par jtrim

    I'm trying to compile x264 for use in an iPhone application. I see there are instructions on how to compile ffmpeg for use on the platform here : http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-October/076618.html , but I can't seem to find anything this complete for compiling x264 on the iPhone. I've found this source tree : http://gitorious.org/x264-arm that seems to have support for the ARM platform.

    Here is my config line :

    ./configure —cross-prefix=/usr/bin/ —host=arm-apple-darwin10 —extra-cflags="-B /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.2.sdk/usr/lib/ -I /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.2.sdk/usr/lib/"
    

    ...and inside configure I'm using the gas-preprocessor script (first link above) as my assembler :

    gas-preprocessor.pl gcc
    

    When I start compiling, it chunks away for a little while, then it spits out these warnings and a huge list of undefined symbols :

    ld : warning : option -s is obsolete and being ignored
    ld : warning : -force_cpusubtype_ALL will become unsupported for ARM architectures
    ld : warning : in /usr/lib/crt1.o, missing required architecture arm in file
    ld : warning : in /usr/X11R6/lib/libX11.dylib, missing required architecture arm in file
    ld : warning : in /usr/lib/libm.dylib, missing required architecture arm in file
    ld : warning : in /usr/lib/libpthread.dylib, missing required architecture arm in file
    ld : warning : in /usr/lib/libgcc_s.1.dylib, missing required architecture arm in file
    ld : warning : in /usr/lib/libSystem.dylib, missing required architecture arm in file
    Undefined symbols :
    

    My guess would be that the problem has to do with the "missing required architecture arm in file" warning...any ideas ?

  • Create a mkv file with colored background and containing a given audio and subtitle stream

    25 mai 2023, par rdrg109

    Table of contents

    


      

    • The context
    • 


    • Minimal working example
    • 


    • What I've tried

        

      • Create a mkv file with colored background and an audio stream
      • 


      • Create a mkv file with colored background, an audio stream and a subtitles stream
      • 


      


    • 


    • The question
    • 


    


    


    The context

    


    I have a *.flac file and a *.srt file. I want to merge those files in a MKV file, but at the same time, I want to add a video stream. I want the video stream to show a green background the entire time.

    


    


    Minimal working example

    


    For our experimentation, let's create two sample files : one *.flac file and one *.srt file.

    


    The following command creates a *.flac file that lasts 60 seconds and contains a sine wave.

    


    $ ffmpeg -y -f lavfi -i "sine=f=1000:d=60" input.flac


    


    The following command creates a *.srt file. Note that our last subtitle lasts until the sixth second, this is intended.

    


    $ cat << EOF > input.srt
1
00:00:00,000 --> 00:00:03,000
This is the first subtitle in a
SRT file.

2
00:00:03,000 --> 00:00:06,000
This is the second subtitle in a
SRT file.
EOF


    


    


    What I've tried

    


    


    Create a mkv file with colored background and an audio stream

    


    I know how to create a MKV file containing a given audio stream and a colored background as the video stream.

    


    The following command creates a MKV file containing input.flac as the audio stream and green background as the video stream. The MKV file have the same duration as input.flac.

    


    $ ffmpeg \
  -y \
  -f lavfi \
  -i color=c=green:s=2x2 \
  -i input.flac \
  -c:v libx264 \
  -c:a copy \
  -shortest \
  output.mkv


    


    The following command shows the duration of the streams in the resulting file.

    


    $ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r ''


    


    {
  "programs": [],
  "streams": [
    {
      "codec_type": "video",
      "tags": {
        "DURATION": "00:00:58.200000000"
      }
    },
    {
      "codec_type": "audio",
      "tags": {
        "DURATION": "00:01:00.000000000"
      }
    }
  ]
}


    


    


    Create a mkv file with colored background, an audio stream and a subtitles stream

    


    To add a subtitles stream, I just need to specify the *.srt file. However, when I do this, the duration of the video is set to the time of the last subtitle in the *.srt file. This is expected because I have used -shortest. I would get the result I'm looking for if it were possible to specify the stream that -shortest gives top priority to. I haven't found this information on the Internet.

    


    $ ffmpeg \
  -y \
  -f lavfi \
  -i color=c=green:s=2x2 \
  -i input.flac \
  -i input.srt \
  -c:v libx264 \
  -c:a copy \
  -shortest \
  output.mkv


    


    The following command shows the duration of the streams in the resulting file. Note that the maximum duration of the resulting file is 6 seconds, while in the resulting file from the previous section it was 1 minute.

    


    $ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r ''


    


    {
  "programs": [],
  "streams": [
    {
      "codec_type": "video",
      "tags": {
        "DURATION": "00:00:01.160000000"
      }
    },
    {
      "codec_type": "audio",
      "tags": {
        "DURATION": "00:00:03.134000000"
      }
    },
    {
      "codec_type": "subtitle",
      "tags": {
        "DURATION": "00:00:06.000000000"
      }
    }
  ]
}


    


    


    The question

    


    Given a *.flac file and a *.srt file. How to merge them in a *.mkv file so that it has the *.flac file as the audio stream, the *.srt file as the subtitles stream and a green background as the video stream ?