Recherche avancée

Médias (91)

Autres articles (60)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (4584)

  • Synchronizing RTSP signals with FFMPEG

    3 décembre 2019, par Taits

    I have achieved to merge two RTSP signals in one with FFMPEG but in the resulting stream, the first input has a delay of between 2 and 5 seconds with respect to the second. And it is always the first input that is delayed compared to the second.

    The two RTSP signals come from the same camera model, same configuration, same room ...

    However, if I put the same RTSP signal (either of the two) as input 1 and input 2, the same thing happens. Despite being the same signal, the first input is delayed compared to the second one.

    How could I get them synchronized ?

    This is the command that I execute :

    ffmpeg -rtsp_transport tcp -thread_queue_size 512 -rtbufsize 50M -r 15
    -i rtsp://XXXX -rtsp_transport tcp -thread_queue_size 512 -rtbufsize 50M -r 15
    -c:a aac -i rtsp://YYYY -filter_complex "[0:v]pad=iw*2:ih,setpts=PTS-STARTPTS[bg];
    [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w[out]" -map "[out]" -f hls
    -hls_time 2 -hls_list_size 5 -use_localtime 1 -use_localtime_mkdir 1
    -hls_segment_filename 'LIVE/file-%s.ts' -map a -ar 16000 -ac 1 -ab 64000 -c:a aac
    -y output.m3u8

    Here you have the process informationn :

    ffmpeg version 3.4.2 Copyright (c) 2000-2018 the FFmpeg developers
    built with Apple LLVM version 9.0.0 (clang-900.0.39.2)
    configuration: --prefix=/usr/local/Cellar/ffmpeg/3.4.2 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --disable-jack --enable-gpl --enable-libmp3lame --enable-libx264 --enable-libxvid --enable-opencl --enable-videotoolbox --disable-lzma
     libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libavresample   3.  7.  0 /  3.  7.  0
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
     libpostproc    54.  7.100 / 54.  7.100
    [rtsp @ 0x7fe284000e00] Missing PPS in sprop-parameter-sets, ignoring
    Input #0, rtsp, from 'rtsp://XXXX':
     Metadata:
       title           : Media Presentation
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1280x720, 15 fps, 15 tbr, 90k tbn, 30 tbc
       Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp
    [rtsp @ 0x7fe28484de00] Missing PPS in sprop-parameter-sets, ignoring
    Input #1, rtsp, from 'rtsp://XXXX':
     Metadata:
       title           : Media Presentation
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #1:0: Video: h264 (Main), yuv420p(progressive), 1280x720, 15 fps, 15 tbr, 90k tbn, 30 tbc
       Stream #1:1: Audio: aac (LC), 16000 Hz, mono, fltp
    Stream mapping:
     Stream #0:0 (h264) -> pad (graph 0)
     Stream #1:0 (h264) -> setpts (graph 0)
     overlay (graph 0) -> Stream #0:0 (libx264)
     Stream #0:1 -> #0:1 (aac (native) -> aac (native))
    Press [q] to stop, [?] for help
    [libx264 @ 0x7fe286802000] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
    [libx264 @ 0x7fe286802000] profile High, level 4.0
    [libx264 @ 0x7fe286802000] 264 - core 152 r2854 e9a5903 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=1 weightp=2 keyint=250 keyint_min=15 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    [hls @ 0x7fe286800000] Opening 'LIVE/file-1524728763.ts' for writing
    Output #0, hls, to 'output.m3u8':
     Metadata:
       title           : Media Presentation
       encoder         : Lavf57.83.100
       Stream #0:0: Video: h264 (libx264), yuv420p, 2560x720, q=-1--1, 15 fps, 90k tbn, 15 tbc (default)
       Metadata:
         encoder         : Lavc57.107.100 libx264
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
       Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp, 64 kb/s
       Metadata:
         encoder         : Lavc57.107.100 aac
    [hls @ 0x7fe286800000] Opening 'LIVE/file-1524728782.ts' for writinged=1.09x    
    [hls @ 0x7fe286800000] Opening 'output.m3u8.tmp' for writing
    [hls @ 0x7fe286800000] Opening 'output.m3u8.tmp' for writing=N/A speed=1.07x    
    frame=  396 fps= 15 q=-1.0 Lsize=N/A time=00:00:27.00 bitrate=N/A speed=1.05x    
    video:2946kB audio:147kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
  • FFmpeg stuck on frame 0 during time-lapse conversion with `-ss` start time

    2 janvier 2021, par rgov

    I have a long video that I'd like to speed up into a time-lapse. Following the FFmpeg wiki, I came up with this incantation :

    



    ffmpeg -i IMG_0238.MOV -r 60 -ss 00:04:41 -filter:v "setpts=PTS/60.0" -an output.mp4


    



    It should clip off the first few minutes, then create a new 60fps video file at 60x the speed.

    



    Although it spins all my CPU cores up to 100%, the status bar simply shows frame=    0 fps=0.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x and never progresses.

    



    The problem seems related to the -ss 00:04:41 flag, which specifies the start time ; I want to drop the first 4 minutes 41 seconds of source video. When I omit this option, the video conversion succeeds.

    



    Full log follows.

    




    


    ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
  built with Apple LLVM version 9.1.0 (clang-902.0.30)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/3.4.1 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-gpl --enable-libmp3lame --enable-libx264 --enable-libx265 --enable-libxvid --enable-opencl --enable-videotoolbox --disable-lzma
  libavutil      55. 78.100 / 55. 78.100
  libavcodec     57.107.100 / 57.107.100
  libavformat    57. 83.100 / 57. 83.100
  libavdevice    57. 10.100 / 57. 10.100
  libavfilter     6.107.100 /  6.107.100
  libavresample   3.  7.  0 /  3.  7.  0
  libswscale      4.  8.100 /  4.  8.100
  libswresample   2.  9.100 /  2.  9.100
  libpostproc    54.  7.100 / 54.  7.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fb59b800000] stream 2, missing mandatory atoms, broken header
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fb59b800000] stream 3, missing mandatory atoms, broken header
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'IMG_0238.MOV':
  Metadata:
    major_brand     : qt  
    minor_version   : 0
    compatible_brands: qt  
    creation_time   : 2017-04-20T16:25:14.000000Z
    com.apple.quicktime.location.ISO6709: +36.4610+025.3727+123.613/
    com.apple.quicktime.make: Apple
    com.apple.quicktime.model: iPhone 7
    com.apple.quicktime.software: 10.3.1
    com.apple.quicktime.creationdate: 2017-04-20T19:25:13+0300
  Duration: 01:22:45.53, start: -0.000023, bitrate: 44171 kb/s
    Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 3840x2160, 44077 kb/s, 30 fps, 30 tbr, 600 tbn, 1200 tbc (default)
    Metadata:
      rotate          : 180
      creation_time   : 2017-04-20T16:25:14.000000Z
      handler_name    : Core Media Data Handler
      encoder         : H.264
    Side data:
      displaymatrix: rotation of -180.00 degrees
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 84 kb/s (default)
    Metadata:
      creation_time   : 2017-04-20T16:25:14.000000Z
      handler_name    : Core Media Data Handler
    Stream #0:2(und): Data: none (mebx / 0x7862656D) (default)
    Metadata:
      creation_time   : 2017-04-20T16:25:14.000000Z
      handler_name    : Core Media Data Handler
    Stream #0:3(und): Data: none (mebx / 0x7862656D) (default)
    Metadata:
      creation_time   : 2017-04-20T16:25:14.000000Z
      handler_name    : Core Media Data Handler
File 'output.mp4' already exists. Overwrite ? [y/N] y
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x7fb59b82b800] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
[libx264 @ 0x7fb59b82b800] profile High, level 5.2
[libx264 @ 0x7fb59b82b800] 264 - core 148 r2795 aaa9aa8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'output.mp4':
  Metadata:
    major_brand     : qt  
    minor_version   : 0
    compatible_brands: qt  
    com.apple.quicktime.creationdate: 2017-04-20T19:25:13+0300
    com.apple.quicktime.location.ISO6709: +36.4610+025.3727+123.613/
    com.apple.quicktime.make: Apple
    com.apple.quicktime.model: iPhone 7
    com.apple.quicktime.software: 10.3.1
    encoder         : Lavf57.83.100
    Stream #0:0(und): Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 3840x2160, q=-1--1, 60 fps, 15360 tbn, 60 tbc (default)
    Metadata:
      encoder         : Lavc57.107.100 libx264
      creation_time   : 2017-04-20T16:25:14.000000Z
      handler_name    : Core Media Data Handler
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
      displaymatrix: rotation of -0.00 degrees
frame=    0 fps=0.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x


    


  • carrierwave-ffmpeg .weba audio convert to .mp3 or .mp4

    14 mai 2018, par nmadoug

    I use react-mic on the front-end of my React/Rails application to capture audio from the user. react-mic formats the audio blob as .weba. I’m trying to convert the .weba to .mp4 or .mp3 using carrierwave-ffmpeg . I can isolate the issue to my Rails uploader, which is as follows :

    require 'fog/aws'

    class UserResponseUploader < CarrierWave::Uploader::Base
     include CarrierWave::FFmpeg

     if Rails.env.test?
       storage :file
     else
       storage :fog
     end

     version :mp3 do
       puts `which ffmpeg`
       # binding.pry
       process encode: [:mp3, audio_codec: "a libmp3lame -qscale:a 2"]
       # process :encode_audio=> [:mp4, audio_codec: "aac",:custom => "-strict experimental -qscale:a 2"]

       def full_filename(for_file)
         super.chomp(File.extname(super)) + '.mp3'
       end
     end

     # Directory where uploaded files will be stored on file system or AWS
     def store_dir
       "uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
     end
    end

    ffmpeg is installed on my local PC and I can run the command on the terminal to get an audio file created

    ffmpeg -i audio.weba -codec:a libmp3lame -qscale:a 2 willwork.mp3

    My problem is when I try and run everything in my code, I get the following error message :

    NameError (undefined local variable or method `e' for #):

    I don’t know where it’s complaining. I have a feeling the encoding line is causing the problem and I don’t have it set up properly.

    Any ideas ?