Recherche avancée

Médias (0)

Mot : - Tags -/tags

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (64)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (6993)

  • The duration of the combined video with FFmpeg becomes 0

    24 août 2019, par sido

    I am developing an app for Android by Unity.
    When I merge video and audio with FFmpeg, the duration of the output video file is 0 though it can be played back normally.

    I tried to set duration when merging, but it didn’t work.

    The commands being executed are as follows :

    -y -i /storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_video.mp4 -i /storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_audio.mp3 -t 3 -c copy -map 0:v -map 1:a -shortest /storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_merged.mp4

    I expected that the duration of the merged file was the value set in MovieTime, but the duration of the actually merged file was 0.

    The log is as follows :

    Started
    ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (GCC)
     configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_video.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 0
       compatible_brands: isommp42
       creation_time   : 2019-08-24 03:21:20
       com.android.version: 7.0
     Duration: 00:00:03.65, start: 0.000000, bitrate: 2821 kb/s
       Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 360x640, 2814 kb/s, SAR 1:1 DAR 9:16, 59.42 fps, 59.94 tbr, 90k tbn, 180k tbc (default)
       Metadata:
         creation_time   : 2019-08-24 03:21:20
         handler_name    : VideoHandle
    [mp3 @ 0xed92f600] Skipping 0 bytes of junk at 880.
    Input #1, mp3, from '/storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_audio.mp3':
     Metadata:
       encoder         : Lavf57.25.100
     Duration: 00:00:03.55, start: 0.025057, bitrate: 257 kb/s
       Stream #1:0: Audio: mp3, 44100 Hz, stereo, s16p, 256 kb/s
       Metadata:
         encoder         : Lavc57.24
    Output #0, mp4, to '/storage/emulated/0/Android/data/jp.ne.company.app/files/Record/20190824122116_merged.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 0
       compatible_brands: isommp42
       com.android.version: 7.0
       encoder         : Lavf57.25.100
       Stream #0:0(eng): Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 360x640 [SAR 1:1 DAR 9:16], q=2-31, 2814 kb/s, 59.42 fps, 59.94 tbr, 90k tbn, 90k tbc (default)
       Metadata:
         creation_time   : 2019-08-24 03:21:20
         handler_name    : VideoHandle
       Stream #0:1: Audio: mp3 (i[0][0][0] / 0x0069), 44100 Hz, stereo, 256 kb/s
       Metadata:
         encoder         : Lavc57.24
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #1:0 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    frame=  178 fps=0.0 q=-1.0 Lsize=    1098kB time=00:00:03.00 bitrate=2992.8kbits/s speed= 122x    
    video:997kB audio:95kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.534772%
  • ffmpeg 4.1.4 wav to mp3 error of "deprecated pixel format used" [on hold]

    26 août 2019, par yang jia
    [swscaler @ 0xc946d000] deprecated pixel format used, make sure you did set range correctly

    [mp3 @ 0xc9621600] Frame rate very high for a muxer not efficiently supporting it.

       Please consider specifying a lower framerate, a different muxer or -vsync 2

    [auto_resampler_0 @ 0xd8e73be0] [SWR @ 0xc8058000] Output channel layout '6 channels (FLC+BC+SL+SR+TC+TFL)' is not supported

    [auto_resampler_0 @ 0xd8e73be0] Failed to configure output pad on auto_resampler_0

    Error reinitializing filters!

    Failed to inject frame into filter network: Invalid argument
    Error while processing the decoded data for stream #0:0
    --------- beginning of crash
    2019-02-26 16:04:58.368 27737-27737/com.blplayer.jbl.blplayer A/libc: Fatal signal 7 (SIGBUS), code 1, fault addr 0xc5ca in tid 27737 (er.jbl.blplayer)

    cmd

    ffmpeg -i test.wav  -acodec libmp3lame -ar 8000 -ac 2 -y wav2mp3.mp3

    This is the log :

    enter image description here

    then on Mac it success egg
    ffmpeg -i input.wav -vn -ar 44100 -ac 2 -b:a 192k output.mp3

    Guessed Channel Layout for Input Stream #0.0 : stereo
    Input #0, wav, from ’input.wav’ :
    Metadata :
    encoder : Lavf58.20.100
    Duration : 00:02:08.29, bitrate : 256 kb/s
    Stream #0:0 : Audio : pcm_s16le (1[0][0][0] / 0x0001), 8000 Hz, stereo, s16, 256 kb/s
    Stream mapping :
    Stream #0:0 -> #0:0 (pcm_s16le (native) -> mp3 (libmp3lame))

    Output #0, mp3, to ’output.mp3’ :
    Metadata :
    TSSE : Lavf58.31.104
    Stream #0:0 : Audio : mp3 (libmp3lame), 44100 Hz, stereo, s16p, 192 kb/s
    Metadata :
    encoder : Lavc58.55.101 libmp3lame

    but on android it gets error

  • Using Unicast RTSP URIs via ffmpeg

    6 août 2019, par Chris Marshall

    I’m fairly new to ffmpeg, so I’d certainly appreciate being given an "M" to "RTFM." The ffmpeg docs are...not-so-easy...to navigate, but I’m trying.

    The goal is to develop a compiled server that incorporates ffmpeg, but first, I need to get it working via CLI.

    I have a standard AXIS Surveillance camera (AXIS M5525-E), set up as an ONVIF device (but that isn’t really relevant to this issue).

    When I query it, I get the following URI as its video streaming URI :

    rtsp://192.168.4.12/onvif-media/media.amp?profile=profile_1_jpeg&streamtype=unicast

    The IP is local to a sandboxed network.

    I add the authentication parameters to it, like so :

    rtsp://<login>:<password>@192.168.4.12/onvif-media/media.amp?profile=profile_1_jpeg&amp;streamtype=unicast
    </password></login>

    (Yeah, I know that’s not secure, but this is just for testing and feasibility study. The whole damn sandbox is an insecure mess).

    Now, if I use VLC to open the URI, it works great (of course). Looking at it with a packet analyzer, I see the following negotiation between the device and my computer (at .2 - Clipped for brevity) :

    Id = 11
    Source = 192.168.4.12
    Destination = 192.168.4.2
    Captured Length = 82
    Packet Length = 82
    Protocol = TCP
    Date Received = 2019-08-06 12:18:37 +0000
    Time Delta = 1.342024087905884
    Information = 554 -> 53755 ([ECN, ACK, SYN], Seq=696764098, Ack=3139240483, Win=28960)
                       °
                       °
                       °
    Id = 48
    Source = 192.168.4.12
    Destination = 192.168.4.2
    Captured Length = 366
    Packet Length = 366
    Protocol = TCP
    Date Received = 2019-08-06 12:18:38 +0000
    Time Delta = 2.09382700920105
    Information = 554 -> 53755 ([ACK, PUSH], Seq=696765606, Ack=3139242268, Win=1073)

    Followed immediately by UDP stream packets.

    If, however, I feed the same URI to ffmpeg :

    ffmpeg -i rtsp://<login>:<password>@192.168.4.12/onvif-media/media.amp?profile=profile_1_jpeg&amp;streamtype=unicast -c:v libx264 -crf 21 -preset veryfast -g 30 -sc_threshold 0 -f hls -hls_time 4 /Volumes/Development/webroot/fftest/stream.m3u8
    </password></login>

    I get nothing. No negotiation at all between the device and my computer.

    After that, if I then remove the &amp;streamtype=unicast argument, I get a negotiation, and a stream :

    Id = 10
    Source = 192.168.4.12
    Destination = 192.168.4.2
    Captured Length = 82
    Packet Length = 82
    Protocol = TCP
    Date Received = 2019-08-06 10:37:48 +0000
    Time Delta = 3.047425985336304
    Information = 554 -> 49606 ([ECN, ACK, SYN], Seq=457514925, Ack=2138974173, Win=28960)
                       °
                       °
                       °
    Id = 31
    Source = 192.168.4.12
    Destination = 192.168.4.2
    Captured Length = 345
    Packet Length = 345
    Protocol = TCP
    Date Received = 2019-08-06 10:37:49 +0000
    Time Delta = 3.840152025222778
    Information = 554 -> 49606 ([ACK, PUSH], Seq=457516393, Ack=2138975704, Win=1039)

    I will, of course, be continuing to work out why this is [not] happening, and will post any solutions that I find, but, like I said, I’m fairly new to this, so it’s entirely possible that I’m missing some basic stuff, and would appreciate any guidance.

    Thanks !