Recherche avancée

Médias (91)

Autres articles (37)

  • Soumettre améliorations et plugins supplémentaires

    10 avril 2011

    Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
    Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (2410)

  • Prevent FFmpeg from closing when a named pipe has been read completely

    21 octobre 2019, par Werther Berselli

    I’m doing a C++ application for a university project that takes a video file (e.g. matroska) and using FFmpeg (embedding commands inside std::system() instructions) apply these steps :

    • Extract chunks of frames (e.g. 10 seconds per chunk) and chunks of .aac audio

    • Apply some filters to frames

    • Encode video chunk and audio chunk with x264 and send it to a listening client using RTSP. This step is achieved using two named pipes, one for video and one for audio.

    • Receive the stream into another process and play it using ffplay (on localhost or lan).

    I need to divide my stream in chunks because I need eventually to satisfy real-time constraints, so I can’t apply filters before to my entire input video file and only then start streaming to client.
    My primary problem here is that once FFmpeg has emptied the two pipes, it close ; but other chunks of video and audio have still to be piped and streamed. I need FFmpeg to listen to the pipes waiting for new data.

    In bash I achieved this with the following commands.

    Start to listen in a prompt for a RTSP stream :

    ffplay -rtsp_flags listen rtsp://127.0.0.1@127.0.0.1:8090

    Create a video named pipe and an audio named pipe :

    mkfifo video_pipe
    mkfifo audio_pipe

    Use this command to prevent FFmpeg to close when video pipe is emptied :

    exec 7<>video_pipe

    (it is sufficient to apply it to the pipe video and neither will the audio pipe give problems)

    Activate FFmpeg command

    ffmpeg -probesize 2147483647 -re -s 1280x720 -pix_fmt rgb24 -i pipe:0 -vsync 0 -i audio_pipe -r 25 -vcodec libx264 -crf 23 -preset ultrafast -f rtsp -rtsp_transport tcp rtsp://127.0.0.1@127.0.0.1:8090 < video_pipe

    And then feed the pipes in another prompt :

    cat audiochunk.aac > audio_pipe & cat frame*.bmp > video_pipe

    These commands work well using 3 prompts, then I tried them in C++ embedding commands in std::system() instructions (using different threads) ; everything works, but once ffmpeg command empty the video pipe (finishing first chunk), it closes.
    exec command seems to be uneffective here.
    How could I prevent FFmpeg from closing ?

    Two days struggling on that problem, viewing all possible internet solution. I hope I was clear despite a great headache, thanks for your suggestions in advance.

    UPDATE :
    My C++ code is something like this ; I putted it in a single function on a single thread, but it doesn’t change it’s behaviour.
    I’m on Ubuntu 18.04.2

    void CameraThread::ffmpegJob()
    {
       std::string strvideo_length, command, timing;
       long video_length, begin_chunk, end_chunk;
       int begin_h, begin_m, begin_s, end_h, end_m, end_s;

       command = "ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 " + Configurations::file_name;
       strvideo_length = execCmd(command.c_str());
       strvideo_length.pop_back(); // remove \n character
       video_length = strToPositiveDigit(strvideo_length);

       if(video_length == -1)
       {
           std::cout << "Invalid input file";
           return;
       }

       std::system("bash -c \"rm mst-temp/mst_video_pipe\"");
       std::system("bash -c \"rm mst-temp/mst_audio_pipe\"");
       std::system("bash -c \"mkfifo mst-temp/mst_video_pipe\"");
       std::system("bash -c \"mkfifo mst-temp/mst_audio_pipe\"");
       // Keep video pipe opened
       std::system("bash -c \"exec 7<>mst-temp/mst_video_pipe\"");

       std::string rtsp_url = "rtsp://" + Configurations::my_own_used_ip + "@" + Configurations::client_ip +
               ":" + std::to_string(Configurations::port + 1);

       command = "ffmpeg -probesize 2147483647 -re -s 1280x720 -pix_fmt rgb24 -i pipe:0 "
                 "-i mst-temp/mst_audio_pipe -r 25 -vcodec libx264 -crf 23 -preset ultrafast -f rtsp "
                 "-rtsp_transport tcp " + rtsp_url + " < mst-temp/mst_video_pipe &"; // Using & to continue without block on command
       std::system(command.c_str());

       begin_chunk = -1 * VIDEO_CHUNK;
       end_chunk = 0;

       // Extract the complete audio track
       command = "bash -c \"ffmpeg -i " + Configurations::file_name + " -vn mst-temp/audio/complete.aac -y\"";
       std::system(command.c_str());

       while(true)
       {
           // Define the actual video chunk (in seconds) to use, if EOF is reached, exit
           begin_chunk += (end_chunk - begin_chunk);
           if(begin_chunk == video_length)
               break;
           if(end_chunk + VIDEO_CHUNK <= video_length)
               end_chunk += VIDEO_CHUNK;
           else
               end_chunk += (video_length - end_chunk);

           // Set begin and end H, M, S variables
           begin_h = static_cast<int>(begin_chunk / 3600);
           begin_chunk -= (begin_h * 3600);
           begin_m = static_cast<int>(begin_chunk / 60);
           begin_chunk -= (begin_m * 60);
           begin_s = static_cast<int>(begin_chunk);
           end_h = static_cast<int>(end_chunk / 3600);
           end_chunk -= (end_h * 3600);
           end_m = static_cast<int>(end_chunk / 60);
           end_chunk -= (end_m * 60);
           end_s = static_cast<int>(end_chunk);

           // Extract bmp frames and audio from video chunk
           // Extract frames
           timing = " -ss " + std::to_string(begin_h) + ":" + std::to_string(begin_m) +
                   ":" + std::to_string(begin_s) + " -to " + std::to_string(end_h) +
                   ":" + std::to_string(end_m) + ":" + std::to_string(end_s);
           command = "bash -c \"ffmpeg -i " + Configurations::file_name + timing +
                   " -compression_algo raw -pix_fmt rgb24 mst-temp/frames/output%03d.bmp\"";
           std::system(command.c_str());
           // Extract audio
           command = "bash -c \"ffmpeg -i mst-temp/audio/complete.aac" + timing +
                   " -vn mst-temp/audio/audiochunk.aac -y\"";
           std::system(command.c_str());

           // Apply elaborations on audio and frames.........................

           // Write modified audio and frames to streaming pipes
           command = "bash -c \"cat mst-temp/audio/audiochunk.aac > mst-temp/mst_audio_pipe &amp; "
                     "cat mst-temp/frames/output*.bmp > mst-temp/mst_video_pipe\"";
           std::system(command.c_str());
       }
    }
    </int></int></int></int></int></int>
  • Applying fades between images ffmpeg command

    29 octobre 2015, par Jon Stevens

    I am trying to create a video slideshow that fade out/in between each image solely from an ffmpeg cli command. After researching this for hours, the only way I discovered that this was possible was to use the -filter_complex argument and pass in all images and specify a complex filter that defines multiple fades out and back in that I could time to happen between frames. The command I have so far :

    ffmpeg -y -framerate 1/5 \
    -loop 1 -i img-1.jpg \
    -loop 1 -i img-2.jpg \
    -loop 1 -i img-3.jpg \
    -filter_complex \
    "[1:v]fade=out:4:d=1,fade=in:5:d=1[fad1]; \
    [2:v]fade=out:9:d=1,fade=in:10:d=1[fad2]; \
    [3:v]fade=out:14:d=1,fade=in:15:d=1[fad3];" \
    -c:v libx264 -r 25 -pix_fmt yuv420p test.mp4

    Here’s the output from executing this command :

    ffmpeg version 2.6.4 Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 5.1.1 (GCC) 20150618 (Red Hat 5.1.1-4)
     configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-frei0r --enable-gnutls --enable-ladspa --enable-libass --enable-libcdio --enable-libdc1394 --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
     libavutil      54. 20.100 / 54. 20.100
     libavcodec     56. 26.100 / 56. 26.100
     libavformat    56. 25.101 / 56. 25.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 11.102 /  5. 11.102
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, image2, from img-1.jpg':
     Duration: 00:00:05.00, start: 0.000000, bitrate: 141 kb/s
       Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1280x720 [SAR 72:72 DAR 16:9], 0.20 fps, 0.20 tbr, 0.20 tbn, 0.20 tbc
    Input #1, image2, from img-2.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 17789 kb/s
       Stream #1:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1280x720 [SAR 67:67 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 25 tbc
    Input #2, image2, from 'img-3.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 17764 kb/s
       Stream #2:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1280x720 [SAR 62:62 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [AVFilterGraph @ 0xbc2a00] No such filter: ''
    Error configuring filters.

    All I am trying to do is create a video slideshow with fades/transitions between images. Any help is greatly appreciated !

  • Cant link against ffmpeg static build

    24 mars 2017, par David Barishev

    I have build ffmpeg libraries statically for x86 android using a custom configuration.Now i wanted to add them in my android project.

    Im using ffmpeg 3.2.git, android studio 2.3.

    I created a folder named distribution which had my binaries, and the minimum headers i needed for it to not tell me about missing header (Except for the avutil- i just included them all), located the root of my project.
    Here is a file tree :

    distribution
    ├── libavcodec
    │   ├── lib
    │   │   └── x86
    │   │       └── libavcodec.a
    │   ├── avcodec.h
    │   └── version.h
    ├── libavfilter
    │   ├── lib
    │   │   └── x86
    │   │       └── libavfilter.a
    │   ├── avfiltergraph.h
    │   └── avfilter.h
    ├── libavformat
    │   ├── lib
    │   │   └── x86
    │   │       └── libavformat.a
    │   ├── avformat.h
    │   ├── avio.h
    │   └── version.h
    ├── libavutil
    │   ├── lib
    │   │   └── x86
    │   │       └── libavutil.a
    │   ├── attributes.h
    │   ├── avconfig.h
    │   ├── avutil.h
    │   ├── buffer.h
    │   ├── buffer_internal.h
    │   ├── camellia.h
    │   ├── cast5.h
    │   ├── channel_layout.h
    │   ├── colorspace.h
    │   ├── color_utils.h
    │   ├── common.h
    │   ├── cpu.h
    │   ├── cpu_internal.h
    │   ├── crc.h
    │   ├── des.h
    │   ├── dict.h
    │   ├── display.h
    │   ├── downmix_info.h
    │   ├── dynarray.h
    │   ├── error.h
    │   ├── eval.h
    │   ├── ffmath.h
    │   ├── ffversion.h
    │   ├── fifo.h
    │   ├── file.h
    │   ├── fixed_dsp.h
    │   ├── float_dsp.h
    │   ├── frame.h
    │   ├── hash.h
    │   ├── hmac.h
    │   ├── hwcontext_cuda.h
    │   ├── hwcontext_cuda_internal.h
    │   ├── hwcontext_dxva2.h
    │   ├── hwcontext.h
    │   ├── hwcontext_internal.h
    │   ├── hwcontext_qsv.h
    │   ├── hwcontext_vaapi.h
    │   ├── hwcontext_vdpau.h
    │   ├── imgutils.h
    │   ├── integer.h
    │   ├── internal.h
    │   ├── intfloat.h
    │   ├── intmath.h
    │   ├── intreadwrite.h
    │   ├── lfg.h
    │   ├── libm.h
    │   ├── lls.h
    │   ├── log.h
    │   ├── lzo.h
    │   ├── macros.h
    │   ├── mastering_display_metadata.h
    │   ├── mathematics.h
    │   ├── md5.h
    │   ├── mem.h
    │   ├── mem_internal.h
    │   ├── motion_vector.h
    │   ├── murmur3.h
    │   ├── opencl.h
    │   ├── opencl_internal.h
    │   ├── opt.h
    │   ├── parseutils.h
    │   ├── pca.h
    │   ├── pixdesc.h
    │   ├── pixelutils.h
    │   ├── pixfmt.h
    │   ├── qsort.h
    │   ├── random_seed.h
    │   ├── rational.h
    │   ├── rc4.h
    │   ├── replaygain.h
    │   ├── reverse.h
    │   ├── ripemd.h
    │   ├── samplefmt.h
    │   ├── sha512.h
    │   ├── sha.h
    │   ├── softfloat.h
    │   ├── softfloat_ieee754.h
    │   ├── softfloat_tables.h
    │   ├── spherical.h
    │   ├── stereo3d.h
    │   ├── tablegen.h
    │   ├── tea.h
    │   ├── thread.h
    │   ├── threadmessage.h
    │   ├── timecode.h
    │   ├── time.h
    │   ├── time_internal.h
    │   ├── timer.h
    │   ├── timestamp.h
    │   ├── tree.h
    │   ├── twofish.h
    │   └── version.h
    └── libswresample
       ├── lib
       │   └── x86
       │       └── libswresample.a
       ├── swresample.h
       └── swresample_internal.h

    I edited my cmake to include the libraries :

    add_library(
            native-lib
            SHARED
            src/main/cpp/native-lib.cpp )

    set(distribution_DIR ${CMAKE_SOURCE_DIR}/../distribution)

    add_library(lib_avcodec STATIC IMPORTED)
    set_target_properties(lib_avcodec PROPERTIES IMPORTED_LOCATION
       ${distribution_DIR}/libavcodec/lib/${ANDROID_ABI}/libavcodec.a)

    add_library(lib_avfilter STATIC IMPORTED)
    set_target_properties(lib_avfilter PROPERTIES IMPORTED_LOCATION
       ${distribution_DIR}/libavfilter/lib/${ANDROID_ABI}/libavfilter.a)

    add_library(lib_avformat STATIC IMPORTED)
    set_target_properties(lib_avformat PROPERTIES IMPORTED_LOCATION
       ${distribution_DIR}/libavformat/lib/${ANDROID_ABI}/libavformat.a)

    add_library(lib_avutil STATIC IMPORTED)
    set_target_properties(lib_avutil PROPERTIES IMPORTED_LOCATION
       ${distribution_DIR}/libavutil/lib/${ANDROID_ABI}/libavutil.a)

    add_library(lib_swresample STATIC IMPORTED)
    set_target_properties(lib_swresample PROPERTIES IMPORTED_LOCATION
       ${distribution_DIR}/libswresample/lib/${ANDROID_ABI}/libswresample.a)


    include_directories(
                             ${distribution_DIR}
                             )


    target_link_libraries(
                          native-lib

                          lib_avcodec
                          lib_avfilter
                          lib_avformat
                          lib_avutil
                          lib_swresample
                           )

    I also restricted the build to only x86, in my app build.gradle :

    ndk {
               // Specifies the ABI configurations of your native
               // libraries Gradle should build and package with your APK.
               abiFilters 'x86'
       }

    The project gradle sync worked successfully.
    I wrote the following code in my cpp file :

    #include


    extern "C"{
       #include "libavformat/avformat.h"
    }
    JNIEXPORT void JNICALL
    Java_com_example_david_testffmpegcpp_MainActivity_stringFromJNI(
           JNIEnv *env,
           jobject /* this */) {

       av_register_all ();
       avformat_network_init ();


    }

    Just to check if the library works, but i cant seem to link against the libraries correctly. It complains about undefined symbols.
    What did i do wrong ?

    Error :

    [2/2] Linking CXX shared library

    ..\..\..\..\build\intermediates\cmake\debug\obj\x86\libnative-lib.so
    FAILED: cmd.exe /C "cd . &amp;&amp; D:\AndroidSDK\ndk-bundle\toolchains\llvm\prebuilt\windows-x86_64\bin\clang++.exe  --target=i686-none-linux-android --gcc-toolchain=D:/AndroidSDK/ndk-bundle/toolchains/x86-4.9/prebuilt/windows-x86_64 --sysroot=D:/AndroidSDK/ndk-bundle/platforms/android-16/arch-x86 -fPIC -g -DANDROID -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -mstackrealign -Wa,--noexecstack -Wformat -Werror=format-security  -g -DANDROID -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -mstackrealign -Wa,--noexecstack -Wformat -Werror=format-security   -O0 -fno-limit-debug-info -O0 -fno-limit-debug-info  -Wl,--build-id -Wl,--warn-shared-textrel -Wl,--fatal-warnings -Wl,--no-undefined -Wl,-z,noexecstack -Qunused-arguments -Wl,-z,relro -Wl,-z,now -Wl,--build-id -Wl,--warn-shared-textrel -Wl,--fatal-warnings -Wl,--no-undefined -Wl,-z,noexecstack -Qunused-arguments -Wl,-z,relro -Wl,-z,now -shared -Wl,-soname,libnative-lib.so -o ..\..\..\..\build\intermediates\cmake\debug\obj\x86\libnative-lib.so CMakeFiles/native-lib.dir/src/main/cpp/native-lib.cpp.o  -llog ../../../../../distribution/libavcodec/lib/x86/libavcodec.a ../../../../../distribution/libavfilter/lib/x86/libavfilter.a ../../../../../distribution/libavformat/lib/x86/libavformat.a ../../../../../distribution/libavutil/lib/x86/libavutil.a ../../../../../distribution/libswresample/lib/x86/libswresample.a -lm "D:/AndroidSDK/ndk-bundle/sources/cxx-stl/gnu-libstdc++/4.9/libs/x86/libgnustl_static.a" &amp;&amp; cd ."
    src/libavformat/allformats.c:51: error: undefined reference to 'avcodec_register_all'
     src/libavformat/id3v2.c:1009: error: undefined reference to 'uncompress'
     src/libavformat/id3v2.c:1153: error: undefined reference to 'av_init_packet'
     src/libavformat/matroskadec.c:1393: error: undefined reference to 'inflateInit_'
     src/libavformat/matroskadec.c:1408: error: undefined reference to 'inflate'
     src/libavformat/matroskadec.c:1401: error: undefined reference to 'inflateEnd'
     src/libavformat/matroskadec.c:1411: error: undefined reference to 'inflateEnd'
     src/libavformat/matroskadec.c:3113: error: undefined reference to 'av_new_packet'
     src/libavformat/matroskadec.c:3134: error: undefined reference to 'av_packet_new_side_data'
     src/libavformat/matroskadec.c:2855: error: undefined reference to 'av_new_packet'
     src/libavformat/matroskadec.c:3147: error: undefined reference to 'av_packet_new_side_data'
     src/libavformat/matroskadec.c:3022: error: undefined reference to 'av_new_packet'
     src/libavformat/matroskadec.c:3031: error: undefined reference to 'av_packet_new_side_data'
     src/libavformat/matroskadec.c:3042: error: undefined reference to 'av_packet_new_side_data'
     src/libavformat/matroskadec.c:3151: error: undefined reference to 'av_packet_unref'
     src/libavformat/matroskadec.c:2680: error: undefined reference to 'av_packet_unref'
     src/libavformat/matroskadec.c:963: error: undefined reference to 'av_fast_padded_malloc'
     src/libavformat/matroskadec.c:2163: error: undefined reference to 'av_get_codec_tag_string'
     src/libavformat/matroskadec.c:2680: error: undefined reference to 'av_packet_unref'
     src/libavformat/matroskadec.c:2579: error: undefined reference to 'av_init_packet'
     src/libavformat/matroskadec.c:2580: error: undefined reference to 'av_new_packet'
     src/libavformat/matroskadec.c:1715: error: undefined reference to 'avpriv_mpeg4audio_sample_rates'
     src/libavformat/matroskadec.c:1715: error: undefined reference to 'avpriv_mpeg4audio_sample_rates'
     src/libavformat/matroskadec.c:1715: error: undefined reference to 'avpriv_mpeg4audio_sample_rates'
     src/libavformat/matroskadec.c:1866: error: undefined reference to 'avcodec_chroma_pos_to_enum'
     src/libavformat/matroskadec.c:2680: error: undefined reference to 'av_packet_unref'
     src/libavformat/matroskaenc.c:2482: error: undefined reference to 'avcodec_get_type'
     src/libavformat/matroskaenc.c:646: error: undefined reference to 'avpriv_split_xiph_headers'
     src/libavformat/matroskaenc.c:1066: error: undefined reference to 'av_get_bits_per_sample'
     src/libavformat/matroskaenc.c:723: error: undefined reference to 'avpriv_mpeg4audio_get_config'
     src/libavformat/matroskaenc.c:875: error: undefined reference to 'avcodec_enum_to_chroma_pos'
     src/libavformat/matroskaenc.c:813: error: undefined reference to 'avcodec_get_name'
     src/libavformat/matroskaenc.c:806: error: undefined reference to 'avcodec_get_name'
     src/libavformat/matroskaenc.c:824: error: undefined reference to 'avcodec_get_name'
     .....