Recherche avancée

Médias (91)

Autres articles (35)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

Sur d’autres sites (6248)

  • webm files created with ffmpeg are too long

    6 août 2022, par Azrael

    I have a folder of exactly 300 images in png format (labelled 1.png, 2.png, ..., 300.png), which I'm trying to convert to a video. I would like the video to be in the webm format, but there seems to be an issue :

    


    using the following command :

    


    ffmpeg -start_number 1 -i ./frames/%d.png -frames:v 300 -r 30 out.webm


    


    does generate an out.webm file, and, according to ffprobe -select_streams v -count_frames -show_entries stream=nb_read_frames,r_frame_rate out.webm (which is presumably quite an inefficient way to get that information, but that's besides the point), it does contain 300 frames and has a framerate of exactly 30/1, however, instead of the expected exactly 10 seconds (from 300 frames being played at 30 fps), the video lasts slightly longer (about 12 seconds).
    
This discrepancy does seem to scale up with video length ; 900 frames being converted to a video the same way and with the same frame rate yield a 36 (instead of 30) second video.

    


    For testing, I also tried generating an mp4 file instead of a webm one, with the following command (exact same as above, but out.mp4 instead of out.webm), and that worked exactly as expected, out.mp4 was a 10-second long video.

    


    ffmpeg -start_number 1 -i ./frames/%d.png -frames:v 100 -r 30 out.mp4


    


    How do I fix this ? is my ffmpeg command off or is this a bug within the tool ?

    


  • Decoding AAC to PCM with ffmpeg results in noise

    18 octobre 2022, par userDtrm

    I have a .mp4 file generated with ffmpeg as follows.

    


    


    ffmpeg -y -i video_extended.mp4 -itsoffset 00:00:04.00 -i output5-1.wav -map 0:0 -map 1:0 -c:v copy -c:a aac -ac 6 -ar 48000 -b:a 128k -async 1 mixed.mp4

    


    


    Playing mixed.mp4 file with ffplay is fine and there is no impact to the sound quality. Below is the output I get from ffplay when using the command ffplay -i  mixed.mp4

    


    > Input #0, mov,mp4,m4a,3gp,3g2,mj2, from
> 'mixed_h264_aac_512k_async_qp0_all_I.mp4':   Metadata:
>     major_brand     : isom
>     minor_version   : 512
>     compatible_brands: isomiso2avc1mp41
>     encoder         : Lavf58.76.100   Duration: 00:00:16.02, start: 0.000000, bitrate: 49136 kb/s   Stream #0:0[0x1](und): Video: h264 (High 4:4:4 Predictive) (avc1 / 0x31637661), yuv422p10le(progressive),
> 1920x1080, 65409 kb/s, 59.94 fps, 59.94 tbr, 11988 tbn (default)
>     Metadata:
>       handler_name    : VideoHandler
>       vendor_id       : [0][0][0][0]   Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, 5.1, fltp, 71 kb/s (default)
>     Metadata:
>       handler_name    : SoundHandler
>       vendor_id       : [0][0][0][0] Switch subtitle stream from #-1 to #-1 vq= 1606KB sq=    0B f=0/0


    


    Then, I decode the mixed.mp4 file back to raw PCM using the following command.

    


    


    ffmpeg -i mixed.mp4 -vn -acodec pcm_s16le -f s16le -ar 48000 -ac 6 raw_audio.pcm

    


    


    However, this raw_audio.pcm contains a lot of noise and ffplay output shows the following output

    


    [s16le @ 0x7f7490000c80] Estimating duration from bitrate, this may be inaccurate
Input #0, s16le, from 'separated_audio_s16.pcm':
  Duration: 00:00:16.02, bitrate: 4607 kb/s
  Stream #0:0: Audio: pcm_s16le, 48000 Hz, 6 channels, s16, 4608 kb/s
[pcm_s16le @ 0x7f749002b940] Multiple frames in a packet.
[pcm_s16le @ 0x7f749002b940] Invalid PCM packet, data has size 8 but at least a size of 12 was expected
    Last message repeated 32 times
[pcm_s16le @ 0x7f749002b940] Invalid PCM packet, data has size 8 but at least a size of 12 was expected
    Last message repeated 11 times
Switch subtitle stream from #-1 to #-1 vq=    0KB sq=    0B f=0/0   
[pcm_s16le @ 0x7f749002b940] Invalid PCM packet, data has size 8 but at least a size of 12 was expected
    Last message repeated 11 times
[pcm_s16le @ 0x7f749002b940] Invalid PCM packet, data has size 8 but at least a size of 12 was expected
    Last message repeated 11 times
[pcm_s16le @ 0x7f749002b940] Invalid PCM packet, data has size 8 but at least a size of 12 was expected


    


    Can someone please explain the issue here ? Note that the ffplay command that works correctly for mixed.mp4 shows fltp as the audio format, whereas when playing the raw_audio.pcm file, it is seen as s16.

    


    Is this a resampling issue in ffmpeg, and how can I rectify this ?

    


    I’m using ffmpeg and ffplay versions 5.0.1 in a Fedora 36 system.

    


    Thank you.

    


  • Using and building FFMPEG library in Android Studio(Cmake)

    8 septembre 2017, par y3k00000

    As title, I’m trying to use the ffmpeg source code as library in Android Studio on Ubuntu Linux, but meeting some trouble in compilation stage. I think I must be missing some compile option but having no idea what I miss, can somebody lend a hand ?

    My codes & settings are below :

    A simple auto-generated .cpp

    #include
    #include <string>
    #include "libavcodec/aacenc.h"

    extern "C"
    JNIEXPORT jstring JNICALL
    Java_y3k_testffmpegnative_MainActivity_stringFromJNI(
           JNIEnv *env,
           jobject /* this */) {
       std::string hello = "Hello from C++";
       return env->NewStringUTF(hello.c_str());
    }

    AACEncContext * aacEncContext; // Just trying by adding this.
    </string>

    I’ve tried these Gradle setting and it didn’t help :

    android {
       ....
       externalNativeBuild {
           cmake {
               arguments "-DANDROID_TOOLCHAIN=clang"
               cFlags "-std=c99"
               cppFlags "-frtti","-fexceptions"
           }
       }
    }
    ....

    CMakeLists.txt

    (Android Studio generated)
    ....
    include_directories( /home/y3k/ffmpeg )

    Error message while compiling :

    /home/y3k/ffmpeg/libavutil/float_dsp.h
    Error:(164, 50) error: expected ')'
    Information:(164, 30) to match this '('
    Error:(164, 50) error: expected ')'
    Information:(164, 30) note: to match this '('
    /home/y3k/ffmpeg/libavutil/fixed_dsp.h
    Error:(153, 44) error: expected ')'
    Information:(153, 30) to match this '('
    Error:(153, 44) error: expected ')'
    Information:(153, 30) note: to match this '('
    /home/y3k/ffmpeg/libavcodec/mpeg4audio.h
    Error:(44, 8) error: unknown type name 'av_export'
    Error:(44, 18) error: expected unqualified-id
    Error:(44, 8) error: unknown type name 'av_export'
    Error:(44, 18) error: expected unqualified-id
    /home/y3k/ffmpeg/libavcodec/aac.h
    Error:(294, 21) error: expected member name or ';' after declaration specifiers
    Error:(294, 21) error: expected member name or ';' after declaration specifiers

    FFMpeg config.h generated by this script :

    NDK=/home/y3k/Android/Sdk/ndk-bundle
    SYSROOT=$NDK/platforms/android-23/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    function build_one
    {
    ./configure \
    --prefix=$PREFIX \
    --disable-shared \
    --enable-static \
    --disable-doc \
    --disable-ffmpeg \
    --disable-ffplay \
    --disable-ffprobe \
    --disable-ffserver \
    --disable-avdevice \
    --disable-doc \
    --disable-symver \
    --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
    --target-os=linux \
    --arch=arm \
    --enable-cross-compile \
    --sysroot=$SYSROOT \
    --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
    --extra-ldflags="$ADDI_LDFLAGS" \
    $ADDITIONAL_CONFIGURE_FLAG
    make clean
    make
    make install
    }
    CPU=arm
    PREFIX=$(pwd)/android/$CPU
    ADDI_CFLAGS="-marm"
    build_one

    By tracing the errors I found these lines :

    in float_dsp.h, fixed_dsp.h :

    void (*butterflies_float)(float *av_restrict v1, float *av_restrict v2, int len);

    av_restrict defined in config.h

    #define av_restrict restrict

    in aac.h :

    ....
    struct AACContext {
       AVClass        *class;
    ....

    in mpeg4audio.h :

    extern av_export const int avpriv_mpeg4audio_sample_rates[16];

    So I’m guessing it’s because the compiler misrecognized the C code as C++, I tried adding these :

    arguments "-DANDROID_TOOLCHAIN=clang"
    cFlags "-std=c99"

    to my build.gradle but didn’t help. Having no idea where to move on. :(

    I’m using the latest stable version of Android Studio on Ubuntu desktop. Please don’t be hesitate to ask if any extra information is required.

    Also I’ve been tried edit my native-lib.cpp into .c (Surely with the code contents changed), but wasn’t working either.

    Appreciate for any help.