Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (13)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Taille des images et des logos définissables

    9 février 2011, par

    Dans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
    Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)

Sur d’autres sites (6297)

  • ffmpeg - operation not permitted error while conversion

    20 février 2012, par Jomoos

    I am developing an android app. My requirement is that to implement an rtsp streaming server on android. It has to live stream video and audio captured using MediaRecorder. Another requirement is that I have to use live555 as the streaming server. What I get from MediaRecorder is in MP4 or 3GP format. live555 cannot able to stream both. But it can stream audio if I recorded it only in 'RAW_AMR' format. Since live555 support 'mpg' format for streaming, I decided to put someone in middle who can convert 'mp4' or '3gp' to 'mpg', and I chose ffmpeg.

    I have ported live555 and ffmpeg to android. ffmpeg is able to convert the file recorded by MediaRecorder once it is finished. But the problem is that ffmpeg cannot be able to do it concurrently. That is, ffmpeg is not able to convert the file while recording. It shows an Operation not permitted error. I tried the same on my linux machine, using VLC to record instead of MediaRecorder on android. The result is same. ffmpeg is able to convert once the recording is finished, and not able to do the same while recording.

    Here is the ffmpeg command I issued on my linux box :

    ffmpeg -v 9 -loglevel 99 -i test.mp4 test.mpg

    Where test.mp4 is the file to which VLC is recording in mp4 format. and test.mpg is my destination file. The following is the output by ffmpeg on terminal.

    ffmpeg version 0.8.9, Copyright (c) 2000-2011 the FFmpeg developers
     built on Feb  1 2012 18:29:27 with gcc 4.6.2 20111027 (Red Hat 4.6.2-1)
     configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --enable-bzlib --enable-libcelt --enable-libdc1394 --enable-libdirac --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-x11grab --enable-avfilter --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
     libavutil    51.  9. 1 / 51.  9. 1
     libavcodec   53.  8. 0 / 53.  8. 0
     libavformat  53.  5. 0 / 53.  5. 0
     libavdevice  53.  1. 1 / 53.  1. 1
     libavfilter   2. 23. 0 /  2. 23. 0
     libswscale    2.  0. 0 /  2.  0. 0
     libpostproc  51.  2. 0 / 51.  2. 0
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x1672600] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x1672600] ISO: File Type Major Brand: isom
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x1672600] moov atom not found
    test.mp4: Operation not permitted

    Would anyone please tell me what is causing the problem ? Or is the scenario above is possible by ffmpeg. That is, is ffmpeg is able to do the conversion at the same time as that of recording ? If it is not possible by ffmpeg, would you please suggest any alternative solutions ?

    NOTE : I am putting a C tag because if it possible by some tweaking in C on ffmpeg, I am ready to do that(I want the solution that badly). But please provide some pointers to the right direction.

  • equivalent of QImage::bits() in C/C++

    17 février 2012, par JonnyCplusplus

    I am having an issue using ffmpeg. ffmpeg expects to read in a uint8_t and QImage::bits() is the only way I can generate valid images to feed to ffmpeg to get a valid video. The function ffmpeg(buf) will essentially generate the video I need. The issue is the buffers in each example print out very different results.

    My goad is to get the same result I get with QImage::bits() without using Qt at all. How can I duplicate that using C/C++

    Works :

    QImage getInfo("image");
    uint8_t *buf = getInfo.bits();
    ffmpeg(buff); // this will work great!

    This does not work :

    FILE *f;
    long lSize=0;
    uint8_t *buf;

    f = fopen ("image", "rb");

    fseek (f , 0 , SEEK_END);
    lSize = ftell (f);
    rewind (f);

    buf = (uint8_t*)malloc(lSize);
    fread (buf, 1, lSize, f);
    ffmpeg(buf);  //junk video!
  • FFMPEG 2 Videos transcoded and side by side in 1 frame ?

    3 mars 2016, par dcoffey3296

    I have 2 videos : HEADSHOT.MOV and SCREEN.MOV. They are both large files and I am looking to both shrink (size, bitrate, etc) and place these two side by side in the same, very wide, video frame. The end result would be that when you play the output_video.mp4, you would have a very wide frame with both videos in sync and playing at the same rate.

    Here is the syntatically incorrect version of what I am trying to do :

    ffmpeg -i HEADSHOT.MOV -t 00:02:00 -acodec libfaac -ab 64k -vcodec libx264 -r 30 -pass 1 -s 374x210 -vf "movie=SCREEN.MOV [small]; [in][small] -an -r 30 -pass 1 -s 374x210 overlay=10:10 -t 00:02:00 [out]" -threads 0 output_movie.mp4

    In the above example, I also tried to set a test movie duration for 2 minutes which raises another question, What is the best way to handle 2 movies of varying length (if they are close) ?

    The resources I have found helpful so far are :

    Multiple video sources combined into one and

    http://ffmpeg.org/ffmpeg.html#overlay-1

    Any help/advice is greatly appreciated. I am having trouble with the FFMPEG syntax ! Thank you !