Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (44)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (4213)

  • Broadcast mjpeg stream via websocket using ffmpeg and Python Tornado

    25 février 2016, par Asampaiz

    Well, i have been strugling for weeks now. Searching and reading a hundred of pages and nearly surrender.

    I need your help, this is the story : I want to stream my Logitech C930e webcam (connected to Raspi 2) to web browser. I have tried so many different way, such as using ffserver to pass the stream from ffmpeg to the web browser, but all of that is using same basic, it’s all need a re-encoding. ffserver will always re-encode the stream that passed by ffmpeg, no matter it is already on the right format or not. My webcam have built-in video encoding to mjpeg until 1080p, so that is the reason why i use this webcam, i don’t want using all of Raspi 2 resource just for encoding those stream.

    This approach end up in eating all my Raspi 2 Resources.

    Logitech C930e ---mjpeg 720p (compressed) instead of rawvideo---> ffmjpeg (copy, no reencoding) —http—> ffserver(mjpeg, reencoding to mjpeg ;this is the problem) —http—> Firefox

    My new approach

    Logitech C930e ---mjpeg 720p (compressed) instead of rawvideo---> ffmjpeg (copy, no reencoding —pipe—> Python3 (using tornado as the web framework) —websocket—> Firefox

    The problem of the new approach

    The problem is i can not make sure the stream format that passed by ffmpeg via pipe to Python is ready | compatible to be streamed to browser via websocket. I mean i already do all these step above but the result is unreadable image shown in the browser (like TV lost signal).

    1. I need help figuring out how to feed python the right mjpeg stream format with ffmpeg
    2. I need help on the client side (javascript) how to show the binary message that sent via websocket (the mjpeg stream)

    This is my current script

    Executing ffmpeg in Python (pipe) - Server Side

    --- cut ---
           multiprocessing.Process.__init__(self)
           self.resultQ = resultQ
           self.taskQ = taskQ
           self.FFMPEG_BIN = "/home/pi/bin/ffmpeg"
           self.video_w = 1280
           self.video_h = 720
           self.video_res = '1280x720'
           self.webcam = '/dev/video0'
           self.frame_rate = '10'
           self.command = ''
           self.pipe = ''
           self.stdout = ''
           self.stderr = ''

       #Start the ffmpeg, this parameter need to be ajusted,
       #video format already tried rawvide, singlejpeg, mjpeg
       #mpjpeg, image2pipe
       #i need help here (to make sure the format is right for pipe)
       def camera_stream_start(self):
               self.command = [ self.FFMPEG_BIN,
                   '-loglevel', 'debug',
                   '-y',
                   '-f', 'v4l2',
                   '-input_format', 'mjpeg',
                   '-s', self.video_res,
                   '-r', self.frame_rate,
                   '-i', self.webcam,
                   '-c:v', 'copy',
                   '-an',
                   '-f', 'rawvideo',
                   #'-pix_fmts', 'rgb24',
                   '-']
               self.pipe = sp.Popen(self.command, stdin=sp.PIPE, stdout = sp.PIPE, shell=False)
               #return self.pipe

       #stop ffmpeg
       def camera_stream_stop(self):
           self.pipe.stdout.flush()
           self.pipe.terminate()
           self.pipe = ''
           #return self.pipe

       def run(self):
           #start stream
           self.camera_stream_start()
           logging.info("** Camera process started")
           while True:
               #get the stream from pipe,
               #this part is also need to be ajusted
               #i need help here
               #processing the stream read so it can be
               #send to browser via websocket
               stream = self.pipe.stdout.read(self.video_w*self.video_h*3)

               #reply format to main process
               #in main process, the data will be send over binary websocket
               #to client (self.write_message(data, binary=True))
               rpl = {
                   'task' : 'feed',
                   'is_binary': True,
                   'data' : stream
               }
               self.pipe.stdout.flush()
               self.resultQ.put(rpl)
               #add some wait
               time.sleep(0.01)
           self.camera_stream_stop()
           logging.info("** Camera process ended")

    ffmpeg output

    --- Cut ---    
    Successfully opened the file.
    Output #0, rawvideo, to 'pipe:':
     Metadata:
       encoder         : Lavf57.26.100
       Stream #0:0, 0, 1/10: Video: mjpeg, 1 reference frame, yuvj422p(center), 1280x720 (0x0), 1/10, q=2-31, -1 kb/s, 10 fps, 10 tbr, 10 tbn, 10 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    --- Cut ---    

    JavaScript websocket - on the client side

    --- Cut ---
    socket = new WebSocket(url, protocols || []);
    socket.binaryType = "arraybuffer";

    socket.onmessage = function (message) {
       //log.debug(message.data instanceof ArrayBuffer);
       //This is for the stream that sent via websocket
       if(message.data instanceof ArrayBuffer)
       {
           //I need help here
           //How can i process the binary stream
           //so its can be shown in the browser (img)
           var bytearray = new Uint8Array(message.data);
           var imageheight = 720;
           var imagewidth = 1280;

           var tempcanvas = document.createElement('canvas');
           tempcanvas.height = imageheight;
           tempcanvas.width = imagewidth;
           var tempcontext = tempcanvas.getContext('2d');

           var imgdata = tempcontext.getImageData(0,0,imagewidth,imageheight);

           var imgdatalen = imgdata.data.length;

           for(var i=8;i/this is for ordinary string that sent via websocket
       else{
           pushData = JSON.parse(message.data);
           console.log(pushData);
       }

    --- Cut ---

    Any help, feedback or anything is very appreciated. If something not clear please advice me.

  • FFServer streaming H.264 from Logitech C920 without re-encoding

    29 novembre 2016, par Zoltan Fedor

    I’m trying to broadcast a native .H264 webcam feed from a Logitech C920 webcam in realtime from an Odroid device (a robot) via ffserver running on a separate server (CentOS 7.1) to users’ browser without reeconding the .H264 video feed.

    Having a realtime video feed in the browser is a challenge on its own, so for now I’m just trying to get the Logitech C920 webcam on the Odroid to stream its native .H264 realtime video feed as mp4 via ffserver to users without the need to reencode the video in the process.
    Obviously I want to avoid re-encoding as that would take too much CPU time and would kill the realtime video feed. Later I might need to change the container to .flv or rtp, so it can be played from the browser in a realtime fashion. I’m using the Logitech C920 webcam, because it can do .H264 encoding on the hardware. (it has been tested by saving a file directly, it works, except the well-known ’jerkiness’ issue related to a linux kernel bug : http://sourceforge.net/p/linux-uvc/mailman/message/33164469/ , but that is a different story)

    The problem is, that however I set ffmpeg-ffserver up, as soon as ffserver is in the picture the feed gets reencoded - even from h264(native) to h264(libx264) - taking up 100% of CPU on the Odroid device and introducing a huge delay in the video feed.

    Below are my ffmpeg and ffserver settings.

    Ffmpeg from the Odroid device streaming the .H264 feed to ffserver

    $ ffmpeg -s 1920x1080 -f v4l2 -vcodec h264 -i /dev/video0 -copyinkf -vcodec copy http://xxxyyyy.com:8090/feed1.ffm
    ffmpeg version N-72744-g653bf3c Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 4.8 (Ubuntu/Linaro 4.8.2-19ubuntu1)
     configuration: --prefix=/home/odroid/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/odroid/ffmpeg_build/include --extra-ldflags=-L/home/odroid/ffmpeg_build/lib --bindir=/home/odroid/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
     libavutil      54. 27.100 / 54. 27.100
     libavcodec     56. 41.100 / 56. 41.100
     libavformat    56. 36.100 / 56. 36.100
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 16.101 /  5. 16.101
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  2.100 /  1.  2.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, video4linux2,v4l2, from '/dev/video0':
     Duration: N/A, start: 6581.606726, bitrate: N/A
       Stream #0:0: Video: h264 (Constrained Baseline), yuvj420p(pc), 1920x1080 [SAR 1:1 DAR 16:9], -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 60 tbc
    [swscaler @ 0x11bf0b0] deprecated pixel format used, make sure you did set range correctly
    No pixel format specified, yuvj420p for H.264 encoding chosen.
    Use -pix_fmt yuv420p for compatibility with outdated media players.
    [libx264 @ 0x12590e0] using SAR=64/45
    [libx264 @ 0x12590e0] using cpu capabilities: ARMv6 NEON
    [libx264 @ 0x12590e0] profile High, level 1b
    Output #0, ffm, to 'http://robo-car.int.thomsonreuters.com:8090/feed1.ffm':
     Metadata:
       creation_time   : now
       encoder         : Lavf56.36.100
       Stream #0:0: Video: h264 (libx264), yuvj420p(pc), 160x128 [SAR 64:45 DAR 16:9], q=-1--1, 64 kb/s, 30 fps, 1000k tbn, 5 tbc
       Metadata:
         encoder         : Lavc56.41.100 libx264
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    ^Cav_interleaved_write_frame(): Immediate exit requested00 bitrate=N/A dup=0 drop=97    
       Last message repeated 2140 times
    frame= 3723 fps=301 q=-1.0 Lsize=     396kB time=00:12:14.20 bitrate=   4.4kbits/s dup=3699 drop=103    
    video:321kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 23.500496%

    And the /etc/ffserver.conf on the server running ffserver :

    HTTPPort 8090                      # Port to bind the server to
    HTTPBindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 10000             # Maximum bandwidth per client
                                  # set this high enough to exceed stream bitrate
    CustomLog -

    <feed>         # This is the input feed where FFmpeg will send
      File ./feed1.ffm            # video stream.
      FileMaxSize 1G              # Maximum file size for buffering video
    </feed>

    <stream>
     Feed feed1.ffm
     Format mp4
     NoAudio
    </stream>

    As you have seen above in the ffmpeg section, there is a reencoding happening on the Odroid device maxing out the CPUs :

    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))

    I have already tried setting the VideoCodec value in the ffserver config directly to libx264, tried the -re setting in ffmpeg, tried using different syntax for ffmpeg, etc. Nothing helps. Reeconding is always there and so I can’t make ffmpeg-ffserver just to broadcast the video stream as-is.

    Both ffmpeg (on the Odroid and on the server) were compiled yesterday (2015-06-09) from source, so they are the latest (and the same) version.

    Any idea ?

    EDIT :
    IN SUMMARY the issue is : I cannot find a way to get ffserver to broadcast the h264(native) feed coming from the Logitech C920 webcam without re-encoding.

  • Compiling FFmpeg on OSX - "speex not found using pkg-config"

    19 septembre 2016, par n4zArh

    I recently had few problems with FFmpeg and compiling it to get library. I managed to get through all of them, however recently I found out I need to add Speex decoder (and possibly encoder) to my project. I got Speex by sources, ./configure and make;make install (later - as I had problems - I also used Brew to download Speex). I added --enable-libspeex to my configure script and every time I try to use it I get "Speex not found using pkg-config" error.

    I am sure that there’s Speex files at /usr/local/include and lib directories, I also added those two as CFLAGS and LDFLAGS, I tried building Speex with or without using --prefix (both pointing to /usr/ and /usr/local/), I tried modifying FFmpeg’s configure file (require_pkg_config with Speex call), but no matter what I try to do I fail to build it - every time with same error.

    Long story short - how to build FFmpeg with Speex decoder on OSX ? I read somewhere that libspeex-dev might be needed, but it’s available through apt-get and not Brew (unless I screwed something up).

    My build script :

    #!/bin/bash

    if [ "$NDK" = "" ]; then
       echo NDK variable not set, assuming ${HOME}/android-ndk
       export NDK=${HOME}/Library/Android/sdk/ndk-bundle
    fi

    SYSROOT=$NDK/platforms/android-16/arch-arm
    # Expand the prebuilt/* path into the correct one
    TOOLCHAIN=`echo $NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/darwin-x86_64`
    export PATH=$TOOLCHAIN/bin:$PATH

    rm -rf build/ffmpeg
    mkdir -p build/ffmpeg
    cd ffmpeg

    # Don't build any neon version for now
    for version in armv5te armv7a; do

    DEST=../build/ffmpeg
    FLAGS="--target-os=linux --cross-prefix=arm-linux-androideabi- --arch=arm"
    FLAGS="$FLAGS --sysroot=$SYSROOT"
    FLAGS="$FLAGS --enable-shared --disable-symver"
    FLAGS="$FLAGS --enable-small"
    FLAGS="$FLAGS --disable-everything"
    FLAGS="$FLAGS --enable-decoder=h264 --enable-decoder=adpcm_ima_oki --enable-decoder=adpcm_ima_ws"
    FLAGS="$FLAGS --enable-encoder=adpcm_ima_qt --enable-encoder=adpcm_ima_wav --enable-encoder=adpcm_g726"
    FLAGS="$FLAGS --enable-encoder=adpcm_g722 --enable-libspeex"

    case "$version" in
       neon)
           EXTRA_CFLAGS="-march=armv7-a -mfloat-abi=softfp -mfpu=neon"
           EXTRA_LDFLAGS="-Wl,--fix-cortex-a8"
           # Runtime choosing neon vs non-neon requires
           # renamed files
           ABI="armeabi-v7a"
           ;;
       armv7a)
           EXTRA_CFLAGS="-march=armv7-a -mfloat-abi=softfp"
           EXTRA_LDFLAGS=""
           ABI="armeabi-v7a"
           ;;
       *)
           EXTRA_CFLAGS=""
           EXTRA_LDFLAGS=""
           ABI="armeabi"
           ;;
    esac
    DEST="$DEST/$ABI"
    FLAGS="$FLAGS --prefix=$DEST"
    EXTRA_CFLAGS="$EXTRA_CFLAGS -I/usr/local/include/"
    EXTRA_LDFLAGS="$EXTRA_LDFLAGS -L/usr/local/lib"
    PKT_CONFIG_PATH="/usr/lib/pkgconfig/"
    mkdir -p $DEST
    echo $FLAGS --extra-cflags="$EXTRA_CFLAGS" --extra-ldflags="$EXTRA_LDFLAGS" > $DEST/info.txt
    ./configure $FLAGS --extra-cflags="$EXTRA_CFLAGS" --extra-ldflags="$EXTRA_LDFLAGS" | tee $DEST/configuration.txt
    [ $PIPESTATUS == 0 ] || exit 1
    rm compat/strtod.o
    rm compat/strtod.d
    make clean
    make -j4 || exit 1
    make install || exit 1

    done

    Tail of config.log :

    BEGIN /tmp/ffconf.QcYgKHFW.c
       1   #include
       2   #include
       3   float foo(complex float f, complex float g) { return cabs(f * I); }
       4   int main(void){ return (int) foo; }
    END /tmp/ffconf.QcYgKHFW.c
    arm-linux-androideabi-gcc --sysroot=/Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -isysroot /Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -Dstrtod=avpriv_strtod -DPIC -I/usr/local/include/ -march=armv5te -std=c99 -fomit-frame-pointer -fPIC -marm -pthread -c -o /tmp/ffconf.vfjjuG7b.o /tmp/ffconf.QcYgKHFW.c
    /tmp/ffconf.QcYgKHFW.c:1:21: fatal error: complex.h: No such file or directory
    #include
                    ^
    compilation terminated.
    check_complexfunc cexp 1
    check_ld cc
    check_cc
    BEGIN /tmp/ffconf.QcYgKHFW.c
       1   #include
       2   #include
       3   float foo(complex float f, complex float g) { return cexp(f * I); }
       4   int main(void){ return (int) foo; }
    END /tmp/ffconf.QcYgKHFW.c
    arm-linux-androideabi-gcc --sysroot=/Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -isysroot /Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -Dstrtod=avpriv_strtod -DPIC -I/usr/local/include/ -march=armv5te -std=c99 -fomit-frame-pointer -fPIC -marm -pthread -c -o /tmp/ffconf.vfjjuG7b.o /tmp/ffconf.QcYgKHFW.c
    /tmp/ffconf.QcYgKHFW.c:1:21: fatal error: complex.h: No such file or directory
    #include
                    ^
    compilation terminated.
    check_pkg_config speex speex/speex.h speex_decoder_init -lspeex
    false --exists --print-errors speex
    ERROR: speex not found using pkg-config