Recherche avancée

Médias (91)

Autres articles (68)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

Sur d’autres sites (9158)

  • android ffmpeg concatenate multiple m4a files

    19 janvier 2017, par margie

    I need help in concatenating audio files in android using ffmpeg

    My source of audio files is from recording result (container .m4a, encoder aac).
    When I concat them using ffmpeg, the result file length(duration) is only of the 1st file.

    ie. FileA length = 1:00, FileB length = 0:14. Expected FileResult length = 1:14, but actual result is 1:00).

    I also tried the command I used in desktop ffmpeg and it give the same result.

    This is my command :

    [-i,
    concat:storage/emulated/0/Music/Recordings/REC_20170119_162023.m4a|/storage/emulated/0/Music/Recordings/REC_20170119_162042.m4a,
    -c,
    copy,
    /storage/emulated/0/Music/Recordings/REC_20170119_162047.m4a]

    and the ffmpeg result is :

    D/FFmpeg: onSuccess: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
           built with gcc 4.8 (GCC)
           configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
           libavutil      55. 17.103 / 55. 17.103
           libavcodec     57. 24.102 / 57. 24.102
           libavformat    57. 25.100 / 57. 25.100
           libavdevice    57.  0.101 / 57.  0.101
           libavfilter     6. 31.100 /  6. 31.100
           libswscale      4.  0.100 /  4.  0.100
           libswresample   2.  0.101 /  2.  0.101
           libpostproc    54.  0.100 / 54.  0.100
         [mov,mp4,m4a,3gp,3g2,mj2 @ 0xf58d7000] Found duplicated MOOV Atom. Skipped it
         Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'concat:/storage/emulated/0/Music/Recordings/REC_20170119_162023.m4a|/storage/emulated/0/Music/Recordings/REC_20170119_162042.m4a':
           Metadata:
             com.android.version: 6.0
             major_brand     : mp42
             minor_version   : 0
             creation_time   : 2017-01-19 09:20:30
             compatible_brands: isommp42
           Duration: 00:00:07.18, start: 0.000000, bitrate: 21 kb/s
             Stream #0:0(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 10 kb/s (default)
             Metadata:
               creation_time   : 2017-01-19 09:20:30
               handler_name    : SoundHandle
         Output #0, ipod, to '/storage/emulated/0/Music/Recordings/REC_20170119_162047.m4a':
           Metadata:
             com.android.version: 6.0
             major_brand     : mp42
             minor_version   : 0
             compatible_brands: isommp42
             encoder         : Lavf57.25.100
             Stream #0:0(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, 10 kb/s (default)
             Metadata:
               creation_time   : 2017-01-19 09:20:30
               handler_name    : SoundHandle
         Stream mapping:
           Stream #0:0 -> #0:0 (copy)
         Press [q] to stop, [?] for help
         size=      11kB time=00:00:07.17 bitrate=  13.1kbits/s speed= 230x    
         video:0kB audio:10kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 20.181986%

    also tried this command, but still give the same result

    [-y, -i, /storage/emulated/0/Music/Recordings/REC_20170119_165440.m4a, -i, /storage/emulated/0/Music/Recordings/REC_20170119_165445.m4a, -acodec, aac, /storage/emulated/0/Music/Recordings/REC_20170119_165448.m4a]
  • VLC - Could someone assist me into improving latency in streaming to web based app ?

    19 janvier 2017, par zyeek

    I have been looking for solutions in which I can stream an IP camera’s stream to HTML 5. Currently as is it doesn’t support RTSP so easily.

    I am trying to be able to view the camera’s stream as live as possible. I was hoping someone could help me achieve that. I have been playing with it to get something workable, but at the moment I get a 5s delay stream. It is smooth, but wish to get it hopefully within <1-2s delay if possible.

    My current setup goes from taking my IP camera’s stream in RTSP and converting it to a webm and streaming it to a url, which then I plan on using that to put else where in a web app.


    What I would like to achieve

    Use a protocol that has low latency with audio was well. Webm was used as test, but I can’t seem to get other commands to get the proper stream to be going.

    I would like to use DASH, but from reading FFMPEG currently doesn’t support it. I was thinking maybe RTMP would be good enough for now, being both low latency and HTTP 5 compatible. I am just unable to figure out how to get FFMPEG to transcode the RTSP to RTMP.


    SETUP :

    I am using ffserver and ffmpeg. Overall scope : trying to pull IP camera stream and put it on a web app.

    Framework I am use is Meteor JS. So, I am trying to few plugins or outside complex setups as I want to be able to deploy this Meteor app on mobile devices as well. So, I want to stay within the boundaries of what HTML 5 can support.

    My current ffserver setup is ffserver.conf (this was taking from bunch of different place :

    HTTPPort 8090                      # Port to bind the server to
    HTTPBindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 10000             # Maximum bandwidth per client
                                  # set this high enough to exceed stream bitrate
    CustomLog -

    <feed>
        File /tmp/feed.ffm
        FileMaxSize 100K
        ACL allow 127.0.0.1
    </feed>


    <stream>
        Format webm
        Feed feed.ffm
        NoAudio
        VideoCodec libvpx
        VideoFrameRate 24
        VideoBitRate 1024
        VideoSize 480x270
        VideoBufferSize 1024
        AVOptionVideo flags +global_header
        StartSendOnKey
    </stream>

    <stream>            # Server status URL
      Format status
      # Only allow local people to get the status
      ACL allow localhost
    </stream>

    <redirect>    # Just an URL redirect for index
      # Redirect index.html to the appropriate site
      URL url/
    </redirect>

    Works normally :

    ffserver version 3.2.2 Copyright (c) 2000-2016 the FFmpeg developers
     built with Apple LLVM version 5.1 (clang-503.0.40) (based on LLVM 3.4svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/3.2.2 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-frei0r --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags=-I/usr/local/Cellar/openjpeg/2.1.2/include/openjpeg-2.1 --enable-nonfree --enable-vda
     libavutil      55. 34.100 / 55. 34.100
     libavcodec     57. 64.101 / 57. 64.101
     libavformat    57. 56.100 / 57. 56.100
     libavdevice    57.  1.100 / 57.  1.100
     libavfilter     6. 65.100 /  6. 65.100
     libavresample   3.  1.  0 /  3.  1.  0
     libswscale      4.  2.100 /  4.  2.100
     libswresample   2.  3.100 /  2.  3.100
     libpostproc    54.  1.100 / 54.  1.100
    /etc/ffserver.conf:27: Setting default value for video bit rate tolerance = 256000. Use NoDefaults to disable it.
    /etc/ffserver.conf:27: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.
    /etc/ffserver.conf:27: Setting default value for video max rate = 2048000. Use NoDefaults to disable it.
    Wed Jan 18 17:04:30 2017 FFserver started.

    Now I give life to the feed with ffmpeg. Command I use :

    ffmpeg -vsync 2 -i rtsp://admin:password@192.168.2.165:88/videoMain -map 0 http://localhost:8090/feed.ffm

    which gives the result :

    ffmpeg version 3.2.2 Copyright (c) 2000-2016 the FFmpeg developers
     built with Apple LLVM version 5.1 (clang-503.0.40) (based on LLVM 3.4svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/3.2.2 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-frei0r --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags=-I/usr/local/Cellar/openjpeg/2.1.2/include/openjpeg-2.1 --enable-nonfree --enable-vda
     libavutil      55. 34.100 / 55. 34.100
     libavcodec     57. 64.101 / 57. 64.101
     libavformat    57. 56.100 / 57. 56.100
     libavdevice    57.  1.100 / 57.  1.100
     libavfilter     6. 65.100 /  6. 65.100
     libavresample   3.  1.  0 /  3.  1.  0
     libswscale      4.  2.100 /  4.  2.100
     libswresample   2.  3.100 /  2.  3.100
     libpostproc    54.  1.100 / 54.  1.100
    Guessed Channel Layout for Input Stream #0.1 : mono
    Input #0, rtsp, from 'rtsp://admin:password@192.168.2.165:88/videoMain':
     Metadata:
       title           : IP Camera Video
       comment         : videoMain
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1280x720, 90k tbr, 90k tbn, 180k tbc
       Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s
    [libvpx @ 0x7fd58184a600] v1.6.0
    Output #0, ffm, to 'http://localhost:8090/feed.ffm':
     Metadata:
       title           : IP Camera Video
       comment         : videoMain
       creation_time   : now
       encoder         : Lavf57.56.100
       Stream #0:0: Video: vp8 (libvpx), yuv420p, 480x270, q=-1--1, 1024 kb/s, 90k fps, 1000k tbn, 24 tbc
       Metadata:
         encoder         : Lavc57.64.101 libvpx
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 8388608 vbv_delay: -1
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> vp8 (libvpx))
    Press [q] to stop, [?] for help
    [rtsp @ 0x7fd581000000] max delay reached. need to consume packet
    [rtsp @ 0x7fd581000000] RTP: missed 5 packets
    [h264 @ 0x7fd5818ae800] Increasing reorder buffer to 1
    frame=  139 fps= 18 q=0.0 Lsize=     440kB time=00:00:09.25 bitrate= 389.7kbits/s speed=1.19x    
    video:429kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.663893%
  • Evolution #3964 : mise en forme minimum des formulaires

    26 mars 2018, par cam.lafit -

    Salut

    Note : ce mail date de juin 2017, perdu dans la boite d’envoi

    Les captures donnent envie :)

    De mon coté j’avais fait un peu de veille sur la question en cherchant
    des kit css tout prêts, j’avais trouvé ceux ci ( à voir si cela peut
    donner des idées à prendre) :

    https://purecss.io/forms/
    http://formanizr.firchow.net/index.php