Recherche avancée

Médias (91)

Autres articles (43)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (6322)

  • Revision 32884 : auteurs dans les sommaires (page d’accueil+rubriques)

    13 novembre 2009, par fil@… — Log

    auteurs dans les sommaires (page d’accueil+rubriques)

  • FFmpeg : stream audio playlist, normalize loudness and generate spectrogram and waveform

    23 février 2021, par watain

    I would like to use FFmpeg to stream a playlist containing multiple audio files (mainly FLAC and MP3). During playback, I would like FFmpeg to normalize the loudness of the audio signal and generate a spectrogram and waveform of the audio signal separately. The spectrogram and waveform should serve as a audio stream monitor. The final audio stream, spectrogram and waveform outputs will be sent to a browser, which plays the audio stream and continuously renders the spectrogram and waveform "images". I would also like to be able to remove and add audio files from the playlist during playback.

    



    As a first step, I would like to use the ffmpeg command to achieve the desired result, before I'll try to write code which does the same programmatically.

    



    (sidenote : I've discovered libgroove which basically does what I want, but I would like to understand the FFmpeg internals and write my own piece of software. The target language is Go and using either the goav or go-libav libraries might do the job. However, I might end up writing the code in C, then creating Go language bindings from C, instead of relying on one of the named libraries.)

    



    Here's a little overview :

    



    playlist (input) --> loudnorm --> split --> spectrogram --> separate output
                                    |
                                  split ---> waveform ----> separate output
                                    |
                                    +------> encode ------> audio stream output


    



    For the loudness normalization, I intend to use the loudnorm filter, which implements the EBU R128 algorithm.

    



    For the spectrogram, I intend to use the showspectrum or showspectrumpic filter. Since I want the spectrogram to be "steamable", I'm not really sure how to do this. Maybe there's a way to output segments step-by-step ? Or maybe there's a way to output some sort of representation (JSON or any other format) step-by-step ?

    



    For the waveform, I intend to use the showwaves or showwavespic filter. The same as for the spectrogram applies here, since the output should be "streamable".

    



    I'm having a little trouble to achieve what I want using the ffmpeg command. Here's what I have so far :

    



    ffmpeg \
    -re -i input.flac \
    -filter_complex "
      [0:a] loudnorm [ln]; \
      [ln] asplit [a][b]; \
      [a] showspectrumpic=size=640x518:mode=combined [ss]; \
      [b] showwavespic=size=1280x202 [sw]
    " \
    -map '[ln]' -map '[ss]' -map '[sw]' \
    -f tee \
    -acodec libmp3lame -ab 128k -ac 2 -ar 44100 \
    "
      [aselect='ln'] rtp://127.0.0.1:1234 | \
      [aselect='ss'] ss.png | \
      [aselect='sw'] sw.png
    "


    



    Currently, I get the following error :

    



    Output with label 'ln' does not exist in any defined filter graph, or was already used elsewhere.


    



    Also, I'm not sure whether aselect is the correct functionality to use. Any Hints ?

    


  • Fragmented mp4 file is not played by MSE

    19 mai 2020, par Daniel

    I created a fragmented mp4 file with ffmpeg (from h264), and removed the first 6 moof and mdat pairs.

    



    So now it still has the correct order of boxes : ftyp, moov, moof, mdat, moof, mdat, ..., but the first moof packet has the sequenceNumber of 7.

    



    VLC can play it fine, 'Movies & TV' can also play, but the first some seconds are black.

    



    If I drag the file into the browser, it can also play it fine.

    



    It is however not being displayed at all in the browser (Chrome) if I feed it via MSE.

    



    No error messages are printed, and in the media-internals' log it can be seen that the videoplayer starts playing in the first second and suspends it only in the 18th second :

    



    Timestamp   Property    Value
00:00:00.000    origin_url  "https://localhost:8443/"
00:00:00.000    kFrameUrl   "https://localhost:8443/websocket/videodemo.html"
00:00:00.000    kFrameTitle "WebSocket and MSE demo"
00:00:00.000    url "blob:https://localhost:8443/3b4d4b1a-7c08-4136-95fe-dabc14fba95f"
00:00:00.000    info    "ChunkDemuxer"
00:00:00.000    pipeline_state  "kStarting"
00:00:01.067    kVideoTracks    [{"alpha mode":"is_opaque","codec":"h264","coded size":"1600x900","color space":"{primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}","encryption scheme":"Unencrypted","flipped":false,"has_extra_data":false,"natural size":"1600x900","profile":"h264 main","rotation":"0°","visible rect":"0,0 1600x900"}]
00:00:01.067    debug   "Video rendering in low delay mode."
00:00:01.070    info    "Using D3D11 device for DXVA"
00:00:01.075    kIsVideoDecryptingDemuxerStream false
00:00:01.075    kVideoDecoderName   "MojoVideoDecoder"
00:00:01.075    kIsPlatformVideoDecoder true
00:00:01.075    info    "Selected MojoVideoDecoder for video decoding, config: codec: h264, profile: h264 main, alpha_mode: is_opaque, coded size: [1600,900], visible rect: [0,0,1600,900], natural size: [1600,900], has extra data: false, encryption scheme: Unencrypted, rotation: 0°, flipped: 0, color space: {primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}"
00:00:01.075    pipeline_state  "kPlaying"
00:00:01.067    duration    "unknown"
00:00:18.926    pipeline_state  "kSuspending"
00:00:18.926    pipeline_state  "kSuspended"
00:00:18.927    event   "SUSPENDED"


    



    Here is the video file for reference.

    



    What is the problem with this file, why is it not displayed in the browser with MSE ?