Recherche avancée

Médias (91)

Autres articles (14)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (2190)

  • Cannot get JACK Audio/Netjack working over LAN

    23 juin 2020, par James

    I'm trying to stream low latency audio between 2 raspberry pis. Both gstreamer and ffmpeg induce 2+ second delays for me.

    



    I've played around with Jack Audio and locally on a single pi it seems promising. I can route mic input to a speaker locally and it is almost instantaneous.

    



    However, I have been having trouble getting it to route between devices using Netjack.

    



    # ON SERVER
jackd -P70 -p16 -t2000 -dalsa -dhw:1 -p128 -n3 -r44100 -s 

# ON CLIENT
jackd -v -R -P70 -dnetone -i1 -o1 -I0 -O0  -r44100 -p128 -n3

# ON SERVER
jack_netsource -H < ip address of client >
jack_lsp # list availible connection ports

>system:capture_1
>system:playback_1
>system:playback_2
>netjack:capture_1
>netjack:capture_2
>netjack:capture_3
>netjack:playback_1
>netjack:playback_2
>netjack:playback_3

jack_connect system:capture_1 system:playback_1 # this works
jack_connect system:capture_1 netjack:playback_1 # this doesn't work :(


    



    Most of the launch options I pulled from here http://wiki.linuxaudio.org/wiki/raspberrypi#using_jack. I'll be honest I don't really know what they do.

    



    The client jackd output shows messages like

    



    Jack: data not valid
Jack: data not valid
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3
netxruns... duration: 139ms
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3


    



    And the server jack_netsource output looks like

    



    current latency 114
current latency 20
current latency 27
current latency 29
current latency 48
current latency 23
current latency 33
current latency 28
current latency 41
current latency 84
current latency 44


    



    and the server jackd output looks like

    



    JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackEngine::XRun: client = netjack was not finished, state = Triggered


    



    I believe the -dnetone flag indicates to use Netjack2. Netjack 1, which I've tried with the -dnet flag results in a single Not Connected message from jack_netsource and :

    



    Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6


    



    from the client jackd.

    


  • How do I know ffmpeg-php is installed ?

    18 juillet 2014, par Rob Avery IV

    I just followed the instructions from this link on how to install ffmpeg-php on my dedicated server : http://www.ndchost.com/wiki/server-administration/install-ffmpeg

    At the bottom, it says to run the command php -i|grep ffmpeg and if it outputs the following lines then it is installed :

    ffmpegffmpeg support (ffmpeg-php) => enabled
    ffmpeg-php version => 0.6.0
    ffmpeg.allow_persistent => 0 => 0

    When I run it, it gives me this :

    ffmpeg
    ffmpeg-php version => 0.6.0-svn
    ffmpeg-php built on => Jul 18 2014 08:46:12
    ffmpeg-php gd support  => enabled
    ffmpeg libavcodec version => Lavc52.108.0
    ffmpeg libavformat version => Lavf52.93.0
    ffmpeg swscaler version => SwS0.12.0
    ffmpeg.allow_persistent => 0 => 0
    ffmpeg.show_warnings => 0 => 0
    PWD => /usr/local/src/ffmpeg-php-0.6.0
    _SERVER["PWD"] => /usr/local/src/ffmpeg-php-0.6.0
    _ENV["PWD"] => /usr/local/src/ffmpeg-php-0.6.0

    I got 2/3 lines, but the one is not character-for-character the same.

    Is ffmpegffmpeg support (ffmpeg-php) => enabled the same as ffmpegffmpeg support (ffmpeg-php) => enabled in this context ?

    EDIT :
    Running this command ffmpeg -version gives me this result :

    FFmpeg version SVN-r26402, Copyright (c) 2000-2011 the FFmpeg developers
     built on Jul 18 2014 08:41:45 with gcc 4.4.7 20120313 (Red Hat 4.4.7-3)
     configuration: --enable-libmp3lame --disable-mmx --enable-shared
     libavutil     50.36. 0 / 50.36. 0
     libavcore      0.16. 1 /  0.16. 1
     libavcodec    52.108. 0 / 52.108. 0
     libavformat   52.93. 0 / 52.93. 0
     libavdevice   52. 2. 3 / 52. 2. 3
     libavfilter    1.74. 0 /  1.74. 0
     libswscale     0.12. 0 /  0.12. 0
    FFmpeg SVN-r26402
    libavutil     50.36. 0 / 50.36. 0
    libavcore      0.16. 1 /  0.16. 1
    libavcodec    52.108. 0 / 52.108. 0
    libavformat   52.93. 0 / 52.93. 0
    libavdevice   52. 2. 3 / 52. 2. 3
    libavfilter    1.74. 0 /  1.74. 0
    libswscale     0.12. 0 /  0.12. 0
  • FFMPEG or FFPLAY, catch FFT signal in real time as floats

    25 avril 2021, par NVRM

    Looking to extract in real time a FFT snapshot of waveforms data with ffplay, in the view of creating animations.

    


    This is exactly what I am looking to catch, but this demo is using JavaScript in a browser. (Source own post)

    


    

    

    const audio = document.getElementById('music');
audio.load();
audio.play();

const ctx = new AudioContext();
const audioSrc = ctx.createMediaElementSource(audio);
const analyser = ctx.createAnalyser();

audioSrc.connect(analyser);
analyser.connect(ctx.destination);

analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
const frequencyData = new Uint8Array(bufferLength);

setInterval(() => {
   analyser.getByteFrequencyData(frequencyData);
   console.log(frequencyData);
}, 1000);

    


    <audio src="http://strm112.1.fm/reggae_mobile_mp3" crossorigin="use-URL-credentials" controls="true"></audio>

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;


    &#xA;

    I tried many variations around the method posted on https://trac.ffmpeg.org/wiki/Waveform .

    &#xA;

    enter image description here

    &#xA;

    The problem is the output format for FFT is PCM (Pulse Code Modulation), and not real time.

    &#xA;


    &#xA;

    In a generic way, is there a simple way to do this, while the sound is playing, to retrieve this data ?

    &#xA;

    ffplay -fft file.mp3 > fft.json&#xA;

    &#xA;


    &#xA;

    Using C, same stuff : Apply FFT on pcm data and convert to a spectrogram

    &#xA;

    FFMPEG waveform filter documentation

    &#xA;