
Recherche avancée
Médias (91)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
avec chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
sans chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
config chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (25)
-
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Sélection de projets utilisant MediaSPIP
29 avril 2011, parLes exemples cités ci-dessous sont des éléments représentatifs d’usages spécifiques de MediaSPIP pour certains projets.
Vous pensez avoir un site "remarquable" réalisé avec MediaSPIP ? Faites le nous savoir ici.
Ferme MediaSPIP @ Infini
L’Association Infini développe des activités d’accueil, de point d’accès internet, de formation, de conduite de projets innovants dans le domaine des Technologies de l’Information et de la Communication, et l’hébergement de sites. Elle joue en la matière un rôle unique (...)
Sur d’autres sites (4428)
-
I received connection refused error while trying to stream live video through RTMP with FFMPEG
25 septembre 2020, par FemzyI am working on a nodeJs app that can send camera stream to third party plartform i.e Facebook and Youtube using the RTMP protoco ;.. It works well on my localhost but once i deploy to the server, it only give me errors. The error I get is below on this content..
Here is my codes


server.js




const child_process = require('child_process'); // To be used later for running FFmpeg
const express = require('express');
const http = require('http');
const WebSocketServer = require('ws').Server;

const app = express();
const server = http.createServer(app).listen(4000, () => {
 console.log('Listening...');
});

// Serve static files out of the www directory, where we will put our HTML page
app.use(express.static(__dirname + '/www'));


const wss = new WebSocketServer({
 server: server
});
wss.on('connection', (ws, req) => {
 
 
 
 const rtmpUrl = 'rtmp://a.rtmp.youtube.com/live2/MyStreamId';
 console.log('Target RTMP URL:', rtmpUrl);
 
 // Launch FFmpeg to handle all appropriate transcoding, muxing, and RTMP.
 // If 'ffmpeg' isn't in your path, specify the full path to the ffmpeg binary.
 const ffmpeg = child_process.spawn('ffmpeg', [
 // Facebook requires an audio track, so we create a silent one here.
 // Remove this line, as well as `-shortest`, if you send audio from the browser.
 //'-f', 'lavfi', '-i', 'anullsrc',
 
 // FFmpeg will read input video from STDIN
 '-i', '-',
 
 // Because we're using a generated audio source which never ends,
 // specify that we'll stop at end of other input. Remove this line if you
 // send audio from the browser.
 //'-shortest',
 
 // If we're encoding H.264 in-browser, we can set the video codec to 'copy'
 // so that we don't waste any CPU and quality with unnecessary transcoding.
 // If the browser doesn't support H.264, set the video codec to 'libx264'
 // or similar to transcode it to H.264 here on the server.
 '-vcodec', 'copy',
 
 // AAC audio is required for Facebook Live. No browser currently supports
 // encoding AAC, so we must transcode the audio to AAC here on the server.
 '-acodec', 'aac',
 
 // FLV is the container format used in conjunction with RTMP
 '-f', 'flv',
 
 // The output RTMP URL.
 // For debugging, you could set this to a filename like 'test.flv', and play
 // the resulting file with VLC. Please also read the security considerations
 // later on in this tutorial.
 rtmpUrl 
 ]);
 
 // If FFmpeg stops for any reason, close the WebSocket connection.
 ffmpeg.on('close', (code, signal) => {
 console.log('FFmpeg child process closed, code ' + code + ', signal ' + signal);
 ws.terminate();
 });
 
 // Handle STDIN pipe errors by logging to the console.
 // These errors most commonly occur when FFmpeg closes and there is still
 // data to write. If left unhandled, the server will crash.
 ffmpeg.stdin.on('error', (e) => {
 console.log('FFmpeg STDIN Error', e);
 });
 
 // FFmpeg outputs all of its messages to STDERR. Let's log them to the console.
 ffmpeg.stderr.on('data', (data) => {
 console.log('FFmpeg STDERR:', data.toString());
 });

 // When data comes in from the WebSocket, write it to FFmpeg's STDIN.
 ws.on('message', (msg) => {
 console.log('DATA', msg);
 ffmpeg.stdin.write(msg);
 });
 
 // If the client disconnects, stop FFmpeg.
 ws.on('close', (e) => {
 ffmpeg.kill('SIGINT');
 });
 
});







On the server.js file i create a websocket to receive stream data from the client side and then use FFMPEG to send the stream data over to youtube via the RTMP url


Here is my client.js code




const ws = new WebSocket(
 'wss://my-websocket-server.com'

 );
 ws.addEventListener('open', (e) => {
 console.log('WebSocket Open', e);
 drawVideosToCanvas();
 mediaStream = getMixedVideoStream(); // 30 FPS
 mediaRecorder = new MediaRecorder(mediaStream, {
 mimeType: 'video/webm;codecs=h264',
 //videoBitsPerSecond : 3000000000
 bitsPerSecond: 6000000
 });

 mediaRecorder.addEventListener('dataavailable', (e) => {
 ws.send(e.data);
 });
 mediaRecorder.onstop = function() {
 ws.close.bind(ws);
 isRecording = false;
 actionBtn.textContent = 'Start Streaming';
 actionBtn.onclick = startRecording;
 }
 mediaRecorder.onstart = function() {
 isRecording = true;
 actionBtn.textContent = 'Stop Streaming';
 actionBtn.onclick = stopRecording;
 screenShareBtn.onclick = startSharing;
 screenShareBtn.disabled = false;
 }
 //mediaRecorder.addEventListener('stop', ws.close.bind(ws));

 mediaRecorder.start(1000); // Start recording, and dump data every second

 });







On my client.js file, i captured users camera and then open the websocket server to send the data to the server.. Every thing works fine on local host expect for when i deploy it to live server..
i am wondering if there is a bad configuration on the server.. The server is Centos 7.8 and the app was runing on Apache software
Here is how i configured the virtual host for the websocket domain




ServerName my-websocket.com

 RewriteEngine on
 RewriteCond %{HTTP:Upgrade} websocket [NC]
 RewriteCond %{HTTP:Connection} upgrade [NC]
 RewriteRule .* "ws://127.0.0.1:3000/$1" [P,L]

 ProxyPass "/" "http://127.0.0.1:3000/$1"
 ProxyPassReverse "/" "http://127.0.0.1:3000/$1"
 ProxyRequests off







I don't know much about server configuration but i just thought may be the configuration has to do with why FFMPEg can not open connection to RTMP protocol on the server.


here is the error am getting




FFmpeg STDERR: Input #0, lavfi, from 'anullsrc':
 Duration:
FFmpeg STDERR: N/A, start: 0.000000, bitrate: 705 kb/s
 Stream #0:0: Audio: pcm_u8, 44100 Hz, stereo, u8, 705 kb/s

DATA <buffer 1a="1a">
DATA <buffer 45="45" df="df" a3="a3" 42="42" 86="86" 81="81" 01="01" f7="f7" f2="f2" 04="04" f3="f3" 08="08" 82="82" 88="88" 6d="6d" 61="61" 74="74" 72="72" 6f="6f" 73="73" 6b="6b" 87="87" 0442="0442" 85="85" 02="02" 18="18" 53="53" 80="80" 67="67" ff="ff" 53991="53991" more="more" bytes="bytes">
DATA <buffer 40="40" c1="c1" 81="81" 00="00" f0="f0" 80="80" 7b="7b" 83="83" 3e="3e" 3b="3b" 07="07" d6="d6" 4e="4e" 1c="1c" 11="11" b4="b4" 7f="7f" cb="cb" 5e="5e" 68="68" 9b="9b" d5="d5" 2a="2a" e3="e3" 06="06" c6="c6" f3="f3" 94="94" ff="ff" 29="29" 16="16" b2="b2" 60="60" 04ac="04ac" 37="37" fb="fb" 1a="1a" 15="15" ea="ea" 39="39" a0="a0" cd="cd" 02="02" b8="b8" 56206="56206" more="more" bytes="bytes">
FFmpeg STDERR: Input #1, matroska,webm, from 'pipe:':
 Metadata:
 encoder :
FFmpeg STDERR: Chrome
 Duration: N/A, start: 0.000000, bitrate: N/A
 Stream #1:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)
 Stream #1:1(eng): Video: h264 (Constrained Baseline), yuv420p(progressive), 1366x768, SAR 1:1 DAR 683:384, 30.30 fps, 30 tbr, 1k tbn, 60 tbc (default)

FFmpeg STDERR: [tcp @ 0xe5fac0] Connection to tcp://a.rtmp.youtube.com:1935 failed (Connection refused), trying next address
[rtmp @ 0xe0fb80] Cannot open connection tcp://a.rtmp.youtube.com:1935

FFmpeg STDERR: rtmp://a.rtmp.youtube.com/live2/mystreamid: Network is unreachable

FFmpeg child process closed, code 1, signal null</buffer></buffer></buffer>







I will really appreciate if I could get some insight on what may be causing this issue or what i can do to solve it..Thanks in advance..


-
Join us at MatomoCamp 2024 world tour edition
13 novembre 2024, par Daniel Crough — Uncategorized -
compiling ffmpeg for Mac OSX High Sierra 10.13
29 juillet 2021, par MartinHello I made an electron app that uses ffmpeg to combine audio and renders video, it works fine on windows, linux, and modern mac osx computers, but a user has reported to me that on an older version of mac osx such as High Sierra 10.13, the way that I have setup ffmpeg does not work.


I have a virtual machine with High Sierra v10.13 where I install
RenderTune-mac.dmg
from my RenderTune releases page, then I download 2 audio files and the image from this link. I open RenderTune, and try render a video. My command to combine the audio files into a single mp3 works fine, but when I try to combine that mp3 with the image file, the ffmpeg build I have packaged with my electron app fails with this error :

Command was killed with SIGABRT (Aborted): /Applications/RenderTune.app/Contents/Resources/ffmpeg -loop 1 -framerate 2 -i /Users/martin/Downloads/R-3777978-1344032418-8379.jpeg.jpg -i /Users/martin/Downloads/output-871140.mp3 -y -acodec copy -b:a 320k -vcodec libx264 -b:v 8000k -maxrate 8000k -minrate 8000k -bufsize 3M -filter:v scale=w=1920:h=1954 -preset medium -tune stillimage -crf 18 -pix_fmt yuv420p -shortest /Users/martin/Downloads/concatVideo-871140.mp4
ffmpeg version git-2021-03-24-13335df Copyright (c) 2000-2021 the FFmpeg developers
 built with Apple LLVM version 10.0.1 (clang-1001.0.46.4)
 configuration: --pkgconfigdir=/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/lib/pkgconfig --prefix=/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace --pkg-config-flags=--static --extra-cflags='-I/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/include -mmacosx-version-min=10.10' --extra-ldflags='-L/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/lib -mmacosx-version-min=10.10' --extra-libs='-lpthread -lm' --enable-static --disable-securetransport --disable-debug --disable-shared --disable-ffplay --disable-lzma --disable-doc --enable-version3 --enable-pthreads --enable-runtime-cpudetect --enable-avfilter --enable-filters --disable-libxcb --enable-gpl --enable-nonfree --disable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libx264
 libavutil 56. 66.100 / 56. 66.100
 libavcodec 58.128.100 / 58.128.100
 libavformat 58. 69.100 / 58. 69.100
 libavdevice 58. 12.100 / 58. 12.100
 libavfilter 7.107.100 / 7.107.100
 libswscale 5. 8.100 / 5. 8.100
 libswresample 3. 8.100 / 3. 8.100
 libpostproc 55. 8.100 / 55. 8.100
Input #0, image2, from '/Users/martin/Downloads/R-3777978-1344032418-8379.jpeg.jpg':
 Duration: 00:00:00.50, start: 0.000000, bitrate: 1758 kb/s
 Stream #0:0: Video: mjpeg (Progressive), yuvj444p(pc, bt470bg/unknown/unknown), 590x600 [SAR 1:1 DAR 59:60], 2 fps, 2 tbr, 2 tbn, 2 tbc
Input #1, mp3, from '/Users/martin/Downloads/output-871140.mp3':
 Metadata:
 title : My Little Grass Shack
 album : Our Hawaii - A Collection Of Personal Favorites
 artist : Society Of Seven
 track : 11
 encoder : Lavf58.69.100
 Duration: 00:06:25.59, start: 0.025057, bitrate: 320 kb/s
 Stream #1:0: Audio: mp3, 44100 Hz, stereo, fltp, 320 kb/s
 Metadata:
 encoder : Lavc58.12
Stream mapping:
 Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
 Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
[swscaler @ 0x7fbad9167600] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x7fbad9040400] using SAR=2681/2679
dyld: lazy symbol binding failed: Symbol not found: ____chkstk_darwin
 Referenced from: /Applications/RenderTune.app/Contents/Resources/ffmpeg
 Expected in: /usr/lib/libSystem.B.dylib

dyld: Symbol not found: ____chkstk_darwin
 Referenced from: /Applications/RenderTune.app/Contents/Resources/ffmpeg
 Expected in: /usr/lib/libSystem.B.dylib

 at makeError (/Applications/Render…eca/lib/error.js:59)
 at handlePromise (/Applications/Render…/execa/index.js:114)
 at async file:/Applicat…js/newindex.js:1323



These files will render fine on windows/linux and recent mac versions. In order to package ffmpeg in my electron app on mac computers I had to build a custom sandboxed version with no dynamically linked libraries. I have a .sh file that automatically downloads ffmpeg and builds it with all the necessary flags for mac computers.
https://github.com/MartinBarker/RenderTune/blob/master/buildffmpeg.sh
Inside this .sh file is where I compile ffmpeg using these flags :


./configure \
 --pkgconfigdir="$WORKSPACE/lib/pkgconfig" \
 --prefix=${WORKSPACE} \
 --pkg-config-flags="--static" \
 --extra-cflags="-I$WORKSPACE/include -mmacosx-version-min=${MACOS_MIN}" \
 --extra-ldflags="-L$WORKSPACE/lib -mmacosx-version-min=${MACOS_MIN}" \
 --extra-libs="-lpthread -lm" \
 --enable-static \
 --disable-securetransport \
 --disable-debug \
 --disable-shared \
 --disable-ffplay \
 --disable-lzma \
 --disable-doc \
 --enable-version3 \
 --enable-pthreads \
 --enable-runtime-cpudetect \
 --enable-avfilter \
 --enable-filters \
 --disable-libxcb \
 --enable-gpl \
 --enable-nonfree \
 --disable-libass \
 --enable-libfdk-aac \
 --enable-libmp3lame \
 --enable-libx264 



If I try to run this script in my High Sierra VM, it fails with this message :


Unknown option "-extra-libs=-lpthread"


if I remove that flag it fails with a different message :


Unknown option "--enable-static"


I need this flag in order to release my electron app on the mac apple store, can anyone help me compile a static version of ffmpeg that works on old versions like High Sierra 10.13 as well as works on modern mac os systems ?