
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (29)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)
Sur d’autres sites (4411)
-
stream mp4 video with node fluent-ffmpeg
28 février 2016, par rchristianI’m trying to create video stream server and client with node fluent-ffmpeg, express and ejs. And a haven’t solve this for a while.
What I want to do is to play video beginning by certain time.
The following codes make it with Safari browser on windows but with others it makes a loop of a few seconds or it saysvideo format not supported
server code (run.js) :
app.get('/video', function(req, res) {
//define file path,time to seek the beegining and set ffmpeg binary
var pathToMovie = '../videos/test.mp4';
var seektime = 100;
proc.setFfmpegPath(__dirname + "/ffmpeg/ffmpeg");
//encoding the video source
var proc = new ffmpeg({source: pathToMovie})
.seekInput(seektime)
.withVideoBitrate(1024)
.withVideoCodec('libx264')
.withAspect('16:9')
.withFps(24)
.withAudioBitrate('128k')
.withAudioCodec('libfaac')
.toFormat('mp4');
//pipe
.pipe(res, {end: true});
});client code (index.ejs) :
<video>
<source src="video/" type="video/mp4"></source>
</video>
Help please. I searched everywhere solution but I didn’t find
-
I received connection refused error while trying to stream live video through RTMP with FFMPEG
25 septembre 2020, par FemzyI am working on a nodeJs app that can send camera stream to third party plartform i.e Facebook and Youtube using the RTMP protoco ;.. It works well on my localhost but once i deploy to the server, it only give me errors. The error I get is below on this content..
Here is my codes


server.js




const child_process = require('child_process'); // To be used later for running FFmpeg
const express = require('express');
const http = require('http');
const WebSocketServer = require('ws').Server;

const app = express();
const server = http.createServer(app).listen(4000, () => {
 console.log('Listening...');
});

// Serve static files out of the www directory, where we will put our HTML page
app.use(express.static(__dirname + '/www'));


const wss = new WebSocketServer({
 server: server
});
wss.on('connection', (ws, req) => {
 
 
 
 const rtmpUrl = 'rtmp://a.rtmp.youtube.com/live2/MyStreamId';
 console.log('Target RTMP URL:', rtmpUrl);
 
 // Launch FFmpeg to handle all appropriate transcoding, muxing, and RTMP.
 // If 'ffmpeg' isn't in your path, specify the full path to the ffmpeg binary.
 const ffmpeg = child_process.spawn('ffmpeg', [
 // Facebook requires an audio track, so we create a silent one here.
 // Remove this line, as well as `-shortest`, if you send audio from the browser.
 //'-f', 'lavfi', '-i', 'anullsrc',
 
 // FFmpeg will read input video from STDIN
 '-i', '-',
 
 // Because we're using a generated audio source which never ends,
 // specify that we'll stop at end of other input. Remove this line if you
 // send audio from the browser.
 //'-shortest',
 
 // If we're encoding H.264 in-browser, we can set the video codec to 'copy'
 // so that we don't waste any CPU and quality with unnecessary transcoding.
 // If the browser doesn't support H.264, set the video codec to 'libx264'
 // or similar to transcode it to H.264 here on the server.
 '-vcodec', 'copy',
 
 // AAC audio is required for Facebook Live. No browser currently supports
 // encoding AAC, so we must transcode the audio to AAC here on the server.
 '-acodec', 'aac',
 
 // FLV is the container format used in conjunction with RTMP
 '-f', 'flv',
 
 // The output RTMP URL.
 // For debugging, you could set this to a filename like 'test.flv', and play
 // the resulting file with VLC. Please also read the security considerations
 // later on in this tutorial.
 rtmpUrl 
 ]);
 
 // If FFmpeg stops for any reason, close the WebSocket connection.
 ffmpeg.on('close', (code, signal) => {
 console.log('FFmpeg child process closed, code ' + code + ', signal ' + signal);
 ws.terminate();
 });
 
 // Handle STDIN pipe errors by logging to the console.
 // These errors most commonly occur when FFmpeg closes and there is still
 // data to write. If left unhandled, the server will crash.
 ffmpeg.stdin.on('error', (e) => {
 console.log('FFmpeg STDIN Error', e);
 });
 
 // FFmpeg outputs all of its messages to STDERR. Let's log them to the console.
 ffmpeg.stderr.on('data', (data) => {
 console.log('FFmpeg STDERR:', data.toString());
 });

 // When data comes in from the WebSocket, write it to FFmpeg's STDIN.
 ws.on('message', (msg) => {
 console.log('DATA', msg);
 ffmpeg.stdin.write(msg);
 });
 
 // If the client disconnects, stop FFmpeg.
 ws.on('close', (e) => {
 ffmpeg.kill('SIGINT');
 });
 
});







On the server.js file i create a websocket to receive stream data from the client side and then use FFMPEG to send the stream data over to youtube via the RTMP url


Here is my client.js code




const ws = new WebSocket(
 'wss://my-websocket-server.com'

 );
 ws.addEventListener('open', (e) => {
 console.log('WebSocket Open', e);
 drawVideosToCanvas();
 mediaStream = getMixedVideoStream(); // 30 FPS
 mediaRecorder = new MediaRecorder(mediaStream, {
 mimeType: 'video/webm;codecs=h264',
 //videoBitsPerSecond : 3000000000
 bitsPerSecond: 6000000
 });

 mediaRecorder.addEventListener('dataavailable', (e) => {
 ws.send(e.data);
 });
 mediaRecorder.onstop = function() {
 ws.close.bind(ws);
 isRecording = false;
 actionBtn.textContent = 'Start Streaming';
 actionBtn.onclick = startRecording;
 }
 mediaRecorder.onstart = function() {
 isRecording = true;
 actionBtn.textContent = 'Stop Streaming';
 actionBtn.onclick = stopRecording;
 screenShareBtn.onclick = startSharing;
 screenShareBtn.disabled = false;
 }
 //mediaRecorder.addEventListener('stop', ws.close.bind(ws));

 mediaRecorder.start(1000); // Start recording, and dump data every second

 });







On my client.js file, i captured users camera and then open the websocket server to send the data to the server.. Every thing works fine on local host expect for when i deploy it to live server..
i am wondering if there is a bad configuration on the server.. The server is Centos 7.8 and the app was runing on Apache software
Here is how i configured the virtual host for the websocket domain




ServerName my-websocket.com

 RewriteEngine on
 RewriteCond %{HTTP:Upgrade} websocket [NC]
 RewriteCond %{HTTP:Connection} upgrade [NC]
 RewriteRule .* "ws://127.0.0.1:3000/$1" [P,L]

 ProxyPass "/" "http://127.0.0.1:3000/$1"
 ProxyPassReverse "/" "http://127.0.0.1:3000/$1"
 ProxyRequests off







I don't know much about server configuration but i just thought may be the configuration has to do with why FFMPEg can not open connection to RTMP protocol on the server.


here is the error am getting




FFmpeg STDERR: Input #0, lavfi, from 'anullsrc':
 Duration:
FFmpeg STDERR: N/A, start: 0.000000, bitrate: 705 kb/s
 Stream #0:0: Audio: pcm_u8, 44100 Hz, stereo, u8, 705 kb/s

DATA <buffer 1a="1a">
DATA <buffer 45="45" df="df" a3="a3" 42="42" 86="86" 81="81" 01="01" f7="f7" f2="f2" 04="04" f3="f3" 08="08" 82="82" 88="88" 6d="6d" 61="61" 74="74" 72="72" 6f="6f" 73="73" 6b="6b" 87="87" 0442="0442" 85="85" 02="02" 18="18" 53="53" 80="80" 67="67" ff="ff" 53991="53991" more="more" bytes="bytes">
DATA <buffer 40="40" c1="c1" 81="81" 00="00" f0="f0" 80="80" 7b="7b" 83="83" 3e="3e" 3b="3b" 07="07" d6="d6" 4e="4e" 1c="1c" 11="11" b4="b4" 7f="7f" cb="cb" 5e="5e" 68="68" 9b="9b" d5="d5" 2a="2a" e3="e3" 06="06" c6="c6" f3="f3" 94="94" ff="ff" 29="29" 16="16" b2="b2" 60="60" 04ac="04ac" 37="37" fb="fb" 1a="1a" 15="15" ea="ea" 39="39" a0="a0" cd="cd" 02="02" b8="b8" 56206="56206" more="more" bytes="bytes">
FFmpeg STDERR: Input #1, matroska,webm, from 'pipe:':
 Metadata:
 encoder :
FFmpeg STDERR: Chrome
 Duration: N/A, start: 0.000000, bitrate: N/A
 Stream #1:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)
 Stream #1:1(eng): Video: h264 (Constrained Baseline), yuv420p(progressive), 1366x768, SAR 1:1 DAR 683:384, 30.30 fps, 30 tbr, 1k tbn, 60 tbc (default)

FFmpeg STDERR: [tcp @ 0xe5fac0] Connection to tcp://a.rtmp.youtube.com:1935 failed (Connection refused), trying next address
[rtmp @ 0xe0fb80] Cannot open connection tcp://a.rtmp.youtube.com:1935

FFmpeg STDERR: rtmp://a.rtmp.youtube.com/live2/mystreamid: Network is unreachable

FFmpeg child process closed, code 1, signal null</buffer></buffer></buffer>







I will really appreciate if I could get some insight on what may be causing this issue or what i can do to solve it..Thanks in advance..


-
FFmpeg stream video results into net::ERR_CONTENT_LENGTH_MISMATCH 206
7 septembre 2021, par RedI am trying to convert a
mkv
video to anmp4
format using FFmpeg, the output should be piped to awriteableStream
, in this case I am using express.js

This is the actual code


const pathToVideo = './path-to/video.mkv';
const stats = fs.statSync(pathToVideo);

const range = req.headers.range;

if (!range) res.status(400).send("Requires Range header");

const CHUNK_SIZE = 10 ** 6; // 1MB
const start = Number(range?.replace(/\D/g, ""));
const end = Math.min(start + CHUNK_SIZE, stats.size - 1);

const stream = fs.createReadStream(pathToMovie, { start: start, end: end })

const contentLength = end - start + 1;
const headers = {
 "Content-Range": `bytes ${start}-${end}/${stats.size + 1}`,
 "Accept-Ranges": "bytes",
 "Content-Length": contentLength,
 "Content-Type": "video/mp4",
};

res.writeHead(206, headers); 

ffmpeg(stream)
 .format('mp4')
 .videoCodec('libx264')
 .audioCodec('libmp3lame')
 .outputOptions('-movflags frag_keyframe+empty_moov')
 .pipe(res, { end: true })



In Chrome this results into an error




GET http://localhost/video net::ERR_CONTENT_LENGTH_MISMATCH 206 (Partial Content)




If I save the result into a file, instead of piping it, the video is playable and works fine. What do I forget in the above code ? I tried to play with the
headers
, without luck.