
Recherche avancée
Médias (91)
-
999,999
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Demon seed (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
The four of us are dying (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Corona radiata (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Lights in the sky (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (72)
-
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Problèmes fréquents
10 mars 2010, parPHP et safe_mode activé
Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (7233)
-
Sequelize FFMPEG get video after upload on NodeJs
30 juin 2020, par jjplackHello After upload a video to db using sequelize, i would like to edit it using FFMPEG


So to get the video is just point the model attribute to FFMPEG ?


Because using the file path is not editing the video.


For exemple :


fastify.route({
 method: "POST",
 url: "/posts",
 preHandler: upload.single("video"),

 handler: async function(request, reply) {
 const { Post } = fastify.sequelize;

 const videoPath = "./public/uploads/";

 

 

 const post = await Post.create({
 video: request.file.path,
 title: request.body.title,
 
 });
 reply.code(201).send(post);


 

try {
 const process = new ffmpeg(post.video);
 process.then(function (video) {
 video.addCommand('-ss', '00:01:00')
 video.addCommand('-vframes', '1')
 video.save(videoPath, function (error, file) {
 if (!error)
 console.log('Video file: ' + file);
 });
 }, function (err) {
 console.log('Error: ' + err);
 });
} catch (e) {

 console.log(e.msg);

}
 
 }
 });



-
Ffmpeg CRC mismatch
11 octobre 2022, par DNSI got this error while sending my audio encoded with opus in OGG container.


I'm recording 5 seconds for testing wiht PortAudio my microphone, after that I encode this record with Opus and finally I encapsulate the encoded buffer.


But when I'm sending the audio buffer encoded to ffmpeg, I got a CRC mismatch error and some issues with the file created.


PortAudio works, Opus works and LibOgg works because I'm able to create a file without ffmpeg that I can listen with correct headers, duration, volume...


Here the output of ffmpeg while sending the buffer :


Input #0, ogg, from 'pipe:':
 Duration: N/A, start: -0.010000, bitrate: N/A
 Stream #0:0: Audio: opus, 48000 Hz, stereo, fltp
 Metadata:
 ENCODER : Opus
Stream mapping:
 Stream #0:0 -> #0:0 (opus (native) -> aac (native))
Output #0, flv, to 'save/bonjour.flv':
 Metadata:
 encoder : Lavf59.16.100
 Stream #0:0: Audio: aac (LC) ([10][0][0][0] / 0x000A), 48000 Hz, stereo, fltp, 128 kb/s
 Metadata:
 encoder : Lavc59.18.100 aac
[ogg @ 00000181dcbabd40] CRC mismatch!te=N/A speed=N/A
 Last message repeated 32 times
[ogg @ 00000181dcbabd40] CRC mismatch!te= 67.9kbits/s speed=0.634x
 Last message repeated 26 times
[ogg @ 00000181dcbabd40] CRC mismatch!te= 59.4kbits/s speed=0.863x
 Last message repeated 43 times
size= 27kB time=00:00:05.00 bitrate= 43.9kbits/s speed= 1x
video:0kB audio:25kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 6.062128%
[aac @ 00000181dcc15100] Qavg: 1486.398



I'm trying to create a file in flv with aac codec with a transmuxing to prepare the future for real time streaming.


Here my loop :


int bytes_read = 0;
FILE *ffmpegcmd;

if (!(ffmpegcmd = _popen("ffmpeg -re -i - -f flv -acodec aac save/bonjour.flv", "w"))) {
 std::cerr << "popen error\n";
 return 1;
}

opus_enc_write_headers(opus);
while (size < SAMPLE_RATE * NUM_SECONDS)
{
 bytes_read = opus_enc_encode(opus, outbuf, size);
 fwrite(outbuf, bytes_read, 1, ffmpegcmd);
 fflush(ffmpegcmd);
 size += FRAMES_PER_BUFFER;
}
_pclose(ffmpegcmd);



The encode function :


int opus_enc_encode(opus_enc* opus, char *enc_buf, int size)
{
 int w = 0;
 ogg_packet op;
 int ret;
 opus_int32 offset = size % (NUM_SECONDS * SAMPLE_RATE - MAX_PACKET_SIZE);

 /* Copy headers */
 while (ogg_stream_flush(&opus->os, &opus->og) != 0) {
 memcpy(enc_buf + w, opus->og.header, opus->og.header_len);
 w += opus->og.header_len;
 memcpy(enc_buf + w, opus->og.body, opus->og.body_len);
 w += opus->og.body_len;
 }

 /* Set bitrate */
 if (opus->last_bitrate != opus->bitrate) {
 if ((opus->bitrate < 9600) || (opus->bitrate > 320000)) {
 opus->bitrate = 192000;
 }
 opus_encoder_ctl(opus->encoder, OPUS_SET_BITRATE(opus->bitrate));
 opus->last_bitrate = opus->bitrate;
 }

 /* Encode Opus */
 ret = opus_encode(opus->encoder, &data.recordedSamples[offset << 1], FRAMES_PER_BUFFER, opus->buffer, MAX_PACKET_SIZE);

 op.b_o_s = 0;
 op.e_o_s = 0;
 op.granulepos = opus->granulepos;
 op.packetno = opus->packetno++;
 op.packet = opus->buffer;
 op.bytes = ret;

 opus->granulepos += FRAMES_PER_BUFFER;

 ogg_stream_packetin(&opus->os, &op);

 while (ogg_stream_flush(&opus->os, &opus->og) != 0) {
 memcpy(enc_buf + w, opus->og.header, opus->og.header_len);
 w += opus->og.header_len;
 memcpy(enc_buf + w, opus->og.body, opus->og.body_len);
 w += opus->og.body_len;
 }

 return w;
}



data.recordedSamples is the buffer of recorded audio from PortAudio (short *).


Should I split the ogg pages ?


Thanks


-
Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received
10 décembre 2024, par QumberI am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.


Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :


const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
 const peerConnection = new RTCPeerConnection();

 // Start ffmpeg process for streaming
 const ffmpeg = spawn('ffmpeg', [
 '-f', 'flv',
 '-i', 'pipe:0',
 '-c:v', 'libx264',
 '-preset', 'veryfast',
 '-maxrate', '3000k',
 '-bufsize', '6000k',
 '-pix_fmt', 'yuv420p',
 '-g', '50',
 '-f', 'flv',
 'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
 ]);

 ffmpeg.on('error', (err) => {
 console.error('FFmpeg error:', err);
 });

 ffmpeg.stderr.on('data', (data) => {
 console.error('FFmpeg stderr:', data.toString());
 });

 ffmpeg.stdout.on('data', (data) => {
 console.log('FFmpeg stdout:', data.toString());
 });

 // Handle incoming tracks
 peerConnection.ontrack = (event) => {
 console.log('Track received:', event.track.kind);
 const track = event.track;

 // Stream the incoming track to FFmpeg
 track.onunmute = () => {
 console.log('Track unmuted:', track.kind);
 const reader = track.createReadStream();
 reader.on('data', (chunk) => {
 console.log('Forwarding chunk to FFmpeg:', chunk.length);
 ffmpeg.stdin.write(chunk);
 });
 reader.on('end', () => {
 console.log('Stream ended');
 ffmpeg.stdin.end();
 });
 };

 track.onmute = () => {
 console.log('Track muted:', track.kind);
 };
 };

 // Set the remote description (offer) received from the client
 await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

 // Create an answer and send it back to the client
 const answer = await peerConnection.createAnswer();
 await peerConnection.setLocalDescription(answer);

 res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
 console.log('WebRTC to RTMP server running on port 3000');
});




This is the output I get, but nothing gets sent to YouTube :




FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr: configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr: libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100

FFmpeg stderr: libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100





I do not understand what I am doing wrong. Any help would be appreciated.



Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :


popup.js & popup.html




document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('openCapturePage').addEventListener('click', () => {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 });
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/popup.js'></script>




StreamSavvy













capture.js & capture.html




let peerConnection;

async function startStreaming() {
 try {
 const stream = await navigator.mediaDevices.getDisplayMedia({
 video: {
 cursor: "always"
 },
 audio: false
 });

 peerConnection = new RTCPeerConnection({
 iceServers: [{
 urls: 'stun:stun.l.google.com:19302'
 }]
 });

 stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));

 const offer = await peerConnection.createOffer();
 await peerConnection.setLocalDescription(offer);

 const response = await fetch('http://localhost:3000/webrtc', {
 method: 'POST',
 headers: {
 'Content-Type': 'application/json'
 },
 body: JSON.stringify({
 sdp: peerConnection.localDescription
 })
 });

 const {
 sdp
 } = await response.json();
 await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));

 console.log("Streaming to server via WebRTC...");
 } catch (error) {
 console.error("Error starting streaming:", error.name, error.message);
 }
}

async function stopStreaming() {
 if (peerConnection) {
 // Stop all media tracks
 peerConnection.getSenders().forEach(sender => {
 if (sender.track) {
 sender.track.stop();
 }
 });

 // Close the peer connection
 peerConnection.close();
 peerConnection = null;
 console.log("Streaming stopped");
 }
}

document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('startCapture').addEventListener('click', startStreaming);
 document.getElementById('stopCapture').addEventListener('click', stopStreaming);
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/capture.js'></script>




StreamSavvy Capture















background.js (service worker)




chrome.runtime.onInstalled.addListener(() => {
 console.log("StreamSavvy Extension Installed");
});

chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
 if (message.type === 'startStreaming') {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 sendResponse({
 status: 'streaming'
 });
 } else if (message.type === 'stopStreaming') {
 chrome.tabs.query({
 url: chrome.runtime.getURL('capture.html')
 }, (tabs) => {
 if (tabs.length > 0) {
 chrome.tabs.sendMessage(tabs[0].id, {
 type: 'stopStreaming'
 });
 sendResponse({
 status: 'stopped'
 });
 }
 });
 }
 return true; // Keep the message channel open for sendResponse
});