
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (102)
-
Modifier la date de publication
21 juin 2013, parComment changer la date de publication d’un média ?
Il faut au préalable rajouter un champ "Date de publication" dans le masque de formulaire adéquat :
Administrer > Configuration des masques de formulaires > Sélectionner "Un média"
Dans la rubrique "Champs à ajouter, cocher "Date de publication "
Cliquer en bas de la page sur Enregistrer -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (6484)
-
Revision 36890 : Allez hop un petit logo ...
3 avril 2010, par kent1@… — LogAllez hop un petit logo …
-
Connect a remote Ip camera as a Webrtc client
5 avril 2017, par idoshI have 2 cameras :
- An internal webcam embedded in my laptop.
- A remote IP camera that is connected to my laptop through Wifi (transmits TCP, raw H264 data - no container). I’m getting the stream using node.js.
My goal is to create a Webrtc network and connect the remote camera as another client.
I’m trying to figure out possible solutions :
- My naive thinking was that I would stream the remote camera payload to the browser. But as I came to understand the browser can’t handle the stream without a container. Fair enough. But I don’t understand why it does handle the video stream that arrives from my internal camera (from the navigator.getUserMedia() function). what’s the difference between the two streams ? why can’t I mimic the stream from the remote camera as the input ?
- To bypass this problem I thought about creating a virtual camera using Manycam (or Manycam like app). To accomplish that I need to convert my TCP stream into an RTP stream (in order to feed Manycam). Though I did saw some info in ffmpeg command line, I couldn’t find info in their node.js api package "fluent-ffmpeg". Is it possible to do it using fluent-ffmpeg ? Or only using the command line tool ? Would it require another rtp server in the middle such as this one ?.
- Third option I read about is using node.js as a client in Webrtc. I saw it was implemented in "simple-peer". I tried it out using their co-work with socket.io (socket.io-p2p). unfortunately I couldn’t get it to work / : When i’m trying to create a socket/peer in the server - it throws errors, as it expect options that are only available on the client-side (like window, location, etc.). Am I doing something wrong ? maybe there is more suitable framework for this matter ?
- Forth option is to use a streaming server in the middle such as Kurnto. From my understanding it receives rtp as an input and transmits it as a webrtc client. I feel it’s the most excessive option, but maybe it’s not so bad (I have to admit that I haven’t investigate this option yet).
any thoughts ?
thanks !
-
FFMPEG - How to wait until all blobs are written before finishing ffmpeg process when getting them from media recorder API
7 novembre 2020, par Caio NakaiI'm using media recorder API to record a video from user's screen and sending the blobs through web socket to a nodejs server. The nodejs server is using the blobs to create a webm video file, the video is being created fine but with a delay, after the user clicks on the stop recording button it stops the media recorder api, however the server didn't finish the processing of all blobs (at least that's what I think it's happening) and then when I check the video file generated the last few seconds of the recording are missing I wonder if there's an way to solve this. Any help is appreciated :)


This is the front-end code that sends the blobs to the nodejs server


const startScreenCapture = async () => {
 try {
 let screenStream;
 videoElem = document.getElementById("myscreen");
 screenStream = await navigator.mediaDevices.getDisplayMedia(
 displayMediaOptions
 );

 const recorderOptions = {
 mimeType: "video/webm;codecs=vp9",
 videoBitsPerSecond: 3 * 1024 * 1024,
 };

 screenMediaRecorder = new MediaRecorder(screenStream, recorderOptions);
 screenMediaRecorder.start(1); // 1000 - the number of milliseconds to record into each Blob
 screenMediaRecorder.ondataavailable = (event) => {
 console.debug("Got blob data:", event.data);
 console.log("Camera stream: ", event.data);
 if (event.data && event.data.size > 0) {
 socket.emit("screen_stream", event.data);
 }
 };

 videoElem.srcObject = screenStream;
 // console.log("Screen stream", screenStream);
 // socket.emit("screen_stream", screenStream);
 } catch (err) {
 console.error("Error: " + err);
 }
};

const stopCapture = (evt) => {
 let tracks = videoElem.srcObject.getTracks();

 tracks.forEach((track) => track.stop());
 videoElem.srcObject = null;
 screenMediaRecorder.stop();
 socket.emit("stop_screen");
 socket.close();
};



This is the nodejs back-end that handle the blobs and generates the videofile


const ffmpeg2 = child_process.spawn("ffmpeg", [
 "-i",
 "-",
 "-c:v",
 "copy",
 "-c:a",
 "copy",
 "screen.webm",
 ]);


 socket.on("screen_stream", (msg) => {
 console.log("Writing screen blob! ");
 ffmpeg2.stdin.write(msg);
 });

 socket.on("stop_screen", () => {
 console.log("Stop recording..");
 });