
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (104)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)
Sur d’autres sites (7903)
-
ffmpeg legitimate decoding errors
20 juillet 2017, par Gideon OduroMy issue is as follows, i’m sending a H.264 encoded video captured with the help of WebRTC over WebSocket. The idea is to perform server side analysis and object tracking.
navigator.mediaDevices.getUserMedia(constraint).then((stream) => {
isVideoElement(target, stream)
mediaRecorder = recorder(stream, {mimeType: 'video/webm; codecs=H264'})
mediaRecorder.ondataavailable = (blob) => socket.send(blob.data)
mediaRecorder.start('2000');
})On the server side, data is being received as ByteBuffer :
override fun handleBinaryMessage(session: WebSocketSession, msg: BinaryMessage) {
analysis(msg.payload)
}Im using the following resources (resource_1, resource_2) to try to convert my ByteBuffer to a OpenCv frame :
fun startPreview(data: ByteBuffer) {
avcodec_register_all()
val pack = avcodec.AVPacket()
pack.data(BytePointer(data))
avcodec.av_init_packet(pack)
val videoData = BytePointer(data)
val codec = avcodec.avcodec_find_decoder(avcodec.AV_CODEC_ID_H264)
val videoCodecContext = avcodec.avcodec_alloc_context3(codec)
videoCodecContext.width(1280)
videoCodecContext.height(720)
videoCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P)
videoCodecContext.codec_type(avutil.AVMEDIA_TYPE_VIDEO)
videoCodecContext.extradata(videoData)
videoCodecContext.extradata_size(data.capacity())
videoCodecContext.flags2(videoCodecContext.flags2() or avcodec.CODEC_FLAG2_CHUNKS)
avcodec.avcodec_open2(videoCodecContext, codec, null as PointerPointer<*>?)
val decodedFrameLength = avcodec.avcodec_receive_frame(videoCodecContext, avutil.AVFrame())
println(decodedFrameLength)
}Im then receiving decodedFrameLength of -35 indicating a decoding error, cant figure out how to proceed from here ?
-
Save video to disk from WebRTC MediaStream in Node
27 novembre 2020, par SAGBO AiméI'm building an app where the user can connect to the server through a WebRTC (I'm using simple-peer library both server-side and client-side to set the peer-to-peer connection).
Once the client and the server are connected, the client app stream the user camera and micro to the server.


Now, I want to save the streamed data to the filesystem server-side as an MP4 video file.


I hear about ffmpeg and fluent-ffmpeg to achieve this but i don't know how to use them.


- 

- Server side code to set up the peer connection




const Peer = require("simple-peer");
const wrtc = require("wrtc");

const peer = new Peer({ initiator: false, wrtc: wrtc, trickle: false });

peer.on("error", (err: any) => console.log("error", err));

 peer.on("signal", (data: any) => {
 if (data.type === "offer" || data.type === "answer")
 dispatchMessage(JSON.stringify(data));
 // if (data.renegotiate || data.transceiverRequest) return;
 });

 peer.on("connect", () => {
 console.log("CONNECTED");
 peer.send(JSON.stringify("HELLO DEER PEER FROM SERVER"));
 });

 peer.on("data", (data: any) => {
 console.log("data: ", data);
 });

 peer.on("stream", (stream: MediaStream) => {
 console.log("-------Stream received", stream);
 });

 peer.on("track", (track: MediaStreamTrack) => {
 console.log("-------trackEvent:", track);
 });



- 

- Client-side code




const stream = await window.navigator.mediaDevices.getUserMedia({
 video: { width: { ideal: 4096 }, height: { ideal: 2160 }},
 audio: true,
});

const p = new SimplePeer({
 initiator: isInitiator, 
 trickle: false 
});

stream.getTracks().forEach(track => p.addTrack(
 track, 
 stream 
));

// Here I set up the listeners for the peer connection



-
Capturing audio data (using javascript) and uploading on a server as MP3
4 septembre 2018, par MichelFollowing a number of resources on the internet, I am trying to build a simple web page, where I can go to record something (my voice), then make a mp3 file out of the recording and finally upload that file to a server.
At this point I can do the recording and also play back, but I haven’t gone as far as uploading, it seems like I cannot even make an mp3 file locally.
Can someone tell me what I am doing wrong, or in the wrong order ?Below is all the code I have at this point.
<div>
<h2>Audio record and playback</h2>
<p>
<button></button></p><h3>Start</h3>
<button disabled="disabled"><h3>Stop</h3></button>
<audio controls="controls"></audio>
<a></a>
</div>
<code class="echappe-js"><script><br />
var player = document.getElementById('player');<br />
<br />
var handleSuccess = function(stream) {<br />
rec = new MediaRecorder(stream);<br />
<br />
rec.ondataavailable = e => {<br />
audioChunks.push(e.data);<br />
if (rec.state == "inactive") {<br />
let blob = new Blob(audioChunks,{type:'audio/x-mpeg-3'});<br />
player.src = URL.createObjectURL(blob);<br />
player.controls=true;<br />
player.autoplay=true;<br />
// audioDownload.href = player.src;<br />
// audioDownload.download = 'sound.data';<br />
// audioDownload.innerHTML = 'Download';<br />
mp3Build();<br />
}<br />
}<br />
<br />
player.src = stream;<br />
};<br />
<br />
navigator.mediaDevices.getUserMedia({audio:true/*, video: false */})<br />
.then(handleSuccess);<br />
<br />
startRecord.onclick = e => {<br />
startRecord.disabled = true;<br />
stopRecord.disabled=false;<br />
audioChunks = [];<br />
rec.start();<br />
}<br />
<br />
stopRecord.onclick = e => {<br />
startRecord.disabled = false;<br />
stopRecord.disabled=true;<br />
rec.stop();<br />
}<br />
<br />
<br />
var ffmpeg = require('ffmpeg');<br />
<br />
function mp3Build() {<br />
try {<br />
var process = new ffmpeg('sound.data');<br />
process.then(function (audio) {<br />
// Callback mode.<br />
audio.fnExtractSoundToMP3('sound.mp3', function (error, file) {<br />
if (!error) {<br />
console.log('Audio file: ' + file);<br />
audioDownload.href = player.src;<br />
audioDownload.download = 'sound.mp3';<br />
audioDownload.innerHTML = 'Download';<br />
} else {<br />
console.log('Error-fnExtractSoundToMP3: ' + error);<br />
}<br />
});<br />
}, function (err) {<br />
console.log('Error: ' + err);<br />
});<br />
} catch (e) {<br />
console.log(e.code);<br />
console.log(e.msg);<br />
}<br />
}<br />
<br />
</script>When I try to investigate and see what is happening using the Debugger inside the Web Console ; on the line :
var process = new ffmpeg('sound.data');
I get this message :
Paused on exception
TypeError ffmpeg is not a contructor.And on the line :
var ffmpeg = require('ffmpeg');
I get this message :
Paused on exception
ReferenceError require is not defined.Beside when I watch the expression ffmpeg, I can see :
ffmpeg: undefined
After some further investigations, and using browserify I use the following code :
<div>
<h2>Audio record and playback</h2>
<p>
<button></button></p><h3>Start</h3>
<button disabled="disabled"><h3>Stop</h3></button>
<audio controls="controls"></audio>
<a></a>
</div>
<code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/bundle.js'></script><script><br />
var player = document.getElementById('player');<br />
<br />
var handleSuccess = function(stream) {<br />
rec = new MediaRecorder(stream);<br />
<br />
rec.ondataavailable = e => {<br />
if (rec.state == "inactive") {<br />
let blob = new Blob(audioChunks,{type:'audio/x-mpeg-3'});<br />
//player.src = URL.createObjectURL(blob);<br />
//player.srcObject = URL.createObjectURL(blob);<br />
//player.srcObject = blob;<br />
player.srcObject = stream;<br />
player.controls=true;<br />
player.autoplay=true;<br />
// audioDownload.href = player.src;<br />
// audioDownload.download = 'sound.data';<br />
// audioDownload.innerHTML = 'Download';<br />
mp3Build();<br />
}<br />
}<br />
<br />
//player.src = stream;<br />
player.srcObject = stream;<br />
};<br />
<br />
navigator.mediaDevices.getUserMedia({audio:true/*, video: false */})<br />
.then(handleSuccess);<br />
<br />
startRecord.onclick = e => {<br />
startRecord.disabled = true;<br />
stopRecord.disabled=false;<br />
audioChunks = [];<br />
rec.start();<br />
}<br />
<br />
stopRecord.onclick = e => {<br />
startRecord.disabled = false;<br />
stopRecord.disabled=true;<br />
rec.stop();<br />
}<br />
<br />
<br />
var ffmpeg = require('ffmpeg');<br />
<br />
function mp3Build() {<br />
try {<br />
var process = new ffmpeg('sound.data');<br />
process.then(function (audio) {<br />
// Callback mode.<br />
audio.fnExtractSoundToMP3('sound.mp3', function (error, file) {<br />
if (!error) {<br />
console.log('Audio file: ' + file);<br />
//audioDownload.href = player.src;<br />
audioDownload.href = player.srcObject;<br />
audioDownload.download = 'sound.mp3';<br />
audioDownload.innerHTML = 'Download';<br />
} else {<br />
console.log('Error-fnExtractSoundToMP3: ' + error);<br />
}<br />
});<br />
}, function (err) {<br />
console.log('Error: ' + err);<br />
});<br />
} catch (e) {<br />
console.log(e.code);<br />
console.log(e.msg);<br />
}<br />
}<br />
<br />
</script>That solved the problem of :
the expression ffmpeg being: undefined
But the play back is no longer working. I may not be doing the right thing with player.srcObject and maybe some other things too.
When I use this line :
player.srcObject = URL.createObjectURL(blob);
I get this message :
Paused on exception
TypeError: Value being assigned to HTMLMediaElement.srcObject is not an object.And when I use this line :
player.srcObject = blob;
I get this message :
Paused on exception
TypeError: Value being assigned to HTMLMediaElement.srcObject does not implement interface MediaStream.Finally, if I use this :
player.srcObject = stream;
I do not get any error message but the voice recording still does not work.