Recherche avancée

Médias (0)

Mot : - Tags -/auteurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • Diogene : création de masques spécifiques de formulaires d’édition de contenus

    26 octobre 2010, par

    Diogene est un des plugins ? SPIP activé par défaut (extension) lors de l’initialisation de MediaSPIP.
    A quoi sert ce plugin
    Création de masques de formulaires
    Le plugin Diogène permet de créer des masques de formulaires spécifiques par secteur sur les trois objets spécifiques SPIP que sont : les articles ; les rubriques ; les sites
    Il permet ainsi de définir en fonction d’un secteur particulier, un masque de formulaire par objet, ajoutant ou enlevant ainsi des champs afin de rendre le formulaire (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (4854)

  • AWS Lambda : ffmpeg thumbnails Generator : empty JPG

    3 septembre 2020, par Magikey

    When a video is uploaded on S3 i want to store a JPG screenshot.

    



    On a lambda function with amazon AWS, i do :

    



    ...

  let tmpFile = createWriteStream(`/tmp/screenshot.jpg`)

  var ffmpeg = spawn(ffmpegPath, [
      "-ss","00:00:05",
      "-i", target,
      "-vf", "thumbnail,scale=200:200", 
      "-qscale:v" ,"2",
      "-frames:v", "1",
      "-f", "image2",
      "-c:v", "mjpeg",
      "pipe:1"
    ]);

  ffmpeg.stdout.pipe(tmpFile).on("error", err => {
      console.log("Error A: ",err);
    });

  ffmpeg.on('error', err => {
    console.log("Error B", err)
    reject()
  })

  ffmpeg.on('close', code => {
    tmpFile.end();
    console.log('Log A', ffmpeg);

    child_process.exec("echo `ls -l -R /tmp`",
      (error, stdout, stderr) => {
        console.log(stdout)
    });

    resolve()
  })
...


    



    But the result is an empty JPG file in S3.

    



    Logs shows no errors, my "target" is OK, stdout ls show me the empty JPG file.

    



    I have try a lot of things, like use other version of ffmpeg but same.

    



    There is the "console.log('Log A', ffmpeg)" :

    



    ChildProcess {
 _events: [Object: null prototype] { error: [Function], close: [Function] },
 _eventsCount: 2,
 _maxListeners: undefined,
 _closesNeeded: 3,
 _closesGot: 3,
 connected: false,
 signalCode: 'SIGSEGV',
 exitCode: null,
 killed: false,
 spawnfile: '/opt/nodejs/ffmpeg',
 _handle: null,
 spawnargs: [
   '/opt/nodejs/ffmpeg',
   '-ss',
   '00:00:05',
   '-i',
   'https://xxxxxxxxx',
   '-vf',
   'thumbnail,scale=200:200',
   '-qscale:v',
   '2',
   '-frames:v',
   '1',
   '-f',
   'image2',
   '-v',
   '16',
   '-c:v',
   'mjpeg',
   'pipe:1'
 ],
 pid: 24,
 stdin: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: null,
     ended: false,
     endEmitted: false,
     reading: false,
     sync: true,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: true,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] { end: [Function: onReadableStreamEnd] },
   _eventsCount: 1,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   [Symbol(asyncId)]: 5,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stdout: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: false,
     ended: true,
     endEmitted: true,
     reading: false,
     sync: false,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: false,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] {
     end: [Function: onReadableStreamEnd],
     close: [Function]
   },
   _eventsCount: 2,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   write: [Function: writeAfterFIN],
   [Symbol(asyncId)]: 6,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stderr: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: null,
     ended: true,
     endEmitted: true,
     reading: false,
     sync: false,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: true,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] {
     end: [Function: onReadableStreamEnd],
     close: [Function]
   },
   _eventsCount: 2,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   write: [Function: writeAfterFIN],
   [Symbol(asyncId)]: 7,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stdio: [
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 1,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     [Symbol(asyncId)]: 5,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   },
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 2,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     write: [Function: writeAfterFIN],
     [Symbol(asyncId)]: 6,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   },
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 2,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     write: [Function: writeAfterFIN],
     [Symbol(asyncId)]: 7,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   }
 ]
} ```


    


  • Convert Blob audio file to mp3 type in typescript

    6 avril 2022, par I. Albuquerque

    I'm trying to convert a blob audio file to .mp3 type which is generated from MediaRecorder it is returning with webm type and I have tried other types in MimeType(attribute in MediaRecorder to set the type) but they are not supported so I have tried ffmpeg npm library but it was asking for path of the file but i'm not saving it so that also didn't work for me. Any suggestion and answer that will help !!

    


    Here is how i get audio

    


    getAudio(){
  navigator.mediaDevices.getUserMedia({ audio: true})
    .then( stream => {
      console.log(stream)
      this.mediaRecord = new MediaRecorder(stream)

      this.mediaRecord.ondataavailable = (data: { data: any; }) => {
        console.log(data)
        this.chunks.push(data.data)
      }

      this.mediaRecord.onstop = () => {
        const blob = new Blob(this.chunks, { type: 'audio/mp3'})
        const reader = new window.FileReader()
        reader.readAsDataURL(blob)
        reader.onloadend = () => {
          const teste:any = this.$el.querySelector('#teste')
          teste.src = reader.result //render.result e o local onde o audio fica armazenado
          this.ArquivoAudio = blob
          console.log(reader.result)
        }
      }
    }, err => {
      console.log(err)
      alert('voce deve permitir a captura de audio')
    })
},


    


  • set MediaRecorder to record 1 frame every N seconds

    19 août 2022, par The Blind Hawk

    Summary

    


    I have a version of my code already working on Chrome and Edge, but I need some fixes for it to work on Safari.
    
My objective is to record around 25 minutes and download a timelapse version of the recording.
    
final product requirements :

    


    speed: 3fps
length: ~25s

(I need to record one frame every 20 seconds for 25 mins)


    


    this.secondStream settings :

    


    this.secondStream = await navigator.mediaDevices.getUserMedia({
    audio: false,
    video: {width: 430, height: 430, facingMode: "user"}
});


    


    My code for IOS so far :

    


            startIOSVideoRecording: function() {
            console.log("setting up recorder");
            var self = this;
            this.data = [];

            if (MediaRecorder.isTypeSupported('video/mp4')) {
                // IOS does not support webm, so I will be using mp4
                var options = {mimeType: 'video/mp4', videoBitsPerSecond : 1000000};
            } else {
                console.log("ERROR: mp4 is not supported, trying to default to webm");
                var options = {mimeType: 'video/webm'};
            }
            console.log("options settings:");
            console.log(options);

            this.recorder = new MediaRecorder(this.secondStream, options);

            this.recorder.ondataavailable = function(evt) {
                if (evt.data && evt.data.size > 0) {
                    self.data.push(evt.data);
                    console.log('chunk size: ' + evt.data.size);
                }
            }

            this.recorder.onstop = function(evt) {
                console.log('recorder stopping');
                var blob = new Blob(self.data, {type: "video/mp4"});
                self.download(blob, "mp4");
                self.sendMail(videoBlob);
            }

            console.log("finished setup, starting")
            this.recorder.start(1200);

            function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms));}

            async function looper() {
                // I am trying to pick one second every 20 more or less
                await sleep(500);
                self.recorder.pause();
                await sleep(18000);
                self.recorder.resume();
                looper();
            }
            looper();
        },


    


    Issues

    


    Only one call to getUserMedia()

    


    I am already using this.secondstream elsewhere, and I need the settings to stay as they are for the other functionality.
    
On Chrome and Edge, I could just call getUserMedia() again with different settings, and the issue would be solved, but on IOS calling getUserMedia() a second time kills the first stream.
    
The settings that I was planning to use (works for Chrome and Edge) :

    


    navigator.mediaDevices.getUserMedia({
    audio: false,
    video: { 
        width: 360, height: 240, facingMode: "user", 
        frameRate: { min:0, ideal: 0.05, max:0.1 } 
    },
}


    


    The timelapse library I am using does not support mp4 (ffmpeg as alternative ?)

    


    I am forced to use mp4 on IOS apparently, but this does not allow me to use the library I was relying on so I need an alternative.
    
I am thinking of using ffmpeg but cannot find any documentation to make it interact with the blob before the download.
    
I do not want to edit the video after downloading it, but I want to be able to download the already edited version, so no terminal commands.

    


    MediaRecorder pause and resume are not ideal

    


    On Chrome and Edge I would keep one frame every 20 seconds by setting the frameRate to 0.05, but this does not seem to work on IOS for two reasons.
    
First one is related to the first issue of not being able to change the settings of getUserMedia() without destroying the initial stream in the first place.
    
And even after changing the settings, It seems that setting the frame rate below 1 is not supported on IOS. Maybe I wrote something else wrong, but I was not able to open the downloaded file.
    
Therefore I tried relying on pausing and resuming the MediaRecorder, but this brings forth another two issues :
    
I am currently saving 1 second every 20 seconds and not 1 frame every 20 seconds, and I cannot find any workarounds.
    
Pause and Resume take a little bit of time, making the code unreliable, as I sometimes pick 2/20 seconds instead of 1/20, and I have no reliability that the loop is actually running every 20 seconds (might be 18 might be 25).