Recherche avancée

Médias (91)

Autres articles (20)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (6542)

  • use ffmpeg to add audio/video sequences from multiple other files in another file

    9 novembre 2015, par robert paré

    I have several mp4 files that contain video and audio content ("base" videos)

    I have a length value (recordingLength) in which the output video should be (let’s say 3600 seconds).

    What I’m trying, is first creating a blank file with the total length (recordingLength) of 3600 seconds. I do it like so :

    ffmpeg -f lavfi -i anullsrc -t 3600 -c:a libfdk_aac ./output.mp4

    Then let’s say I have 5 base videos each lasting a couple of minutes, I need to add each of them at a specific time in the blank file created above. I tried playing around with -itsoffset {seconds} -t {seconds} but I’m still really just playing around trying to get my head around all the options.

  • How to stream to NestJS if the data is transmitted through a NATS broker ?

    13 août 2023, par О. Войтенко

    I have an rtsp to mpegts streamer (ffmpeg wrapped in Express for publishing) that streams by publish to a NATS Jetstream.

    


    const options =
     [
         "-rtsp_transport", "tcp",
         "-i", streamUrl,
         '-f', 'mpegts',
         '-pix_fmt', 'yuv420p',
         '-c:v', 'h264',
         '-b:v', '800k',
         '-r', '30',
         '-muxdelay', '0.4',
         '-movflags', 'frag_keyframe+empty_moov',
         '-'
     ];


    


    try {
         const natsConnection = await nats.connect({
             servers: 'ws://127.0.0.1:9222',
             preserveBuffers: true
         })

         conststream = spawn('ffmpeg', options);

         stream.stdout.on('data', (chunk) => {
             console log(chunk);
             natsConnection.publish('stream.1', chunk);
         });

         stream.stdout.on('error', (error) => {
             console.error('FFmpeg error:', error);
         });

         stream.stdout.on('close', (code) => {
             console.log('FFmpeg process closed with code:', code);
         });
     } catch (e) {
         console.error(e);
     }


    


    It's all about the NestJS application signing up for this when the visitor follows the route.
The controller method itself

    


    @get()
   async findOne(
     @Headers() headers,
     @Res() res,
   ) {
     const streamObservable = this.natsService.subscribeStreamEvents('1');

     // Set appropriate headers for video streaming
     res.setHeader('Content-Type', 'video/mp4');
     res.setHeader('Transfer-Encoding', 'chunked');

     res.status(200);

     // Write data chunks directly to the response
     const subscription = streamObservable.subscribe({
       next: (chunk) => {
         console.log('wwww', chunk);
         res.write(chunk);
       },
       error: (error) => {
         console.error('Error streaming video:', error);
       },
       complete: () => {
         res end();
       },
     });

     res.on('close', () => {
       subscription.unsubscribe();
     });
   }


    


    Well, and service service

    


    subscribeToCameraEvents(cameraId: string) {&#xA;     const subject = new Subject<any>();&#xA;&#xA;     this.natsConnection.subscribe(`stream.${cameraId}`, {&#xA;       callback: (err, msg: Msg) => {&#xA;         if (err) {&#xA;           subject.error(err);&#xA;           return;&#xA;         }&#xA;         // console.log(msg.data);&#xA;         subject.next(msg.data);&#xA;       },&#xA;     });&#xA;&#xA;     return subject.asObservable();&#xA;   }&#xA;</any>

    &#xA;

    The data is consoled as a Buffer.

    &#xA;

    trying to play the stream

    &#xA;

    &#xA;&#xA;&#xA;     &#xA;&#xA;&#xA;<video controls="controls" width="640" height="480" style="background: black">&#xA;     <source src="http://localhost:3301/api/streams" type="video/mp4">&#xA;     Your browser does not support the video tag.&#xA;</source></video>&#xA;&#xA;&#xA;

    &#xA;

    the route is true in the source.

    &#xA;

    The problem is that through VLC and ffplay - localhost:3301/api/streams the stream is displayed, and everything works fine, without any additional settings.

    &#xA;

    And here in html does not want.

    &#xA;

    **What am I doing wrong ? **

    &#xA;

    I will be glad to any comments and additional questions.

    &#xA;

    I have tried canvas and jsmpeg (not working too), @Res(passthrough : true) - stream does not work in VLC, ffplay and html (without passthrough stream is working only in VLS and ffplay)

    &#xA;

  • avcodec/mpegvideo : flip motion vector visualization for backward motion vectors

    9 juillet 2014, par Michael Niedermayer
    avcodec/mpegvideo : flip motion vector visualization for backward motion vectors
    

    Also support changing arrow head/tail shape

    Signed-off-by : Michael Niedermayer <michaelni@gmx.at>

    • [DH] libavcodec/mpegvideo.c