Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (88)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

Sur d’autres sites (9802)

  • webrtc to rtmp send video from camera to rtmp link

    14 avril 2024, par Leo-Mahendra

    i cant send the video from webrtc which is converted to bufferd data for every 10seconds and send to server.js where it takes it via websockets and convert it to flv format using ffmpeg.

    


    i am trying to send it to rtmp server named restreamer for start, here i tried to convert the buffer data and send it to rtmp link using ffmpeg commands, where i initially started to suceesfully save the file from webrtc to mp4 format for a duration of 2-3 minute.

    


    after i tried to use webrtc to send video data for every 10 seconds and in server i tried to send it to rtmp but i cant send it, but i can see the connection of rtmp url and server is been taken place but i cant see the video i can see the logs in rtmp server as

    


    2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37700" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37716" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37728" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"   


    


    my frontend code

    


         const handleSendVideo = async () => {
        console.log("start");
    
        if (!ws) {
            console.error('WebSocket connection not established.');
            return;
        }
    
        try {
            const videoStream = await navigator.mediaDevices.getUserMedia({ video: true });
            const mediaRecorder = new MediaRecorder(videoStream);
    
            const requiredFrameSize = 460800;
            const frameDuration = 10 * 1000; // 10 seconds in milliseconds
    
            mediaRecorder.ondataavailable = async (event) => {
                if (ws.readyState !== WebSocket.OPEN) {
                    console.error('WebSocket connection is not open.');
                    return;
                }
    
                if (event.data.size > 0) {
                    const arrayBuffer = await event.data.arrayBuffer();
                    const uint8Array = new Uint8Array(arrayBuffer);
    
                    const width = videoStream.getVideoTracks()[0].getSettings().width;
                    const height = videoStream.getVideoTracks()[0].getSettings().height;
    
                    const numFrames = Math.ceil(uint8Array.length / requiredFrameSize);
    
                    for (let i = 0; i < numFrames; i++) {
                        const start = i * requiredFrameSize;
                        const end = Math.min((i + 1) * requiredFrameSize, uint8Array.length);
                        let frameData = uint8Array.subarray(start, end);
    
                        // Pad or trim the frameData to match the required size
                        if (frameData.length < requiredFrameSize) {
                            // Pad with zeros to reach the required size
                            const paddedData = new Uint8Array(requiredFrameSize);
                            paddedData.set(frameData, 0);
                            frameData = paddedData;
                        } else if (frameData.length > requiredFrameSize) {
                            // Trim to match the required size
                            frameData = frameData.subarray(0, requiredFrameSize);
                        }
    
                        const dataToSend = {
                            buffer: Array.from(frameData), // Convert Uint8Array to array of numbers
                            width: width,
                            height: height,
                            pixelFormat: 'yuv420p',
                            mode: 'SendRtmp'
                        };
    
                        console.log("Sending frame:", i);
                        ws.send(JSON.stringify(dataToSend));
                    }
                }
            };
    
            // Start recording and send data every 10 seconds
            mediaRecorder.start(frameDuration);
    
            console.log("MediaRecorder started.");
        } catch (error) {
            console.error('Error accessing media devices or starting recorder:', error);
        }
      };


    


    and my backend

    


        wss.on('connection', (ws) => {
    console.log('WebSocket connection established.');

    ws.on('message', async (data) => {
        try {
            const parsedData = JSON.parse(data);

            if (parsedData.mode === 'SendRtmp' && Array.isArray(parsedData.buffer)) {
                const { buffer, pixelFormat, width, height } = parsedData;
                const bufferArray = Buffer.from(buffer);

                await sendRtmpVideo(bufferArray, pixelFormat, width, height);
            } else {
                console.log('Received unknown or invalid mode or buffer data');
            }
        } catch (error) {
            console.error('Error parsing WebSocket message:', error);
        }
    });

    ws.on('close', () => {
        console.log('WebSocket connection closed.');
    });
    });
    const sendRtmpVideo = async (frameBuffer, pixelFormat, width, height) => {
    console.log("ffmpeg data",frameBuffer)
    try {
        const ratio = `${width}x${height}`;
        const ffmpegCommand = [
            '-re',
            '-f', 'rawvideo',
            '-pix_fmt', pixelFormat,
            '-s', ratio,
            '-i', 'pipe:0',
            '-c:v', 'libx264',
            '-preset', 'fast', // Specify the preset for libx264
            '-b:v', '3000k',    // Specify the video bitrate
            '-loglevel', 'debug',
            '-f', 'flv',
            // '-flvflags', 'no_duration_filesize', 
            RTMPLINK
        ];


        const ffmpeg = spawn('ffmpeg', ffmpegCommand);

        ffmpeg.on('exit', (code, signal) => {
            if (code === 0) {
                console.log('FFmpeg process exited successfully.');
            } else {
                console.error(`FFmpeg process exited with code ${code} and signal ${signal}`);
            }
        });

        ffmpeg.on('error', (error) => {
            console.error('FFmpeg spawn error:', error);
        });

        ffmpeg.stderr.on('data', (data) => {
            console.error(`FFmpeg stderr: ${data}`);
        });

        ffmpeg.stdin.write(frameBuffer, (err) => {
            if (err) {
                console.error('Error writing to FFmpeg stdin:', err);
            } else {
                console.log('Data written to FFmpeg stdin successfully.');
            }
            ffmpeg.stdin.end(); // Close stdin after writing the buffer
        });
        } catch (error) {
        console.error('Error in sendRtmpVideo:', error);
        }
    };



    


  • How do i play an HLS stream when playlist.m3u8 file is constantly being updated ?

    3 janvier 2021, par Adnan Ahmed

    I am using MediaRecorder to record chunks of my live video in webm format from MediaStream and converting these chunks to .ts files on the server using ffmpeg and then updating my playlist.m3u8 file with this code :

    


    function generateM3u8Playlist(fileDataArr, playlistFp, isLive, cb) {
    var durations = fileDataArr.map(function(fd) {
        return fd.duration;
    });
    var maxT = maxOfArr(durations);

    var meta = [
        '#EXTM3U',
        '#EXT-X-VERSION:3',
        '#EXT-X-MEDIA-SEQUENCE:0',
        '#EXT-X-ALLOW-CACHE:YES',
        '#EXT-X-TARGETDURATION:' + Math.ceil(maxT),
    ];

    fileDataArr.forEach(function(fd) {
        meta.push('#EXTINF:' + fd.duration.toFixed(2) + ',');
        meta.push(fd.fileName2);
    });

    if (!isLive) {
        meta.push('#EXT-X-ENDLIST');
    }

    meta.push('');
    meta = meta.join('\n');

    fs.writeFile(playlistFp, meta, cb);
}


    


    Here fileDataArr holds information for all the chunks that have been created.

    


    After that i use this code to create a hls server :

    


    var runStreamServer = (function(streamFolder) {
    var executed = false;
    return function(streamFolder) {
        if (!executed) {
            executed = true;
            var HLSServer = require('hls-server')
            var http = require('http')

            var server = http.createServer()
            var hls = new HLSServer(server, {
                path: '/stream', // Base URI to output HLS streams
                dir: 'C:\\Users\\Work\\Desktop\\live-stream\\webcam2hls\\videos\\' + streamFolder // Directory that input files are stored
            })
            console.log("We are going to stream from folder:" + streamFolder);
            server.listen(8000);
            console.log('Server Listening on Port 8000');
        }
    };
})();


    


    The problem is that if i stop creating new chunks and then use the hls server link :
http://localhost:8000/stream/playlist.m3u8 then the video plays in VLC but if i try to play during the recording it keeps loading the file but does not play. I want it to play while its creating new chunks and updating playlist.m3u8. The quirk in generateM3u8Playlist function is that it adds '#EXT-X-ENDLIST' to the playlist file after i have stopped recording.
The software is still in production so its a bit messy code. Thank you for any answers.

    


    The client side that generates blobs is as follows :

    


    var mediaConstraints = {
            video: true,
            audio:true
        };
navigator.getUserMedia(mediaConstraints, onMediaSuccess, onMediaError);
function onMediaSuccess(stream) {
            console.log('will start capturing and sending ' + (DT / 1000) + 's videos when you press start');
            var mediaRecorder = new MediaStreamRecorder(stream);

            mediaRecorder.mimeType = 'video/webm';

            mediaRecorder.ondataavailable = function(blob) {
                var count2 = zeroPad(count, 5);
                // here count2 just creates a blob number 
                console.log('sending chunk ' + name + ' #' + count2 + '...');
                send('/chunk/' + name + '/' + count2 + (stopped ? '/finish' : ''), blob);
                ++count;
            };
        }
// Here we have the send function which sends our blob to server:
        function send(url, blob) {
            var xhr = new XMLHttpRequest();
            xhr.open('POST', url, true);

            xhr.responseType = 'text/plain';
            xhr.setRequestHeader('Content-Type', 'video/webm');
            //xhr.setRequestHeader("Content-Length", blob.length);

            xhr.onload = function(e) {
                if (this.status === 200) {
                    console.log(this.response);
                }
            };
            xhr.send(blob);
        }


    


    The code that receives the XHR request is as follows :

    


    var parts = u.split('/');
        var prefix = parts[2];
        var num = parts[3];
        var isFirst = false;
        var isLast = !!parts[4];

        if ((/^0+$/).test(num)) {
            var path = require('path');
            shell.mkdir(path.join(__dirname, 'videos', prefix));
            isFirst = true;
        }

        var fp = 'videos/' + prefix + '/' + num + '.webm';
        var msg = 'got ' + fp;
        console.log(msg);
        console.log('isFirst:%s, isLast:%s', isFirst, isLast);

        var stream = fs.createWriteStream(fp, { encoding: 'binary' });
        /*stream.on('end', function() {
            respond(res, ['text/plain', msg]);
        });*/

        //req.setEncoding('binary');

        req.pipe(stream);
        req.on('end', function() {
            respond(res, ['text/plain', msg]);

            if (!LIVE) { return; }

            var duration = 20;
            var fd = {
                fileName: num + '.webm',
                filePath: fp,
                duration: duration
            };
            var fileDataArr;
            if (isFirst) {
                fileDataArr = [];
                fileDataArrs[prefix] = fileDataArr;
            } else {
                var fileDataArr = fileDataArrs[prefix];
            }
            try {
                fileDataArr.push(fd);
            } catch (err) {
                fileDataArr = [];
                console.log(err.message);
            }
            videoUtils.computeStartTimes(fileDataArr);

            videoUtils.webm2Mpegts(fd, function(err, mpegtsFp) {
                if (err) { return console.error(err); }
                console.log('created %s', mpegtsFp);

                var playlistFp = 'videos/' + prefix + '/playlist.m3u8';

                var fileDataArr2 = (isLast ? fileDataArr : lastN(fileDataArr, PREV_ITEMS_IN_LIVE));

                var action = (isFirst ? 'created' : (isLast ? 'finished' : 'updated'));

                videoUtils.generateM3u8Playlist(fileDataArr2, playlistFp, !isLast, function(err) {
                    console.log('playlist %s %s', playlistFp, (err ? err.toString() : action));
                });
            });


            runStreamServer(prefix);
        }


    


  • Socket.io client in js and server in Socket.io go doesn't send connected messege and data

    24 mars 2023, par OmriHalifa

    I am using ffmpeg and socket.io and I have some issues. I'm trying to send a connection request to a server written in Go through React, but I'm unable to connect to it. I tried adding the events in useEffect and it's still not working, what should I do ? i attaching my code in js and in go :
main.go

    


    package main

import (
    "log"

    "github.com/gin-gonic/gin"

    socketio "github.com/googollee/go-socket.io"
)

func main() {
    router := gin.New()

    server := socketio.NewServer(nil)

    server.OnConnect("/", func(s socketio.Conn) error {
        s.SetContext("")
        log.Println("connected:", s.ID())
        return nil
    })

    server.OnEvent("/", "notice", func(s socketio.Conn, msg string) {
        log.Println("notice:", msg)
        s.Emit("reply", "have "+msg)
    })

    server.OnEvent("/", "transcoded-video", func(s socketio.Conn, data string) {
        log.Println("transcoded-video:", data)
    })

    server.OnEvent("/", "bye", func(s socketio.Conn) string {
        last := s.Context().(string)
        s.Emit("bye", last)
        s.Close()
        return last
    })

    server.OnError("/", func(s socketio.Conn, e error) {
        log.Println("meet error:", e)
    })

    server.OnDisconnect("/", func(s socketio.Conn, reason string) {
        log.Println("closed", reason)
    })

    go func() {
        if err := server.Serve(); err != nil {
            log.Fatalf("socketio listen error: %s\n", err)
        }
    }()
    defer server.Close()

    if err := router.Run(":8000"); err != nil {
        log.Fatal("failed run app: ", err)
    }
}



    


    App.js

    


    import &#x27;./App.css&#x27;;&#xA;import { useEffect } from &#x27;react&#x27;;&#xA;import { createFFmpeg, fetchFile } from &#x27;@ffmpeg/ffmpeg&#x27;;&#xA;import { io } from &#x27;socket.io-client&#x27;; &#xA;&#xA;function App() {&#xA;  const socket = io("http://localhost:8000",function() {&#xA;    // Send a message to the server when the client is connected&#xA;    socket.emit(&#x27;clientConnected&#x27;, &#x27;Client has connected to the server!&#x27;);&#xA;  })&#xA;&#xA;  const ffmpegWorker = createFFmpeg({&#xA;    log: true&#xA;  })&#xA;&#xA;  // Initialize FFmpeg when the component is mounted&#xA;  async function initFFmpeg() {&#xA;    await ffmpegWorker.load();&#xA;  }&#xA;&#xA;  async function transcode(webcamData) {&#xA;    const name = &#x27;record.webm&#x27;;&#xA;    await ffmpegWorker.FS(&#x27;writeFile&#x27;, name, await fetchFile(webcamData));&#xA;    await ffmpegWorker.run(&#x27;-i&#x27;, name, &#x27;-preset&#x27;, &#x27;ultrafast&#x27;, &#x27;-threads&#x27;, &#x27;4&#x27;, &#x27;output.mp4&#x27;);&#xA;    const data = ffmpegWorker.FS(&#x27;readFile&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Set the source of the output video element to the transcoded video data&#xA;    const video = document.getElementById(&#x27;output-video&#x27;);&#xA;    video.src = URL.createObjectURL(new Blob([data.buffer], { type: &#x27;video/mp4&#x27; }));&#xA;    &#xA;    // Remove the output.mp4 file from the FFmpeg virtual file system&#xA;    ffmpegWorker.FS(&#x27;unlink&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Emit a "transcoded-video" event to the server with the transcoded video data&#xA;    socket.emit("transcoded-video", data.buffer)&#xA;  }&#xA;  &#xA;  &#xA;&#xA;  let mediaRecorder;&#xA;  let chunks = [];&#xA;  &#xA;  // Request access to the user&#x27;s camera and microphone and start recording&#xA;  function requestMedia() {&#xA;    const webcam = document.getElementById(&#x27;webcam&#x27;);&#xA;    navigator.mediaDevices.getUserMedia({ video: true, audio: true })&#xA;    .then(async (stream) => {&#xA;      webcam.srcObject = stream;&#xA;      await webcam.play();&#xA;&#xA;      // Set up a MediaRecorder instance to record the video and audio&#xA;      mediaRecorder = new MediaRecorder(stream);&#xA;&#xA;      // Add the recorded data to the chunks array&#xA;      mediaRecorder.ondataavailable = async (e) => {&#xA;        chunks.push(e.data);&#xA;      }&#xA;&#xA;      // Transcode the recorded video data after the MediaRecorder stops&#xA;      mediaRecorder.onstop = async () => {&#xA;        await transcode(new Uint8Array(await (new Blob(chunks)).arrayBuffer()));&#xA;&#xA;        // Clear the chunks array after transcoding&#xA;        chunks = [];&#xA;&#xA;        // Start the MediaRecorder again after a 0 millisecond delay&#xA;        setTimeout(() => {&#xA;          mediaRecorder.start();&#xA;          &#xA;          // Stop the MediaRecorder after 3 seconds&#xA;          setTimeout(() => {&#xA;            mediaRecorder.stop();&#xA;          }, 500);&#xA;        }, 0);&#xA;      }&#xA;&#xA;      // Start the MediaRecorder&#xA;      mediaRecorder.start();&#xA;&#xA;      // Stop the MediaRecorder after 3 seconds&#xA;      setTimeout(() => {&#xA;        mediaRecorder.stop();&#xA;      }, 700);&#xA;    })&#xA;  }&#xA;  &#xA;  useEffect(() => {&#xA;    // Set up event listeners for the socket connection&#xA;    socket.on(&#x27;/&#x27;, function(){&#xA;      // Log a message when the client is connected to the server&#xA;      console.log("Connected to server!"); &#xA;    });&#xA;&#xA;    socket.on(&#x27;transcoded-video&#x27;, function(data){&#xA;      // Log the received data for debugging purposes&#xA;      console.log("Received transcoded video data:", data); &#xA;    });&#xA;&#xA;    socket.on(&#x27;notice&#x27;, function(data){&#xA;      // Emit a "notice" event back to the server to acknowledge the received data&#xA;      socket.emit("notice", "ping server!");&#xA;    });&#xA;&#xA;    socket.on(&#x27;bye&#x27;, function(data){&#xA;      // Log the received data and disconnect from the server&#xA;      console.log("Server sent:", data); &#xA;      socket.disconnect();&#xA;    });&#xA;&#xA;    socket.on(&#x27;disconnect&#x27;, function(){&#xA;      // Log a message when the client is disconnected from the server&#xA;      console.log("Disconnected from server!"); &#xA;    });&#xA;  }, [])&#xA;&#xA;  return (&#xA;    <div classname="App">&#xA;      <div>&#xA;          <video muted="{true}"></video>&#xA;          <video autoplay="autoplay"></video>&#xA;      </div>&#xA;      <button>start streaming</button>&#xA;    </div>&#xA;  );&#xA;}&#xA;&#xA;export default App;&#xA;

    &#xA;

    What can i do to fix it ? thank you !!

    &#xA;