Recherche avancée

Médias (91)

Autres articles (23)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (7467)

  • AWS Lambda execution time for FFMPEG transcoding

    4 janvier 2023, par FlamingMoe

    I'm using AWS Lambda for converting files from WEBM to MP4

    


    I'm using ffmpeg version 4.3.1-static https://johnvansickle.com/ffmpeg/ (I have done the following tests also with the ffmpeg in serverless AWS ffmpeg layer (that includes de 4.1.3), but results are even worse (about 25% slower)

    


    I'm using Node 10x as container.

    


    WEBM size   Time to convert.  Memory Lambda.  Memory used (as shown in log)

80Mb             ~44s              3008            410
40Mb             ~44s              3008            375

80Mb             ~70s              1024            321
40Mb             ~70s              1024            279


    


    All videos are 80s length. So as far as I can see, it does not matter the size of the WEBM, if the length of the video is the same, it takes the same to convert. So ffmpeg takes more time if the video length is higher, not if the file size is higher ... curious ;-)

    


    But in the other hand, I'm confused with Lambda memory. I know memory and CPU comes together in Lambda ... the more memory you choose, the more CPU is assigned.

    


    But...

    


      

    1. Why ffmpeg just take about 300/400Mb if it has more to run ?
    2. 


    3. How can I tell ffmpeg to use more memory ?
    4. 


    5. Is there any option to accelerate the process in Lambda ?
    6. 


    


    Btw, In all tests, all ffmpeg are the same, and

    


    cpu-used paramenter)

    


      

    • I added to ffmpeg parameters cpu-used=100, and it does not matter at all if I put cpu-used=5 ... times are the same, so I guess that parameter is useless (i don't know why)
    • 


    


    threads parameter)

    


      

    • Also I did some tests with "threads" parameters, but it's useless also.
    • 


    


    I know it's not a good comparison, but same files takes about 5 seconds to be converted in a simple dedicated server (8 vCores and 8GB RAM in OVH Centos VPS).

    


    Btw, Amazon Elastic Transcoder is not an option :
a) it's extremely more expensive
b) it has just his profiles to convert, and my ffmpeg commands are very complex (watermarks, effects, etc ...)

    


  • AWS Lambda in Node JS with FFMPEG Lambda Layer

    29 mars 2023, par mwcwge23

    I'm trying to make a Lambda that takes a video and puts a watermark image on it.
I'm using Lambda with NodeJS and FFMPEG Lambda Layer I took from here :
https://serverlessrepo.aws.amazon.com/applications/us-east-1/145266761615/ffmpeg-lambda-layer

    


    I got these two errors and I don't have a clue what do I did wrong :
errors

    


    Please help me :)

    


    (by the way, if you have an easier solution to put a watermark image on video that'll also be great)

    


    That's my code (trying to put a watermark image on a video file) :

    


    const express = require("express");
const childProcess = require("child_process");
const path = require("path");
const fs = require("fs");
const util = require("util");
const os = require("os");
const { fileURLToPath } = require("url");
const { v4: uuidv4 } = require("uuid");
const bodyParser = require("body-parser");
const awsServerlessExpressMiddleware = require("aws-serverless-express/middleware");
const AWS = require("aws-sdk");
const workdir = os.tmpdir();

const s3 = new AWS.S3();

// declare a new express app
const app = express();
app.use(bodyParser.json());
app.use(awsServerlessExpressMiddleware.eventContext());

// Enable CORS for all methods
app.use(function (req, res, next) {
  res.header("Access-Control-Allow-Origin", "*");
  res.header("Access-Control-Allow-Headers", "*");
  next();
});

const downloadFileFromS3 = function (bucket, fileKey, filePath) {
  "use strict";
  console.log("downloading", bucket, fileKey, filePath);
  return new Promise(function (resolve, reject) {
    const file = fs.createWriteStream(filePath),
      stream = s3
        .getObject({
          Bucket: bucket,
          Key: fileKey,
        })
        .createReadStream();
    stream.on("error", reject);
    file.on("error", reject);
    file.on("finish", function () {
      console.log("downloaded", bucket, fileKey);
      resolve(filePath);
    });
    stream.pipe(file);
  });
};

const uploadFileToS3 = function (bucket, fileKey, filePath, contentType) {
  "use strict";
  console.log("uploading", bucket, fileKey, filePath);
  return s3
    .upload({
      Bucket: bucket,
      Key: fileKey,
      Body: fs.createReadStream(filePath),
      ACL: "private",
      ContentType: contentType,
    })
    .promise();
};

const spawnPromise = function (command, argsarray, envOptions) {
  return new Promise((resolve, reject) => {
    console.log("executing", command, argsarray.join(" "));
    const childProc = childProcess.spawn(
        command,
        argsarray,
        envOptions || { env: process.env, cwd: process.cwd() }
      ),
      resultBuffers = [];
    childProc.stdout.on("data", (buffer) => {
      console.log(buffer.toString());
      resultBuffers.push(buffer);
    });
    childProc.stderr.on("data", (buffer) => console.error(buffer.toString()));
    childProc.on("exit", (code, signal) => {
      console.log(`${command} completed with ${code}:${signal}`);
      if (code || signal) {
        reject(`${command} failed with ${code || signal}`);
      } else {
        resolve(Buffer.concat(resultBuffers).toString().trim());
      }
    });
  });
};

app.post("/api/addWatermark", async (req, res) => {
  try {
    const bucketName = "bucketName ";
    const uniqeName = uuidv4() + Date.now();
    const outputPath = path.join(workdir, uniqeName + ".mp4");
    const key = "file_example_MP4_480_1_5MG.mp4";
    const localFilePath = path.join(workdir, key);
    const watermarkPngKey = "watermark.png";
    const watermarkLocalFilePath = path.join(workdir, watermarkPngKey);

    downloadFileFromS3(bucketName, key, localFilePath)
      .then(() => {
        downloadFileFromS3(bucketName, watermarkPngKey, watermarkLocalFilePath)
          .then(() => {
            fs.readFile(localFilePath, (err, data) => {
              if (!err && data) {
                console.log("successsss111");
              }
            });
            fs.readFile(watermarkLocalFilePath, (err, data) => {
              if (!err && data) {
                console.log("successsss222");
              }
            });

            fs.readFile(outputPath, (err, data) => {
              if (!err && data) {
                console.log("successsss3333");
              }
            });

            spawnPromise(
              "/opt/bin/ffmpeg",
              [
                "-i",
                localFilePath,
                "-i",
                watermarkLocalFilePath,
                "-filter_complex",
                `[1]format=rgba,colorchannelmixer=aa=0.5[logo];[0][logo]overlay=5:H-h-5:format=auto,format=yuv420p`,
                "-c:a",
                "copy",
                outputPath,
              ],
              { env: process.env, cwd: workdir }
            )
              .then(() => {
                uploadFileToS3(
                  bucketName,
                  uniqeName + ".mp4",
                  outputPath,
                  "mp4"
                );
              });
           });
      });
  } catch (err) {
    console.log({ err });
    res.json({ err });
  }
});

app.listen(8136, function () {
  console.log("App started");
});

module.exports = app;



    


  • ffmpeg transcode to live stream

    14 septembre 2016, par brayancastrop

    I need to display a ip camera stream in an html video tag, i have figured out how to transcode to a file from the rtsp stream like this

    ffmpeg -i "rtsp://user:password@ip" -s 640x480 /tmp/output.mp4

    now i need to be able to be able to live stream the rtsp input in a video tag like this

    <video src="http://domain:port/output.mp4" autoplay="autoplay"></video>

    I was trying to do something like this in my server (an ubuntu micro instance on amazon) in order to reproduce the video in the video tag but didn’t work

    ffmpeg -i "rtsp://user:password@ip" -s 640x480 http://localhost:8080/stream.mp4

    instead i got this log

    [tcp @ 0x747b40] Connection to tcp://localhost:8080 failed: Connection refused
    http://localhost:8080/stream.mp4: Connection refused

    i don’t really understand what’s happening, not sure if it’s sending the output to that url or serving the output there and this, i’ve been checking the ffmpeg man docs but i didn’t find any example related to this use case and also other questiones like this one FFmpeg Stream Transcoding which is similar to my last try without success

    btw, this is the camera i’m using DS-2CD2020F-I(W) - http://www.hikvision.com/en/Products_accessries_157_i5847.html
    they offer an httppreview but it’s just an img tag source which updates but appears to be unstable

    This is my first time trying to do something like this so any insight about how to achieve it will be really usefull and appreciated