Recherche avancée

Médias (0)

Mot : - Tags -/performance

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (105)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Formulaire personnalisable

    21 juin 2013, par

    Cette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
    Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire. (...)

  • Qu’est ce qu’un masque de formulaire

    13 juin 2013, par

    Un masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
    Chaque formulaire de publication d’objet peut donc être personnalisé.
    Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
    Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...)

Sur d’autres sites (6282)

  • Livestream not reaching AWS endpoint

    13 août 2024, par NoobAmI

    I'm trying to stream my live video into Amazon IVS and I don't see it on the live channels.

    


    Is it possible I have a mistake in my FFMPEG configuration ?
I'm expecting to see this in my playback url or on the console screen for playback but I see nothing at the moment.

    


    As I understand it, shouldn't I see some kind of playback in the live channels if a stream is being sent that channel ?

    


      async ivsStreamingService(payload: any): Promise<void> {&#xA;    const injestServer = &#x27;***.global-contribute.live-video.net:443/app/&#x27;;&#xA;    const streamKey = &#x27;sk_us-east-1_*****&#x27;;&#xA;    const ffmpeg = spawn(&#x27;ffmpeg&#x27;, [&#xA;      &#x27;-re&#x27;,&#xA;      &#x27;-i&#x27;, &#x27;-&#x27;,&#xA;      &#x27;-r&#x27;, &#x27;30&#x27;,&#xA;      &#x27;-c:v&#x27;, &#x27;libx264&#x27;,&#xA;      &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;,&#xA;      &#x27;-profile:v&#x27;, &#x27;main&#x27;,&#xA;      &#x27;-preset&#x27;, &#x27;veryfast&#x27;,&#xA;      &#x27;-x264opts&#x27;, &#x27;nal-hrd=cbr:no-scenecut&#x27;,&#xA;      &#x27;-minrate&#x27;, &#x27;3000k&#x27;,&#xA;      &#x27;-maxrate&#x27;, &#x27;3000k&#x27;,&#xA;      &#x27;-g&#x27;, &#x27;60&#x27;,&#xA;      &#x27;-c:a&#x27;, &#x27;aac&#x27;,&#xA;      &#x27;-b:a&#x27;, &#x27;160k&#x27;,&#xA;      &#x27;-ac&#x27;, &#x27;2&#x27;,&#xA;      &#x27;-ar&#x27;, &#x27;44100&#x27;,&#xA;      &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;      `rtmps://${ingestServer}${streamKey}`&#xA;    ]);&#xA;  &#xA;    ffmpeg.stdin.write(payload, (err) => {&#xA;      console.log(payload)&#xA;      if (err) console.error(&#x27;Error writing payload to FFmpeg stdin:&#x27;, err);&#xA;    });&#xA;  &#xA;    ffmpeg.on(&#x27;close&#x27;, (code) => {&#xA;      console.log(`FFmpeg process exited with code ${code}`);&#xA;    });&#xA;  &#xA;    ffmpeg.stdin.on(&#x27;error&#x27;, (err) => {&#xA;      console.error(&#x27;Error writing to FFmpeg stdin:&#x27;, err);&#xA;    });&#xA;  &#xA;    ffmpeg.stderr.on(&#x27;data&#x27;, (data) => {&#xA;      console.error(`FFmpeg error: ${data}`);&#xA;    });&#xA;  }&#xA;</void>

    &#xA;

    Example of the consoled data

    &#xA;

    I'm not quite sure why it wouldn't receive the stream, as it would appear everything is correct.

    &#xA;

  • ffmpeg remote url not working in lambda function

    9 septembre 2024, par Komal

    I have created lambda function to trim video.

    &#xA;

    I run this command :

    &#xA;

    /opt/bin/ffmpeg -protocol_whitelist file,http,https,tcp,tls -i &#x27;https://source-bucket.s3.amazonaws.com/1.mp4&#x27; -ss 5 -t 10 -loglevel 48 -y -avoid_negative_ts 1 -acodec copy /tmp/output.mp4&#xA;

    &#xA;

    It gives following output :

    &#xA;

    "ffmpeg version 7.0.2-static https://johnvansickle.com/ffmpeg/  Copyright (c) 2000-2024 the FFmpeg developers\n  built with gcc 8 (Debian 8.3.0-6)\n  configuration: --enable-gpl --enable-version3&#xA;--enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg\n  libavutil      59.  8.100 / 59.  8.100\n  libavcodec     61.  3.100 / 61.  3.100\n  libavformat    61.  1.100 / 61.  1.100\n  libavdevice    61.  1.100 /&#xA;61.  1.100\n  libavfilter    10.  1.100 / 10.  1.100\n  libswscale      8.  1.100 /  8.  1.100\n  libswresample   5.  1.100 /  5.  1.100\n  libpostproc    58.  1.100 / 58.  1.100\nSplitting the commandline.\nReading option &#x27;-protocol_whitelist&#x27; ... matched as AVOption &#x27;protocol_whitelist&#x27; with argument &#x27;file,http,https,tcp,tls&#x27;.\nReading option &#x27;-i&#x27; ... matched as input url with argument &#x27;https://source-bucket.s3.amazonaws.com/1.mp4&#x27;.\nReading option &#x27;-ss&#x27; ... matched as option &#x27;ss&#x27; (start transcoding at specified time) with argument &#x27;50&#x27;.\nReading option &#x27;-t&#x27; ... matched as option &#x27;t&#x27; (stop transcoding after specified duration) with argument &#x27;100&#x27;.\nReading option &#x27;-loglevel&#x27; ... matched as option &#x27;loglevel&#x27; (set logging level) with argument &#x27;48&#x27;.\nReading option &#x27;-y&#x27; ... matched as option &#x27;y&#x27; (overwrite output files) with argument &#x27;1&#x27;.\nReading option &#x27;-avoid_negative_ts&#x27; ... matched as AVOption &#x27;avoid_negative_ts&#x27; with argument &#x27;1&#x27;.\nReading option &#x27;-acodec&#x27; ... matched as option &#x27;acodec&#x27; (alias for -c:a (select encoder/decoder for audio streams)) with argument &#x27;copy&#x27;.\nReading option &#x27;/tmp/output.mp4&#x27; ... matched as output url.\nFinished splitting the commandline.\nParsing a group of options: global .\nApplying option loglevel (set logging level) with argument 48.\nApplying option y (overwrite output files) with argument&#xA;1.\nSuccessfully parsed a group of options.\nParsing a group of options: input url https://source-bucket.s3.amazonaws.com/1.mp4.\nSuccessfully parsed a group of options.\nOpening an input file: https://source-bucket.s3.amazonaws.com/1.mp4.\n[AVFormatContext @ 0x70ea0c0] Opening &#x27;https://source-bucket.s3.amazonaws.com/1.mp4&#x27; for reading\n"&#xA;

    &#xA;

    So, its not going beyond this : 'Opening 'https://source-bucket.s3.amazonaws.com/1.mp4' ; for reading\n'

    &#xA;

    And after this, it immediately gets out. Need suggestions, where its going wrong. Same command working fine on local computer.

    &#xA;

    I tried cloudfront and signed url as well, as mentioned in this post : How to read remote video on Amazon S3 using ffmpeg. But it doesnt work with url. Only working when downloaded into tmp folder.

    &#xA;

    Following is the final log, which you can see is no timeout. It actually finish function execution within 2-3 seconds. I have timeout set for 15 mins to the function which is maximum :&#xA;enter image description here

    &#xA;

    REPORT RequestId: 4acd7b38-017c-4dce-bb65-8f6fd3cf37e0  Duration: 1297.93 ms    Billed Duration: 1298 ms    Memory Size: 1024 MB    Max Memory Used: 103 MB Init Duration: 328.76 ms    &#xA;

    &#xA;

  • Record video stream in rust

    19 novembre 2024, par El_Loco

    I have bought a stereo camera with global shutter and a frame rate of at most 120 fps. https://www.amazon.com/dp/B0D8T3ZSL4?ref_=pe_386300_442618370_TE_sc_as_ri_0#

    &#xA;

    My next step is to write a program that can show and record a video with desired fps and resolution.

    &#xA;

    use opencv::{&#xA;    core, highgui,&#xA;    prelude::*,&#xA;    videoio::{self, VideoCapture},&#xA;    Result,&#xA;};&#xA;&#xA;fn open_camera() -> Result<videocapture> {&#xA;    let capture = videoio::VideoCapture::new(2, videoio::CAP_ANY)?;&#xA;    return Ok(capture);&#xA;}&#xA;fn main() -> Result&lt;()> {&#xA;    let window = "video capture";&#xA;    highgui::named_window(window, highgui::WINDOW_AUTOSIZE)?;&#xA;    let mut cam = open_camera()?;&#xA;    let opened = videoio::VideoCapture::is_opened(&amp;cam)?;&#xA;    if !opened {&#xA;        panic!("Unable to open default camera!");&#xA;    }&#xA;    let width = 3200.0;&#xA;    let height = 1200.0;&#xA;    cam.set(videoio::CAP_PROP_FRAME_WIDTH, width)?;&#xA;    cam.set(videoio::CAP_PROP_FRAME_HEIGHT, height)?;&#xA;&#xA;    // Set the frame rate (FPS)&#xA;    let fps = 60.0;&#xA;        &#xA;    let fourcc = videoio::VideoWriter::fourcc(&#x27;M&#x27;, &#x27;J&#x27;, &#x27;P&#x27;, &#x27;G&#x27;)?;&#xA;    let mut writer = videoio::VideoWriter::new(&#xA;        "video_output.avi",&#xA;        fourcc,&#xA;        fps,&#xA;        core::Size::new(width as i32, height as i32),&#xA;        true,&#xA;    )?;&#xA;&#xA;    if !writer.is_opened()? {&#xA;        println!("Error: Could not open the video writer.");&#xA;    }&#xA;&#xA;    let mut frame = core::Mat::default();&#xA;    let mut ctr = 0;&#xA;    while cam.read(&amp;mut frame)? {&#xA;        if frame.empty() {&#xA;            break;&#xA;        }&#xA;        writer.write(&amp;frame)?;&#xA;        highgui::imshow(window, &amp;frame)?;&#xA;        &#xA;        let key = highgui::wait_key(1)?;&#xA;        if key > 0 {&#xA;            break;&#xA;        }&#xA;        ctr &#x2B;= 1;&#xA;        if ctr == 600 {&#xA;            break;&#xA;        }&#xA;    }&#xA;    cam.release()?;&#xA;    writer.release()?;&#xA;    Ok(())&#xA;}&#xA;</videocapture>

    &#xA;

    When I run this code the frame rate is terrible. Like 1 fps or something. For debugging I tried to run in cheese. There I got 30 fps with full resolution 3200x1200. But I cannot change the fps to 60 fps what I can see.

    &#xA;

    Then I tried to capture a video using ffmpeg :

    &#xA;

    ffmpeg -f v4l2 -framerate 60 -video_size 3200x1200 -i /dev/video2 output.mp4

    &#xA;

    With the following output :

    &#xA;

    [video4linux2,v4l2 @ 0x5a72cbbd1400] The driver changed the time per frame from 1/60 to 1/2&#xA;Input #0, video4linux2,v4l2, from &#x27;/dev/video2&#x27;:&#xA;  Duration: N/A, start: 2744.250608, bitrate: 122880 kb/s&#xA;  Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 3200x1200, 122880 kb/s, 2 fps, 2 tbr, 1000k tbn&#xA;File &#x27;output.mp4&#x27; already exists. Overwrite? [y/N]&#xA;

    &#xA;

    The frame rate is lowered to 2 fps.

    &#xA;

    Then I tried to run v4l2-ctl --list-formats-ext -d 2 with the following output :

    &#xA;

    ioctl: VIDIOC_ENUM_FMT&#xA;        Type: Video Capture&#xA;&#xA;        [0]: &#x27;MJPG&#x27; (Motion-JPEG, compressed)&#xA;                Size: Discrete 3200x1200&#xA;                        Interval: Discrete 0.017s (60.000 fps)&#xA;                        Interval: Discrete 0.033s (30.000 fps)&#xA;                        Interval: Discrete 0.040s (25.000 fps)&#xA;                        Interval: Discrete 0.050s (20.000 fps)&#xA;                        Interval: Discrete 0.067s (15.000 fps)&#xA;                        Interval: Discrete 0.100s (10.000 fps)&#xA;                Size: Discrete 2560x720&#xA;                        Interval: Discrete 0.017s (60.000 fps)&#xA;                        Interval: Discrete 0.033s (30.000 fps)&#xA;                        Interval: Discrete 0.040s (25.000 fps)&#xA;                        Interval: Discrete 0.050s (20.000 fps)&#xA;                        Interval: Discrete 0.067s (15.000 fps)&#xA;                        Interval: Discrete 0.100s (10.000 fps)&#xA;                Size: Discrete 1600x600&#xA;                        Interval: Discrete 0.008s (120.000 fps)&#xA;                        Interval: Discrete 0.017s (60.000 fps)&#xA;                        Interval: Discrete 0.033s (30.000 fps)&#xA;                        Interval: Discrete 0.040s (25.000 fps)&#xA;                        Interval: Discrete 0.050s (20.000 fps)&#xA;                        Interval: Discrete 0.067s (15.000 fps)&#xA;

    &#xA;

    I then tried to open the camera using qv4land there it seemed to work. Does not seem like I can record a video though.

    &#xA;

    I am using Rust to learn. I want to be able to programmatically be able to record a video somehow and then do computer vision. The easiest would be to do it in Rust. But other solutions are ok.

    &#xA;

    Edit&#xA;I have found some more this morning :

    &#xA;

    v4l2-ctl -d 2 --list-formats-ext&#xA;ioctl: VIDIOC_ENUM_FMT&#xA;    Type: Video Capture&#xA;&#xA;    [0]: &#x27;MJPG&#x27; (Motion-JPEG, compressed)&#xA;        Size: Discrete 3200x1200&#xA;            Interval: Discrete 0.017s (60.000 fps)&#xA;            Interval: Discrete 0.033s (30.000 fps)&#xA;            Interval: Discrete 0.040s (25.000 fps)&#xA;            Interval: Discrete 0.050s (20.000 fps)&#xA;            Interval: Discrete 0.067s (15.000 fps)&#xA;            Interval: Discrete 0.100s (10.000 fps)&#xA;&#xA;    [1]: &#x27;YUYV&#x27; (YUYV 4:2:2)&#xA;        Size: Discrete 3200x1200&#xA;            Interval: Discrete 0.500s (2.000 fps)&#xA;        Size: Discrete 2560x720&#xA;            Interval: Discrete 0.500s (2.000 fps)&#xA;        Size: Discrete 1600x600&#xA;            Interval: Discrete 0.100s (10.000 fps)&#xA;

    &#xA;

    I also found here that order of flags was important for ffmpeg. Running this I can actually record a video with 60 fps :

    &#xA;

    ffmpeg -framerate 60 -f v4l2 -video_size 3200x1200 -input_format mjpeg  -i /dev/video2 output.avi

    &#xA;

    A drawback is that the images does not look very sharp. You can clearly see the pixels. (I am new to video formats etc as well. Before it has just worked.)

    &#xA;

    If I change from avito mkvit is slow again.

    &#xA;

    In the link above I also saw a suggestion to first do :

    &#xA;

    ffmpeg -framerate 60 -f v4l2 -video_size 3200x1200 -input_format mjpeg  -i /dev/video2 -c copy mjpeg.mkv

    &#xA;

    and then :

    &#xA;

    ffmpeg -i mjpeg.mkv -c:v libx264 -crf 23 -preset medium -pix_fmt yuv420p out.mkv

    &#xA;

    which worked. But I am not sure those flags are ideal for the camera I have. I think it is a good start to make it run as expected using command line and ffmpeg. So I know what format to use and that it actually works as intended before doing it programmatically.

    &#xA;