Recherche avancée

Médias (0)

Mot : - Tags -/optimisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (16)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (4419)

  • Python running ffmpeg with Popen causes memory leak

    13 janvier 2020, par loretoparisi

    I’m using ffmpeg in Python3 to run a command :

       process = subprocess.Popen(
           command,
           stdout=open(os.devnull, 'wb'),
           stdin=subprocess.PIPE,
           stderr=subprocess.PIPE)

       # Write data to STDIN.
       try:
           process.stdin.write(data.astype('code>

    where data representing a wave file, hence it is appended to stdin.
    This code has a memory leak. The memory is not being released, in fact if we do memory tracing :

       is_tracing = tracemalloc.is_tracing()
       if not is_tracing:
           nframe = 6
           tracemalloc.start(nframe)
       current_mem, peak_mem = tracemalloc.get_traced_memory()
       overhead = tracemalloc.get_tracemalloc_memory()
       summary = "traced memory: %d KiB  peak: %d KiB  overhead: %d KiB" % (
           int(current_mem // 1024), int(peak_mem // 1024), int(overhead // 1024)
       )
       print( "before save", summary )

       process = subprocess.Popen(
           command,
           stdout=open(os.devnull, 'wb'),
           stdin=subprocess.PIPE,
           stderr=subprocess.PIPE)

       try:
           process.stdin.write(data.astype('/ 1024), int(peak_mem // 1024), int(overhead // 1024)
       )
       print( "after save", summary )

    and running it more times it will give

    after save traced memory: 18 KiB  peak: 1419 KiB  overhead: 10 KiB
    before save traced memory: 27459 KiB  peak: 28152 KiB  overhead: 28293 KiB
    after save traced memory: 27384 KiB  peak: 28872 KiB  overhead: 28267 KiB
    before save traced memory: 52707 KiB  peak: 53400 KiB  overhead: 53132 KiB
    after save traced memory: 52653 KiB  peak: 54120 KiB  overhead: 53109 KiB

    Specifically the ffmpeg command executed was

        command = (
           self._get_command_builder()
           .flag('-y')
           .opt('-loglevel', 'error')
           .opt('-f', 'f32le')
           .opt('-ar', sample_rate)
           .opt('-ac', data.shape[1])
           .opt('-i', '-')
           .flag('-vn')
           .opt('-acodec', codec)
           .opt('-ar', sample_rate)
           .opt('-strict', '-2')
           .opt('-ab', bitrate)
           .flag(path)
           .command())

    Which is the variabile that leaks memory ? I have tried to clean data and flush the stdin without any success.
    I have also tried to run within a threading.Thread like :

    class MyClass(threading.Thread):
       def __init__(self, *args):
           self.stdout = None
           self.stderr = None
           self.args = args
           threading.Thread.__init__(self)

       def run(self):
           command = self.args[ 0 ]
           data = self.args[ 1 ]
           process = subprocess.Popen(
               command,
               stdout=open(os.devnull, 'wb'),
               stdin=subprocess.PIPE,
               stderr=subprocess.PIPE)
           try:
               process.stdin.write(data.astype('code>

    and then

    myclass = MyClass(command, data)
    myclass.start()
    myclass.join()
  • memory leak reading video frames to numpy array using ffmpeg as a python subprocess

    19 novembre 2023, par paddyg

    I can stream videos frame by frame to an OpenGL Texture2D OK in python (pi3d module, example in pi3d_demos/VideoWalk.py) but I've noticed that it gradually leaks memory. Below is a stripped down version of the code that shows the problem.

    


    Can anyone see where I'm leaking ? The memory seems to be recovered when python stops. I've tried explicitly setting things to None or calling the garbage collector manually.

    


    #!/usr/bin/python
import os
import numpy as np
import subprocess
import threading
import time
import json

def get_dimensions(video_path):
    probe_cmd = f'ffprobe -v error -show_entries stream=width,height,avg_frame_rate -of json "{video_path}"'
    probe_result = subprocess.check_output(probe_cmd, shell=True, text=True)
    video_info_list = [vinfo for vinfo in json.loads(probe_result)['streams'] if 'width' in vinfo]
    if len(video_info_list) > 0:
        video_info = video_info_list[0] # use first if more than one!
        return(video_info['width'], video_info['height'])
    else:
        return None

class VideoStreamer:
    def __init__(self, video_path):
        self.flag = False # use to signal new texture
        self.kill_thread = False
        self.command = [ 'ffmpeg', '-i', video_path, '-f', 'image2pipe',
                        '-pix_fmt', 'rgb24', '-vcodec', 'rawvideo', '-']
        dimensions = get_dimensions(video_path)
        if dimensions is not None:
            (self.W, self.H) = dimensions
            self.P = 3
            self.image = np.zeros((self.H, self.W, self.P), dtype='uint8')
            self.t = threading.Thread(target=self.pipe_thread)
            self.t.start()
        else: # couldn't get dimensions for some reason - assume not able to read video
            self.W = 240
            self.H = 180
            self.P = 3
            self.image = np.zeros((self.H, self.W, self.P), dtype='uint8')
            self.t = None

    def pipe_thread(self):
        pipe = None
        while not self.kill_thread:
            st_tm = time.time()
            if pipe is None:
                pipe = subprocess.Popen(self.command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, bufsize=-1)
            self.image = np.frombuffer(pipe.stdout.read(self.H * self.W * self.P), dtype='uint8') # overwrite array
            pipe.stdout.flush() # presumably nothing else has arrived since read()
            pipe.stderr.flush() # ffmpeg sends commentary to stderr
            if len(self.image) < self.H * self.W * self.P: # end of video, reload
                pipe.terminate()
                pipe = None
            else:
                self.image.shape = (self.H, self.W, self.P)
                self.flag = True
            step = time.time() - st_tm
            time.sleep(max(0.04 - step, 0.0)) # adding fps info to ffmpeg doesn't seem to have any effect
        if pipe is not None:
            pipe.terminate()
            pipe = None

    def kill(self):
        self.kill_thread = True
        if self.t is not None:
            self.t.join()

vs = None
try:
    while True:
        for (path, _, videos) in os.walk("/home/patrick/Pictures/videos"):
            for video in videos:
                print(video)
                os.system("free") # shows gradually declining memory available
                vs = VideoStreamer(os.path.join(path, video))
                for i in range(500):
                    tries = 0
                    while not vs.flag and tries < 5:
                        time.sleep(0.001)
                        tries += 1
                    # at this point vs.image is a numpy array HxWxP bytes
                    vs.flag = False
                vs.kill()
except KeyboardInterrupt:
    if vs is not None:
        vs.kill()


os.system("free")


    


  • FFMPEG not enough data (x y), trying to decode anyway

    7 juin 2016, par Forest J. Handford

    I’m trying to make videos of Direct3D games using a C# app. For non-Direct3D games I stream images from Graphics.CopyFromScreen which works. When I copy the screen from Direct3D and stream it to FFMPEG I get :

    [bmp @ 00000276b0b9c280] not enough data (5070 < 129654), trying to
    decode anyway

    An MP4 file is created, but it is always 0 bytes.

    To get screenshots from Direct3D, I am using Justin Stenning’s Direct3DHook. This produces images MUCH bigger than when I get images from Graphics.CopyFromScreen (8 MB vs 136 KB). I’ve tried increasing the buffer (-bufsize) but the number on the left of the error is not impacted.

    I’ve tried resizing the image to 1/6th the original. That reduces the number on the right, but does not eliminate it. Even when the number on the right is close to what I have for Graphics.CopyFromScreen I get an error. Here is a sample of the current code :

    using System;
    using System.Diagnostics;
    using System.Threading;
    using System.Drawing;
    using Capture.Hook;
    using Capture.Interface;
    using Capture;
    using System.IO;

    namespace GameRecord
    {
       public class Video
       {
           private const int VID_FRAME_FPS = 8;
           private const int SIZE_MODIFIER = 6;
           private const double FRAMES_PER_MS = VID_FRAME_FPS * 0.001;
           private const int SLEEP_INTERVAL = 2;
           private const int CONSTANT_RATE_FACTOR = 18; // Lower crf = Higher Quality https://trac.ffmpeg.org/wiki/Encode/H.264
           private Image image;
           private Capture captureScreen;
           private int processId = 0;
           private Process process;
           private CaptureProcess captureProcess;
           private Process launchingFFMPEG;
           private string arg;
           private int frame = 0;
           private Size? resize = null;


           /// <summary>
           /// Generates the Videos by gathering frames and processing via FFMPEG.
           /// </summary>
           public void RecordScreenTillGameEnd(string exe, OutputDirectory outputDirectory, CustomMessageBox alertBox, Thread workerThread)
           {
               AttachProcess(exe);
               RequestD3DScreenShot();
               while (image == null) ;
               Logger.log.Info("Launching FFMPEG ....");
               resize = new Size(image.Width / SIZE_MODIFIER, image.Height / SIZE_MODIFIER);
               // H.264 can let us do 8 FPS in high res . . . but must be licensed for commercial use.
               arg = "-f image2pipe -framerate " + VID_FRAME_FPS + " -i pipe:.bmp -pix_fmt yuv420p -crf " +
                   CONSTANT_RATE_FACTOR + " -preset ultrafast -s " + resize.Value.Width + "x" +
                   resize.Value.Height + " -vcodec libx264 -bufsize 30000k -y \"" +
                   outputDirectory.pathToVideo + "\"";

               launchingFFMPEG = new Process
               {
                   StartInfo = new ProcessStartInfo
                   {
                       FileName = "ffmpeg",
                       Arguments = arg,
                       UseShellExecute = false,
                       CreateNoWindow = true,
                       RedirectStandardInput = true,
                       RedirectStandardError = true
                   }
               };
               launchingFFMPEG.Start();

               Stopwatch stopWatch = Stopwatch.StartNew(); //creates and start the instance of Stopwatch

               do
               {
                   Thread.Sleep(SLEEP_INTERVAL);
               } while (workerThread.IsAlive);

               Logger.log.Info("Total frames: " + frame + " Expected frames: " + (ExpectedFrames(stopWatch.ElapsedMilliseconds) - 1));

               launchingFFMPEG.StandardInput.Close();

    #if DEBUG
               string line;
               while ((line = launchingFFMPEG.StandardError.ReadLine()) != null)
               {
                   Logger.log.Debug(line);
               }
    #endif
               launchingFFMPEG.Close();
               alertBox.Show();
           }

           void RequestD3DScreenShot()
           {
               captureProcess.CaptureInterface.BeginGetScreenshot(new Rectangle(0, 0, 0, 0), new TimeSpan(0, 0, 2), Callback, resize, (ImageFormat)Enum.Parse(typeof(ImageFormat), "Bitmap"));
           }

           private void AttachProcess(string exe)
           {
               Thread.Sleep(300);
               Process[] processes = Process.GetProcessesByName(Path.GetFileNameWithoutExtension(exe));
               foreach (Process currProcess in processes)
               {
                   // Simply attach to the first one found.

                   // If the process doesn't have a mainwindowhandle yet, skip it (we need to be able to get the hwnd to set foreground etc)
                   if (currProcess.MainWindowHandle == IntPtr.Zero)
                   {
                       continue;
                   }

                   // Skip if the process is already hooked (and we want to hook multiple applications)
                   if (HookManager.IsHooked(currProcess.Id))
                   {
                       continue;
                   }

                   Direct3DVersion direct3DVersion = Direct3DVersion.AutoDetect;

                   CaptureConfig cc = new CaptureConfig()
                   {
                       Direct3DVersion = direct3DVersion,
                       ShowOverlay = false
                   };

                   processId = currProcess.Id;
                   process = currProcess;

                   var captureInterface = new CaptureInterface();
                   captureInterface.RemoteMessage += new MessageReceivedEvent(CaptureInterface_RemoteMessage);
                   captureProcess = new CaptureProcess(process, cc, captureInterface);

                   break;
               }
               Thread.Sleep(10);

               if (captureProcess == null)
               {
                   ShowUser.Exception("No executable found matching: '" + exe + "'");
               }
           }

           /// <summary>
           /// The callback for when the screenshot has been taken
           /// </summary>
           ///
           ///
           ///
           void Callback(IAsyncResult result)
           {
               using (Screenshot screenshot = captureProcess.CaptureInterface.EndGetScreenshot(result))
               if (screenshot != null &amp;&amp; screenshot.Data != null &amp;&amp; arg != null)
               {
                   if (image != null)
                   {
                       image.Dispose();
                   }

                   image = screenshot.ToBitmap();
                   // image.Save("D3DImageTest.bmp");
                   image.Save(launchingFFMPEG.StandardInput.BaseStream, System.Drawing.Imaging.ImageFormat.Bmp);
                   launchingFFMPEG.StandardInput.Flush();
                   frame++;
               }

               if (frame &lt; 5)
               {
                   Thread t = new Thread(new ThreadStart(RequestD3DScreenShot));
                   t.Start();
               }
               else
               {
                   Logger.log.Info("Done getting shots from D3D.");
               }
           }

           /// <summary>
           /// Display messages from the target process
           /// </summary>
           ///
           private void CaptureInterface_RemoteMessage(MessageReceivedEventArgs message)
           {
               Logger.log.Info(message);
           }
       }
    }

    When I search the internet for the error all I get is the FFMPEG source code, which has not proven to be illuminating. I have been able to save the image directly to disk, which makes me feel like it is not an issue with disposing the data. I have also tried only grabbing one frame, but that produces the same error, which suggests to me it is not a threading issue.

    Here is the full sample of stderr :

    2016-06-02 18:29:38,046 === ffmpeg version N-79143-g8ff0f6a Copyright (c) 2000-2016 the FFmpeg developers

    2016-06-02 18:29:38,047 ===   built with gcc 5.3.0 (GCC)

    2016-06-02 18:29:38,048 ===   configuration: --enable-gpl
    --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib

    2016-06-02 18:29:38,062 ===   libavutil      55. 19.100 / 55. 19.100

    2016-06-02 18:29:38,063 ===   libavcodec     57. 30.100 / 57. 30.100

    2016-06-02 18:29:38,064 ===   libavformat    57. 29.101 / 57. 29.101

    2016-06-02 18:29:38,064 ===   libavdevice    57.  0.101 / 57.  0.101

    2016-06-02 18:29:38,065 ===   libavfilter     6. 40.102 /  6. 40.102

    2016-06-02 18:29:38,066 ===   libswscale      4.  0.100 /  4.  0.100

    2016-06-02 18:29:38,067 ===   libswresample   2.  0.101 /  2.  0.101

    2016-06-02 18:29:38,068 ===   libpostproc    54.  0.100 / 54.  0.100

    2016-06-02 18:29:38,068 === [bmp @ 000002cd7e5cc280] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,069 === [bmp @ 000002cd7e5cc280] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,069 === Input #0, image2pipe, from 'pipe:.bmp':

    2016-06-02 18:29:38,262 ===   Duration: N/A, bitrate: N/A

    2016-06-02 18:29:38,262 ===     Stream #0:0: Video: bmp, bgra, 1920x1080, 8 tbr, 8 tbn, 8 tbc

    2016-06-02 18:29:38,263 === [libx264 @ 000002cd7e5d59a0] VBV bufsize set but maxrate unspecified, ignored

    2016-06-02 18:29:38,264 === [libx264 @ 000002cd7e5d59a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2

    2016-06-02 18:29:38,265 === [libx264 @ 000002cd7e5d59a0] profile Constrained Baseline, level 1.1

    2016-06-02 18:29:38,266 === [libx264 @ 000002cd7e5d59a0] 264 - core 148 r2665 a01e339 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=8 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0

    2016-06-02 18:29:38,463 === Output #0, mp4, to 'C:\Users\fores\AppData\Roaming\Affectiva\n_Artifacts_20160602_182857\GameplayVidOut.mp4':

    2016-06-02 18:29:38,464 ===   Metadata:

    2016-06-02 18:29:38,465 ===     encoder         : Lavf57.29.101

    2016-06-02 18:29:38,469 ===     Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 320x180, q=-1--1, 8 fps, 16384 tbn, 8 tbc

    2016-06-02 18:29:38,470 ===     Metadata:

    2016-06-02 18:29:38,472 ===       encoder         : Lavc57.30.100 libx264

    2016-06-02 18:29:38,474 ===     Side data:

    2016-06-02 18:29:38,475 ===       cpb: bitrate max/min/avg: 0/0/0 buffer size: 30000000 vbv_delay: -1

    2016-06-02 18:29:38,476 === Stream mapping:

    2016-06-02 18:29:38,477 ===   Stream #0:0 -> #0:0 (bmp (native) -> h264 (libx264))

    2016-06-02 18:29:38,480 === [bmp @ 000002cd7e5cc9a0] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,662 === [bmp @ 000002cd7e5cc9a0] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,662 === Error while decoding stream #0:0: Invalid data found when processing input

    2016-06-02 18:29:38,663 === frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    

    2016-06-02 18:29:38,663 === video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

    2016-06-02 18:29:38,664 === Conversion failed!

    In memory, the current image is 320 pixels wide and 180 pixels long. The pixel format is Format32bppRgb. The horizontal and vertical resolutions seem odd, they are both 96.01199. When filed to disk here is ffprobe output for the file :

    ffprobe version N-79143-g8ff0f6a Copyright (c) 2007-2016 the FFmpeg developers
     built with gcc 5.3.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 30.100 / 57. 30.100
     libavformat    57. 29.101 / 57. 29.101
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 40.102 /  6. 40.102
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, png_pipe, from 'C:\Users\fores\git\game-playtest-tool\GamePlayTest\bin\x64\Debug\D3DFromCapture.bmp':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: png, rgba(pc), 1920x1080 [SAR 3779:3779 DAR 16:9], 25 tbr, 25 tbn, 25 tbc

    Here is a PNG version of an example screenshot from the current code (playing Portal 2) :
    Portal 2 Screenshot

    Any ideas would be greatly appreciated. My current workaround is to save the files to the HDD and compile the video after gameplay, but it’s a far less performant option. Thank you !