Recherche avancée

Médias (91)

Autres articles (44)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (6027)

  • Faster way of getting number of key frames than "show_frames" in ffprobe ?

    19 novembre 2016, par Will Tower

    I’m making a little in-house utility using ffmpg and ffprobe. Works fine and does what is needed : give a count of the number of key frames in a video file plus some other details.

    Alas, with the large video files this will be used on it can take many seconds for show_frames to return – and I then have to parse the JSON dump of frame data and keep a running count of the total key frames.

    Is there a faster way ? Perhaps it is listed in the "stream" or "format" data dumps and I am not recognizing what it is being called ? I’ve been through the ffmpg and ffprobe docs and didn’t find anything else.

  • FFMPEG not enough data (x y), trying to decode anyway

    7 juin 2016, par Forest J. Handford

    I’m trying to make videos of Direct3D games using a C# app. For non-Direct3D games I stream images from Graphics.CopyFromScreen which works. When I copy the screen from Direct3D and stream it to FFMPEG I get :

    [bmp @ 00000276b0b9c280] not enough data (5070 < 129654), trying to
    decode anyway

    An MP4 file is created, but it is always 0 bytes.

    To get screenshots from Direct3D, I am using Justin Stenning’s Direct3DHook. This produces images MUCH bigger than when I get images from Graphics.CopyFromScreen (8 MB vs 136 KB). I’ve tried increasing the buffer (-bufsize) but the number on the left of the error is not impacted.

    I’ve tried resizing the image to 1/6th the original. That reduces the number on the right, but does not eliminate it. Even when the number on the right is close to what I have for Graphics.CopyFromScreen I get an error. Here is a sample of the current code :

    using System;
    using System.Diagnostics;
    using System.Threading;
    using System.Drawing;
    using Capture.Hook;
    using Capture.Interface;
    using Capture;
    using System.IO;

    namespace GameRecord
    {
       public class Video
       {
           private const int VID_FRAME_FPS = 8;
           private const int SIZE_MODIFIER = 6;
           private const double FRAMES_PER_MS = VID_FRAME_FPS * 0.001;
           private const int SLEEP_INTERVAL = 2;
           private const int CONSTANT_RATE_FACTOR = 18; // Lower crf = Higher Quality https://trac.ffmpeg.org/wiki/Encode/H.264
           private Image image;
           private Capture captureScreen;
           private int processId = 0;
           private Process process;
           private CaptureProcess captureProcess;
           private Process launchingFFMPEG;
           private string arg;
           private int frame = 0;
           private Size? resize = null;


           /// <summary>
           /// Generates the Videos by gathering frames and processing via FFMPEG.
           /// </summary>
           public void RecordScreenTillGameEnd(string exe, OutputDirectory outputDirectory, CustomMessageBox alertBox, Thread workerThread)
           {
               AttachProcess(exe);
               RequestD3DScreenShot();
               while (image == null) ;
               Logger.log.Info("Launching FFMPEG ....");
               resize = new Size(image.Width / SIZE_MODIFIER, image.Height / SIZE_MODIFIER);
               // H.264 can let us do 8 FPS in high res . . . but must be licensed for commercial use.
               arg = "-f image2pipe -framerate " + VID_FRAME_FPS + " -i pipe:.bmp -pix_fmt yuv420p -crf " +
                   CONSTANT_RATE_FACTOR + " -preset ultrafast -s " + resize.Value.Width + "x" +
                   resize.Value.Height + " -vcodec libx264 -bufsize 30000k -y \"" +
                   outputDirectory.pathToVideo + "\"";

               launchingFFMPEG = new Process
               {
                   StartInfo = new ProcessStartInfo
                   {
                       FileName = "ffmpeg",
                       Arguments = arg,
                       UseShellExecute = false,
                       CreateNoWindow = true,
                       RedirectStandardInput = true,
                       RedirectStandardError = true
                   }
               };
               launchingFFMPEG.Start();

               Stopwatch stopWatch = Stopwatch.StartNew(); //creates and start the instance of Stopwatch

               do
               {
                   Thread.Sleep(SLEEP_INTERVAL);
               } while (workerThread.IsAlive);

               Logger.log.Info("Total frames: " + frame + " Expected frames: " + (ExpectedFrames(stopWatch.ElapsedMilliseconds) - 1));

               launchingFFMPEG.StandardInput.Close();

    #if DEBUG
               string line;
               while ((line = launchingFFMPEG.StandardError.ReadLine()) != null)
               {
                   Logger.log.Debug(line);
               }
    #endif
               launchingFFMPEG.Close();
               alertBox.Show();
           }

           void RequestD3DScreenShot()
           {
               captureProcess.CaptureInterface.BeginGetScreenshot(new Rectangle(0, 0, 0, 0), new TimeSpan(0, 0, 2), Callback, resize, (ImageFormat)Enum.Parse(typeof(ImageFormat), "Bitmap"));
           }

           private void AttachProcess(string exe)
           {
               Thread.Sleep(300);
               Process[] processes = Process.GetProcessesByName(Path.GetFileNameWithoutExtension(exe));
               foreach (Process currProcess in processes)
               {
                   // Simply attach to the first one found.

                   // If the process doesn't have a mainwindowhandle yet, skip it (we need to be able to get the hwnd to set foreground etc)
                   if (currProcess.MainWindowHandle == IntPtr.Zero)
                   {
                       continue;
                   }

                   // Skip if the process is already hooked (and we want to hook multiple applications)
                   if (HookManager.IsHooked(currProcess.Id))
                   {
                       continue;
                   }

                   Direct3DVersion direct3DVersion = Direct3DVersion.AutoDetect;

                   CaptureConfig cc = new CaptureConfig()
                   {
                       Direct3DVersion = direct3DVersion,
                       ShowOverlay = false
                   };

                   processId = currProcess.Id;
                   process = currProcess;

                   var captureInterface = new CaptureInterface();
                   captureInterface.RemoteMessage += new MessageReceivedEvent(CaptureInterface_RemoteMessage);
                   captureProcess = new CaptureProcess(process, cc, captureInterface);

                   break;
               }
               Thread.Sleep(10);

               if (captureProcess == null)
               {
                   ShowUser.Exception("No executable found matching: '" + exe + "'");
               }
           }

           /// <summary>
           /// The callback for when the screenshot has been taken
           /// </summary>
           ///
           ///
           ///
           void Callback(IAsyncResult result)
           {
               using (Screenshot screenshot = captureProcess.CaptureInterface.EndGetScreenshot(result))
               if (screenshot != null &amp;&amp; screenshot.Data != null &amp;&amp; arg != null)
               {
                   if (image != null)
                   {
                       image.Dispose();
                   }

                   image = screenshot.ToBitmap();
                   // image.Save("D3DImageTest.bmp");
                   image.Save(launchingFFMPEG.StandardInput.BaseStream, System.Drawing.Imaging.ImageFormat.Bmp);
                   launchingFFMPEG.StandardInput.Flush();
                   frame++;
               }

               if (frame &lt; 5)
               {
                   Thread t = new Thread(new ThreadStart(RequestD3DScreenShot));
                   t.Start();
               }
               else
               {
                   Logger.log.Info("Done getting shots from D3D.");
               }
           }

           /// <summary>
           /// Display messages from the target process
           /// </summary>
           ///
           private void CaptureInterface_RemoteMessage(MessageReceivedEventArgs message)
           {
               Logger.log.Info(message);
           }
       }
    }

    When I search the internet for the error all I get is the FFMPEG source code, which has not proven to be illuminating. I have been able to save the image directly to disk, which makes me feel like it is not an issue with disposing the data. I have also tried only grabbing one frame, but that produces the same error, which suggests to me it is not a threading issue.

    Here is the full sample of stderr :

    2016-06-02 18:29:38,046 === ffmpeg version N-79143-g8ff0f6a Copyright (c) 2000-2016 the FFmpeg developers

    2016-06-02 18:29:38,047 ===   built with gcc 5.3.0 (GCC)

    2016-06-02 18:29:38,048 ===   configuration: --enable-gpl
    --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib

    2016-06-02 18:29:38,062 ===   libavutil      55. 19.100 / 55. 19.100

    2016-06-02 18:29:38,063 ===   libavcodec     57. 30.100 / 57. 30.100

    2016-06-02 18:29:38,064 ===   libavformat    57. 29.101 / 57. 29.101

    2016-06-02 18:29:38,064 ===   libavdevice    57.  0.101 / 57.  0.101

    2016-06-02 18:29:38,065 ===   libavfilter     6. 40.102 /  6. 40.102

    2016-06-02 18:29:38,066 ===   libswscale      4.  0.100 /  4.  0.100

    2016-06-02 18:29:38,067 ===   libswresample   2.  0.101 /  2.  0.101

    2016-06-02 18:29:38,068 ===   libpostproc    54.  0.100 / 54.  0.100

    2016-06-02 18:29:38,068 === [bmp @ 000002cd7e5cc280] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,069 === [bmp @ 000002cd7e5cc280] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,069 === Input #0, image2pipe, from 'pipe:.bmp':

    2016-06-02 18:29:38,262 ===   Duration: N/A, bitrate: N/A

    2016-06-02 18:29:38,262 ===     Stream #0:0: Video: bmp, bgra, 1920x1080, 8 tbr, 8 tbn, 8 tbc

    2016-06-02 18:29:38,263 === [libx264 @ 000002cd7e5d59a0] VBV bufsize set but maxrate unspecified, ignored

    2016-06-02 18:29:38,264 === [libx264 @ 000002cd7e5d59a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2

    2016-06-02 18:29:38,265 === [libx264 @ 000002cd7e5d59a0] profile Constrained Baseline, level 1.1

    2016-06-02 18:29:38,266 === [libx264 @ 000002cd7e5d59a0] 264 - core 148 r2665 a01e339 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=8 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0

    2016-06-02 18:29:38,463 === Output #0, mp4, to 'C:\Users\fores\AppData\Roaming\Affectiva\n_Artifacts_20160602_182857\GameplayVidOut.mp4':

    2016-06-02 18:29:38,464 ===   Metadata:

    2016-06-02 18:29:38,465 ===     encoder         : Lavf57.29.101

    2016-06-02 18:29:38,469 ===     Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 320x180, q=-1--1, 8 fps, 16384 tbn, 8 tbc

    2016-06-02 18:29:38,470 ===     Metadata:

    2016-06-02 18:29:38,472 ===       encoder         : Lavc57.30.100 libx264

    2016-06-02 18:29:38,474 ===     Side data:

    2016-06-02 18:29:38,475 ===       cpb: bitrate max/min/avg: 0/0/0 buffer size: 30000000 vbv_delay: -1

    2016-06-02 18:29:38,476 === Stream mapping:

    2016-06-02 18:29:38,477 ===   Stream #0:0 -> #0:0 (bmp (native) -> h264 (libx264))

    2016-06-02 18:29:38,480 === [bmp @ 000002cd7e5cc9a0] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,662 === [bmp @ 000002cd7e5cc9a0] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,662 === Error while decoding stream #0:0: Invalid data found when processing input

    2016-06-02 18:29:38,663 === frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    

    2016-06-02 18:29:38,663 === video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

    2016-06-02 18:29:38,664 === Conversion failed!

    In memory, the current image is 320 pixels wide and 180 pixels long. The pixel format is Format32bppRgb. The horizontal and vertical resolutions seem odd, they are both 96.01199. When filed to disk here is ffprobe output for the file :

    ffprobe version N-79143-g8ff0f6a Copyright (c) 2007-2016 the FFmpeg developers
     built with gcc 5.3.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 30.100 / 57. 30.100
     libavformat    57. 29.101 / 57. 29.101
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 40.102 /  6. 40.102
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, png_pipe, from 'C:\Users\fores\git\game-playtest-tool\GamePlayTest\bin\x64\Debug\D3DFromCapture.bmp':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: png, rgba(pc), 1920x1080 [SAR 3779:3779 DAR 16:9], 25 tbr, 25 tbn, 25 tbc

    Here is a PNG version of an example screenshot from the current code (playing Portal 2) :
    Portal 2 Screenshot

    Any ideas would be greatly appreciated. My current workaround is to save the files to the HDD and compile the video after gameplay, but it’s a far less performant option. Thank you !

  • Subtitling Sierra RBT Files

    2 juin 2016, par Multimedia Mike — Game Hacking

    This is part 2 of the adventure started in my Subtitling Sierra VMD Files post. After I completed the VMD subtitling, The Translator discovered a wealth of animation files in a format called RBT (this apparently stands for “Robot” but I think “Ribbit” format could be more fun). What are we going to do ? We had come so far by solving the VMD subtitling problem for Phantasmagoria. It would be a shame if the effort ground to a halt due to this.

    Fortunately, the folks behind the ScummVM project already figured out enough of the format to be able to decode the RBT files in Phantasmagoria.

    In the end, I was successful in creating a completely standalone tool that can take a Robot file and a subtitle file and create a new Robot file with subtitles. The source code is here (subtitle-rbt.c). Here’s what the final result looks like :


    Spanish refrigerator
    “What’s in the refrigerator ?” I should note at this juncture that I am not sure if this particular Robot file even has sound or dialogue since I was conducting these experiments on a computer with non-working audio.

    The RBT Format
    I have created a new MultimediaWiki page describing the Robot Animation format based on the ScummVM source code. I have not worked with a format quite like this before. These are paletted animations which consist of a sequence of independent frames that are designed to be overlaid on top of static background. Because of these characteristics, each frame encodes its own unique dimensions and origin coordinate within the frame. While the Phantasmagoria VMD files are usually 288×144 (which are usually double-sized for the benefit of a 640×400 Super VGA canvas), these frames are meant to be plotted on a game field that was roughly 576×288 (288×144 doublesized).

    For example, 2 minimalist animation frames from a desk investigation Robot file :


    Robot Animation Frame #1
    100×147

    Robot Animation Frame #2
    101×149

    As for compression, my first impression was that the algorithm was the same as VMD. This is wrong. It evidently uses an unmodified version of a standard algorithm called Lempel-Ziv-Stac (LZS). It shows up in several RFCs and was apparently used in MS-DOS’s transparent disk compression scheme.

    Approach
    Thankfully, many of the lessons I learned from the previous project are applicable to this project, including : subtitle library interfacing, subtitling in the paletted colorspace, and replacing encoded frames from the original file instead of trying to create a new file.

    Here is the pitch for this project :

    • Create a C program that can traverse through an input file, piece by piece, and generate an output file. The result of this should be a bitwise identical file.
    • Adapt the LZS compression decoding algorithm from ScummVM into the new tool. Make the tool dump raw Portable NetMap (PNM) files of varying dimensions and ensure that they look correct.
    • Compress using LZS.
    • Stretch the frames and draw subtitles.
    • More compression. Find the minimum window for each frame.

    Compression
    Normally, my first goal is to decompress the video and store the data in a raw form. However, this turned out to be mathematically intractable. While the format does support both compressed and uncompressed frames (even though ScummVM indicates that the uncompressed path is yet unexercised), the goal of this project requires making the frames so large that they overflow certain parameters of the file.

    A Robot file has a sequence of frames and 2 tables describing the size of each frame. One table describes the entire frame size (audio + video) while the second table describes just the video frame size. Since these tables only use 16 bits to specify a size, the maximum frame size is 65536 bytes. Leaving space for the audio portion of the frame, this only leaves a per-frame byte budget of about 63000 bytes for the video. Expanding the frame to 576×288 (165,888 pixels) would overflow this limit.

    Anyway, the upshot is that I needed to compress the data up front.

    Fortunately, the LZS compressor is pretty straightforward, at least if you have experience writing VLC-oriented codecs. While the algorithm revolves around back references, my approach was to essentially write an RLE encoder. My compressor would search for runs of data (plentiful when I started to stretch the frame for subtitling purposes). When a run length of n=3 or more of the same pixel is found, encode the pixel by itself, and then store a back reference of offset -1 and length (n-1). It took a little while to iron out a few problems, but I eventually got it to work perfectly.

    I have to say, however, that the format is a little bit weird in how it codes very large numbers. The length encoding is somewhat Golomb-like, i.e., smaller values are encoded with fewer bits. However, when it gets to large numbers, it starts encoding counts of 15 as blocks of 1111. For example, 24 is bigger than 7. Thus, emit 1111 into the bitstream and subtract 8 from 23 -> 16. Still bigger than 15, so stuff another 1111 into the bitstream and subtract 15. Now we’re at 1, so stuff 0001. So 24 is 11111111 0001. 12 bits is not too horrible. But the total number of bytes (value / 30). So a value of 300 takes around 10 bytes (80 bits) to encode.

    Palette Slices
    As in the VMD subtitling project, I took the subtitle color offered in the subtitle spec file as a suggestion and used Euclidean distance to match to the closest available color in the palette. One problem, however, is that the palette is a lot smaller in these animations. According to my notes, for the set of animations I scanned, only about 80 colors were specified, starting at palette index 55. I hypothesize that different slices of the palette are reserved for different uses. E.g., animation, background, and user interface. Thus, there is a smaller number of colors to draw upon for subtitling purposes.

    Scaling
    One bit of residual weirdness in this format is the presence of a per-frame scale factor. While most frames set this to 100 (100% scale), I have observed 70%, 80%, and 90%. ScummVM is a bit unsure about how to handle these, so I am as well. However, I eventually realized I didn’t really need to care, at least not when decoding and re-encoding the frame. Just preserve the scale factor. I intend to modify the tool further to take scale factor into account when creating the subtitle.

    The Final Resolution
    Right around the time that I was composing this post, The Translator emailed me and notified me that he had found a better way to subtitle the Robot files by modifying the scripts, rendering my entire approach moot. The result is much cleaner :


    Proper RBT Subtitles
    Turns out that the engine supported subtitles all along

    It’s a good thing that I enjoyed the challenge or I might be annoyed at this point.

    See Also