Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (95)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

  • Configuration spécifique d’Apache

    4 février 2011, par

    Modules spécifiques
    Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
    Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
    Création d’un (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (5947)

  • How to properly pipe adb screenrecord (h264 stream) to ffplay in a WinForms app ?

    23 avril 2022, par GeneralFuzz

    How to pipe ADB's exec out to ffplay ?

    


    I have been struggling to get this "Live view" C# WinForms app working properly this last week. The goal is to have the android screen in the native app window where I then have other controls implemented as an overlay.

    


    I am able to live stream by piping adb's screen record H264 into FFplay via CMD.
A CMD process that launches a .BAT does function, but I can't manipulate FFplay as control seems to be lost with how it's launched (Correct if wrong).
I just need a programmatic version of this where I can then control the FFplay window to merge it as a child into my form.

    


    adb exec-out screenrecord --output-format=h264 - | ffplay -window_title "Live View" -framerate 60 -framedrop -probesize 32 -sync video  -


    


    I also attempted creating a ADB and FFplay process, manually trying to write the standard in from ADB's standard out. The standard out was received but I couldn't figure out writing to ffplay correctly. May have had a same thread deadlock issue.

    


            //Configure ffplay process and start
        //ffplayProcess.SynchronizingObject();
        ffplayProcess.OutputDataReceived += (o, ev) => Debug.WriteLine(ev.Data ?? "NULL", "ffplay");
        ffplayProcess.ErrorDataReceived += (o, ev) => Debug.WriteLine(ev.Data ?? "NULL", "ffplay");
        ffplayProcess.Exited += (o, ev) => Debug.WriteLine("Exited", "ffplay");
        try
        {
            ffplayProcess.Start();
        }
        catch (Exception err)
        {
            MessageBox.Show($"Failed to start livestream. {err.Message}", "Live Stream Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
            return;
        }

        //Wait/check the process started, then...
        System.Threading.Thread.Sleep(200);

        //Run only if ffplay has not exited
        if (ffplayProcess.HasExited == false)
        {
            // make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
            SetParent(ffplayProcess.MainWindowHandle, this.Handle);
            MoveWindow(ffplayProcess.MainWindowHandle, 0, 0, 240, 320, true);
        }

        adbProcess.OutputDataReceived += (o, ev) => {
            Debug.WriteLine(ev.Data ?? "NULL", "adb");

            if (ev.Data != "NULL" || ev.Data != null)
            {
                //Convert data to byte array
                //byte[] dataBytes = Encoding.ASCII.GetBytes(ev.Data);
                byte[] dataBytes = Encoding.UTF8.GetBytes(ev.Data);
                ffplayProcess.StandardInput.BaseStream.WriteAsync(dataBytes, 0, dataBytes.Length);
                ffplayProcess.StandardInput.BaseStream.FlushAsync();
            }
        };
        adbProcess.ErrorDataReceived += (o, ev) => Debug.WriteLine(ev.Data ?? "NULL", "adb");
        adbProcess.Exited += (o, ev) => Debug.WriteLine("Exited", "adb");

        adbProcess.Start();

        adbProcess.BeginOutputReadLine();
        adbProcess.BeginErrorReadLine();


    


    My current attempt is using MedallionShell to pipe into the FFplay process. ADB and FFPlay launch, but I never get FFplay's video out window.

    


    private void FormLiveView_Load(object sender, EventArgs e)
{
        var command = Medallion.Shell.Command.Run(tmpPath + "/adb.exe", new[] { "exec-out screenrecord --output-format=h264 -" }, options => { options.DisposeOnExit(false); });
            
        command.PipeTo(Medallion.Shell.Command.Run(tmpPath + "/ffplay.exe", new[] { "-framerate 60 -framedrop -probesize 32 -sync video -" }, options => { options.DisposeOnExit(false); }));
}


    


  • How to implement Seekbar for a video player playing FFmpeg pipe output in Flutter ?

    19 avril 2022, par Lins Louis

    I was about to create a video player that can play FFmpeg pipe output in flutter. Luckily i found a solution with Github project flutter-ffmpeg, Thanks @tanersener for this amazing project https://github.com/tanersener/flutter-ffmpeg

    


    Below I am mentioning the comment that helped me to achieve the feature i was looking for

    


    https://github.com/tanersener/flutter-ffmpeg/issues/92#issuecomment-606051974

    


    thanks, @reeckset also.

    


    BTW, my current issue is, that I didn't find any solution on how to seek my video player that plays a pipe output of ffmpeg. Is there anything I can do for implementing a Seekbar in my video player

    


  • Receiving multiple files from ffmpeg via subprocesses.PIPE

    11 avril 2022, par Per Plexi

    I am using ffmpeg to convert a video into images. These images are then processed by my Python program. Originally I used ffmpeg to first save the images to disk, then reading them one by one with Python.

    



    This works fine, but in an effort to speed up the program I am trying to skip the storage step and only work with the images in memory.

    



    I use the following ffmpeg and Python subproccesses command to pipe the output from ffmpeg to Python :

    



    command = "ffmpeg.exe -i ADD\\sg1-original.mp4 -r 1 -f image2pipe pipe:1"
pipe = subprocess.Popen(ffmpeg-command, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
image = Image.new(pipe.communicate()[0])


    



    The image variable can then be used by my program. The problem is that if I send more than 1 image from ffmpeg all the data is stored in this variable. I need a way to separate the images. The only way I can think of is splitting on jpeg markers end of file (0xff, 0xd9). This works, but is unreliable.

    



    What have I missed regarding piping files with subproccesses. Is there a way to only read one file at a time from the pipeline ?