Recherche avancée

Médias (1)

Mot : - Tags -/Rennes

Autres articles (66)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

Sur d’autres sites (8099)

  • How to send written frames in real time/synchronized with FFmpeg and UDP ?

    20 juin 2018, par potu1304

    I wanted to nearly live stream my Unit game with FFmpeg to a simple client. I have one Unity game in which each frame is saved as an jpg image. These images are wrapped in ffmpeg and send over udp to a simple c# client where I use ffplay to play the stream. The problem is, that FFmpeg is wrapping the images way faster than the unity app can write them. So ffmpeg quits but Unity is still writing frames. Is there a way to set ffmpeg in a loop to wait for the next image or can I somehow make a for loop without call every time ffmpeg ?

    Here is my function from my capturing script in Unity :

    Process process;
       //BinaryWriter _stdin;
       public void encodeFrame()
       {


           ProcessStartInfo startInfo = new ProcessStartInfo();

           var basePath = Application.streamingAssetsPath + "/FFmpegOut/Windows/ffmpeg.exe";


           info.Arguments = "-re -i screen_%d.jpg -vcodec libx264 -r 24 -f mpegts udp://127.0.0.1:1100";
           info.RedirectStandardOutput = true;
           info.RedirectStandardInput = true;
           info.RedirectStandardError = true;
           info.CreateNoWindow = true;
           info.UseShellExecute = false;
           info.RedirectStandardError = true;
           UnityEngine.Debug.Log(string.Format(
               "Executing \"{0}\" with arguments \"{1}\".\r\n",
               info.FileName,
               info.Arguments));
           process = Process.Start(info);
           //_stdin = new BinaryWriter(process.StandardInput.BaseStream);
           process.WaitForExit();
           var outputReader = process.StandardError;
           string Error = outputReader.ReadToEnd();
           UnityEngine.Debug.Log(Error);

    }

    And here the function from my cs file from my simple windowsform application :

    private void xxxFFplay()
    {
       text = "start";
       byte[] send_buffer = Encoding.ASCII.GetBytes(text);
       sock.SendTo(send_buffer, endPoint);
       ffplay.StartInfo.FileName = "ffplay.exe";
       ffplay.StartInfo.Arguments = "udp://127.0.0.1:1100";
       ffplay.StartInfo.CreateNoWindow = true;
       ffplay.StartInfo.RedirectStandardOutput = true;
       ffplay.StartInfo.UseShellExecute = false;

       ffplay.EnableRaisingEvents = true;
       ffplay.OutputDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
       ffplay.ErrorDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
       ffplay.Exited += (o, e) => Debug.WriteLine("Exited", "ffplay");
       ffplay.Start();

       Thread.Sleep(500); // you need to wait/check the process started, then...

       // child, new parent
       // make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
       //SetParent(ffplay.MainWindowHandle, this.panel1.Handle);

       // window, x, y, width, height, repaint
       // move the ffplayer window to the top-left corner and set the size to 320x280
       //MoveWindow(ffplay.MainWindowHandle, -5, -300, 320, 280, true);

    }

    Does have somebody some ideas ? I am really stuck at this to create a somehow "live" stream.

    Best regards

  • Output a Java window as a webcam stream

    15 juin 2012, par Zac

    I would like to write a program perferably in Java that can display animated overlays on a screen.

    The screen will then be broadcast streamed over the internet using a separate program called x-split.

    A good way to do this would be to create a transparent window in java which will display animated files (with transparancy) and the output of this window (Its display) should ideally appear in the webcam device list so it can be easily picked up by x-split which will allow it to be arranged ontop of the game screen I'm currently broadcasting.

    An example program of this type would be one where a webcam image is displayed and "virtual glasses" overlayed over the image of a persons face which could then be transmitted as an output cam.

    I have found the java 6u10-translucent-shapes library to create the transparent window but I don't know how to stream it.

    I've read a few things to suggest that JMF and FFMpeg might be the way to go, but I'm not sure what to install and how.

    Any help or pointers to tutorials would be greatly appreciated.

  • Compression of a video with constant background

    28 juin 2017, par Spirou003

    (sorry if my english is bad, I can read it but I don’t write it quite well...)

    I want to compress some videos, which have two particularities :

    • there is a background that covers 90% of the area, during the whole video
    • most of the others elements can be separately described by a picture in move

    My videos are like this one, and don’t have audio. As you can see, almost everything can be described only by using a fixed background, few small images in move, plus a noise. Moreover, this noise will be almost nul and then an entropic coding would be very efficient. I think it will produce tiny files (< 5 Mo) even if the duration is in hours, a result that is very appreciable since I have actually recorded 30h of game (actual size is 3 Go).

    Is there any way to get new video files, that benefit of these informations ? If yes, what are the implication of a such encoding for watching these videos with Windows Media Player, or for usage with ffmpeg ?

    I searched with Google after anything that can help me, but I don’t know which keyword I can use for this, then I didn’t found anything usefull :-(

    Thanks in advance :-)

    PS : another example, the video is accelerated but shows the interesting moves