Recherche avancée

Médias (91)

Autres articles (60)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (10849)

  • ValueError : I/O operation on closed file when making animation

    3 juillet 2018, par user3851187

    I am using matplotlib and ffmpeg to do some animations. I usually code on a remote server because the code runs faster ; we are having some issues making animations on the remote server. Here is an example of code that works perfectly on my local mac but does not work remotely.

    import matplotlib as mpl
    mpl.use('agg')
    import matplotlib as mpl
    from matplotlib import animation
    import pylab

    def init():
       pylab.plot(pylab.arange(10), [0]*10)

    def redraw(frame):
       pylab.plot(pylab.arange(10), pylab.arange(10) * frame)

    fig = pylab.figure()
    ani = animation.FuncAnimation(fig, redraw, frames=10, interval=1000, init_func=init)
    ani.save('animation.mp4')

    I get the animation I want on my local machine (macOS Sierra). When I run it on the remote host (Debian GNU/Linux 8 (jessie)), I get the following error message after 5 frames

    Traceback (most recent call last):
     File "animation.py", line 14, in <module>
       ani.save('animation.mp4')
     File "/usr/local/lib/python2.7/dist-packages/matplotlib/animation.py", line 1200, in save
       writer.grab_frame(**savefig_kwargs)
     File "/usr/lib/python2.7/contextlib.py", line 35, in __exit__
       self.gen.throw(type, value, traceback)
     File "/usr/local/lib/python2.7/dist-packages/matplotlib/animation.py", line 241, in saving
       self.finish()
     File "/usr/local/lib/python2.7/dist-packages/matplotlib/animation.py", line 367, in finish
       self.cleanup()
     File "/usr/local/lib/python2.7/dist-packages/matplotlib/animation.py", line 405, in cleanup
       out, err = self._proc.communicate()
     File "/usr/local/lib/python2.7/dist-packages/subprocess32.py", line 724, in communicate
       stdout, stderr = self._communicate(input, endtime, timeout)
     File "/usr/local/lib/python2.7/dist-packages/subprocess32.py", line 1535, in _communicate
       orig_timeout)
     File "/usr/local/lib/python2.7/dist-packages/subprocess32.py", line 1591, in _communicate_with_poll
       register_and_append(self.stdout, select_POLLIN_POLLPRI)
     File "/usr/local/lib/python2.7/dist-packages/subprocess32.py", line 1570, in register_and_append
       poller.register(file_obj.fileno(), eventmask)
    ValueError: I/O operation on closed file
    </module>

    My local machine uses matplotlib version 2.0.0 ; the remote machine uses matplotlib version 2.2.2

    On my local machine I have ffmpeg version 3.2.4

    $ ffmpeg -version
    ffmpeg version 3.2.4 Copyright (c) 2000-2017 the FFmpeg developers
    built with Apple LLVM version 8.0.0 (clang-800.0.42.1)
    configuration: --prefix=/usr/local/Cellar/ffmpeg/3.2.4 --enable-shared -
    -enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables
    --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable libmp3lame --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-vda
    libavutil      55. 34.101 / 55. 34.101
    libavcodec     57. 64.101 / 57. 64.101
    libavformat    57. 56.101 / 57. 56.101
    libavdevice    57.  1.100 / 57.  1.100
    libavfilter     6. 65.100 /  6. 65.100
    libavresample   3.  1.  0 /  3.  1.  0
    libswscale      4.  2.100 /  4.  2.100
    libswresample   2.  3.100 /  2.  3.100
    libpostproc    54.  1.100 / 54.  1.100

    On the remote host i have ffmpeg version 4.0.1

    ffmpeg -version
    ffmpeg version 4.0.1 Copyright (c) 2000-2018 the FFmpeg developers
    built with gcc 4.9.2 (Debian 4.9.2-10+deb8u1)
    configuration: --prefix=/usr/local
    libavutil      56. 14.100 / 56. 14.100
    libavcodec     58. 18.100 / 58. 18.100
    libavformat    58. 12.100 / 58. 12.100
    libavdevice    58.  3.100 / 58.  3.100
    libavfilter     7. 16.100 /  7. 16.100
    libswscale      5.  1.100 /  5.  1.100
    libswresample   3.  1.100 /  3.  1.100

    If I recall correctly I installed ffmpeg locally through homebrew ; I have the anaconda distribution of python. On the remote machine we have the default version of python that comes with Jessie ; I’m not sure how the sysadmin installed ffmpeg.

    I am by no means an expert on ffmpeg, but I have generally never had issues with making animations in matplotlib on my local machine and I would really like to be able to make videos more quickly on the remote machine. Any help would be appreciated !

    Edit
    On the remote machine, the animation works if I use avconv as the writer instead of ffmpeg. I installed avconv locally...which led me to get the same ffmpeg issues locally (probably due to updating shared dependencies). However, I uninstalled ffmpeg and reinstalled it with x264 codec enables Animations in ipython (jupyter) notebook - ValueError : I/O operation on closed file

  • Unity : Converting Texture2D to YUV420P and sending with UDP using FFmpeg

    22 juin 2018, par potu1304

    In my Unity game each frame is rendered into a texture and then put together into a video using FFmpeg. Now my questions is if I am doing this right because avcodec_send_frame throws every time an exception.
    I am pretty sure that I am doing something wrong or in the wrong order or simply missing something.

    Here is the code for capturing the texture :

    void Update() {
           //StartCoroutine(CaptureFrame());

           if (rt == null)
           {
               rect = new Rect(0, 0, captureWidth, captureHeight);
               rt = new RenderTexture(captureWidth, captureHeight, 24);
               frame = new Texture2D(captureWidth, captureHeight, TextureFormat.RGB24, false);
           }

           Camera camera = this.GetComponent<camera>(); // NOTE: added because there was no reference to camera in original script; must add this script to Camera
           camera.targetTexture = rt;
           camera.Render();

           RenderTexture.active = rt;
           frame.ReadPixels(rect, 0, 0);
           frame.Apply();

           camera.targetTexture = null;
           RenderTexture.active = null;

           byte[] fileData = null;
           fileData = frame.GetRawTextureData();
           encoding(fileData, fileData.Length);

       }
    </camera>

    And here is the code for encoding and sending the byte data :

    private unsafe void encoding(byte[] bytes, int size)
       {
           Debug.Log("Encoding...");
           AVCodec* codec;
           codec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
           int ret, got_output = 0;

           AVCodecContext* codecContext = null;
           codecContext = ffmpeg.avcodec_alloc_context3(codec);
           codecContext->bit_rate = 400000;
           codecContext->width = captureWidth;
           codecContext->height = captureHeight;
           //codecContext->time_base.den = 25;
           //codecContext->time_base.num = 1;

           AVRational timeBase = new AVRational();
           timeBase.num = 1;
           timeBase.den = 25;
           codecContext->time_base = timeBase;
           //AVStream* videoAVStream = null;
           //videoAVStream->time_base = timeBase;



           AVRational frameRate = new AVRational();
           frameRate.num = 25;
           frameRate.den = 1;
           codecContext->framerate = frameRate;

           codecContext->gop_size = 10;
           codecContext->max_b_frames = 1;
           codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;

           AVFrame* inputFrame;
           inputFrame = ffmpeg.av_frame_alloc();
           inputFrame->format = (int)codecContext->pix_fmt;
           inputFrame->width = captureWidth;
           inputFrame->height = captureHeight;
           inputFrame->linesize[0] = inputFrame->width;

           AVPixelFormat dst_pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P, src_pix_fmt = AVPixelFormat.AV_PIX_FMT_RGBA;
           int src_w = 1920, src_h = 1080, dst_w = 1920, dst_h = 1080;
           SwsContext* sws_ctx;

           GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
           IntPtr address = pinned.AddrOfPinnedObject();

           sbyte** inputData = (sbyte**)address;
           sws_ctx = ffmpeg.sws_getContext(src_w, src_h, src_pix_fmt,
                                dst_w, dst_h, dst_pix_fmt,
                                0, null, null, null);

           fixed (int* lineSize = new int[1])
           {
               lineSize[0] = 4 * captureHeight;
               // Convert RGBA to YUV420P
               ffmpeg.sws_scale(sws_ctx, inputData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
           }

           inputFrame->pts = counter++;

           if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

           AVPacket pkt;
           pkt = new AVPacket();
           //pkt.data = inData;
           AVPacket* packet = &amp;pkt;
           ffmpeg.av_init_packet(packet);

           Debug.Log("pkt.size " + pkt.size);
           pinned.Free();
           AVDictionary* options = null;
           ffmpeg.av_dict_set(&amp;options, "pkt_size", "1300", 0);
           ffmpeg.av_dict_set(&amp;options, "buffer_size", "65535", 0);
           AVIOContext* server = null;
           ffmpeg.avio_open2(&amp;server, "udp://192.168.0.1:1111", ffmpeg.AVIO_FLAG_WRITE, null, &amp;options);
           Debug.Log("encoded");
           ret = ffmpeg.avcodec_encode_video2(codecContext, &amp;pkt, inputFrame, &amp;got_output);
           ffmpeg.avio_write(server, pkt.data, pkt.size);
           ffmpeg.av_free_packet(&amp;pkt);
           pkt.data = null;
           pkt.size = 0;
       }

    And every time I start the game

     if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

    throws the exception.
    Any help in fixing the issue would be greatly appreciated :)

  • Java/OpenCV - How to do a lossless h264 video writing in openCV ?

    15 août 2018, par JohnDoeAnon

    in the last time I had some struggle with the VideoWriter in openCV under java. I want to write a video file in a *.mp4 container with h.264 codec - but I see no option to toggle bitrate or quality in openCV VideoWriter. I did build openCV with ffmpeg as backend. I just want to write the video file in exact quality values as the original input video.
    I also have some code to do the job

    import org.opencv.core.Mat;
    import org.opencv.core.Size;
    import org.opencv.videoio.VideoWriter;
    import org.opencv.videoio.Videoio;

    public class VideoOutput
    {
        private final int H264_CODEC = 33;

        private VideoWriter writer;

        private String filename;

        public VideoOutput (String filename)
        {
           writer = null;

           this.filename = filename;
       }

       public void initialize(double framesPerSecond, int height, int width) throws Exception
       {

           this.writer = new VideoWriter();

           this.writer.open(filename, H264_CODEC, framesPerSecond, new Size(width, height));

           if(!writer.isOpened())
           {
                Logging.LOGGER.severe("Could not create video output file " + filename + "\n");

                throw new Exception("Could not create video output file " + filename + "\n");
           }
       }

       public void setFrame(VideoFrame videoFrame) throws Exception
       {
            if (writer.isOpened())
            {
                Mat frame = ImageUtil.imageToMat(videoFrame.getFrame());

                writer.write(frame);

                frame.release();
            }
       }

    I hoped the VideoWriter gives some options to do the job but it seems not the way.

    So is there an option or flag that I am missing for lossless h264 video writing under openCV and java OR maybe there is another way to do this ?
    Please help me - if you have done this already I really would appreciate some example code to get things done.

    UPDATE

    I do have now a solution that fits for my application, so here it is :

    String fps = Double.toString(this.config.getInputConfig().getFramesPerSecond());

    Runtime.getRuntime().exec(
           new String[] {
           "C:\\ffmpeg-3.4.2-win64-static\\bin\\ffmpeg.exe",
           "-framerate",
           fps,
           "-i",
           imageOutputPath + File.separator +  "%01d.jpg",
           "-c:v",
           "libx265",
           "-crf",
           "1",
           imageOutputPath + File.separator +  "ffmpeg.mp4"
           }
       );

    Credits to @Gyan who gave me the correct ffmpeg call in this post :

    Win/ffmpeg - How to generate a video from images under ffmpeg ?

    Greets