Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (50)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (6227)

  • How to extract the frames from AVPackets transmitted over tcp stream using FFmpeg.AutoGen ?

    28 juillet 2020, par Ahmad

    I'm using FFmpeg.AutoGen to encode screenshots (on Client side), send them, and decode them (on Server side) in realtime. However, I couldn't initialize AVFormatContext properly, so that I can decode and retrieve the frames from the network stream.

    


    I send the encoded packets from client to server through NetworkStream. At the server, I receive byte[] payload which I pass to retrieve_avPacket function as follows :

    


        //Retrieve AV packet from the received data
    public static unsafe AVPacket retrieve_avPacket(byte[] payload)
    {
        var avPacket = new AVPacket();
        ffmpeg.av_init_packet(&avPacket);

        fixed (byte* pData = payload)
        {
            avPacket.data = pData;
            avPacket.size = payload.Length;
        }

        return avPacket;
    }


    


    I need to retrieve the frame from avPacket. Therefore, I changed TryDecodeNextFrame to be as follows :

    


       public bool TryDecodeNextFrame(AVPacket* _pPacket, out AVFrame frame)
    {
        ffmpeg.av_frame_unref(_pFrame);
        ffmpeg.av_frame_unref(_receivedFrame);
        int error;
        do
        {
            try
            {
              ffmpeg.avcodec_send_packet(_pCodecContext, _pPacket).ThrowExceptionIfError();
            }
            finally
            {
                ffmpeg.av_packet_unref(_pPacket);
            }

            error = ffmpeg.avcodec_receive_frame(_pCodecContext, _pFrame);
            if (error == ffmpeg.AVERROR_EOF)
            {
                frame = *_pFrame;
                return false;
            }

        } while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN));
        error.ThrowExceptionIfError();
        frame = *_pFrame;

        return true;
    }


    


    For some reason, I receive this exception System.ApplicationException : 'End of file'

    


    I saved the first AVPacket to a file out.h264 to use it as url for VideoStreamDecoder. Afterward, I decode the AVPackets as I receive them from the network stream as follows :

    


        private static unsafe void DecodeStreamToImages(string url, AVPacket avPacket, NetworkStream stream)
    {
       using (var vsd = new VideoStreamDecoder(url, HWDevice))
        {
            var sourceSize = vsd.FrameSize;
            var sourcePixelFormat = vsd.PixelFormat;
            var destinationSize = sourceSize;
            var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_RGB24;
            byte[] payload ;
            AVPacket avPacket;

            using (var vfc = new VideoFrameConverter(sourceSize, sourcePixelFormat, destinationSize, destinationPixelFormat))
            {
                 var frameNumber = 0;
                byte[] payload;
                while (frameNumber < 200)
                {
                    while (vsd.TryDecodeNextFrame(&avPacket, out var frame))
                    {
                        var convertedFrame = vfc.Convert(frame);

                        using (var bitmap = new Bitmap(convertedFrame.width, convertedFrame.height, convertedFrame.linesize[0], PixelFormat.Format24bppRgb, (IntPtr)convertedFrame.data[0]))
                            bitmap.Save($"frame.{frameNumber:D8}.jpg", ImageFormat.Jpeg);

                        frameNumber++;
                    }
                    //Receive & Retrieve AV Packet
                    payload = ReceiveImageDataTCP(stream);
                    avPacket = retrieve_avPacket(payload);
                }
            }
        }
    }


    


    In the main, the transmission and decoding logic is as follow :

    


            NetworkStream stream = client.GetStream();

        AVPacket avPacket;
        using (var fs = File.Open("out.h264", FileMode.Create))
        {
            byte[] payload= ReceiveImageDataTCP(stream);

            //Retrieve AV Packet and save as H264 encoded packet ==> out.h264
            avPacket = retrieve_avPacket(payload);
            using (var packetStream = new UnmanagedMemoryStream((&avPacket)->data, (&avPacket)->size)) packetStream.CopyTo(fs);
        }

        //decode frames to images, initavPack: packet read from file out.h264 
        DecodeStreamToImages("out.h264", initavPacket, stream);


    


    I'm trying to decode the AVPacket as I receive them from the network stream not from a url. Specifically, the error is raised from TryDecodeNextFrame function. For some reason, _pCodecContext still points to the url although I'm passing AVPacket explicitly without reading it from the input url.
    
WHAT IS WRONG, AND HOW TO SOLVE IT ?

    


  • How to extract the frames from AVPackets using ffmpeg using FFmpeg.AutoGen ?

    28 juillet 2020, par Ahmad

    I'm using FFmpeg.AutoGen to encode catpured frames : https://github.com/Ruslan-B/FFmpeg.AutoGen

    


    I send the encoded packets from client to server through NetworkStream. At the server, I receive byte[] payload which I pass to retrieve_avPacket function as follows :

    


        //Retrieve AV packet from the received data
    public static unsafe AVPacket retrieve_avPacket(byte[] payload)
    {
        var avPacket = new AVPacket();
        ffmpeg.av_init_packet(&avPacket);

        fixed (byte* pData = payload)
        {
            avPacket.data = pData;
            avPacket.size = payload.Length;
        }

        return avPacket;
    }


    


    I aim to retrieve the frame from avPacket. Therefore, I tried to use this function :

    


       public bool TryDecodeNextFrame(AVPacket* _pPacket, out AVFrame frame)
    {
        ffmpeg.av_frame_unref(_pFrame);
        ffmpeg.av_frame_unref(_receivedFrame);
        int error;
        do
        {
            try
            {
              ffmpeg.avcodec_send_packet(_pCodecContext, _pPacket).ThrowExceptionIfError();
            }
            finally
            {
                ffmpeg.av_packet_unref(_pPacket);
            }

            error = ffmpeg.avcodec_receive_frame(_pCodecContext, _pFrame);
            if (error == ffmpeg.AVERROR_EOF)
            {
                frame = *_pFrame;
                return false;
            }

        } while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN));
        error.ThrowExceptionIfError();
        frame = *_pFrame;

        return true;
    }


    


    For some reason, I receive this exception System.ApplicationException : 'End of file'
I saved the first avPacket in file out.h264 to use it as url for VideoStreamDecoder, and then the decoding process is as follows :

    


        private static unsafe void DecodeStreamToImages(string url, AVPacket avPacket, NetworkStream stream)
    {
       using (var vsd = new VideoStreamDecoder(url, HWDevice))
        {
            var sourceSize = vsd.FrameSize;
            var sourcePixelFormat = vsd.PixelFormat;
            var destinationSize = sourceSize;
            var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_RGB24;
            byte[] payload ;
            AVPacket avPacket;

            using (var vfc = new VideoFrameConverter(sourceSize, sourcePixelFormat, destinationSize, destinationPixelFormat))
            {
                 var frameNumber = 0;
                byte[] payload;
                while (frameNumber < 200)
                {
                    while (vsd.TryDecodeNextFrame(&avPacket, out var frame))
                    {
                        var convertedFrame = vfc.Convert(frame);

                        using (var bitmap = new Bitmap(convertedFrame.width, convertedFrame.height, convertedFrame.linesize[0], PixelFormat.Format24bppRgb, (IntPtr)convertedFrame.data[0]))
                            bitmap.Save($"frame.{frameNumber:D8}.jpg", ImageFormat.Jpeg);

                        frameNumber++;
                    }
                    //Receive & Retrieve AV Packet
                    payload = ReceiveImageDataTCP(stream);
                    avPacket = retrieve_avPacket(payload);
                }
            }
        }
    }


    


    In the main, the transmission and decoding logic is as follow :

    


            NetworkStream stream = client.GetStream();

        var outputFileName = "out.h264"; 
        AVPacket avPacket;
        using (var fs = File.Open(outputFileName, FileMode.Create))
        {
            byte[] _serverImageData = ReceiveImageDataTCP(stream);

            //Retrieve AV Packet and save as H264 encoded packet ==> out.h264
            avPacket = retrieve_avPacket(_serverImageData);
            using (var packetStream = new UnmanagedMemoryStream((&avPacket)->data, (&avPacket)->size)) packetStream.CopyTo(fs);
        }

        //decode frames to images
        DecodeStreamToImages(outputFileName, avPacket, stream);


    


    I'm trying to decode the AVPacket as I receive them from the network stream not from a url. Specifically, the error is raised from TryDecodeNextFrame function, in which _pCodecContext still points to the url although I'm passing AVPackets explicitly.
    
WHAT IS WRONG, AND HOW TO SOLVE IT ?

    


  • HEVC/H.265 interlaced format support in ffmpeg or VLC

    30 décembre 2020, par Ernestas Gruodis

    "Music Box Russia" channel over satellite transmits in HEVC 1920x1080 25fps interlaced - and after recording VLC recognizes file as 50 fps, and resolution 1920x540 - half a height. But on satellite tuner the player works fine - it plays a file as 1920x1080 25fps... When we can expect support for HEVC/H.265 interlaced ? Here is recorded file (Garry Grey & Eva Miller - wtf). Also - a lot of lost frames in VLC player statistics..

    


    EDIT :

    


    I found some interesting info how in HEVC the interlace video content can be indicated here :

    


    


    Unlike to H.264/AVC, interlace-dedicated coding in HEVC is not exist :

    


      

    • No mixed frame-field interaction (like PAFF in H.264/AVC)
    • 


    • No interlace scanning of transform coefficients
    • 


    • No correction MVX[1] (or y-component of MV) if current and reference pictures are in different polarity (top-bottom or
bottom-top).
    • 


    


    However, in HEVC the interlace video content can be indicated
(signaled in VPS/SPS and pic_timing SEI messages the latter are
transmitted for every picture in the sequence). Interlace-related
setting :

    


      

    • in VPS/SPS set general_interlaced_source_flag=1 and general_progressive_source_flag=0. Indeed, the HEVC standard says :

      


      if general_progressive_source_flag is equal to 0 and
general_interlaced_source_flag is equal to 1, the source scan type of
the pictures in the CVS should be interpreted as interlaced only.

      


    • 


    • in VPS/SPS set general_frame_only_constraint_flag=0

      


    • 


    • in SPS VUI set field_seq_flag=1 and frame_field_info_present_flag=1. Notice that if these flags are ON
then picture timing SEIs shall be present for each picture.

      


    • 


    • transmission of Picture Timing SEI per picture with the following parameters :

      


      source_scan_type = 0 to indicate interlace mode
for top field picture signal pict_struct=1 and for bottom field picture pict_struct=2

      


    • 


    


    


    Perhaps it is possible to pass these parameters to ffmpeg/vlc before playing a file ?