
Recherche avancée
Autres articles (38)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (5275)
-
Decode multiple video on GPU (FFmpeg.autogen + C#)
5 décembre 2024, par Sang NguyenI'm trying to use the basic example about decoding video (FFmpeg.AutoGen.Example) in the FFmpeg.autogen 4.3.0.3 library https://github.com/Ruslan-B/FFmpeg.AutoGen to decode multiple videos on a GPU (AMD radeon R7 430). My main function is as follows :


private static void Main(string[] args)
 {
 var url = @".\abc.mp4";
 for (int i = 0; i < 11; i++)
 {
 url = @"D:\video\abc" + i + ".mp4";
 new Thread(() =>
 {
 DecodeAllFramesToImages(AVHWDeviceType.AV_HWDEVICE_TYPE_D3D11VA, url);
 }).Start();
 }
 }



I try to decode video abc.mp4 with GPU Hardware Acceleration. However, an error occurs when i run thread count greater than 10. The error images is as follows :


- 

- "System.AccessViolationException : Attempted to read or write protected memory. This is often an indication that other memory." enter image description here
- And in the console screen there is an error message "Failed to create D3D11VA video decoder" and "Failed setup for format d3d11 : hwaccel innitialisation returned error"
enter image description here.






I'm new to the ffmpeg library recently, so I don't know the problem very well, I would love to have your help with this error !!


private static unsafe void DecodeAllFramesToImages(AVHWDeviceType HWDevice, string url)
 { 
 using (var vsd = new VideoStreamDecoder(url,HWDevice))
 {
 Console.WriteLine($"codec name: {vsd.CodecName}");

 var info = vsd.GetContextInfo();
 info.ToList().ForEach(x => Console.WriteLine($"{x.Key} = {x.Value}"));

 var sourceSize = vsd.FrameSize;
 var sourcePixelFormat = HWDevice == AVHWDeviceType.AV_HWDEVICE_TYPE_NONE ? vsd.PixelFormat : GetHWPixelFormat(HWDevice);
 var destinationSize = sourceSize;
 var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_YUV420P;
 using (var vfc = new VideoFrameConverter(sourceSize, sourcePixelFormat, destinationSize, destinationPixelFormat))
 {
 var frameNumber = 0;
 while (vsd.TryDecodeNextFrame(out var frame))
 {
 //var convertedFrame = vfc.Convert(frame); 
 // using (var bitmap = new Bitmap(convertedFrame.width, convertedFrame.height, convertedFrame.linesize[0], PixelFormat.Format24bppRgb, (IntPtr) convertedFrame.data[0]))
 // bitmap.Save($"frame.{frameNumber:D8}.jpg", ImageFormat.Jpeg);
 
 Console.WriteLine($"frame: {frameNumber}");
 frameNumber++;
 }
 }
 }
 }



using System;
using System.Collections.Generic;
using System.Drawing;
using System.IO;
using System.Runtime.InteropServices;

namespace FFmpeg.AutoGen.Example
{
 public sealed unsafe class VideoStreamDecoder : IDisposable
 {
 private readonly AVCodecContext* _pCodecContext;
 private readonly AVFormatContext* _pFormatContext;
 private readonly int _streamIndex;
 private readonly AVFrame* _pFrame;
 private readonly AVFrame* _receivedFrame;
 private readonly AVPacket* _pPacket;

 public VideoStreamDecoder(string url, AVHWDeviceType HWDeviceType = AVHWDeviceType.AV_HWDEVICE_TYPE_NONE)
 {
 _pFormatContext = ffmpeg.avformat_alloc_context();
 _receivedFrame = ffmpeg.av_frame_alloc();
 var pFormatContext = _pFormatContext;
 ffmpeg.avformat_open_input(&pFormatContext, url, null, null).ThrowExceptionIfError();
 ffmpeg.avformat_find_stream_info(_pFormatContext, null).ThrowExceptionIfError();
 AVCodec* codec = null;
 _streamIndex = ffmpeg.av_find_best_stream(_pFormatContext, AVMediaType.AVMEDIA_TYPE_VIDEO, -1, -1, &codec, 0).ThrowExceptionIfError();
 _pCodecContext = ffmpeg.avcodec_alloc_context3(codec);
 if (HWDeviceType != AVHWDeviceType.AV_HWDEVICE_TYPE_NONE)
 {
 ffmpeg.av_hwdevice_ctx_create(&_pCodecContext->hw_device_ctx, HWDeviceType, null, null, 0).ThrowExceptionIfError();
 }
 ffmpeg.avcodec_parameters_to_context(_pCodecContext, _pFormatContext->streams[_streamIndex]->codecpar).ThrowExceptionIfError();
 ffmpeg.avcodec_open2(_pCodecContext, codec, null).ThrowExceptionIfError(); 
 CodecName = ffmpeg.avcodec_get_name(codec->id);
 FrameSize = new Size(_pCodecContext->width, _pCodecContext->height);
 PixelFormat = _pCodecContext->pix_fmt;
 _pPacket = ffmpeg.av_packet_alloc();
 _pFrame = ffmpeg.av_frame_alloc();
 }

 public string CodecName { get; }
 public Size FrameSize { get; }
 public AVPixelFormat PixelFormat { get; }

 public void Dispose()
 {
 ffmpeg.av_frame_unref(_pFrame);
 ffmpeg.av_free(_pFrame);

 ffmpeg.av_packet_unref(_pPacket);
 ffmpeg.av_free(_pPacket);

 ffmpeg.avcodec_close(_pCodecContext);
 var pFormatContext = _pFormatContext;
 ffmpeg.avformat_close_input(&pFormatContext);
 }

 public bool TryDecodeNextFrame(out AVFrame frame)
 {
 ffmpeg.av_frame_unref(_pFrame);
 ffmpeg.av_frame_unref(_receivedFrame);
 int error;
 do
 {
 try
 {
 do
 {
 error = ffmpeg.av_read_frame(_pFormatContext, _pPacket);
 if (error == ffmpeg.AVERROR_EOF)
 {
 frame = *_pFrame;
 return false;
 }

 error.ThrowExceptionIfError();
 } while (_pPacket->stream_index != _streamIndex);

 ffmpeg.avcodec_send_packet(_pCodecContext, _pPacket).ThrowExceptionIfError();
 }
 finally
 {
 ffmpeg.av_packet_unref(_pPacket);
 }

 error = ffmpeg.avcodec_receive_frame(_pCodecContext, _pFrame);
 } while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN));
 error.ThrowExceptionIfError();
 if (_pCodecContext->hw_device_ctx != null)
 {
 ffmpeg.av_hwframe_transfer_data(_receivedFrame, _pFrame, 0).ThrowExceptionIfError();
 frame = *_receivedFrame;
 }
 else
 {
 frame = *_pFrame;
 }
 return true;
 }
 public IReadOnlyDictionary GetContextInfo()
 {
 AVDictionaryEntry* tag = null;
 var result = new Dictionary();
 while ((tag = ffmpeg.av_dict_get(_pFormatContext->metadata, "", tag, ffmpeg.AV_DICT_IGNORE_SUFFIX)) != null)
 {
 var key = Marshal.PtrToStringAnsi((IntPtr) tag->key);
 var value = Marshal.PtrToStringAnsi((IntPtr) tag->value);
 result.Add(key, value);
 }
 return result;
 }
 }
}



-
FFmpeg Autogen and Unity C# to generate video from screenshots (FFmpeg.Autogen)
1er juin 2022, par cameron gibbsI've taken the
FFmpegHelper
,VideoFrameConverter
,H264VideoStreamEncoder
classes straight from the FFmpeg.AutoGen.Example, rolled my ownFFmpegBinariesHelper class
andSize struct
and mangled theEncodeImagesToH264
from Program.cs to look like the below code. I capture a bunch of frames into textures and feed them intoEncoder.EncodeImagesToH264
. It produces a file I'm callingoutputFileName.h264
just fine, no errors. I've changedH264VideoStreamEncoder
a little based on ffmpeg's own c++ examples because they had a few things it seemed the C# example was missing but that hasn't made any difference.

The video is weird :


- 

- it only plays in VLC, is there another
AVPixelFormat
I should be using for thedestinationPixelFormat
so that anything can play ? - VLC is unable to detect the video length or show current time
- it plays back weird as if the first few seconds are all the same frame then starts playing what appears to be some of the frames I'd expect








public static class Encoder
 {
 public static unsafe void EncodeImagesToH264(Texture2D[] images, int fps, string outputFileName)
 {
 FFmpegBinariesHelper.RegisterFFmpegBinaries();

 var fistFrameImage = images[0];
 outputFileName = Path.ChangeExtension(outputFileName, ".h264");
 var sourceSize = new Size(fistFrameImage.width, fistFrameImage.height);
 var sourcePixelFormat = AVPixelFormat.AV_PIX_FMT_RGB24;
 var destinationSize = sourceSize;
 var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_YUV420P;

 try
 {
 using (var vfc = new VideoFrameConverter(
 sourceSize,
 sourcePixelFormat,
 destinationSize,
 destinationPixelFormat))
 {
 using var fs = File.Open(outputFileName, FileMode.Create);
 using var vse = new H264VideoStreamEncoder(fs, fps, destinationSize);
 var frameNumber = 0;
 foreach (var frameFile in images)
 {
 var bitmapData = GetBitmapData(frameFile);

 //var pBitmapData = (byte*)NativeArrayUnsafeUtility
 // .GetUnsafeBufferPointerWithoutChecks(bitmapData);

 fixed (byte* pBitmapData = bitmapData)
 {
 var data = new byte_ptrArray8 { [0] = pBitmapData };
 var linesize = new int_array8 { [0] = bitmapData.Length / sourceSize.Height };
 var frame = new AVFrame
 {
 data = data,
 linesize = linesize,
 height = sourceSize.Height
 };

 var convertedFrame = vfc.Convert(frame);
 convertedFrame.pts = frameNumber;

 vse.Encode(convertedFrame);

 Debug.Log($"frame: {frameNumber}");
 frameNumber++;
 }
 }
 byte[] endcode = { 0, 0, 1, 0xb7 };
 fs.Write(endcode, 0, endcode.Length);
 }
 Debug.Log(outputFileName);
 }
 catch (Exception ex)
 {
 Debug.LogException(ex);
 }
 }

 private static byte[] GetBitmapData(Texture2D frameBitmap)
 {
 return frameBitmap.GetRawTextureData();
 }
 }

 public sealed unsafe class H264VideoStreamEncoder : IDisposable
 {
 private readonly Size _frameSize;
 private readonly int _linesizeU;
 private readonly int _linesizeV;
 private readonly int _linesizeY;
 private readonly AVCodec* _pCodec;
 private readonly AVCodecContext* _pCodecContext;
 private readonly Stream _stream;
 private readonly int _uSize;
 private readonly int _ySize;

 public H264VideoStreamEncoder(Stream stream, int fps, Size frameSize)
 {
 _stream = stream;
 _frameSize = frameSize;

 var codecId = AVCodecID.AV_CODEC_ID_H264;
 _pCodec = ffmpeg.avcodec_find_encoder(codecId);
 if (_pCodec == null)
 throw new InvalidOperationException("Codec not found.");

 _pCodecContext = ffmpeg.avcodec_alloc_context3(_pCodec);
 _pCodecContext->bit_rate = 400000;
 _pCodecContext->width = frameSize.Width;
 _pCodecContext->height = frameSize.Height;
 _pCodecContext->time_base = new AVRational { num = 1, den = fps };
 _pCodecContext->gop_size = 10;
 _pCodecContext->max_b_frames = 1;
 _pCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;

 if (codecId == AVCodecID.AV_CODEC_ID_H264)
 ffmpeg.av_opt_set(_pCodecContext->priv_data, "preset", "veryslow", 0);

 ffmpeg.avcodec_open2(_pCodecContext, _pCodec, null).ThrowExceptionIfError();

 _linesizeY = frameSize.Width;
 _linesizeU = frameSize.Width / 2;
 _linesizeV = frameSize.Width / 2;

 _ySize = _linesizeY * frameSize.Height;
 _uSize = _linesizeU * frameSize.Height / 2;
 }

 public void Dispose()
 {
 ffmpeg.avcodec_close(_pCodecContext);
 ffmpeg.av_free(_pCodecContext);
 }

 public void Encode(AVFrame frame)
 {
 if (frame.format != (int)_pCodecContext->pix_fmt)
 throw new ArgumentException("Invalid pixel format.", nameof(frame));
 if (frame.width != _frameSize.Width)
 throw new ArgumentException("Invalid width.", nameof(frame));
 if (frame.height != _frameSize.Height)
 throw new ArgumentException("Invalid height.", nameof(frame));
 if (frame.linesize[0] < _linesizeY)
 throw new ArgumentException("Invalid Y linesize.", nameof(frame));
 if (frame.linesize[1] < _linesizeU)
 throw new ArgumentException("Invalid U linesize.", nameof(frame));
 if (frame.linesize[2] < _linesizeV)
 throw new ArgumentException("Invalid V linesize.", nameof(frame));
 if (frame.data[1] - frame.data[0] < _ySize)
 throw new ArgumentException("Invalid Y data size.", nameof(frame));
 if (frame.data[2] - frame.data[1] < _uSize)
 throw new ArgumentException("Invalid U data size.", nameof(frame));

 var pPacket = ffmpeg.av_packet_alloc();
 try
 {
 int error;
 do
 {
 ffmpeg.avcodec_send_frame(_pCodecContext, &frame).ThrowExceptionIfError();
 ffmpeg.av_packet_unref(pPacket);
 error = ffmpeg.avcodec_receive_packet(_pCodecContext, pPacket);
 } while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN));

 error.ThrowExceptionIfError();

 using var packetStream = new UnmanagedMemoryStream(pPacket->data, pPacket->size);
 packetStream.CopyTo(_stream);
 }
 finally
 {
 ffmpeg.av_packet_free(&pPacket);
 }
 }
 }



- it only plays in VLC, is there another
-
C# WinForms, play video using the FFmpeg library
25 septembre 2020, par Damn VegetablesAccording to web search results, FFmpeg.AutoGen provides video/audio frames with timing data, and I could use them to render the way I want. But I cannot find a simple example for playing a video file on a WinForms window, using FFmpeg.


In the example file, I can see code for initialisation and decoding video frames. How to play those frames smoothly on WinForms ? Should I use some sort of DirectX library ?