
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (79)
-
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users.
Sur d’autres sites (5691)
-
ffmpeg splits video with differents segments size
22 mai 2018, par Rafael Limaim trying to split a video in segments with same lenght using ffmpeg
ffmpeg -y -i teste.mp4 -map 0 -segment_time 10 -f segment -c:v libx264 -preset veryfast -crf 24 -tune film -vf "drawtext=text='teste':start_number=0:x=0:y=0" -an teste\out%d.mp4
the input file is 3minutes
in the output folder i get several files each have a duration between 8 and 13 seconds.
during the process i get many messages like
Past duration 0.994987 too large
Past duration 0.998329 too large
Past duration 0.995995 too large
Past duration 0.997993 too large
Past duration 0.998665 too large
Past duration 0.999321 too largealso the output file are not printing the "teste" drawtext filter...
does anyone know what is the problem ?
how can a really force ffmpeg to split 10 seconds segment (i can accept 9 or 8 seconds segments as an "error margin" but get 13 seconds segments asking for 10 is something really wrong to me)
-
Real Time Audio and Video Streaming in C#
16 novembre 2014, par NuwanI am developing an application which can be used to stream audio and video in real time.
I can stream in two different ways. I used a capture card to capture live HD stream and
re send it. And also I need to stream local video file in real time.Now I capture video using OpenCV and store frames as bitmaps in
blokingCollection
bitmap queue.
After that I encode video frames usingffmpeg
(used c# library Nreco) and stored in a queue. Then I send that encoded data through UDP (did not used RTP/RTSP) to omxplayer in raspberry pi and it works very fine.Then I captured audio data using ffmpeg
I used this command to capture and encode audio data.data = ms.ToArray();
ffMpegTask = ffmpegConverter.ConvertLiveMedia(
fileName,
null,
ms,
Format.avi,
new ConvertSettings()
{
CustomOutputArgs = " -tune zerolatency -ss " + second + " -t " + endTime + " -strict experimental -acodec aac -ab 160k -ac 2 -ar 44100 -vn ",
});
ffMpegTask.Start();
ffMpegTask.Stop();
byte[] data = ms.ToArray();After that I saved every audio data packet to queue.
And I tried to stream these separate audio and video data to omxplayer by using two different
ports. and received streams by using two omxplayers. And it works fine.But what I need to do is multiplex this audio and video stream and send as one stream.
what I do is first stream two streams asUDP://224.1.1.1:1250
(video) andUDP://224.1.1.1:1260
(audio)
then I used nreco invoke method. We can use it to execute ffmpeg commands." -re -i udp://224.1.1.1:1250 -i udp://224.1.1.1:1260 -c copy -f avi udp://224.1.1.1:1270"
and this works for both audio and video stream but completely out of sync.
Next thing what I do is creating another ffmpeg
ConvertLiveMedia
task and write audio and video data
to that task using write method. And I stream that mux data and received usingffplay
. And it plays the stream
and the sync problem is solved. But sometimes audio and video frames are dropping and then it begins to
play out of sync.combine = new MemoryStream();
ffMpegTaskcom = ffmpegConvertercom.ConvertLiveMedia(
Format.mpeg,
combine,
Format.avi,
new ConvertSettings()
{
CustomInputArgs = " ", // windows bitmap pixel format
CustomOutputArgs = " -threads 7 -c:v libx264 -preset ultrafast -tune zerolatency -strict experimental -profile:v baseline -movflags +faststart -tune film -level 3.0 -tune zerolatency -tune film -pix_fmt yuv420p -g 250 -crf 22 -b:v 4000k -minrate 3000k -maxrate 5000k -acodec aac -ab 160k -ac 2 -ar 44100",
});
ffMpegTaskcom.Start();
byte[] streamBytesvi = null;
byte[] streamBytesau = null;
encodeQueqe.TryDequeue(out streamBytesvi);
encodeQueqeau.TryDequeue(out streamBytesau);
ffMpegTaskcom.Write(streamBytesvi, 0, streamBytesvi.Length);
ffMpegTaskcom.Write(streamBytesau, 0, streamBytesau.Length);
//ffMpegTaskcom.Wait();
ffMpegTaskcom.Stop();Now I need to know a good method to deliver audio and video data with synchronization.
Please tell me what is the wrong I have done or suggest a better way to do this.Thank You !
-
avcodec/cbs_av1 : fix decoder/encoder_buffer_delay variable types
4 novembre 2018, par James Almeravcodec/cbs_av1 : fix decoder/encoder_buffer_delay variable types
buffer_delay_length_minus_1 is five bits long, meaning decode_buffer_delay and
encoder_buffer_delay can have values up to 32 bits long.Reviewed-by : Mark Thompson <sw@jkqxz.net>
Signed-off-by : James Almer <jamrial@gmail.com>