
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (42)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (6995)
-
FFmpeg uses too much memory when repeating split, select, overlay
13 novembre 2020, par finefootI'm running


ffmpeg -i input.mp4 -filter_complex_script script.txt output.mp4



with the following minimal example script :


split[tmp],
select='between(t,1,2)',
select='between(n,0,1)',
[tmp]overlay=enable='between(t,1,2)':eof_action=repeat,
split[tmp],
select='between(t,3,4)',
select='between(n,0,1)',
[tmp]overlay=enable='between(t,3,4)':eof_action=repeat



What I want to do is to take 1 frame at a certain position and repeat it for a certain duration, basically "pausing" the video, while overwriting to keep the same output length. In the example, I'm doing this twice : I'm using
split[tmp]
to get a second input stream to work on, select the time at position 00:01 withselect='between(t,1,2)'
, select the first frame from that position withselect='between(n,0,1)'
and finally overlay that frame over the input. This repeats for a second time at position 00:03. I have tested this and it does exactly what I'm looking for.

However, in my real script, I'm repeating this about 1000 times for different positions in the stream (and for shorter durations than 1 second) which results in running out of memory. What am I doing wrong ? What can I do to optimize ?


-
checkasm/hevc_pel : Fix stack buffer overreads
28 septembre 2021, par Andreas Rheinhardtcheckasm/hevc_pel : Fix stack buffer overreads
This patch increases several stack buffers in order to fix
stack-buffer-overflows (e.g. in put_hevc_qpel_uni_hv_9 in
line 814 of hevcdsp_template.c) detected with ASAN in the hevc_pel
checkasm test.
The buffers are increased by the minimal amount necessary
in order not to mask potential future bugs.Reviewed-by : Martin Storsjö <martin@martin.st>
Reviewed-by : "zhilizhao(赵志立)" <quinkblack@foxmail.com>
Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com> -
Create a mkv file with colored background and containing a given audio and subtitle stream
25 mai 2023, par rdrg109Table of contents


- 

- The context
- Minimal working example
- What I've tried

- 

- Create a mkv file with colored background and an audio stream
- Create a mkv file with colored background, an audio stream and a subtitles stream






- The question












The context


I have a
*.flac
file and a*.srt
file. I want to merge those files in a MKV file, but at the same time, I want to add a video stream. I want the video stream to show a green background the entire time.



Minimal working example


For our experimentation, let's create two sample files : one
*.flac
file and one*.srt
file.

The following command creates a
*.flac
file that lasts 60 seconds and contains a sine wave.

$ ffmpeg -y -f lavfi -i "sine=f=1000:d=60" input.flac



The following command creates a
*.srt
file. Note that our last subtitle lasts until the sixth second, this is intended.

$ cat << EOF > input.srt
1
00:00:00,000 --> 00:00:03,000
This is the first subtitle in a
SRT file.

2
00:00:03,000 --> 00:00:06,000
This is the second subtitle in a
SRT file.
EOF





What I've tried




Create a mkv file with colored background and an audio stream


I know how to create a MKV file containing a given audio stream and a colored background as the video stream.


The following command creates a MKV file containing
input.flac
as the audio stream and green background as the video stream. The MKV file have the same duration asinput.flac
.

$ ffmpeg \
 -y \
 -f lavfi \
 -i color=c=green:s=2x2 \
 -i input.flac \
 -c:v libx264 \
 -c:a copy \
 -shortest \
 output.mkv



The following command shows the duration of the streams in the resulting file.


$ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r ''



{
 "programs": [],
 "streams": [
 {
 "codec_type": "video",
 "tags": {
 "DURATION": "00:00:58.200000000"
 }
 },
 {
 "codec_type": "audio",
 "tags": {
 "DURATION": "00:01:00.000000000"
 }
 }
 ]
}





Create a mkv file with colored background, an audio stream and a subtitles stream


To add a subtitles stream, I just need to specify the
*.srt
file. However, when I do this, the duration of the video is set to the time of the last subtitle in the*.srt
file. This is expected because I have used-shortest
. I would get the result I'm looking for if it were possible to specify the stream that-shortest
gives top priority to. I haven't found this information on the Internet.

$ ffmpeg \
 -y \
 -f lavfi \
 -i color=c=green:s=2x2 \
 -i input.flac \
 -i input.srt \
 -c:v libx264 \
 -c:a copy \
 -shortest \
 output.mkv



The following command shows the duration of the streams in the resulting file. Note that the maximum duration of the resulting file is 6 seconds, while in the resulting file from the previous section it was 1 minute.


$ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r ''



{
 "programs": [],
 "streams": [
 {
 "codec_type": "video",
 "tags": {
 "DURATION": "00:00:01.160000000"
 }
 },
 {
 "codec_type": "audio",
 "tags": {
 "DURATION": "00:00:03.134000000"
 }
 },
 {
 "codec_type": "subtitle",
 "tags": {
 "DURATION": "00:00:06.000000000"
 }
 }
 ]
}





The question


Given a
*.flac
file and a*.srt
file. How to merge them in a*.mkv
file so that it has the*.flac
file as the audio stream, the*.srt
file as the subtitles stream and a green background as the video stream ?