
Recherche avancée
Autres articles (31)
-
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Les notifications de la ferme
1er décembre 2010, parAfin d’assurer une gestion correcte de la ferme, il est nécessaire de notifier plusieurs choses lors d’actions spécifiques à la fois à l’utilisateur mais également à l’ensemble des administrateurs de la ferme.
Les notifications de changement de statut
Lors d’un changement de statut d’une instance, l’ensemble des administrateurs de la ferme doivent être notifiés de cette modification ainsi que l’utilisateur administrateur de l’instance.
À la demande d’un canal
Passage au statut "publie"
Passage au (...)
Sur d’autres sites (3353)
-
How to overlay two mp3 audio files with different bitrates using fluent-ffmpeg
7 janvier 2023, par Alex P MnzI am trying to overlay one mp3 file (a speech track, the file name for which is passed into the below function) on top of another (a background music track, the url of which is passed into the function), so that they both play simultaneously, using NodeJS and fluent-ffmpeg.


What happens when I call the function is that I get an output file which 1) only plays the background audio track, and 2) upon opening the output track, the time seeker skips straight to where the end of the first audio track would have been (around 1 minute 30 seconds).


I'd like to in the first instance have them play at the same time as each other in the output file, without this skipping ahead effect (even if I drag the slider back to within the first 1 minute 30, it just jumps straight back to 1 minute 31 seconds - as if somehow there was no data in that first minute and a half).


In the second instance, I'd also like to have the background audio track loop until the first track finishes, so any help with that as part of an answer would be very much appreciated. But the immediate problem is just getting the two audios to actually play simultaneously starting from 0 seconds.


I have tried the below to get to this point :


const fs = require('fs');
const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath(ffmpegPath);
var ffprobe = require('ffprobe-static');
ffmpeg.setFfprobePath(ffprobe.path);
const axios = require('axios');
const crypto = require('crypto');
const path = require('path');


const overlayBackgroundAudio = async (inputFileName, backgroundFileUrl) => {

 const backgroundTrack = await axios({
 method: 'GET',
 url: backgroundFileUrl,
 responseType: 'arraybuffer'
 });

 // write a file with the retrieved background audio track
 const backgroundTrackFileName = crypto.randomBytes(12).toString('hex');
 fs.writeFileSync(`${backgroundTrackFileName}.mp3`, backgroundTrack.data); //todo - delete after


 // set the name of the output file
 const newFileName = crypto.randomBytes(12).toString('hex');
 const outputFileName = `${newFileName}.mp3`;

 // run the relevant ffmpeg commands

 const overlayTracks = async () => {
 
 return new Promise((resolve, reject) => {
 ffmpeg()
 .input(`${inputFileName}.mp3`)
 .input(`${backgroundTrackFileName}.mp3`)
 .complexFilter([
 {
 filter: 'volume',
 options: '1',
 inputs: '[0:0]',
 outputs: '[a]'
 },
 {
 filter: 'volume',
 options: '1',
 inputs: '[1:0]',
 outputs: '[b]'
 },
 {
 filter: 'adelay',
 options: '0',
 inputs: '[a]',
 outputs: '[a1]'
 },
 {
 filter: 'adelay',
 options: '0|0',
 inputs: '[b]',
 outputs: '[b1]'
 },
 {
 filter: 'amix',
 options: 'inputs=2:duration=first',
 inputs: '[a1][b1]',
 outputs: '[out]'
 }
 ])
 .outputOptions(['-map', '[out]', outputFileName])
 .output(outputFileName)
 .on('end', function() {
 resolve(outputFileName);
 })
 .on('stderr', console.log) // log ffmpeg output to console
 .on('error', function(err) {
 console.log(`An error occurred overlaying tracks: ${err.message}`);
 reject(err);
 })
 .run()
 })
 }

 const overlaidTracksFileName = await overlayTracks();

 console.log('overlaid file name:', overlaidTracksFileName)

 return overlaidTracksFileName;
 
}

module.exports = overlayBackgroundAudio;



Here is what the ffmpeg library is logging to my console (which may help in figuring out why this is not working as intended) :


Input #0, mp3, from '95c8ec8ccbb100d2bfe81ffd.mp3':
 Metadata:
 encoder : Lavf58.24.101
 Duration: 00:02:41.42, start: 0.046042, bitrate: 48 kb/s
 Stream #0:0: Audio: mp3, 24000 Hz, mono, fltp, 48 kb/s
[mp3 @ 000002c77978ddc0] Estimating duration from bitrate, this may be inaccurate
Input #1, mp3, from '14e70bbd339b612e96f29017.mp3':
 Metadata:
 date : 2022-12-30 17:56
 id3v2_priv.XMP : <?xpacket begin="\xef\xbb\xbf" id="W5M0MpCehiHzreSzNTczkc9d"?>\x0a\x0a \x0a s
 Stream #1:0: Audio: mp3, 48000 Hz, stereo, fltp, 192 kb/s
Stream mapping:
 Stream #0:0 (mp3float) -> volume (graph 0)
 Stream #1:0 (mp3float) -> volume (graph 0)
 amix (graph 0) -> Stream #0:0 (libmp3lame)
 Stream #1:0 -> #1:0 (mp3 (mp3float) -> mp3 (libmp3lame))
Press [q] to stop, [?] for help
Output #0, mp3, to '0c0404b793d488bf38e882a0.mp3':
 Metadata:
 TSSE : Lavf58.24.101
 Stream #0:0: Audio: mp3 (libmp3lame), 24000 Hz, mono, fltp (default)
 Metadata:
 encoder : Lavc58.42.102 libmp3lame
Output #1, mp3, to '0c0404b793d488bf38e882a0.mp3':
 Metadata:
 TSSE : Lavf58.24.101
 Stream #1:0: Audio: mp3 (libmp3lame), 48000 Hz, stereo, fltp
 Metadata:
 encoder : Lavc58.42.102 libmp3lame





Thank you very much in advance for any help that you can try to provide !


-
ffmpeg not capture screenshot on siteground
20 mars 2023, par saddamI'm using ffmpeg (v5.1) in laravel with PHP 7.4


//where to save the image 
$new_file_name = time() . '.jpg';
$image = public_path() . '/uploads/images/' . $new_file_name; //Full path of image
$path = public_path() . '/uploads/' . $path; //Full path of video
//ffmpeg command to find thumbnail from video
$interval = env('PLAEHOLDER_CAPTURE_TIME', 2); //time to take screenshot at 
$cmd = "ffmpeg -i $path -ss 00:00:01 -vframes 1 $image 2>&1";
exec($cmd, $output);
var_dump($output); 
exit;



It's throwing an error :




array(32) [0]=> string(64) "ffmpeg version 5.1 Copyright (c) 2000-2022 the FFmpeg developers" [1]=> string(25) " built with gcc 12 (GCC)" [2]=> string(128) " configuration : —enable-pic —enable-static —disable-ffplay —disable-ffprobe —disable-doc —disable-avdevice —disable-alsa" [3]=> string(40) " libavutil 57. 28.100 / 57. 28.100" [4]=> string(40) " libavcodec 59. 37.100 / 59. 37.100" [5]=> string(40) " libavformat 59. 27.100 / 59. 27.100" [6]=> string(40) " libavfilter 8. 44.100 / 8. 44.100" [7]=> string(40) " libswscale 6. 7.100 / 6. 7.100" [8]=> string(40) " libswresample 4. 7.100 / 4. 7.100" [9]=> string(130) "Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/customer/www/backend.clavora.com/public_html/public/uploads/videos/1679277906.mp4' :" [10]=> string(11) " Metadata :" [11]=> string(26) " major_brand : isom" [12]=> string(25) " minor_version : 512" [13]=> string(39) " compatible_brands : isomiso2avc1mp41" [14]=> string(35) " encoder : Lavf58.45.100" [15]=> string(59) " Duration : 00:00:32.58, start : 0.000000, bitrate : 515 kb/s" [16]=> string(161) " Stream #0:00x1 : Video : h264 (Main) (avc1 / 0x31637661), yuv420p(progressive), 576x1024 [SAR 1:1 DAR 9:16], 380 kb/s, 30 fps, 30 tbr, 15360 tbn (default)" [17]=> string(13) " Metadata :" [18]=> string(56) " handler_name : ?Mainconcept Video Media Handler" [19]=> string(36) " vendor_id : [0][0][0][0]" [20]=> string(104) " Stream #0:10x2 : Audio : aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 129 kb/s (default)" [21]=> string(13) " Metadata :" [22]=> string(60) " handler_name : #Mainconcept MP4 Sound Media Handler" [23]=> string(36) " vendor_id : [0][0][0][0]" [24]=> string(15) "Stream mapping :" [25]=> string(55) " Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))" [26]=> string(31) "Press [q] to stop, [?] for help" [27]=> string(78) "[auto_scale_0 @ 0x564f48c3b180] Failed to configure output pad on auto_scale_0" [28]=> string(29) "Error reinitializing filters !" [29]=> string(76) "Failed to inject frame into filter network : Resource temporarily unavailable" [30]=> string(55) "Error while processing the decoded data for stream #0:0" [31]=> string(18) "Conversion failed !"




-
How to play HDR10 videos using ffplay and ffmpeg ?
6 janvier 2023, par befandyI'm trying to play on Windows a HDR10 video. My display support 10 bits HDR content.
The command that I'm using to play the HDR10 video is the following.
ffplay -sws_flags print_info -i video.mp4


But the output looks washed out.


the log are the following :


ffplay version 2022-03-28-git-5ee198f9aa-full_build-www.gyan.dev Copyright (c) 2003-2022 the FFmpeg developers built with gcc 11.2.0 (Rev7, Built by MSYS2 project) configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint libavutil 57. 24.101 / 57. 24.101 libavcodec 59. 25.100 / 59. 25.100 libavformat 59. 20.101 / 59. 20.101 libavdevice 59. 6.100 / 59. 6.100 libavfilter 8. 29.100 / 8. 29.100 libswscale 6. 6.100 / 6. 6.100 libswresample 4. 6.100 / 4. 6.100 libpostproc 56. 5.100 / 56. 5.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'video.mp4':=0/0 Metadata: major_brand : isom minor_version : 512 compatible_brands: isomiso2mp41 encoder : Lavf58.29.100 Duration: 00:00:08.78, start: 0.000000, bitrate: 159424 kb/s Stream #0:0[0x1](und): Video: hevc (Main 10) (hev1 / 0x31766568), yuv420p10le(tv, bt2020nc/bt2020/smpte2084, progressive), 3840x2160 [SAR 1:1 DAR 16:9], 159427 kb/s, 60 fps, 60 tbr, 15360 tbn (default) Metadata: handler_name : VideoHandler vendor_id : [0][0][0][0] [swscaler @ 000001d62a81d000] [swscaler @ 000001d622aebf80] bicubic scaler, from yuv420p10le to yuv420p using MMXEXT [swscaler @ 000001d62a81d000] [swscaler @ 000001d622aebf80] using unscaled yuv420p10le -> yuv420p special converter [swscaler @ 000001d62a81d000] [swscaler @ 000001d635638600] bicubic scaler, from yuv420p10le to yuv420p using MMXEXT [swscaler @ 000001d62a81d000] [swscaler @ 000001d635638600] using unscaled yuv420p10le -> yuv420p special converter [swscaler @ 000001d62a81d000] [swscaler @ 000001d635661d80] bicubic scaler, from yuv420p10le to yuv420p using MMXEXT [swscaler @ 000001d62a81d000] [swscaler @ 000001d635661d80] using unscaled yuv420p10le -> yuv420p special converter [swscaler @ 000001d62a81d000] [swscaler @ 000001d63567df80] bicubic scaler, from yuv420p10le to yuv420p using MMXEXT [swscaler @ 000001d62a81d000] [swscaler @ 000001d63567df80] using unscaled yuv420p10le -> yuv420p special converter [swscaler @ 000001d62a81d000] [swscaler @ 000001d6356bbf80] bicubic scaler, from yuv420p10le to yuv420p using MMXEXT 3.44 M-V: 0.204 fd= 91 aq= 0KB vq=15552KB sq= 0B f=0/0


With the help of '-sws_flags print_info' flag we can see that there is a downscaling happening from 10 bits to 8 bits "yuv420p10le -> yuv420p"


I have try to play the video using VLC and it is working fine on my display : colour are vivid and display bright.


I have also try to decode first the video and then past metadata directly to ffplay as follow.
Decoding to yuv format with :
ffmpeg -i video.mp4 video.yuv

Then play yuv file :ffplay -framerate 60 -video_size 3840x2160 -pixel_format yuv420p10le -color_range tv -color_trc smpte2084 -color_primaries bt2020 -colorspace bt2020nc -i video.yuv


But the result is the same washed out color and "yuv420p10le -> yuv420p" conversion


Is there a way to play a HDR video (encoded or decoded stream) with ffplay ?