
Recherche avancée
Autres articles (32)
-
L’utiliser, en parler, le critiquer
10 avril 2011La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
Une liste de discussion est disponible pour tout échange entre utilisateurs. -
Installation en mode standalone
4 février 2011, parL’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
[mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...) -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)
Sur d’autres sites (4221)
-
Node 18 or Node 20 break ffmpeg (in google cloud functions -> ffprobe was killed with signal SIGSEGV)
10 janvier 2024, par user20206929Please see below, the code is working on node js 16, but not when upgrading to node 18 or 20.


const ffmpeg = require("fluent-ffmpeg");

// Following is inside a .https.onRequest Google Cloud function with enough memory

try {
 const duration = new Promise((resolve, reject) => {
 ffmpeg.ffprobe(videoUrl, async (err, metadata) => {
 if (err) {
 if (res.headersSent) {
 console.error("Response already sent");
 return;
 } else {
 console.log("Metadata:", metadata);
 console.log("err: " + err);
 res.status(400).send("Error getting video metadata");
 return;
 }
 }
 const duration = metadata.format.duration;
 console.log("video duration in second: " + duration);
 resolve(duration);
 });
});
 videoDuration = await duration;
} catch (err) {
 console.log(err);
 throw err;
}



When upgrading to node 18/20 (No other change than upgrading node), the error "ffprobe not found" appears.


But setting the path manually using ffmpeg.setFfprobePath(ffprobePath) ;
trigger the error : Error : ffprobe was killed with signal SIGSEGV


So it seem its a permissions issue.


However, I tried a lot of different solutions, none of them made this work.
For instance i tried to download manually the ffprobe from the official website https://ffbinaries.com/downloads. Then manually add it to the code.


I tried to use https://www.npmjs.com/package/@ffprobe-installer/ffprobe or others package like https://www.npmjs.com/package/ffprobe-static


I also tried to download the ffprobe file to the temporary folder of google cloud, and change the permission of this folder.


All of those was doing the same error.


None of what i could think of made any difference.


Please help because i need to update node 16 to 18 or 20 before google remove node 16 on january 31 2024 and for now i don't see a solution.


I also looked for other solution to get this duration from a video file url, but using ffmpeg seem to be the only one that should work out of the box. As it is working on node 16.


Thank you,


UPDATE - 11/26/2023


GCP Functions NodeJS 16 runtime uses Ubuntu 18.04 with FFMpeg installed.
NodeJS 18/20 use Ubuntu 22.04, and Google decided not to include FFMpeg.


https://cloud.google.com/functions/docs/runtime-support#node.js
https://cloud.google.com/functions/docs/reference/system-packages


No workaround or solutions found as of now


UPDATE - 01/10/2024


Google added back ffmpeg to latest version, this is working as before now.


-
add libaribb24 ARIB STD-B24 caption decoder
14 janvier 2019, par Jan Ekströmadd libaribb24 ARIB STD-B24 caption decoder
* Outputs ASS lines with basic coloring and font scaling for each
given region.
* Sets the default style to the resolution of the subtitle plane
(for example, 960x540 / 36pt font for profile A).
* Has options to :
* Disable ruby text (which is coded as regions which have
half-height text in libaribb24).
Enabled by default as without positioning ruby text only
confuses as it is usually coded in the beginning of the decoded
subtitle line.
* Set the working directory, in which libaribb24 will read
configuration as well as into which it may save broadcast extra
symbols as PNG.
Unset by default.The unconventional library check can be explained by the library's
current master branch being licensed as LGPLv3, but at the time of
writing the latest official release is still licensed under GPLv3.Thus, one either has to wait for the following release, or enable
GPLv3. -
Tools/Techniques for investigating video corruption — ffmpeg / libavcodec
17 juillet 2013, par GopherkhanIn my current work I'm trying to encode some images to h264 video using the FFMPEG's C library. The resulting video plays fine in VLC, but has no preview image. The video can play in VLC and Mplayer on ubuntu, but won't play on Mac or PC (in fact, it causes a "VTDecoderXPCService quit unexpectedly" error on Mac).
If I run the resulting file through FFMPEG using the command line, the resulting file has a preview image, and plays correctly everywhere.
Apparently the file that I get out of the program is corrupt in some weird place, but I don't have any output during my compilation or run to indicate where. I can't share my code at the moment (work code isn't open source yet :-( ), but I have tried a number of things :
- Writing only header and trailer data (av_write_trailer) and no frames
- writing frames only minus the trailer (using avcodec_encode_video2 and av_write_frame)
- Adjusting our time_base and frame pts values to encode only one frame per second
- Removing all variable frame rate code
- Numerous other variants that I won't bother you with here
In creating my project, I've also followed the following tutorials :
And consulted the deprecated ffmpeg functions list
And compiled FFMPEG on ubuntu according to the official doc
And consulted numerous StackOverflow questions :
- Raw H264 frames in mpegts container using libavcodec
- How to encode Bitmaps into a video using MediaCodec ?
- How to convert RGB from YUV420p for ffmpeg encoder ?
- Encoding H.264 video using FFmpeg C API
- ffmpeg : how to save h264 raw data as mp4 file
But every run of the program runs into the exact same problem.
My question is, is there anything obvious that causes a programmatic run of FFMpeg to differ from a console run (e.g., an incomplete finalization, some threading issues, etc.) ? Like some obvious reason that a console run could repair a corrupted file ? Or is there a decent tool/method for inspecting a video file and finding the point of corruption ?