
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (23)
-
L’utiliser, en parler, le critiquer
10 avril 2011La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
Une liste de discussion est disponible pour tout échange entre utilisateurs. -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Soumettre bugs et patchs
10 avril 2011Un logiciel n’est malheureusement jamais parfait...
Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
Si vous pensez avoir résolu vous même le bug (...)
Sur d’autres sites (3416)
-
javacv and moviepy comparison for video generation
15 septembre 2024, par VikramI am trying to generate video using images, where image have some overlays text and png icons. I am using javacv library for this.
Final output video seems pixelated, i don't understand what is it since i do not have video processing domain knowledge, i am beginner to this.
I know that video bitrate and choice of video encoder are important factor which contributes to video quality and there are many more factors too.


I am providing you two output for comparison, one of them is generated using javacv and another one is from moviepy library


Please watch it in full screen since the problem i am talking about only gets highlighted in full screen, you will see the pixel dancing in javacv generated video, but python output seems stable


https://imgur.com/a/aowNnKg - javacv generated


https://imgur.com/a/eiLXrbk - Moviepy generated


I am using same encoder in both of the implementation


Encoder - libx264
bitrate - 
 800 Kbps for javacv 
 500 Kbps for moviepy

frame rate - 24fps for both of them

output video size -> 
 7MB (javacv)
 5MB (Moviepy)





generated output size from javacv is bigger then moviepy generated video.


here is my java configuration for FFmpegFrameRecorder


FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(this.outputPath, 
 this.screensizeX, this.screensizeY);
 if(this.videoCodecName!=null && "libx264".equals(this.videoCodecName)) {
 recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 }
 recorder.setFormat("mp4"); 
 recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420);
 recorder.setVideoBitrate(800000);
 recorder.setImageWidth(this.screensizeX);
 recorder.setFrameRate(24);




and here is python configuration for writing video file


Full_final_clip.write_videofile(
 f"{video_folder_path}/{FILE_ID}_INTERMEDIATE.mp4",
 codec="libx264",
 audio_codec="aac",
 temp_audiofile=f"{FILE_ID}_INTER_temp-audio.m4a",
 remove_temp=True,
 fps=24,
 )




as you can see i am not specifying bitrate in python, but i checked that bitrate of final output is around 500 kbps, which is lower then what i specified in java, yet java generated video quality seems poor.


I have tried setting crf value also , but it seems it does not have any impact when used.


increasing bitrate improve quality somewhat but at the cost of file size, still generated output seems pixelated.


Can someone please highlight what might be the issue, and how python is generating better quality video, when both of the libraries use ffmpeg at the backend.


Edit 1 : also, I am adding code which is being used to make zoom animation for continuous frames, As somewhere i read that this might be the cause for pixel jitter, please see and let me know if there is any improvement we can do to remove pixel jittering


private Mat applyZoomEffect(Mat frame, int currentFrame, long effectFrames, int width, int height, String mode, String position, double speed) {
 long totalFrames = effectFrames;
 double i = currentFrame;
 if ("out".equals(mode)) {
 i = totalFrames - i;
 }
 double zoom = 1 + (i * ((0.1 * speed) / totalFrames));

 double originalCenterX = width/2.0;
 double originalCenterY = height/2.0;
 

 // Resize image
 //opencv_imgproc.resize(frame, resizedMat, new Size(newWidth, newHeight));

 // Determine crop region based on position
 double x = 0, y = 0;
 switch (position.toLowerCase()) {
 case "center":
 // Adjusting for center zoom
 x = originalCenterX - originalCenterX * zoom;
 y = originalCenterY - originalCenterY * zoom;
 
 x= (width-(width*zoom))/2.0;
 y= (height-(height*zoom))/2.0;
 break;
 }

 double[][] rowData = {{zoom, 0, x},{0,zoom,y}};

 double[] flatData = flattenArray(rowData);

 // Create a DoublePointer from the flattened array
 DoublePointer doublePointer = new DoublePointer(flatData);

 // Create a Mat object with two rows and three columns
 Mat mat = new Mat(2, 3, org.bytedeco.opencv.global.opencv_core.CV_64F); // CV_64F is for double type

 // Put the data into the Mat object
 mat.data().put(doublePointer);
 Mat transformedFrame = new Mat();
 opencv_imgproc.warpAffine(frame, transformedFrame, mat, new Size(frame.cols(), frame.rows()),opencv_imgproc.INTER_LANCZOS4,0,new Scalar(0,0,0,0));
 return transformedFrame;
 }



-
Node.js readable maximize throughput/performance for compute intense readable - Writable doesn't pull data fast enough
31 décembre 2022, par flohallGeneral setup


I developed an application using AWS Lambda node.js 14.
I use a custom
Readable
implementationFrameCreationStream
that uses node-canvas to draw images, svgs and more on a canvas. This result is then extracted as a raw image buffer in BGRA. A single image buffer contains 1920 * 1080 * 4 Bytes = 8294400 Bytes 8 MB.
This is then piped tostdin
of achild_process
runningffmpeg
.
ThehighWaterMark
of myReadable
inobjectMode:true
is set to 25 so that the internal buffer can use up to 8 MB * 25 = 200 MB.

All this works fine and also doesn't contain too much RAM. But I noticed after some time, that the performance is not ideally.


Performance not optimal


I have an example input that generates a video of 315 frames. If I set
highWaterMark
to a value above 25 the performance increases to the point, when I set to a value of 315 or above.

For some reason
ffmpeg
doesn't start to pull any data untilhighWaterMark
is reached. Obviously thats not what I want.ffmpeg
should always consume data if minimum 1 frame is cached in theReadable
and if it has finished processing the frame before. And theReadable
should produce more frames as longhighWaterMark
isn't reached or the last frame has been reached. So ideally theReadable
and theWriteable
are busy all the time.

I found another way to improve the speed. If I add a timeout in the
_read()
method of theReadable
after let's say every tenth frame for 100 ms. Then theffmpeg
-Writable
will use this timeout to write some frames toffmpeg
.

It seems like frames aren't passed to
ffmpeg
during frame creation because some node.js main thread is busy ?

The fastest result I have if I increase
highWaterMark
above the amount of frames - which doesn't work for longer videos as this would make the AWS Lambda RAM explode. And this makes the whole streaming idea useless. Using timeouts always gives me stomach pain. Also depending on the execution on different environments a good fitting timeout might differ. Any ideas ?

FrameCreationStream


import canvas from 'canvas';
import {Readable} from 'stream';
import {IMAGE_STREAM_BUFFER_SIZE, PerformanceUtil, RenderingLibraryError, VideoRendererInput} from 'vm-rendering-backend-commons';
import {AnimationAssets, BufferType, DrawingService, FullAnimationData} from 'vm-rendering-library';

/**
 * This is a proper back pressure compatible implementation of readable for a having a stream to read single frames from.
 * Whenever read() is called a new frame is created and added to the stream.
 * read() will be called internally until options.highWaterMark has been reached.
 * then calling read will be paused until one frame is read from the stream.
 */
export class FrameCreationStream extends Readable {

 drawingService: DrawingService;
 endFrameIndex: number;
 currentFrameIndex: number = 0;
 startFrameIndex: number;
 frameTimer: [number, number];
 readTimer: [number, number];
 fullAnimationData: FullAnimationData;

 constructor(animationAssets: AnimationAssets, fullAnimationData: FullAnimationData, videoRenderingInput: VideoRendererInput, frameTimer: [number, number]) {
 super({highWaterMark: IMAGE_STREAM_BUFFER_SIZE, objectMode: true});

 this.frameTimer = frameTimer;
 this.readTimer = PerformanceUtil.startTimer();

 this.fullAnimationData = fullAnimationData;

 this.startFrameIndex = Math.floor(videoRenderingInput.startFrameId);
 this.currentFrameIndex = this.startFrameIndex;
 this.endFrameIndex = Math.floor(videoRenderingInput.endFrameId);

 this.drawingService = new DrawingService(animationAssets, fullAnimationData, videoRenderingInput, canvas);
 console.time("read");
 }

 /**
 * this method is only overwritten for debugging
 * @param size
 */
 read(size?: number): string | Buffer {

 console.log("read("+size+")");
 const buffer = super.read(size);
 console.log(buffer);
 console.log(buffer?.length);
 if(buffer) {
 console.timeLog("read");
 }
 return buffer;
 }

 // _read() will be called when the stream wants to pull more data in.
 // _read() will be called again after each call to this.push(dataChunk) once the stream is ready to accept more data. https://nodejs.org/api/stream.html#readable_readsize
 // this way it is ensured, that even though this.createImageBuffer() is async, only one frame is created at a time and the order is kept
 _read(): void {
 // as frame numbers are consecutive and unique, we have to draw each frame number (also the first and the last one)
 if (this.currentFrameIndex <= this.endFrameIndex) {
 PerformanceUtil.logTimer(this.readTimer, 'WAIT -> READ\t');
 this.createImageBuffer()
 .then(buffer => this.optionalTimeout(buffer))
 // push means adding a buffered raw frame to the stream
 .then((buffer: Buffer) => {
 this.readTimer = PerformanceUtil.startTimer();
 // the following two frame numbers start with 1 as first value
 const processedFrameNumberOfScene = 1 + this.currentFrameIndex - this.startFrameIndex;
 const totalFrameNumberOfScene = 1 + this.endFrameIndex - this.startFrameIndex;
 // the overall frameId or frameIndex starts with frameId 0
 const processedFrameIndex = this.currentFrameIndex;
 this.currentFrameIndex++;
 this.push(buffer); // nothing besides logging should happen after calling this.push(buffer)
 console.log(processedFrameNumberOfScene + ' of ' + totalFrameNumberOfScene + ' processed - full video frameId: ' + processedFrameIndex + ' - buffered frames: ' + this.readableLength);
 })
 .catch(err => {
 // errors will be finally handled, when subscribing to frameCreation stream in ffmpeg service
 // this log is just generated for tracing errors and if for some reason the handling in ffmpeg service doesn't work
 console.log("createImageBuffer: ", err);
 this.emit("error", err);
 });
 } else {
 // push(null) makes clear that this stream has ended
 this.push(null);
 PerformanceUtil.logTimer(this.frameTimer, 'FRAME_STREAM');
 }
 }

 private optionalTimeout(buffer: Buffer): Promise<buffer> {
 if(this.currentFrameIndex % 10 === 0) {
 return new Promise(resolve => setTimeout(() => resolve(buffer), 140));
 }
 return Promise.resolve(buffer);
 }

 // prevent memory leaks - without this lambda memory will increase with every call
 _destroy(): void {
 this.drawingService.destroyStage();
 }

 /**
 * This creates a raw pixel buffer that contains a single frame of the video drawn by the rendering library
 *
 */
 public async createImageBuffer(): Promise<buffer> {

 const drawTimer = PerformanceUtil.startTimer();
 try {
 await this.drawingService.drawForFrame(this.currentFrameIndex);
 } catch (err: any) {
 throw new RenderingLibraryError(err);
 }

 PerformanceUtil.logTimer(drawTimer, 'DRAW -> FRAME\t');

 const bufferTimer = PerformanceUtil.startTimer();
 // Creates a raw pixel buffer, containing simple binary data
 // the exact same information (BGRA/screen ratio) has to be provided to ffmpeg, because ffmpeg cannot detect format for raw input
 const buffer = await this.drawingService.toBuffer(BufferType.RAW);
 PerformanceUtil.logTimer(bufferTimer, 'CANVAS -> BUFFER');

 return buffer;
 }
}
</buffer></buffer>


FfmpegService


import {ChildProcess, execFile} from 'child_process';
import {Readable} from 'stream';
import {FPS, StageSize} from 'vm-rendering-library';
import {
 FfmpegError,
 LOCAL_MERGE_VIDEOS_TEXT_FILE, LOCAL_SOUND_FILE_PATH,
 LOCAL_VIDEO_FILE_PATH,
 LOCAL_VIDEO_SOUNDLESS_MERGE_FILE_PATH
} from "vm-rendering-backend-commons";

/**
 * This class bundles all ffmpeg usages for rendering one scene.
 * FFmpeg is a console program which can transcode nearly all types of sounds, images and videos from one to another.
 */
export class FfmpegService {

 ffmpegPath: string = null;


 constructor(ffmpegPath: string) {
 this.ffmpegPath = ffmpegPath;
 }

 /**
 * Convert a stream of raw images into an .mp4 video using the command line program ffmpeg.
 *
 * @param inputStream an input stream containing images in raw format BGRA
 * @param stageSize the size of a single frame in pixels (minimum is 2*2)
 * @param outputPath the filepath to write the resulting video to
 */
 public imageToVideo(inputStream: Readable, stageSize: StageSize, outputPath: string): Promise<void> {
 const args: string[] = [
 '-f',
 'rawvideo',
 '-r',
 `${FPS}`,
 '-pix_fmt',
 'bgra',
 '-s',
 `${stageSize.width}x${stageSize.height}`,
 '-i',
 // input "-" means input will be passed via pipe (streamed)
 '-',
 // codec that also QuickTime player can understand
 '-vcodec',
 'libx264',
 '-pix_fmt',
 'yuv420p',
 /*
 * "-movflags faststart":
 * metadata at beginning of file
 * needs more RAM
 * file will be broken, if not finished properly
 * higher application compatibility
 * better for browser streaming
 */
 '-movflags',
 'faststart',
 // "-preset ultrafast", //use this to speed up compression, but quality/compression ratio gets worse
 // don't overwrite an existing file here,
 // but delete file in the beginning of execution index.ts
 // (this is better for local testing believe me)
 outputPath
 ];

 return this.execFfmpegPromise(args, inputStream);
 }

 private execFfmpegPromise(args: string[], inputStream?: Readable): Promise<void> {
 const ffmpegServiceSelf = this;
 return new Promise(function (resolve, reject) {
 const executionProcess: ChildProcess = execFile(ffmpegServiceSelf.ffmpegPath, args, (err) => {
 if (err) {
 reject(new FfmpegError(err));
 } else {
 console.log('ffmpeg finished');
 resolve();
 }
 });
 if (inputStream) {
 // it's important to listen on errors of input stream before piping it into the write stream
 // if we don't do this here, we get an unhandled promise exception for every issue in the input stream
 inputStream.on("error", err => {
 reject(err);
 });
 // don't reject promise here as the error will also be thrown inside execFile and will contain more debugging info
 // this log is just generated for tracing errors and if for some reason the handling in execFile doesn't work
 inputStream.pipe(executionProcess.stdin).on("error", err => console.log("pipe stream: " , err));
 }
 });
 }
}
</void></void>


-
Flutter FFMPEG : The BackgroundIsolateBinaryMessenger.instance value is invalid until BackgroundIsolateBinaryMessenger.ensureInitialized is executed
25 juin 2023, par DannyHey guys I have a function that uses
ffmpeg
to convert images togifs
. I am using the simple compute function provided by flutter, but I am getting this error.



I/flutter (12889) : Loading ffmpeg-kit-flutter. E/flutter (12889) :
[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled
Exception : Bad state : The BackgroundIsolateBinaryMessenger.instance
value is invalid until
BackgroundIsolateBinaryMessenger.ensureInitialized is executed.
E/flutter (12889) : #0 BackgroundIsolateBinaryMessenger.instance




Logs :




I/flutter (12889) : Loading ffmpeg-kit-flutter. E/flutter (12889) :
[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled
Exception : Bad state : The BackgroundIsolateBinaryMessenger.instance
value is invalid until
BackgroundIsolateBinaryMessenger.ensureInitialized is executed.
E/flutter (12889) : #0 BackgroundIsolateBinaryMessenger.instance
(package:flutter/src/services/_background_isolate_binary_messenger_io.dart:27:7)
E/flutter (12889) : #1 _findBinaryMessenger
(package:flutter/src/services/platform_channel.dart:135:42) E/flutter
(12889) : #2 EventChannel.binaryMessenger
(package:flutter/src/services/platform_channel.dart:619:27) E/flutter
(12889) : #3 EventChannel.receiveBroadcastStream. (package:flutter/src/services/platform_channel.dart:639:7)
E/flutter (12889) : #4 _runGuarded
(dart:async/stream_controller.dart:815:24) E/flutter (12889) : #5

_BroadcastStreamController._subscribe (dart:async/broadcast_stream_controller.dart:207:7) E/flutter (12889) :
#6 _ControllerStream._createSubscription (dart:async/stream_controller.dart:828:19) E/flutter (12889) : #7

_StreamImpl.listen (dart:async/stream_impl.dart:471:9) E/flutter (12889) : #8 FFmpegKitInitializer._initialize
(package:ffmpeg_kit_flutter_min_gpl/src/ffmpeg_kit_flutter_initializer.dart:311:44)
E/flutter (12889) : #9 FFmpegKitInitializer.initialize
(package:ffmpeg_kit_flutter_min_gpl/src/ffmpeg_kit_flutter_initializer.dart:54:23)
E/flutter (12889) : #10 FFmpegKitConfig.init
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit_config.dart:50:32)
E/flutter (12889) : #11 AbstractSession.createFFmpegSession
(package:ffmpeg_kit_flutter_min_gpl/abstract_session.dart:69:29)
E/flutter (12889) : #12 FFmpegSession.create
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_session.dart:40:43)
E/flutter (12889) : #13 FFmpegKit.executeWithArguments
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit.dart:44:29) E/flutter
(12889) : #14 FFmpegKit.execute
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit.dart:38:17) E/flutter
(12889) : #15 _shareMoments
(package:carefour/presentation/maker/create_function.dart:217:19)
E/flutter (12889) : #16 compute.
(package:flutter/src/foundation/_isolates_io.dart:19:20) E/flutter
(12889) : #17 _RemoteRunner._run (dart:isolate:1021:47) E/flutter
(12889) : #18 _RemoteRunner._remoteExecute (dart:isolate:1015:12)
E/flutter (12889) : #19 _delayEntrypointInvocation. (dart:isolate-patch/isolate_patch.dart:299:17) E/flutter
(12889) : #20 _RawReceivePort._handleMessage
(dart:isolate-patch/isolate_patch.dart:189:12)



This is the code :


Future<bool> shareMoments(ComputeMomentModel data) async {
 File? imgFile;
 File? paletteFile;
 var uuid = const Uuid();
 String newUuid = uuid.v4();
 String finalImagePath = "momentGif-$newUuid.gif";
 String paletteFileName= "momentPalette-$newUuid.png";
 File? finalImage;
 finalImage = null;

 await FFmpegKit.execute('-i ${data.directoryPath}/image%d.png -vf palettegen ${data.directoryPath}/$paletteFileName').then((session) async {
 final returnCode = await session.getReturnCode();

 if (ReturnCode.isSuccess(returnCode)) {
 paletteFile = File("${data.directoryPath}/$paletteFileName");

 await FFmpegKit.execute('-f image2 -y -r 8 -i ${data.directoryPath}/image%d.png -i ${paletteFile?.path} -filter_complex fps=8,scale=720:-1:flags=lanczos,split[s0][s1];[s0]palettegen=max_colors=32[p];[s1][p]paletteuse=dither=bayer ${data.directoryPath}/$finalImagePath').then((session) async {
 final returnCode = await session.getReturnCode();
 if (ReturnCode.isSuccess(returnCode)) {
 finalImage = File("${data.directoryPath}/$finalImagePath");
 }
 });

 } else {
 debugPrint("Failed");
 }
 });

 // Below block to clear the cache - else ffmpeg keeps on creating the first one
 int i = 0;
 for (i = 0; i<24; i++) {
 imgFile = File('${data.directoryPath}/image$i.png');
 imgFile.delete(recursive: true);
 }

 // —> Calling Backend API

 return true;

}
</bool>


compute function


Future<bool> computeShareMoments({required String directoryPath}) async {
 ComputeCreateMomentModel data = ComputeMomentModel(null, directoryPath);
 return await compute(shareMoments, data);

}
</bool>


Can anyone help with a solution ? I am stuck for a while now. Thanks in advance.