
Recherche avancée
Médias (16)
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#2 Typewriter Dance
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Sur d’autres sites (1907)
-
How to use command of ffmpeg on android
23 janvier 2015, par user2830969I download ffmpeg static from http://ffmpeg.gusari.org/static/ and I run command
./ffmpeg -i inputFile.mp4 -vf drawtext="fontsize=60:fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-ascent" outputFile.mp4
it work fine on my desktop.
I want to use this command to run in android. I copy ffmpeg file to my android app to run command but it not work.public ProcessRunnable create() {
if (inputPath == null || outputPath == null) {
throw new IllegalStateException("Need an input and output filepath!");
}
final List<string> cmd = new LinkedList<string>();
public ProcessRunnable create() {
if (inputPath == null || outputPath == null) {
throw new IllegalStateException("Need an input and output filepath!");
}
final List<string> cmd = new LinkedList<string>();
cmd.add(mFfmpegPath);
cmd.add("-i");
cmd.add(inputPath);
cmd.add("-vf");
cmd.add("drawtext=\"fontsize=60:fontfile=/system/fonts/DroidSans.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-a
cmd.add(mFfmpegPath);
cmd.add("-i");
cmd.add(inputPath);
cmd.add("-vf");
cmd.add("drawtext=\"fontsize=60:fontfile=/system/fonts/DroidSans.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-ascent\"");
cmd.add(outputPath);
Log.w("Command", cmd.toString());
final ProcessBuilder pb = new ProcessBuilder(cmd);
return new ProcessRunnable(pb);
}
</string></string></string></string>please tell me know "How can I do that ?" thanks so much
-
Send image and audio data to FFmpeg via named pipes
19 avril, par Nicke ManarinI'm able to send frames one by one to FFmpeg via a name pipe to create a video out of them, but if I try sending audio to a second named pipe, FFmpeg only accepts 1 frame in the frame pipe and starts reading from the audio pipe soon after it.


ffmpeg.exe -loglevel debug -hwaccel auto 
-f:v rawvideo -r 25 -pix_fmt bgra -video_size 782x601 -i \\.\pipe\video_to_ffmpeg 
-f:a s16le -ac 2 -ar 48000 -i \\.\pipe\audio_to_ffmpeg 
-c:v libx264 -preset fast -pix_fmt yuv420p 
-vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -crf 23 -f:v mp4 -vsync vfr 
-c:a aac -b:a 128k -ar 48000 -ac 2 
-y "C:\Users\user\Desktop\video.mp4"



I start both pipes like so :


_imagePipeServer = new NamedPipeServerStream(ImagePipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
_imagePipeStreamWriter = new StreamWriter(_imagePipeServer);
_imagePipeServer.BeginWaitForConnection(null, null);

if (hasAudio)
{
 _audioPipeServer = new NamedPipeServerStream(AudioPipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
 _audioPipeStreamWriter = new StreamWriter(_audioPipeServer);
 _audioPipeServer.BeginWaitForConnection(null, null);
}



And send the data to the pipes using these methods :


public void EncodeFrame(nint bufferAddress, int height, int bufferStride)
{
 var frameSize = height * bufferStride;
 var frameBytes = new byte[frameSize];
 System.Runtime.InteropServices.Marshal.Copy(bufferAddress, frameBytes, 0, frameSize);

 if (_imagePipeServer?.IsConnected != true)
 throw new FFmpegException("Pipe not connected", Arguments, Output);

 _imagePipeStreamWriter?.BaseStream.Write(frameBytes, 0, frameBytes.Length);
}



public void EncodeAudio(ISampleProvider provider, long length)
{
 if (_audioPipeServer?.IsConnected != true)
 throw new FFmpegException("Pipe not connected", Arguments, Output);

 var buffer = new byte[provider.WaveFormat.AverageBytesPerSecond * length / TimeSpan.TicksPerSecond];
 var bytesRead = provider.ToWaveProvider().Read(buffer, 0, buffer.Length);

 if (bytesRead < 1)
 return;

 _audioPipeStreamWriter?.BaseStream.Write(buffer, 0, bytesRead);
 _audioPipeStreamWriter?.BaseStream.Flush();
}



Not sending the audio (and thus not creating the audio pipe) works, with FFmpeg taking one frame at time and creating the video normally.


But if I try sending the audio via a secondary pipe, I can only send one frame. This is the output when that happens (Btw, FFmpeg v7.1) :


Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-hwaccel' ... matched as option 'hwaccel' (use HW accelerated decoding) with argument 'auto'.
Reading option '-f:v' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'rawvideo'.
Reading option '-r' ... matched as option 'r' (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument '25'.
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'bgra'.
Reading option '-video_size' ... matched as AVOption 'video_size' with argument '782x601'.
Reading option '-i' ... matched as input url with argument '\\.\pipe\video_to_ffmpeg'.
Reading option '-f:a' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 's16le'.
Reading option '-ac' ... matched as option 'ac' (set number of audio channels) with argument '2'.
Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '48000'.
Reading option '-i' ... matched as input url with argument '\\.\pipe\audio_to_ffmpeg'.
Reading option '-c:v' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'libx264'.
Reading option '-preset' ... matched as AVOption 'preset' with argument 'fast'.
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'yuv420p'.
Reading option '-vf' ... matched as option 'vf' (alias for -filter:v (apply filters to video streams)) with argument 'scale=trunc(iw/2)*2:trunc(ih/2)*2'.
Reading option '-crf' ... matched as AVOption 'crf' with argument '23'.
Reading option '-f:v' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'mp4'.
Reading option '-fps_mode' ... matched as option 'fps_mode' (set framerate mode for matching video streams; overrides vsync) with argument 'vfr'.
Reading option '-c:a' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'aac'.
Reading option '-b:a' ... matched as option 'b' (video bitrate (please use -b:v)) with argument '128k'.
Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '48000'.
Reading option '-ac' ... matched as option 'ac' (set number of audio channels) with argument '2'.
Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
Reading option 'C:\Users\user\Desktop\video.mp4' ... matched as output url.
Finished splitting the commandline.

Parsing a group of options: global.
Applying option loglevel (set logging level) with argument debug.
Applying option y (overwrite output files) with argument 1.
Successfully parsed a group of options.

Parsing a group of options: input url \\.\pipe\video_to_ffmpeg.
Applying option hwaccel (use HW accelerated decoding) with argument auto.
Applying option f:v (force container format (auto-detected otherwise)) with argument rawvideo.
Applying option r (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument 25.
Applying option pix_fmt (set pixel format) with argument bgra.
Successfully parsed a group of options.

Opening an input file: \\.\pipe\video_to_ffmpeg.
[rawvideo @ 000001c302ee08c0] Opening '\\.\pipe\video_to_ffmpeg' for reading
[file @ 000001c302ee1000] Setting default whitelist 'file,crypto,data'
[rawvideo @ 000001c302ee08c0] Before avformat_find_stream_info() pos: 0 bytes read:65536 seeks:0 nb_streams:1
[rawvideo @ 000001c302ee08c0] All info found
[rawvideo @ 000001c302ee08c0] After avformat_find_stream_info() pos: 1879928 bytes read:1879928 seeks:0 frames:1
Input #0, rawvideo, from '\\.\pipe\video_to_ffmpeg':
 Duration: N/A, start: 0.000000, bitrate: 375985 kb/s
 Stream #0:0, 1, 1/25: Video: rawvideo, 1 reference frame (BGRA / 0x41524742), bgra, 782x601, 0/1, 375985 kb/s, 25 tbr, 25 tbn
Successfully opened the file.

Parsing a group of options: input url \\.\pipe\audio_to_ffmpeg.
Applying option f:a (force container format (auto-detected otherwise)) with argument s16le.
Applying option ac (set number of audio channels) with argument 2.
Applying option ar (set audio sampling rate (in Hz)) with argument 48000.
Successfully parsed a group of options.

Opening an input file: \\.\pipe\audio_to_ffmpeg.
[s16le @ 000001c302ef5380] Opening '\\.\pipe\audio_to_ffmpeg' for reading
[file @ 000001c302ef58c0] Setting default whitelist 'file,crypto,data'



The difference if I try sending 1 frame then some bytes (arbitrary length based on fps) of audio is that I get this extra comment at the end :


[s16le @ 0000025948c96d00] Before avformat_find_stream_info() pos: 0 bytes read:15360 seeks:0 nb_streams:1



Extra calls to
EncodeFrame()
hang forever at theBaseStream.Write(frameBytes, 0, frameBytes.Length)
call, suggesting that FFmpeg is no longer reading the data.

Something is causing FFmpeg to close or stop reading the first pipe and only accept data from the second one.


Perhaps the command is missing something ?


-
Real time compression/encoding using ffmpeg in objective c
20 février 2014, par halfwaythruI have a small application written in Objective-c that looks for the video devices on the machine and allows the user to record video. I need to be able to compress this video stream in real time. I do not want to save the whole video, I want to compress it as much as possible and only write out the compressed version.
I also don't want to use the AVFoundation's build in compression methods and need to use a third party library like ffmpeg.
So far, I have been able to record the video and get individual frames using 'AVCaptureVideoDataOutputSampleBufferDelegate' in this method :
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connectionSo I have a stream of images basically, and I want to throw them into ffmpeg (which is all set up on my machine). Do I need to call a terminal command to do this ? And if I do, how do I use the image stack as my input to the ffmpeg command, instead of the video. Also, how do I combine all the little videos in the end ?
Any help is appreciated. Thanks !