Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Sur d’autres sites (1907)

  • How to use command of ffmpeg on android

    23 janvier 2015, par user2830969

    I download ffmpeg static from http://ffmpeg.gusari.org/static/ and I run command

    ./ffmpeg -i inputFile.mp4 -vf drawtext="fontsize=60:fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-ascent" outputFile.mp4

    it work fine on my desktop.
    I want to use this command to run in android. I copy ffmpeg file to my android app to run command but it not work.

    public ProcessRunnable create() {
           if (inputPath == null || outputPath == null) {
               throw new IllegalStateException("Need an input and output filepath!");
           }  

           final List<string> cmd = new LinkedList<string>();
            public ProcessRunnable create() {
           if (inputPath == null || outputPath == null) {
               throw new IllegalStateException("Need an input and output filepath!");
           }  

           final List<string> cmd = new LinkedList<string>();

           cmd.add(mFfmpegPath);
           cmd.add("-i");
           cmd.add(inputPath);
           cmd.add("-vf");
           cmd.add("drawtext=\"fontsize=60:fontfile=/system/fonts/DroidSans.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-a
           cmd.add(mFfmpegPath);
           cmd.add("-i");
           cmd.add(inputPath);
           cmd.add("-vf");
           cmd.add("drawtext=\"fontsize=60:fontfile=/system/fonts/DroidSans.ttf:fontcolor=green:text=AAAA:x=(w-max_glyph_w)/2:y=h/2-ascent\"");
           cmd.add(outputPath);
           Log.w("Command", cmd.toString());
           final ProcessBuilder pb = new ProcessBuilder(cmd);
           return new ProcessRunnable(pb);
       }
    </string></string></string></string>

    please tell me know "How can I do that ?" thanks so much

  • Send image and audio data to FFmpeg via named pipes

    19 avril, par Nicke Manarin

    I'm able to send frames one by one to FFmpeg via a name pipe to create a video out of them, but if I try sending audio to a second named pipe, FFmpeg only accepts 1 frame in the frame pipe and starts reading from the audio pipe soon after it.

    &#xA;

    ffmpeg.exe -loglevel debug -hwaccel auto &#xA;-f:v rawvideo -r 25 -pix_fmt bgra -video_size 782x601 -i \\.\pipe\video_to_ffmpeg &#xA;-f:a s16le -ac 2 -ar 48000 -i \\.\pipe\audio_to_ffmpeg &#xA;-c:v libx264 -preset fast -pix_fmt yuv420p &#xA;-vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -crf 23 -f:v mp4 -vsync vfr &#xA;-c:a aac -b:a 128k -ar 48000 -ac 2 &#xA;-y "C:\Users\user\Desktop\video.mp4"&#xA;

    &#xA;

    I start both pipes like so :

    &#xA;

    _imagePipeServer = new NamedPipeServerStream(ImagePipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);&#xA;_imagePipeStreamWriter = new StreamWriter(_imagePipeServer);&#xA;_imagePipeServer.BeginWaitForConnection(null, null);&#xA;&#xA;if (hasAudio)&#xA;{&#xA;    _audioPipeServer = new NamedPipeServerStream(AudioPipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);&#xA;    _audioPipeStreamWriter = new StreamWriter(_audioPipeServer);&#xA;    _audioPipeServer.BeginWaitForConnection(null, null);&#xA;}&#xA;

    &#xA;

    And send the data to the pipes using these methods :

    &#xA;

    public void EncodeFrame(nint bufferAddress, int height, int bufferStride)&#xA;{&#xA;    var frameSize = height * bufferStride;&#xA;    var frameBytes = new byte[frameSize];&#xA;    System.Runtime.InteropServices.Marshal.Copy(bufferAddress, frameBytes, 0, frameSize);&#xA;&#xA;    if (_imagePipeServer?.IsConnected != true)&#xA;        throw new FFmpegException("Pipe not connected", Arguments, Output);&#xA;&#xA;    _imagePipeStreamWriter?.BaseStream.Write(frameBytes, 0, frameBytes.Length);&#xA;}&#xA;

    &#xA;

    public void EncodeAudio(ISampleProvider provider, long length)&#xA;{&#xA;    if (_audioPipeServer?.IsConnected != true)&#xA;        throw new FFmpegException("Pipe not connected", Arguments, Output);&#xA;&#xA;    var buffer = new byte[provider.WaveFormat.AverageBytesPerSecond * length / TimeSpan.TicksPerSecond];&#xA;    var bytesRead = provider.ToWaveProvider().Read(buffer, 0, buffer.Length);&#xA;&#xA;    if (bytesRead &lt; 1)&#xA;        return;&#xA;&#xA;    _audioPipeStreamWriter?.BaseStream.Write(buffer, 0, bytesRead);&#xA;    _audioPipeStreamWriter?.BaseStream.Flush();&#xA;}&#xA;

    &#xA;

    Not sending the audio (and thus not creating the audio pipe) works, with FFmpeg taking one frame at time and creating the video normally.

    &#xA;

    But if I try sending the audio via a secondary pipe, I can only send one frame. This is the output when that happens (Btw, FFmpeg v7.1) :

    &#xA;

    Splitting the commandline.&#xA;Reading option &#x27;-loglevel&#x27; ... matched as option &#x27;loglevel&#x27; (set logging level) with argument &#x27;debug&#x27;.&#xA;Reading option &#x27;-hwaccel&#x27; ... matched as option &#x27;hwaccel&#x27; (use HW accelerated decoding) with argument &#x27;auto&#x27;.&#xA;Reading option &#x27;-f:v&#x27; ... matched as option &#x27;f&#x27; (force container format (auto-detected otherwise)) with argument &#x27;rawvideo&#x27;.&#xA;Reading option &#x27;-r&#x27; ... matched as option &#x27;r&#x27; (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument &#x27;25&#x27;.&#xA;Reading option &#x27;-pix_fmt&#x27; ... matched as option &#x27;pix_fmt&#x27; (set pixel format) with argument &#x27;bgra&#x27;.&#xA;Reading option &#x27;-video_size&#x27; ... matched as AVOption &#x27;video_size&#x27; with argument &#x27;782x601&#x27;.&#xA;Reading option &#x27;-i&#x27; ... matched as input url with argument &#x27;\\.\pipe\video_to_ffmpeg&#x27;.&#xA;Reading option &#x27;-f:a&#x27; ... matched as option &#x27;f&#x27; (force container format (auto-detected otherwise)) with argument &#x27;s16le&#x27;.&#xA;Reading option &#x27;-ac&#x27; ... matched as option &#x27;ac&#x27; (set number of audio channels) with argument &#x27;2&#x27;.&#xA;Reading option &#x27;-ar&#x27; ... matched as option &#x27;ar&#x27; (set audio sampling rate (in Hz)) with argument &#x27;48000&#x27;.&#xA;Reading option &#x27;-i&#x27; ... matched as input url with argument &#x27;\\.\pipe\audio_to_ffmpeg&#x27;.&#xA;Reading option &#x27;-c:v&#x27; ... matched as option &#x27;c&#x27; (select encoder/decoder (&#x27;copy&#x27; to copy stream without reencoding)) with argument &#x27;libx264&#x27;.&#xA;Reading option &#x27;-preset&#x27; ... matched as AVOption &#x27;preset&#x27; with argument &#x27;fast&#x27;.&#xA;Reading option &#x27;-pix_fmt&#x27; ... matched as option &#x27;pix_fmt&#x27; (set pixel format) with argument &#x27;yuv420p&#x27;.&#xA;Reading option &#x27;-vf&#x27; ... matched as option &#x27;vf&#x27; (alias for -filter:v (apply filters to video streams)) with argument &#x27;scale=trunc(iw/2)*2:trunc(ih/2)*2&#x27;.&#xA;Reading option &#x27;-crf&#x27; ... matched as AVOption &#x27;crf&#x27; with argument &#x27;23&#x27;.&#xA;Reading option &#x27;-f:v&#x27; ... matched as option &#x27;f&#x27; (force container format (auto-detected otherwise)) with argument &#x27;mp4&#x27;.&#xA;Reading option &#x27;-fps_mode&#x27; ... matched as option &#x27;fps_mode&#x27; (set framerate mode for matching video streams; overrides vsync) with argument &#x27;vfr&#x27;.&#xA;Reading option &#x27;-c:a&#x27; ... matched as option &#x27;c&#x27; (select encoder/decoder (&#x27;copy&#x27; to copy stream without reencoding)) with argument &#x27;aac&#x27;.&#xA;Reading option &#x27;-b:a&#x27; ... matched as option &#x27;b&#x27; (video bitrate (please use -b:v)) with argument &#x27;128k&#x27;.&#xA;Reading option &#x27;-ar&#x27; ... matched as option &#x27;ar&#x27; (set audio sampling rate (in Hz)) with argument &#x27;48000&#x27;.&#xA;Reading option &#x27;-ac&#x27; ... matched as option &#x27;ac&#x27; (set number of audio channels) with argument &#x27;2&#x27;.&#xA;Reading option &#x27;-y&#x27; ... matched as option &#x27;y&#x27; (overwrite output files) with argument &#x27;1&#x27;.&#xA;Reading option &#x27;C:\Users\user\Desktop\video.mp4&#x27; ... matched as output url.&#xA;Finished splitting the commandline.&#xA;&#xA;Parsing a group of options: global.&#xA;Applying option loglevel (set logging level) with argument debug.&#xA;Applying option y (overwrite output files) with argument 1.&#xA;Successfully parsed a group of options.&#xA;&#xA;Parsing a group of options: input url \\.\pipe\video_to_ffmpeg.&#xA;Applying option hwaccel (use HW accelerated decoding) with argument auto.&#xA;Applying option f:v (force container format (auto-detected otherwise)) with argument rawvideo.&#xA;Applying option r (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument 25.&#xA;Applying option pix_fmt (set pixel format) with argument bgra.&#xA;Successfully parsed a group of options.&#xA;&#xA;Opening an input file: \\.\pipe\video_to_ffmpeg.&#xA;[rawvideo @ 000001c302ee08c0] Opening &#x27;\\.\pipe\video_to_ffmpeg&#x27; for reading&#xA;[file @ 000001c302ee1000] Setting default whitelist &#x27;file,crypto,data&#x27;&#xA;[rawvideo @ 000001c302ee08c0] Before avformat_find_stream_info() pos: 0 bytes read:65536 seeks:0 nb_streams:1&#xA;[rawvideo @ 000001c302ee08c0] All info found&#xA;[rawvideo @ 000001c302ee08c0] After avformat_find_stream_info() pos: 1879928 bytes read:1879928 seeks:0 frames:1&#xA;Input #0, rawvideo, from &#x27;\\.\pipe\video_to_ffmpeg&#x27;:&#xA;  Duration: N/A, start: 0.000000, bitrate: 375985 kb/s&#xA;  Stream #0:0, 1, 1/25: Video: rawvideo, 1 reference frame (BGRA / 0x41524742), bgra, 782x601, 0/1, 375985 kb/s, 25 tbr, 25 tbn&#xA;Successfully opened the file.&#xA;&#xA;Parsing a group of options: input url \\.\pipe\audio_to_ffmpeg.&#xA;Applying option f:a (force container format (auto-detected otherwise)) with argument s16le.&#xA;Applying option ac (set number of audio channels) with argument 2.&#xA;Applying option ar (set audio sampling rate (in Hz)) with argument 48000.&#xA;Successfully parsed a group of options.&#xA;&#xA;Opening an input file: \\.\pipe\audio_to_ffmpeg.&#xA;[s16le @ 000001c302ef5380] Opening &#x27;\\.\pipe\audio_to_ffmpeg&#x27; for reading&#xA;[file @ 000001c302ef58c0] Setting default whitelist &#x27;file,crypto,data&#x27;&#xA;

    &#xA;

    The difference if I try sending 1 frame then some bytes (arbitrary length based on fps) of audio is that I get this extra comment at the end :

    &#xA;

    [s16le @ 0000025948c96d00] Before avformat_find_stream_info() pos: 0 bytes read:15360 seeks:0 nb_streams:1&#xA;

    &#xA;

    Extra calls to EncodeFrame() hang forever at the BaseStream.Write(frameBytes, 0, frameBytes.Length) call, suggesting that FFmpeg is no longer reading the data.

    &#xA;

    Something is causing FFmpeg to close or stop reading the first pipe and only accept data from the second one.

    &#xA;

    Perhaps the command is missing something ?

    &#xA;

  • Real time compression/encoding using ffmpeg in objective c

    20 février 2014, par halfwaythru

    I have a small application written in Objective-c that looks for the video devices on the machine and allows the user to record video. I need to be able to compress this video stream in real time. I do not want to save the whole video, I want to compress it as much as possible and only write out the compressed version.

    I also don't want to use the AVFoundation's build in compression methods and need to use a third party library like ffmpeg.

    So far, I have been able to record the video and get individual frames using 'AVCaptureVideoDataOutputSampleBufferDelegate' in this method :

    - (void)captureOutput:(AVCaptureOutput *)captureOutput
      didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
      fromConnection:(AVCaptureConnection *)connection

    So I have a stream of images basically, and I want to throw them into ffmpeg (which is all set up on my machine). Do I need to call a terminal command to do this ? And if I do, how do I use the image stack as my input to the ffmpeg command, instead of the video. Also, how do I combine all the little videos in the end ?

    Any help is appreciated. Thanks !