Recherche avancée

Médias (1)

Mot : - Tags -/publicité

Autres articles (84)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (5207)

  • Encoding images into a movie file

    5 avril 2014, par RuAware

    I am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media

    I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.

    Ideally I wont to support down to SDK 9/8

    Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.

    Code for MediaCodec so far

    public class EncodeAndMux extends AsyncTask {
       private static int bitRate = 2000000;
       private static int MAX_INPUT = 100000;
       private static String mimeType = "video/avc";

       private int frameRate = 15;    
       private int colorFormat;
       private int stride = 1;
       private int sliceHeight = 2;        

       private MediaCodec encoder = null;
       private MediaFormat inputFormat;
       private MediaCodecInfo codecInfo = null;
       private MediaMuxer muxer;
       private boolean mMuxerStarted = false;
       private int mTrackIndex = 0;  
       private long presentationTime = 0;
       private Paint bmpPaint;

       private static int WAITTIME = 10000;
       private static String TAG = "ENCODE";

       private ArrayList<string> mFilePaths;
       private String mPath;

       private EncodeListener mListener;
       private int width = 320;
       private int height = 240;
       private double mSpeed = 1;

       public EncodeAndMux(ArrayList<string> filePaths, String savePath) {
           mFilePaths = filePaths;
           mPath = savePath;  

           // Create paint to draw BMP
           bmpPaint = new Paint();
           bmpPaint.setAntiAlias(true);
           bmpPaint.setFilterBitmap(true);
           bmpPaint.setDither(true);
       }

       public void setListner(EncodeListener listener) {
           mListener = listener;
       }

       // set the speed, how many frames a second
       public void setSpead(int speed) {
           mSpeed = speed;
       }

       public double getSpeed() {
           return mSpeed;
       }

       private long computePresentationTime(int frameIndex) {
           final long ONE_SECOND = 1000000;
           return (long) (frameIndex * (ONE_SECOND / mSpeed));
       }

       public interface EncodeListener {
           public void finished();
           public void errored();
       }

       @TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
       @Override
       protected Boolean doInBackground(Integer... params) {

           try {
               muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
           } catch (Exception e){
               e.printStackTrace();
           }

           // Find a code that supports the mime type
           int numCodecs = MediaCodecList.getCodecCount();
           for (int i = 0; i &lt; numCodecs &amp;&amp; codecInfo == null; i++) {
               MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
               if (!info.isEncoder()) {
                   continue;
               }
               String[] types = info.getSupportedTypes();
               boolean found = false;

               for (int j = 0; j &lt; types.length &amp;&amp; !found; j++) {
                   if (types[j].equals(mimeType))
                       found = true;
               }

               if (!found)
                   continue;
               codecInfo = info;
           }


            for (int i = 0; i &lt; MediaCodecList.getCodecCount(); i++) {
                    MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
                    if (!info.isEncoder()) {
                        continue;
                    }

                    String[] types = info.getSupportedTypes();
                    for (int j = 0; j &lt; types.length; ++j) {
                        if (types[j] != mimeType)
                            continue;
                        MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
                        for (int k = 0; k &lt; caps.profileLevels.length; k++) {
                            if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh &amp;&amp; caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
                                codecInfo = info;
                            }
                        }
                    }
            }

           Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);

           MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
           for (int i = 0; i &lt; capabilities.colorFormats.length &amp;&amp; colorFormat == 0; i++) {
               int format = capabilities.colorFormats[i];
               switch (format) {
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                   colorFormat = format;
                   break;
               }
           }
           Log.d(TAG, "Using color format " + colorFormat);

           // Determine width, height and slice sizes
           if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
               // This codec doesn&#39;t support a width not a multiple of 16,
               // so round down.
               width &amp;= ~15;
           }

           stride = width;
           sliceHeight = height;

           if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
               stride = (stride + 15) / 16 * 16;
               sliceHeight = (sliceHeight + 15) / 16 * 16;
           }

           inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
           inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
           inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
           inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
           inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    //          inputFormat.setInteger("stride", stride);
    //          inputFormat.setInteger("slice-height", sliceHeight);
           inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);

           encoder = MediaCodec.createByCodecName(codecInfo.getName());
           encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           encoder.start();

           ByteBuffer[] inputBuffers = encoder.getInputBuffers();
           ByteBuffer[] outputBuffers = encoder.getOutputBuffers();

           int inputBufferIndex= -1, outputBufferIndex= -1;
           BufferInfo info = new BufferInfo();
           for (int i = 0; i &lt; mFilePaths.size(); i++) {

               // use decode sample to calculate inSample size and then resize
               Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);  

               // Create blank bitmap
               Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);                  

               // Center scaled image
               Canvas canvas = new Canvas(bitmap);                
               canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);

               Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
               byte[] dat = getNV12(width, height, bitmap);
               bitmap.recycle();

               // Exception occurred on this below line in Emulator, LINE No. 182//**
               inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
               Log.i("DAT", "Size= "+dat.length);

               if(inputBufferIndex >= 0){
                   int samplesiz= dat.length;
                   inputBuffers[inputBufferIndex].put(dat);
                   presentationTime = computePresentationTime(i);
                   if (i == mFilePaths.size()) {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                       Log.i(TAG, "Last Frame");
                   } else {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                   }

                   while(true) {
                      outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
                      Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                      if (outputBufferIndex >= 0) {
                          ByteBuffer encodedData = outputBuffers[outputBufferIndex];
                          if (encodedData == null) {
                              throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
                                      " was null");
                          }

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                              // The codec config data was pulled out and fed to the muxer when we got
                              // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                              Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                              info.size = 0;
                          }

                          if (info.size != 0) {
                              if (!mMuxerStarted) {
                                  throw new RuntimeException("muxer hasn&#39;t started");
                              }

                              // adjust the ByteBuffer values to match BufferInfo (not needed?)
                              encodedData.position(info.offset);
                              encodedData.limit(info.offset + info.size);

                              muxer.writeSampleData(mTrackIndex, encodedData, info);
                              Log.d(TAG, "sent " + info.size + " bytes to muxer");
                          }

                          encoder.releaseOutputBuffer(outputBufferIndex, false);

                          inputBuffers[inputBufferIndex].clear();
                          outputBuffers[outputBufferIndex].clear();

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                              break;      // out of while
                          }

                      } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                          // Subsequent data will conform to new format.
                          MediaFormat opmediaformat = encoder.getOutputFormat();
                          if (!mMuxerStarted) {
                              mTrackIndex = muxer.addTrack(opmediaformat);
                              muxer.start();
                              mMuxerStarted = true;
                          }
                          Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
                      } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                          outputBuffers = encoder.getOutputBuffers();
                          Log.d(TAG, "Output Buffer changed " + outputBuffers);
                      } else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                          // No Data, break out
                          break;
                      } else {
                          // Unexpected State, ignore it
                          Log.d(TAG, "Unexpected State " + outputBufferIndex);
                      }
                   }

               }    
           }

           if (encoder != null) {
               encoder.flush();
               encoder.stop();
               encoder.release();
               encoder = null;
           }

           if (muxer != null) {
               muxer.stop();
               muxer.release();
               muxer = null;
           }

           return true;

       };

       @Override
       protected void onPostExecute(Boolean result) {
           if (result) {
               if (mListener != null)
                   mListener.finished();
           } else {
               if (mListener != null)
                   mListener.errored();
           }
           super.onPostExecute(result);
       }



       byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
           int [] argb = new int[inputWidth * inputHeight];
           scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
           byte [] yuv = new byte[inputWidth*inputHeight*3/2];
           encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
           scaled.recycle();
           return yuv;
       }


       void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
           final int frameSize = width * height;
           int yIndex = 0;
           int uvIndex = frameSize;
           int a, R, G, B, Y, U, V;
           int index = 0;
           for (int j = 0; j &lt; height; j++) {
               for (int i = 0; i &lt; width; i++) {

                   a = (argb[index] &amp; 0xff000000) >> 24; // a is not used obviously
                   R = (argb[index] &amp; 0xff0000) >> 16;
                   G = (argb[index] &amp; 0xff00) >> 8;
                   B = (argb[index] &amp; 0xff) >> 0;

                   // well known RGB to YVU algorithm
                   Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                   V = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                   U = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                   yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0 : ((Y > 255) ? 255 : Y));
                   if (j % 2 == 0 &amp;&amp; index % 2 == 0) {
                       yuv420sp[uvIndex++] = (byte)((V&lt;0) ? 0 : ((V > 255) ? 255 : V));
                       yuv420sp[uvIndex++] = (byte)((U&lt;0) ? 0 : ((U > 255) ? 255 : U));
                   }

                   index ++;
               }
           }
       }
    }
    </string></string>

    This has now been tested on 4 of my devices and works fine, is there are way to

    1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released)
    2/ Offer an api 16 solution ?
    3/ Do I need stride and stride height ?

    Thanks

  • How can I find out what this ffmpeg error code means ?

    3 mars 2015, par Asik

    I’m using the function avcodec_decode_video2. On an encoding change in the stream, it returns -1094995529. The documentation only states :

    On error a negative value is returned, otherwise the number of bytes
    used or zero if no frame could be decompressed.

    But there doesn’t seem to be an enum of return codes or any other form of documentation. What does the error mean and how can I determine that in general ?

  • version string : add copyright line to version string

    11 juin 2014, par Simon Thelen
    version string : add copyright line to version string
    

    Show the copyright when running `ffmpeg -version’. This is useful for
    end users trying to determine whether they are using FFmpeg or Libav.

    Signed-off-by : Simon Thelen <ffmpeg@c-14.de>

    • [DH] cmdutils.c