Recherche avancée

Médias (1)

Mot : - Tags -/lev manovitch

Autres articles (20)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

Sur d’autres sites (7206)

  • Error while opening encoder for output stream #0:0 for Creating Video from image,Gif,music

    29 février 2020, par brijesh

    I am trying to create a video from image,gif and music.

    Here is the code I used :

    {"-y", "-i", imagepath, "-ignore_loop", "0", "-i", gif, "-filter_complex", "[1:v]scale=" + filterdBitmap.getWidth() + ":" + filterdBitmap.getHeight() + "[ovrl];[0:v][ovrl]overlay=0:0", "-ss", "" + startMs / 1000, "-t", "" + endMs / 1000, "-i", songpath, "-c:v", "libx264", "-preset", "ultrafast", "-r", "30", "-pix_fmt", "yuva420p", "-c:a", "aac", "-shortest", outputLocation.getPath()};

    The error I received was this :

    Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

    Specifically, here is the complete response :

    2020-02-29 10:16:01.043 14913-14913/com.photocreator E/fail: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
         built with gcc 4.8 (GCC)
         configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
         libavutil      55. 17.103 / 55. 17.103
         libavcodec     57. 24.102 / 57. 24.102
         libavformat    57. 25.100 / 57. 25.100
         libavdevice    57.  0.101 / 57.  0.101
         libavfilter     6. 31.100 /  6. 31.100
         libswscale      4.  0.100 /  4.  0.100
         libswresample   2.  0.101 /  2.  0.101
         libpostproc    54.  0.100 / 54.  0.100
       Input #0, png_pipe, from 'file:///storage/emulated/0/1582951553006.jpg':
         Duration: N/A, bitrate: N/A
           Stream #0:0: Video: png, rgba(pc), 639x812, 25 tbr, 25 tbn, 25 tbc
       Input #1, gif, from 'http://13.232.145.224:3003/getpath/video_maker/new/35.gif':
         Duration: N/A, bitrate: N/A
           Stream #1:0: Video: gif, bgra, 288x480, 15 fps, 15 tbr, 100 tbn, 100 tbc
       [mp3 @ 0xaea97200] Skipping 0 bytes of junk at 253.
       Input #2, mp3, from '/storage/emulated/0/Download/supnaringtone-49332.mp3':
         Metadata:
           encoder         : Lavf58.20.100
         Duration: 00:00:21.76, start: 0.025057, bitrate: 64 kb/s
           Stream #2:0: Audio: mp3, 44100 Hz, stereo, s16p, 64 kb/s
           Metadata:
             encoder         : Lavc58.35
       Incompatible pixel format 'yuva420p' for codec 'libx264', auto-selecting format 'yuv420p'
       [libx264 @ 0xaeacfc00] width not divisible by 2 (639x812)
       Output #0, mp4, to '/storage/emulated/0/allkotlin/video/movie_1582951554388.mp4':
           Stream #0:0: Video: h264, none, q=2-31, 128 kb/s, 30 fps (default)
           Metadata:
             encoder         : Lavc57.24.102 libx264
           Stream #0:1: Audio: aac, 0 channels, 128 kb/s
           Metadata:
             encoder         : Lavc57.24.102 aac
       Stream mapping:
         Stream #0:0 (png) -> overlay:main (graph 0)
         Stream #1:0 (gif) -> scale (graph 0)
         overlay (graph 0) -> Stream #0:0 (libx264)
         Stream #2:0 -> #0:1 (mp3 (native) -> aac (native))
       Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

    If I use -s 560x560 , it works great, except then I can’t keep on using my aspect ratio :

    And here is my code :

    public class PhotoEditing extends AppCompatActivity implements GetGifAdapter.GlideInterface, SdCardSongAdapter.MediaInterface
           , FiltersListFragmentListener, EmojiFragmentListener, AddTextFragmentListener {

       public static final String FILE_PROVIDER_AUTHORITY = "com.burhanrashid52.photoeditor.fileprovider";
       @Nullable
       @VisibleForTesting
       Uri mSaveImageUri;
       PhotoEditorView image_preview;
       ImageView image_gif/*,image_preview*/;
       ImageView save, back;
       LinearLayout linearLayout;
       public String sessionId, sessionId1;
       Uri image_selected_uri;

       public Bitmap originalBitmap, filterdBitmap, finalBitmap;
       LinearLayout btn_music_list, btn_music_cut, btn_add_gif, btn_filters_list, btn_emoji, btn_add_text;
       MediaPlayer mediaPlayer;
       String mediaData;
       LinearLayout relativeLayout;
       RelativeLayout seekbar_layout, fm;
       RelativeLayout.LayoutParams layoutparam;
       RangeSeekBar rangeSeekBar;
       Runnable r;
       Handler mHandler;
       private int duration;
       private TextView tvLeft, tvRight;
       RelativeLayout rl_replace, music_fragment;
       FilterListFragment filterListFragment;
       EmojiFragment emojiFragment;
       PhotoEditor photoEditor;
       int screenWidth, screenHeight;
       Bitmap bitmap;
       int brightnessFinal = 0;
       int saturationFinal = 0;
       int constrantFinal = 0;
       int hue = 0;
       String glideData;
       FFmpeg ffmpeg;
       String s;
       String imageHeight;
       String imageWidth ;


       private static final String TAG = "BRIJESH";

       Context context = this;

       static {
           System.loadLibrary("NativeImageProcessor");
       }

       public Bitmap resizeImageToNewSize(Bitmap bitmap, int i, int i2) {
           try {
               int width = bitmap.getWidth();
               int height = bitmap.getHeight();
               float f = (float) i;
               float f2 = (float) i2;
               if (!(height == i2 && width == i)) {
                   float f3 = (float) width;
                   float f4 = f / f3;
                   float f5 = (float) height;
                   float f6 = f2 / f5;
                   if (f4 < f6) {
                       f6 = f4;
                   }
                   f = f3 * f6;
                   f2 = f5 * f6;
               }
               Bitmap bitmap1 = Bitmap.createScaledBitmap(bitmap, (int) f, (int) f2, true);
               fm.removeView(image_preview);
               RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(bitmap1.getWidth(), bitmap1.getHeight());

               params.addRule(RelativeLayout.CENTER_HORIZONTAL);
               image_preview.setLayoutParams(params);

               fm.addView(image_preview);
               return bitmap1;
           } catch (Exception unused) {
               fm.removeView(image_preview);
               RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(200, 100);

               params.addRule(RelativeLayout.CENTER_IN_PARENT);
               image_preview.setLayoutParams(params);
               fm.addView(image_preview);

               return bitmap;
           }
       }

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_photo_editing);

           initialize(this);
           image_preview = findViewById(R.id.image_preview);
           image_preview.getSource().setScaleType(ImageView.ScaleType.FIT_XY);
           photoEditor = new PhotoEditor.Builder(this, image_preview)
                   .setPinchTextScalable(true)
                   .build();
           image_gif = findViewById(R.id.image_gif);
           linearLayout = findViewById(R.id.linearLayout);
           btn_music_list = findViewById(R.id.btn_music_list);
    //        btn_music_cut = findViewById(R.id.btn_music_cut);
           btn_add_gif = findViewById(R.id.btn_add_gif);
           btn_filters_list = findViewById(R.id.btn_filters_list);
           btn_emoji = findViewById(R.id.btn_emoji);
           btn_add_text = findViewById(R.id.btn_add_text);
           fm = findViewById(R.id.frame);
           rl_replace = findViewById(R.id.replace_fragment);
           music_fragment = findViewById(R.id.music_fragment);
           seekbar_layout = findViewById(R.id.seekbar_layout);
           rangeSeekBar = findViewById(R.id.rangeSeekBar);
           rangeSeekBar.setNotifyWhileDragging(true);
           mHandler = new Handler();
           tvLeft = findViewById(R.id.tvLeft);
           tvRight = findViewById(R.id.tvRight);
           relativeLayout = findViewById(R.id.relativeLayout);
           save = findViewById(R.id.btndone);
           back = findViewById(R.id.btnhome);
           mediaPlayer = new MediaPlayer();
           back.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View v) {
                   if (mediaPlayer.isPlaying()) {
                       mediaPlayer.stop();
                       onBackPressed();
                   } else onBackPressed();
               }
           });

           sessionId1 = getIntent().getStringExtra("gallary");
           sessionId = getPathFromUri(PhotoEditing.this, Uri.parse(sessionId1));
           image_selected_uri = Uri.parse(sessionId);
           loadImage();

           btn_music_list.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View v) {
    //                if (seekbar_layout.getVisibility() == View.GONE){
                   if (mediaPlayer != null && mediaPlayer.isPlaying()) {
                       mediaPlayer.stop();
                   }
                   MusicListFragment musicListFragment = new MusicListFragment();
                   FragmentManager fragmentManager = getSupportFragmentManager();
                   FragmentTransaction transaction = fragmentManager.beginTransaction();
                   transaction.replace(R.id.replace_fragment, musicListFragment);
                   transaction.addToBackStack(null);
                   transaction.commit();
                   seekbar_layout.setVisibility(View.VISIBLE);

    //                }
    //                else if (seekbar_layout.getVisibility() == View.VISIBLE) {
    //                    seekbar_layout.setVisibility(View.GONE);
    //                }
               }
           });

    //        btn_music_cut.setOnClickListener(new View.OnClickListener() {
    //            @Override
    //            public void onClick(View v) {
    //                if (seekbar_layout.getVisibility() == View.GONE) {
    //                    if (mediaPlayer != null && mediaPlayer.isPlaying()) {
    //                        seekbar_layout.setVisibility(View.VISIBLE);
    //                    }
    //                } else if (seekbar_layout.getVisibility() == View.VISIBLE) {
    //                    seekbar_layout.setVisibility(View.GONE);
    //                }
    //            }
    //        });

           btn_add_gif.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View v) {
                   GIfFragment gIfFragment = new GIfFragment();
                   FragmentManager fragmentManager = getSupportFragmentManager();
                   FragmentTransaction transaction = fragmentManager.beginTransaction();
                   transaction.replace(R.id.replace_fragment, gIfFragment);
                   transaction.addToBackStack(null);
                   transaction.commit();
               }
           });



           save.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View v) {
    //                image_preview.getSource().setDrawingCacheEnabled(true);
    //                Bitmap b = image_preview.getSource().getDrawingCache();
    //                MediaStore.Images.Media.insertImage(context.getContentResolver(), b,"", "");
                   saveImageToGallery();

               }
           });
       }

       private void loadImage() {
           bitmap = BitmapFactory.decodeFile(image_selected_uri.toString());
           originalBitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
           originalBitmap = modifyOrientation(originalBitmap, sessionId);
           finalBitmap = originalBitmap.copy(Bitmap.Config.ARGB_8888, true);
           filterdBitmap = originalBitmap.copy(Bitmap.Config.ARGB_8888, true);

           DisplayMetrics displayMetrics = new DisplayMetrics();
           getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
           int f3672y = displayMetrics.widthPixels;
           int f3673z = displayMetrics.heightPixels;
           float f = getResources().getDisplayMetrics().density;
           int i = f3672y - ((int) (40.0f * f));
           int i2 = f3673z - ((int) (f * 100.0f));
           image_preview.getSource().setImageBitmap(resizeImageToNewSize(bitmap, i, i2));
       }

       private void getDropboxIMGSize(Uri uri){

           BitmapFactory.Options options = new BitmapFactory.Options();
           options.inJustDecodeBounds = true;
           BitmapFactory.decodeFile(new File(uri.getPath()).getAbsolutePath(), options);
           imageHeight= String.valueOf(options.outHeight);
           imageWidth = String.valueOf(options.outWidth);

       }

       @SuppressLint("MissingPermission")
       private void saveImageToGallery() {
           File file = new File(Environment.getExternalStorageDirectory()
                   + File.separator + ""
                   + System.currentTimeMillis() + ".jpg");
           try {
               file.createNewFile();

               SaveSettings saveSettings = new SaveSettings.Builder()
                       .setClearViewsEnabled(true)
                       .setTransparencyEnabled(true)
                       .build();

               photoEditor.saveAsFile(file.getAbsolutePath(), saveSettings, new PhotoEditor.OnSaveListener() {
                   @Override
                   public void onSuccess(@NonNull String imagePath) {
                       Toast.makeText(context, "Image Saved", Toast.LENGTH_SHORT).show();

                       mSaveImageUri = Uri.fromFile(new File(imagePath));
                       getDropboxIMGSize(mSaveImageUri);
                       executeCmd(String.valueOf(mSaveImageUri), mediaData, glideData, rangeSeekBar.getSelectedMinValue().intValue() * 1000, rangeSeekBar.getSelectedMaxValue().intValue() * 1000);

                   }

                   @Override
                   public void onFailure(@NonNull Exception exception) {
                   }
               });
           } catch (IOException e) {
               e.printStackTrace();
           }
       }


       private void executeCmd(String imagepath, String songpath, String gif, int startMs, int endMs) {
           File outputLocation = getConvertedFile(outputPath() + "video", "movie_" + System.currentTimeMillis() + ".mp4");
           Log.e("videofilepath", songpath);
           String[] complexCommand = {"-y", "-i", imagepath,
                   "-ignore_loop", "0",
                   "-i", gif, "-filter_complex", "[1:v]scale=w='bitand(iw,65534)':h='bitand(ih,65534)' [ovrl];[0:v][ovrl]overlay=0:0",
                   "-ss", "" + startMs / 1000, "-t", "" + endMs / 1000, "-i", songpath,
                   "-c:v", "libx264", "-preset", "ultrafast", "-r", "30", "-pix_fmt", "yuva420p", "-c:a", "aac", "-shortest", outputLocation.getPath()};

           try {
               ffmpeg.execute(complexCommand, new ExecuteBinaryResponseHandler() {

                   @Override
                   public void onSuccess(String s) {
                       Log.e("onSuccess", s);
                   }

                   @Override
                   public void onFailure(String s) {
                       Log.e("fail", s);
                   }
               });
           } catch (FFmpegCommandAlreadyRunningException e) {
               Log.e("catch", e.getMessage());
           }
       }
    }

    Any idea what is going on here ? Thanks !

  • When i used ffmpeg to trim a video ,i observed an error [closed]

    6 mars 2020, par israfilll

    i wanna use ffmpeg to trim a video but when i build app i see this I/System.out : ssssCANNOT LINK EXECUTABLE "/data/user/0/com.example.newapplication/files/ffmpeg" : "/data/data/com.example.newapplication/files/ffmpeg" has text relocations (https://android.googlesource.com/platform/bionic/+/master/androg-’changes-for-ndk-developers.md#Text-Relocations-Enforced-for-API-level-23)

    why didn’t i use this library

    public class MainActivity extends AppCompatActivity {
    String outPutFile;
    Uri filePath, filePathSecond, photoUri, AudioUri;
    ProgressDialog progressDialog;
    private int REQUEST_TAKE_GALLERY_VIDEO = 110;
    private int REQUEST_TAKE_GALLERY_VIDEO_2 = 115;
    private FFmpeg ffmpeg;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);
       if(ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE)
               != PackageManager.PERMISSION_GRANTED){
           ActivityCompat.requestPermissions(this,new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},1);


       }
       progressDialog = new ProgressDialog(this);
       findViewById(R.id.selectVideo).setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               Intent intent = new Intent();
               intent.setType("video/*");
               intent.setAction(Intent.ACTION_GET_CONTENT);
               startActivityForResult(Intent.createChooser(intent, "Select Video"), REQUEST_TAKE_GALLERY_VIDEO);
           }
       });
       findViewById(R.id.start).setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               if (filePath != null) {
                   executeCutVideoCommand(1000, 4 * 1000, filePath);
               }
           }
       });



       findViewById(R.id.marge_video).setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               if (filePath != null && filePathSecond != null)
                   executeMargeVideoCommand(filePath, filePathSecond);
           }
       });
       findViewById(R.id.selectVideo_2).setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               Intent intent = new Intent();
               intent.setType("video/*");
               intent.setAction(Intent.ACTION_GET_CONTENT);
               startActivityForResult(Intent.createChooser(intent, "Select Video"), REQUEST_TAKE_GALLERY_VIDEO_2);
           }
       });

       findViewById(R.id.filter_video).setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               if(filePath!=null){
                   executeFilterCommand(filePath);
               }
           }
       });
       loadFFMpegBinary();
    }


    @Override
    protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
       super.onActivityResult(requestCode, resultCode, data);
       if (requestCode == REQUEST_TAKE_GALLERY_VIDEO) {
           Uri uri = data.getData();
           if (uri != null) {
               filePath = uri;
           }
       }
       if (requestCode == REQUEST_TAKE_GALLERY_VIDEO_2) {
           Uri uri = data.getData();
           if (uri != null) {
               filePathSecond = uri;
           }
       }


    }



    // video trim
    private void executeCutVideoCommand(int startMs, int endMs, Uri filePath) {
       File moviesDir = Environment.getExternalStoragePublicDirectory(
               Environment.DIRECTORY_MOVIES
       );
       String filePrefix = "cut_video";
       String fileExtn = ".mp4";
       String yourRealPath = getPath(MainActivity.this, filePath);
       File dest = new File(moviesDir, filePrefix + fileExtn);
       int fileNo = 0;
       while (dest.exists()) {
           fileNo++;
           dest = new File(moviesDir, filePrefix + fileNo + fileExtn);
       }
      /* Log.d(TAG, "startTrim: src: " + yourRealPath);
       Log.d(TAG, "startTrim: dest: " + dest.getAbsolutePath());
       Log.d(TAG, "startTrim: startMs: " + startMs);
       Log.d(TAG, "startTrim: endMs: " + endMs);*/
       outPutFile = dest.getAbsolutePath();
       //String[] complexCommand = {"-i", yourRealPath, "-ss", "" + startMs / 1000, "-t", "" + endMs / 1000, dest.getAbsolutePath()};
       String[] complexCommand = {"-ss",
               "" + (startMs / 1000),
               "-y",
               "-i",
               yourRealPath,
               "-t",
               "" + ((endMs - startMs) / 1000),
               "-vcodec",
               "mpeg4",
               "-b:v",
               "2097152",
               "-b:a",
               "48000",
               "-ac",
               "2",
               "-ar",
               "22050",
               outPutFile};
       execFFmpegBinary(complexCommand);
    }

    private String getPath(final Context context, final Uri uri) {
       final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT;
       // DocumentProvider
       if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
           if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) {
               // ExternalStorageProvider
               if (isExternalStorageDocument(uri)) {
                   final String docId = DocumentsContract.getDocumentId(uri);
                   final String[] split = docId.split(":");
                   final String type = split[0];
                   if ("primary".equalsIgnoreCase(type)) {
                       return Environment.getExternalStorageDirectory() + "/" + split[1];
                   }
                   // TODO handle non-primary volumes
               }
               // DownloadsProvider
               else if (isDownloadsDocument(uri)) {
                   final String id = DocumentsContract.getDocumentId(uri);
                   final Uri contentUri = ContentUris.withAppendedId(
                           Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));
                   return getDataColumn(context, contentUri, null, null);
               }
               // MediaProvider
               else if (isMediaDocument(uri)) {
                   final String docId = DocumentsContract.getDocumentId(uri);
                   final String[] split = docId.split(":");
                   final String type = split[0];
                   Uri contentUri = null;
                   if ("image".equals(type)) {
                       contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
                   } else if ("video".equals(type)) {
                       contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
                   } else if ("audio".equals(type)) {
                       contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
                   }
                   final String selection = "_id=?";
                   final String[] selectionArgs = new String[]{
                           split[1]
                   };
                   return getDataColumn(context, contentUri, selection, selectionArgs);
               }
           }
           // MediaStore (and general)
           else if ("content".equalsIgnoreCase(uri.getScheme())) {
               return getDataColumn(context, uri, null, null);
           }
           // File
           else if ("file".equalsIgnoreCase(uri.getScheme())) {
               return uri.getPath();
           }
       }
       return null;
    }

    private boolean isExternalStorageDocument(Uri uri) {
       return "com.android.externalstorage.documents".equals(uri.getAuthority());
    }

    /**
    * @param uri The Uri to check.
    * @return Whether the Uri authority is DownloadsProvider.
    */
    private boolean isDownloadsDocument(Uri uri) {
       return "com.android.providers.downloads.documents".equals(uri.getAuthority());
    }

    /**
    * @param uri The Uri to check.
    * @return Whether the Uri authority is MediaProvider.
    */
    private boolean isMediaDocument(Uri uri) {
       return "com.android.providers.media.documents".equals(uri.getAuthority());
    }

    private String getDataColumn(Context context, Uri uri, String selection,
                                String[] selectionArgs) {
       Cursor cursor = null;
       final String column = "_data";
       final String[] projection = {
               column
       };
       try {
           cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs,
                   null);
           if (cursor != null && cursor.moveToFirst()) {
               final int column_index = cursor.getColumnIndexOrThrow(column);
               return cursor.getString(column_index);
           }
       } finally {
           if (cursor != null)
               cursor.close();
       }
       return null;
    }

    private void loadFFMpegBinary() {
       try {
           if (ffmpeg == null) {
               ffmpeg = FFmpeg.getInstance(this);
           }
           ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
               @Override
               public void onFailure() {
               }

               @Override
               public void onSuccess() {
               }
           });
       } catch (FFmpegNotSupportedException e) {
       } catch (Exception e) {
       }
    }

    private void execFFmpegBinary(final String[] command) {
       progressDialog.show();
       try {
           ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
               @Override
               public void onFailure(String s) {


                   System.out.println("ssss"+s);
               }

               @Override
               public void onSuccess(String s) {
                   //    Log.d(TAG, "SUCCESS with output : " + s);
                   Toast.makeText(MainActivity.this, "SUCCESS" + outPutFile, Toast.LENGTH_LONG).show();
               }

               @Override
               public void onProgress(String s) {
               }

               @Override
               public void onStart() {
                   progressDialog.setMessage("Processing...");
                   progressDialog.show();
               }

               @Override
               public void onFinish() {
                   progressDialog.dismiss();
               }
           });
       } catch (Exception e ) {
           Toast.makeText(MainActivity.this, "EXCEPTION", Toast.LENGTH_LONG).show();

           // do nothing for now
       }
    }


    private void executeMargeVideoCommand(Uri filePath, Uri filePathSecond) {
       File moviesDir = Environment.getExternalStoragePublicDirectory(
               Environment.DIRECTORY_MOVIES
       );
       String filePrefix = "marge_change_video";
       String fileExtn = ".mp4";
       String yourRealPath = getPath(MainActivity.this, filePath);
       String yourRealPath2 = getPath(MainActivity.this, filePathSecond);
       File dest = new File(moviesDir, filePrefix + fileExtn);
       int fileNo = 0;
       while (dest.exists()) {
           fileNo++;
           dest = new File(moviesDir, filePrefix + fileNo + fileExtn);
       }

       outPutFile = dest.getAbsolutePath();
       String complexCommand[] = {"-y", "-i", yourRealPath, "-i", yourRealPath2, "-strict", "experimental", "-filter_complex",
               "[0:v]scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v0];[1:v] scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v1];[v0][0:a][v1][1:a] concat=n=2:v=1:a=1",
               "-ab", "48000", "-ac", "2", "-ar", "22050", "-s", "1920x1080", "-vcodec", "libx264", "-crf", "27",
               "-q", "4", "-preset", "ultrafast", outPutFile};
       execFFmpegBinary(complexCommand);
    }


    private void executeFilterCommand(Uri filePath) {
       File moviesDir = Environment.getExternalStoragePublicDirectory(
               Environment.DIRECTORY_MOVIES
       );
       String filePrefix = "filter_change_video";
       String fileExtn = ".mp4";
       String yourRealPath = getPath(MainActivity.this, filePath);
       File dest = new File(moviesDir, filePrefix + fileExtn);
       int fileNo = 0;
       while (dest.exists()) {
           fileNo++;
           dest = new File(moviesDir, filePrefix + fileNo + fileExtn);
       }

       outPutFile = dest.getAbsolutePath();
       //  String complexCommand[] = {"-i", yourRealPath,"-r", "15", "-aspect" ,""+w+":"+""+h ,"-strict" ,"-2",outPutFile};
       String complexCommand[] = { "-i",yourRealPath,"-vf", "split [main][tmp]; [tmp] lutyuv=","y=val*5"," [tmp2]; [main][tmp2] overlay", outPutFile};
       execFFmpegBinary(complexCommand);
    }

    }

  • Trying to cancel execution and delete file using ffmpeg C API

    6 mars 2020, par Vuwox

    The code below is a class that handle the conversion of multiples images, through add_frame() method, into a GIF with encode(). It also use a filter to generate and apply the palette. The usage is like this :

    Code call example

    std::unique_ptr gif_obj = nullptr;
    try
    {
       gif_obj = std::make_unique({1000,1000}, 12, "C:/out.gif",
                 "format=pix_fmts=rgb24,split [a][b];[a]palettegen[p];[b][p]paletteuse");

       // Example: a simple vector of images (usually process internally)
       for(auto img : image_vector)
            gif_obj->add_frame(img);

       // Once all frame were added, encode the final GIF with the filter applied.
       gif_obj->encode();
    }
    catch(const std::exception& e)
    {
       // An error occured! We must close FFMPEG properly and delete the created file.
       gif_obj->cancel();
    }

    I have the following issue. If the code for any reason throw an exception, I call ffmpeg->cancel() and it supposes to delete the GIF file on disk. But this is never working, I assume there is a lock on the file or something like that. So here are my question :

    What is the proper way to close/free ffmpeg object in order to remove the file afterward ?


    Full class code below

    Header

    // C++ Standard includes    
    #include <memory>
    #include <string>
    #include <vector>


    // 3rd Party incldues
    #ifdef __cplusplus
    extern "C" {
    #include "libavformat/avformat.h"
    #include "libavfilter/avfilter.h"
    #include "libavutil/opt.h"
    #include "libavfilter/buffersrc.h"
    #include "libavfilter/buffersink.h"
    #include "libswscale/swscale.h"
    #include "libavutil/imgutils.h"
    }
    #endif

    #define FFMPEG_MSG_LEN 2000

    namespace px
    {
       namespace GIF
       {
           class FFMPEG
           {
           public:
               FFMPEG(const px::Point2D<int>&amp; dim,
                      const int framerate,
                      const std::string&amp; filename,
                      const std::string&amp; filter_cmd);

               ~FFMPEG();

               void add_frame(pxImage * const img);
               void encode();
               void cancel();

           private:

               void init_filters();            // Init everything that needed to filter the input frame.
               void init_muxer();              // The muxer that creates the output file.
               void muxing_one_frame(AVFrame* frame);
               void release();

               int _ret = 0;                   // status code from FFMPEG.
               char _err_msg[FFMPEG_MSG_LEN];  // Error message buffer.


               int m_width = 0;                // The width that all futur images must have to be accepted.
               int m_height = 0;               // The height that all futur images must have to be accepted.

               int m_framerate = 0;            // GIF Framerate.
               std::string m_filename = "";    // The GIF filename (on cache?)
               std::string m_filter_desc = ""; // The FFMPEG filter to apply over the frames.

               bool as_frame = false;

               AVFrame* picture_rgb24 = nullptr;           // Temporary frame that will hold the pxImage in an RGB24 format (NOTE: TOP-LEFT origin)

               AVFormatContext* ofmt_ctx = nullptr;        // ouput format context associated to the
               AVCodecContext* o_codec_ctx = nullptr;      // output codec for the GIF

               AVFilterGraph* filter_graph = nullptr;      // filter graph associate with the string we want to execute
               AVFilterContext* buffersrc_ctx = nullptr;   // The buffer that will store all the frames in one place for the palette generation.
               AVFilterContext* buffersink_ctx = nullptr;  // The buffer that will store the result afterward (once the palette are used).

               int64_t m_pts_increment = 0;
           };
       };
    };
    </int></vector></string></memory>

    ctor

    px::GIF::FFMPEG::FFMPEG(const px::Point2D<int>&amp; dim,
                           const int framerate,
                           const std::string&amp; filename,
                           const std::string&amp; filter_cmd) :
       m_width(dim.x()),
       m_height(dim.y()),
       m_framerate(framerate),
       m_filename(filename),
       m_filter_desc(filter_cmd)
    {
    #if !_DEBUG
       av_log_set_level(AV_LOG_QUIET); // Set the FFMPEG log to quiet to avoid too much logs.
    #endif

       // Allocate the temporary buffer that hold the ffmpeg image (pxImage to AVFrame conversion).
       picture_rgb24 = av_frame_alloc();
       picture_rgb24->pts = 0;
       picture_rgb24->data[0] = NULL;
       picture_rgb24->linesize[0] = -1;
       picture_rgb24->format = AV_PIX_FMT_RGB24;
       picture_rgb24->height = m_height;
       picture_rgb24->width = m_width;

       if ((_ret = av_image_alloc(picture_rgb24->data, picture_rgb24->linesize, m_width, m_height, (AVPixelFormat)picture_rgb24->format, 24)) &lt; 0)
           throw px::GIF::Error("Failed to allocate the AVFrame for pxImage conversion with error: " +
                                std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)),
                                "GIF::FFMPEG CTOR");  

       //printf("allocated picture of size %d, linesize %d %d %d %d\n", _ret, picture_rgb24->linesize[0], picture_rgb24->linesize[1], picture_rgb24->linesize[2], picture_rgb24->linesize[3]);

       init_muxer();   // Prepare the GIF encoder (open it on disk).
       init_filters(); // Prepare the filter that will be applied over the frame.

       // Instead of hardcoder {1,100} which is the GIF tbn, we collect it from its stream.
       // This will avoid future problem if the codec change in ffmpeg.
       if (ofmt_ctx &amp;&amp; ofmt_ctx->nb_streams > 0)
           m_pts_increment = av_rescale_q(1, { 1, m_framerate }, ofmt_ctx->streams[0]->time_base);
       else
           m_pts_increment = av_rescale_q(1, { 1, m_framerate }, { 1, 100 });
    }
    </int>

    FFMPEG Initialization (Filter and muxer)

    void px::GIF::FFMPEG::init_filters()
    {
       const AVFilter* buffersrc = avfilter_get_by_name("buffer");
       const AVFilter* buffersink = avfilter_get_by_name("buffersink");

       AVRational time_base = { 1, m_framerate };
       AVRational aspect_pixel = { 1, 1 };

       AVFilterInOut* inputs = avfilter_inout_alloc();
       AVFilterInOut* outputs = avfilter_inout_alloc();

       filter_graph = avfilter_graph_alloc();

       try
       {
           if (!outputs || !inputs || !filter_graph)
               throw px::GIF::Error("Failed to 'init_filters' could not allocated the graph/filters.", "GIF::FFMPEG init_filters");

           char args[512];
           snprintf(args, sizeof(args),
                    "video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
                    m_width, m_height,
                    picture_rgb24->format,
                    time_base.num, time_base.den,
                    aspect_pixel.num, aspect_pixel.den);

           if (avfilter_graph_create_filter(&amp;buffersrc_ctx, buffersrc, "in", args, nullptr, filter_graph) &lt; 0)
               throw px::GIF::Error("Failed to create the 'source buffer' in init_filer method.", "GIF::FFMPEG init_filters");


           if (avfilter_graph_create_filter(&amp;buffersink_ctx, buffersink, "out", nullptr, nullptr, filter_graph) &lt; 0)
               throw px::GIF::Error("Failed to create the 'sink buffer' in init_filer method.", "GIF::FFMPEG init_filters");

           // GIF has possible output of PAL8.
           enum AVPixelFormat pix_fmts[] = { AV_PIX_FMT_PAL8, AV_PIX_FMT_NONE };

           if (av_opt_set_int_list(buffersink_ctx, "pix_fmts", pix_fmts, AV_PIX_FMT_NONE, AV_OPT_SEARCH_CHILDREN) &lt; 0)
               throw px::GIF::Error("Failed to set the output pixel format.", "GIF::FFMPEG init_filters");

           outputs->name = av_strdup("in");
           outputs->filter_ctx = buffersrc_ctx;
           outputs->pad_idx = 0;
           outputs->next = nullptr;

           inputs->name = av_strdup("out");
           inputs->filter_ctx = buffersink_ctx;
           inputs->pad_idx = 0;
           inputs->next = nullptr;

           // GIF has possible output of PAL8.
           if (avfilter_graph_parse_ptr(filter_graph, m_filter_desc.c_str(), &amp;inputs, &amp;outputs, nullptr) &lt; 0)
               throw px::GIF::Error("Failed to parse the filter graph (bad string!).", "GIF::FFMPEG init_filters");

           if (avfilter_graph_config(filter_graph, nullptr) &lt; 0)
               throw px::GIF::Error("Failed to configure the filter graph (bad string!).", "GIF::FFMPEG init_filters");

           avfilter_inout_free(&amp;inputs);
           avfilter_inout_free(&amp;outputs);
       }
       catch (const std::exception&amp; e)
       {
           // Catch exception to delete element.
           avfilter_inout_free(&amp;inputs);
           avfilter_inout_free(&amp;outputs);
           throw e; // re-throuw
       }
    }


    void px::GIF::FFMPEG::init_muxer()
    {
       AVOutputFormat* o_fmt = av_guess_format("gif", m_filename.c_str(), "video/gif");

       if ((_ret = avformat_alloc_output_context2(&amp;ofmt_ctx, o_fmt, "gif", m_filename.c_str())) &lt; 0)
           throw px::GIF::Error(std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)) + " allocate output format.", "GIF::FFMPEG init_muxer");

       AVCodec* codec = avcodec_find_encoder(AV_CODEC_ID_GIF);
       if (!codec) throw px::GIF::Error("Could to find the 'GIF' codec.", "GIF::FFMPEG init_muxer");

    #if 0
       const AVPixelFormat* p = codec->pix_fmts;
       while (p != NULL &amp;&amp; *p != AV_PIX_FMT_NONE) {
           printf("supported pix fmt: %s\n", av_get_pix_fmt_name(*p));
           ++p;
       }
    #endif

       AVStream* stream = avformat_new_stream(ofmt_ctx, codec);

       AVCodecParameters* codec_paramters = stream->codecpar;
       codec_paramters->codec_tag = 0;
       codec_paramters->codec_id = codec->id;
       codec_paramters->codec_type = AVMEDIA_TYPE_VIDEO;
       codec_paramters->width = m_width;
       codec_paramters->height = m_height;
       codec_paramters->format = AV_PIX_FMT_PAL8;

       o_codec_ctx = avcodec_alloc_context3(codec);
       avcodec_parameters_to_context(o_codec_ctx, codec_paramters);

       o_codec_ctx->time_base = { 1, m_framerate };

       if (ofmt_ctx->oformat->flags &amp; AVFMT_GLOBALHEADER)
           o_codec_ctx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;

       if ((_ret = avcodec_open2(o_codec_ctx, codec, NULL)) &lt; 0)
           throw px::GIF::Error(std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)) + " open output codec.", "GIF::FFMPEG init_muxer");

       if ((_ret = avio_open(&amp;ofmt_ctx->pb, m_filename.c_str(), AVIO_FLAG_WRITE)) &lt; 0)
           throw px::GIF::Error(std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)) + " avio open error.", "GIF::FFMPEG init_muxer");

       if ((_ret = avformat_write_header(ofmt_ctx, NULL)) &lt; 0)
           throw px::GIF::Error(std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)) + " write GIF header", "GIF::FFMPEG init_muxer");

    #if _DEBUG
       // This print the stream/output format.
       av_dump_format(ofmt_ctx, -1, m_filename.c_str(), 1);
    #endif
    }

    Add frame (usually in a loop)

    void px::GIF::FFMPEG::add_frame(pxImage * const img)
    {
       if (img->getImageType() != PXT_BYTE || img->getNChannels() != 4)
           throw px::GIF::Error("Failed to 'add_frame' since image is not PXT_BYTE and 4-channels.", "GIF::FFMPEG add_frame");

       if (img->getWidth() != m_width || img->getHeight() != m_height)
           throw px::GIF::Error("Failed to 'add_frame' since the size is not same to other inputs.", "GIF::FFMPEG add_frame");

       const int pitch = picture_rgb24->linesize[0];
       auto px_ptr = getImageAccessor(img);

       for (int y = 0; y &lt; m_height; y++)
       {
           const int px_row = img->getOrigin() == ORIGIN_BOT_LEFT ? m_height - y - 1 : y;
           for (int x = 0; x &lt; m_width; x++)
           {
               const int idx = y * pitch + 3 * x;
               picture_rgb24->data[0][idx] = px_ptr[px_row][x].ch[PX_RE];
               picture_rgb24->data[0][idx + 1] = px_ptr[px_row][x].ch[PX_GR];
               picture_rgb24->data[0][idx + 2] = px_ptr[px_row][x].ch[PX_BL];
           }
       }

       // palettegen need a whole stream, just add frame to buffer.
       if ((_ret = av_buffersrc_add_frame_flags(buffersrc_ctx, picture_rgb24, AV_BUFFERSRC_FLAG_KEEP_REF)) &lt; 0)
           throw px::GIF::Error("Failed to 'add_frame' to global buffer with error: " +
                                std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)),
                                "GIF::FFMPEG add_frame");

       // Increment the FPS of the picture for the next add-up to the buffer.      
       picture_rgb24->pts += m_pts_increment;

       as_frame = true;
    }    

    Encoder (final step)

    void px::GIF::FFMPEG::encode()
    {
       if (!as_frame)
           throw px::GIF::Error("Please 'add_frame' before running the Encoding().", "GIF::FFMPEG encode");

       // end of buffer
       if ((_ret = av_buffersrc_add_frame_flags(buffersrc_ctx, nullptr, AV_BUFFERSRC_FLAG_KEEP_REF)) &lt; 0)
           throw px::GIF::Error("error add frame to buffer source: " + std::string(av_make_error_string(_err_msg, FFMPEG_MSG_LEN, _ret)), "GIF::FFMPEG encode");

       do {
           AVFrame* filter_frame = av_frame_alloc();
           _ret = av_buffersink_get_frame(buffersink_ctx, filter_frame);
           if (_ret == AVERROR(EAGAIN) || _ret == AVERROR_EOF) {
               av_frame_unref(filter_frame);
               break;
           }

           // write the filter frame to output file
           muxing_one_frame(filter_frame);

           av_frame_unref(filter_frame);
       } while (_ret >= 0);

       av_write_trailer(ofmt_ctx);
    }

    void px::GIF::FFMPEG::muxing_one_frame(AVFrame* frame)
    {
       int ret = avcodec_send_frame(o_codec_ctx, frame);
       AVPacket *pkt = av_packet_alloc();
       av_init_packet(pkt);

       while (ret >= 0) {
           ret = avcodec_receive_packet(o_codec_ctx, pkt);
           if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
               break;
           }

           av_write_frame(ofmt_ctx, pkt);
       }
       av_packet_unref(pkt);
    }

    DTOR, Release and Cancel

    px::GIF::FFMPEG::~FFMPEG()
    {
       release();
    }


    void px::GIF::FFMPEG::release()
    {
       // Muxer stuffs
       if (ofmt_ctx != nullptr) avformat_free_context(ofmt_ctx);
       if (o_codec_ctx != nullptr) avcodec_close(o_codec_ctx);
       if (o_codec_ctx != nullptr) avcodec_free_context(&amp;o_codec_ctx);

       ofmt_ctx = nullptr;
       o_codec_ctx = nullptr;

       // Filter stuffs
       if (buffersrc_ctx != nullptr) avfilter_free(buffersrc_ctx);
       if (buffersink_ctx != nullptr) avfilter_free(buffersink_ctx);
       if (filter_graph != nullptr) avfilter_graph_free(&amp;filter_graph);

       buffersrc_ctx = nullptr;
       buffersink_ctx = nullptr;
       filter_graph = nullptr;

       // Conversion image.
       if (picture_rgb24 != nullptr) av_frame_free(&amp;picture_rgb24);
       picture_rgb24 = nullptr;
    }

    void px::GIF::FFMPEG::cancel()
    {
       // In-case of failure we must close ffmpeg and exit.
       av_write_trailer(ofmt_ctx);

       // Release and close all elements.
       release();

       // Delete the file on disk.
       if (remove(m_filename.c_str()) != 0)
           PX_LOG0(PX_LOGLEVEL_ERROR, "GIF::FFMPEG - On 'cancel' failed to remove the file.");
    }