Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (84)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (7163)

  • Transcoding videos with LibAvFormat for playback on iOS devices

    18 mai 2016, par user361526

    I’m trying to transcode a video on my iOS app using FFMpeg/LibAv.
    What I’m trying to accomplish is to transcode a video in order to resize each frame and possibly lower the bitrate in order to save valuable MB in the device.

    The resulting video must be playable on all iPhone5+ devices.

    After reading the documentation I found out that :

    • I do not need to encode/decode the audio stream -> I’ll copy as-is to the output file
    • I need to encode the video using the h264 codec (LibX264) with a profile supported by iOS (baseline profile with level 3.0 - https://trac.ffmpeg.org/wiki/Encode/H.264#Compatibility)
    • I’m also setting the picture format to YUV planar since it’s the only one supported by iOS
    • For the sake of testing I’m not using any filter (I’m using a dummy/passthrough) at all or even trying to lower the bitrate, I’m just trying to decode the video stream and encode it again
    • Most of the code is based on the transcoding.c and filtering.c available on the FFMpeg examples directory

    FFMpeg-wise what I’m trying to achieve with LibAv is :

    ffmpeg -i INPUT.MOV -c:v libx264 -preset ultrafast -profile:v baseline -level 3.0  -c:a copy output.MOV

    (the resulting file - which can be found below - is playable on QuickTime if it’s generated by FFMpeg through the command line)

    The original video was generated with a regular iPhone using iOS 8.2 but the problem is not device specific or iOS specific, it occurs on all videos generated with LibAv.

    Although both resulting files are playable by VideoLan (VLC) the one I generated through LibAv is not playable by QuickTime even though I can’t find anything wrong with it.

    As you can see below, I create the video stream with the proper video codec on the call to avformat_new_stream :

    AVStream *out_stream; // output stream
    AVStream *in_stream; // input stream
    AVCodecContext *dec_ctx, *enc_ctx; // codec context for the stream
    AVCodec *encoder; // codec used
    int ret;
    unsigned int i;

    ofmt_ctx = NULL;
    // Allocate an AVFormatContext for an output format. This will be the file header (similar to avformat_open_input but with an zero'ed memory)
    avformat_alloc_output_context2(&ofmt_ctx, NULL, NULL, filename);
    if (!ofmt_ctx) {
       av_log(NULL, AV_LOG_ERROR, "Could not create output context\n");
       [self errorWith:kErrorCreatingOutputContext and:@"Could not create output context"];
       return AVERROR_UNKNOWN;
    }
    // we must not use the AVCodecContext from the video stream directly! So we have to use avcodec_copy_context() to copy the context to a new location (after allocating memory for it, of course).
    // iterate over all input streams
    for (i = 0; i < ifmt_ctx->nb_streams; i++) {
       in_stream = ifmt_ctx->streams[i]; // input stream
       dec_ctx = in_stream->codec; // get the codec context for the decoder

       if (dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO) {
           // lets use h264
           encoder = avcodec_find_encoder(AV_CODEC_ID_H264);
           if (!encoder) {
               [self errorWith:kErrorCodecNotFound and:@"H264 Codec Not Found"];
               return AVERROR_UNKNOWN;
           }
           out_stream = avformat_new_stream(ofmt_ctx, encoder); // create a new stream with h264 codec
           if (!out_stream) {
               av_log(NULL, AV_LOG_ERROR, "Failed allocating output stream\n");
               [self errorWith:kErrorAllocateOutputStream and:@"Failed allocating output stream"];
               return AVERROR_UNKNOWN;
           }
           enc_ctx = out_stream->codec; // pointer to the stream codec context
           /* we transcode to same properties (picture size,
            * sample rate etc.). These properties can be changed for output
            * streams easily using filters */
           if (dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO) {
               enc_ctx->width = dec_ctx->width;
               enc_ctx->height = dec_ctx->height;
               enc_ctx->sample_aspect_ratio = dec_ctx->sample_aspect_ratio;
               enc_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
               enc_ctx->time_base = dec_ctx->time_base;
               av_opt_set(enc_ctx->priv_data, "preset", "ultrafast", 0);
               av_opt_set(enc_ctx->priv_data, "profile", "baseline", 0);
               av_opt_set(enc_ctx->priv_data, "level", "3.0", 0);
           }

           out_stream->time_base = in_stream->time_base;

           AVDictionaryEntry *tag = NULL;
           while ((tag = av_dict_get(in_stream->metadata, "", tag, AV_DICT_IGNORE_SUFFIX))) {
               printf("%s=%s\n", tag->key, tag->value);
               char *k = av_strdup(tag->key);       // if your strings are already allocated,
               char *v = av_strdup(tag->value);     // you can avoid copying them like this
               av_dict_set(&out_stream->metadata, k, v, 0);
           }

           ret = avcodec_open2(enc_ctx, encoder, NULL);
           if (ret < 0) {
               av_log(NULL, AV_LOG_ERROR, "Cannot open video encoder for stream #%u\n", i);
               [self errorWith:kErrorCantOpenOutputFile and:[NSString stringWithFormat:@"Cannot open video encoder for stream #%u",i]];
               return ret;
           }

       }
       else if(dec_ctx->codec_type == AVMEDIA_TYPE_UNKNOWN) {
           // if we cant figure out the stream type, fail
           av_log(NULL, AV_LOG_FATAL, "Elementary stream #%d is of unknown type, cannot proceed\n", i);
           [self errorWith:kErrorUnknownStream and:[NSString stringWithFormat:@"Elementary stream #%d is of unknown type, cannot proceed",i]];
           return AVERROR_INVALIDDATA;
       }
       else {
           out_stream = avformat_new_stream(ofmt_ctx, NULL);
           if (!out_stream) {
               av_log(NULL, AV_LOG_ERROR, "Failed allocating output stream\n");
               [self errorWith:kErrorAllocateOutputStream and:@"Failed allocating output stream"];
               return AVERROR_UNKNOWN;
           }
           enc_ctx = out_stream->codec;
           /* this stream must be remuxed */
           // copies ifmt_ctx->streams[i]->codec into ofmt_ctx->streams[i]->codec - Copy the settings of the source AVCodecContext into the destination AVCodecContext.
           ret = avcodec_copy_context(ofmt_ctx->streams[i]->codec,
                                      ifmt_ctx->streams[i]->codec);
           if (ret < 0) {
               av_log(NULL, AV_LOG_ERROR, "Copying stream context failed\n");
               [self errorWith:kErrorCopyStreamFailed and:@"Copying stream context failed"];
               return ret;
           }

       }
       // dunno what this is for
       if (ofmt_ctx->oformat->flags & AVFMT_GLOBALHEADER)
           enc_ctx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;

    }
    if (!(ofmt_ctx->oformat->flags & AVFMT_NOFILE)) {
       // Create and initialize a AVIOContext for accessing the
       // resource indicated by url.
       ret = avio_open(&ofmt_ctx->pb, filename, AVIO_FLAG_WRITE);
       if (ret < 0) {
           av_log(NULL, AV_LOG_ERROR, "Could not open output file '%s'", filename);
           [self errorWith:kErrorCantOpenOutputFile and:[NSString stringWithFormat:@"Could not open output file '%s'", filename]];
           return ret;
       }
    }

    /* init muxer, write output file header */
    // Allocate the stream private data and write the stream header to an output media file.
    ret = avformat_write_header(ofmt_ctx, NULL);
    if (ret < 0) {
       av_log(NULL, AV_LOG_ERROR, "Error occurred when opening output file\n");
       [self errorWith:kErrorOutFileCantWriteHeader and:@"Error occurred when opening output file"];
       return ret;
    }
    return 0;

    You can find the files here :

    1. Original final : https://www.dropbox.com/s/2jjs1uy2pu2veyy/IMG_5705.MOV?dl=0
    2. File generated with FFMpeg - https://www.dropbox.com/s/9hfmq3fcifgpfqc/local-ffmpeg.MOV?dl=0
    3. File generated by code - https://www.dropbox.com/s/rttvny39rj7ejpf/generated-by-Ze.MOV?dl=0

    Thank you so much,
    Ze

  • Android : Pass video path to FFmpeg

    7 janvier 2016, par marian

    I have developed an app that play video from gallery. I would like to add watermark using FFmpeg command in the video selected. But I do not know how to pass the path to the FFmpeg command. I could not find proper tutorials or reference regarding this. My coding are as follows :

    MainActivity.java :

    import android.app.Activity;
    import android.app.ProgressDialog;
    import android.content.DialogInterface;
    import android.content.Intent;
    import android.net.Uri;
    import android.os.Bundle;
    import android.os.Handler;
    import android.os.Message;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.View;
    import android.widget.Button;
    import android.widget.Toast;
    import android.widget.VideoView;

    import com.netcompss.ffmpeg4android.CommandValidationException;
    import com.netcompss.ffmpeg4android.GeneralUtils;
    import com.netcompss.ffmpeg4android.Prefs;
    import com.netcompss.ffmpeg4android.ProgressCalculator;
    import com.netcompss.loader.LoadJNI;

    public class MainActivity extends Activity {
    public ProgressDialog progressBar;

    String workFolder = null;
    String demoVideoFolder = null;
    String demoVideoPath = null;
    String vkLogPath = null;
    LoadJNI vk;
    private final int STOP_TRANSCODING_MSG = -1;
    private final int FINISHED_TRANSCODING_MSG = 0;
    private boolean commandValidationFailedFlag = false;

    Button button;
    VideoView videoView;
    private static final int PICK_FROM_GALLERY = 1;


    private void runTranscodingUsingLoader() {
       Log.i(Prefs.TAG, "runTranscodingUsingLoader started...");

       PowerManager powerManager = (PowerManager)MainActivity.this.getSystemService(Activity.POWER_SERVICE);
       PowerManager.WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
       Log.d(Prefs.TAG, "Acquire wake lock");
       wakeLock.acquire();



       String[] complexCommand = {"ffmpeg","-y" ,"-i", "/sdcard/videokit/in.mp4","-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};
       ///////////////////////////////////////////////////////////////////////


       vk = new LoadJNI();
       try {
           // running complex command with validation
           vk.run(complexCommand, workFolder, getApplicationContext());

           // running without command validation
           //vk.run(complexCommand, workFolder, getApplicationContext(), false);

           // running regular command with validation
           //vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, getApplicationContext());

           Log.i(Prefs.TAG, "vk.run finished.");
           // copying vk.log (internal native log) to the videokit folder
           GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);

       } catch (CommandValidationException e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
           commandValidationFailedFlag = true;

       } catch (Throwable e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
       }
       finally {
           if (wakeLock.isHeld()) {
               wakeLock.release();
               Log.i(Prefs.TAG, "Wake lock released");
           }
           else{
               Log.i(Prefs.TAG, "Wake lock is already released, doing nothing");
           }
       }

       // finished Toast
       String rc = null;
       if (commandValidationFailedFlag) {
           rc = "Command Vaidation Failed";
       }
       else {
           rc = GeneralUtils.getReturnCodeFromLog(vkLogPath);
       }
       final String status = rc;
       MainActivity.this.runOnUiThread(new Runnable() {
           public void run() {
               Toast.makeText(MainActivity.this, status, Toast.LENGTH_LONG).show();
               if (status.equals("Transcoding Status: Failed")) {
                   Toast.makeText(MainActivity.this, "Check: " + vkLogPath + " for more information.", Toast.LENGTH_LONG).show();
               }
           }
       });
    }


    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);

       button = (Button) findViewById(R.id.button);

       videoView = (VideoView) findViewById(R.id.videoview);

       button.setOnClickListener(new View.OnClickListener() {

           public void onClick(View v) {
               // TODO Auto-generated method stub
               Intent intent = new Intent();

               intent.setType("video/*");
               intent.setAction(Intent.ACTION_GET_CONTENT);

               startActivityForResult(Intent.createChooser(intent, "Complete action using"), PICK_FROM_GALLERY);
           }
       });

    }

    @Override
    public void onActivityResult(int requestCode, int resultCode, Intent data) {
       if (resultCode != RESULT_OK) return;

       if (requestCode == PICK_FROM_GALLERY) {
           Uri mVideoURI = data.getData();
           videoView.setVideoURI(mVideoURI);
           videoView.start();
           demoVideoFolder = mVideoURI.getPath();
           demoVideoPath = demoVideoFolder;
           savevideo(mVideoURI);

       }


    }
    private Handler handler = new Handler() {
       @Override
       public void handleMessage(Message msg) {
           Log.i(Prefs.TAG, "Handler got message");
           if (progressBar != null) {
               progressBar.dismiss();

               // stopping the transcoding native
               if (msg.what == STOP_TRANSCODING_MSG) {
                   Log.i(Prefs.TAG, "Got cancel message, calling fexit");
                   vk.fExit(getApplicationContext());


               }
           }
       }
    };

    public void runTranscoding() {
       progressBar = new ProgressDialog(MainActivity.this);
       progressBar.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
       progressBar.setTitle("FFmpeg4Android Direct JNI");
       progressBar.setMessage("Press the cancel button to end the operation");
       progressBar.setMax(100);
       progressBar.setProgress(0);

       progressBar.setCancelable(false);
       progressBar.setButton(DialogInterface.BUTTON_NEGATIVE, "Cancel", new DialogInterface.OnClickListener() {
           @Override
           public void onClick(DialogInterface dialog, int which) {
               handler.sendEmptyMessage(STOP_TRANSCODING_MSG);
           }
       });

       progressBar.show();

       new Thread() {
           public void run() {
               Log.d(Prefs.TAG,"Worker started");
               try {
                   //sleep(5000);
                   runTranscodingUsingLoader();
                   handler.sendEmptyMessage(FINISHED_TRANSCODING_MSG);

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();

       // Progress update thread
       new Thread() {
           ProgressCalculator pc = new ProgressCalculator(vkLogPath);
           public void run() {
               Log.d(Prefs.TAG,"Progress update started");
               int progress = -1;
               try {
                   while (true) {
                       sleep(300);
                       progress = pc.calcProgress();
                       if (progress != 0 && progress < 100) {
                           progressBar.setProgress(progress);
                       }
                       else if (progress == 100) {
                           Log.i(Prefs.TAG, "==== progress is 100, exiting Progress update thread");
                           pc.initCalcParamsForNextInter();
                           break;
                       }
                   }

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();
    }

    public void savevideo (Uri mVideoURI){
       demoVideoFolder = mVideoURI.getPath();
       demoVideoPath = demoVideoFolder;
       Log.i(Prefs.TAG, getString(R.string.app_name) + " version: " + GeneralUtils.getVersionName(getApplicationContext()));

       Button invoke = (Button) findViewById(R.id.button);
       invoke.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               Log.i(Prefs.TAG, "run clicked.");
               runTranscoding();
           }
       });

       workFolder = getApplicationContext().getFilesDir() + "/";
       Log.i(Prefs.TAG, "workFolder (license and logs location) path: " + workFolder);
       vkLogPath = workFolder + "vk.log";
       Log.i(Prefs.TAG, "vk log (native log) path: " + vkLogPath);
       GeneralUtils.copyLicenseFromAssetsToSDIfNeeded(this, workFolder);
       GeneralUtils.copyDemoVideoFromAssetsToSDIfNeeded(this, demoVideoFolder);
       int rc = GeneralUtils.isLicenseValid(getApplicationContext(), workFolder);
       Log.i(Prefs.TAG, "License check RC: " + rc);

    }
    }

    ffmpeg command :

    String[] complexCommand = {"ffmpeg","-y" ,"-i",  "/sdcard/videokit/in.mp4","-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};

    Tis command is from a sample project. How do i pass the video path to this command ? I do not know how to edit the command to support my requirement. Can someone guide me through this. Any help will be really helpful. Thank you.

  • FFmpeg4Android : How to select videos from gallery ?

    4 décembre 2015, par marian

    I have created an app using FFmpeg4Android library. I would like to add watermark into videos using ffmpeg. Command for adding watermark is working fine but the coding has pre-fixed video name in the command. I would like to choose videos from gallery folder and then add watermark to the videos. I do not know how to add intent for choosing videos with FFmpeg. I have tried to add the intent like this replacing the in.mp4 with PICK_FROM_GALLERY intent :

    private void runTranscodingUsingLoader() {
       Log.i(Prefs.TAG, "runTranscodingUsingLoader started...");

       PowerManager powerManager = (PowerManager)ProgressBarExample.this.getSystemService(Activity.POWER_SERVICE);
       WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
       Log.d(Prefs.TAG, "Acquire wake lock");
       wakeLock.acquire();

       Intent intent = new Intent();

       intent.setType("video/*");
       intent.setAction(Intent.ACTION_GET_CONTENT);

       startActivityForResult(Intent.createChooser(intent, "Complete action using"), PICK_FROM_GALLERY);


       String[] complexCommand = {"ffmpeg","-y" ,"-i", String.valueOf(PICK_FROM_GALLERY),"-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};

    It opens the gallery but the watermarking command doesnt get executed in the selected video. Can someone help me to resolve this issue. I could not find proper references/examples regarding this. My full coding is as follows.

    public class ProgressBarExample extends Activity  {

    public ProgressDialog progressBar;

    String workFolder = null;
    String demoVideoFolder = null;
    String demoVideoPath = null;
    String vkLogPath = null;
    LoadJNI vk;
    private final int STOP_TRANSCODING_MSG = -1;
    private final int FINISHED_TRANSCODING_MSG = 0;
    private boolean commandValidationFailedFlag = false;
    private static final int PICK_FROM_GALLERY = 1;

    private void runTranscodingUsingLoader() {
       Log.i(Prefs.TAG, "runTranscodingUsingLoader started...");

       PowerManager powerManager = (PowerManager)ProgressBarExample.this.getSystemService(Activity.POWER_SERVICE);
       WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
       Log.d(Prefs.TAG, "Acquire wake lock");
       wakeLock.acquire();




       String[] complexCommand = {"ffmpeg","-y" ,"-i", "/sdcard/videokit/in.mp4","-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};
       ///////////////////////////////////////////////////////////////////////


       vk = new LoadJNI();
       try {
           // running complex command with validation
           vk.run(complexCommand, workFolder, getApplicationContext());

           // running without command validation
           //vk.run(complexCommand, workFolder, getApplicationContext(), false);

           // running regular command with validation
           //vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, getApplicationContext());

           Log.i(Prefs.TAG, "vk.run finished.");
           // copying vk.log (internal native log) to the videokit folder
           GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);

       } catch (CommandValidationException e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
           commandValidationFailedFlag = true;

       } catch (Throwable e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
       }
       finally {
           if (wakeLock.isHeld()) {
               wakeLock.release();
               Log.i(Prefs.TAG, "Wake lock released");
           }
           else{
               Log.i(Prefs.TAG, "Wake lock is already released, doing nothing");
           }
       }

       // finished Toast
       String rc = null;
       if (commandValidationFailedFlag) {
           rc = "Command Vaidation Failed";
       }
       else {
           rc = GeneralUtils.getReturnCodeFromLog(vkLogPath);
       }
       final String status = rc;
       ProgressBarExample.this.runOnUiThread(new Runnable() {
           public void run() {
               Toast.makeText(ProgressBarExample.this, status, Toast.LENGTH_LONG).show();
               if (status.equals("Transcoding Status: Failed")) {
                   Toast.makeText(ProgressBarExample.this, "Check: " + vkLogPath + " for more information.", Toast.LENGTH_LONG).show();
               }
           }
       });
    }


    @Override
    public void onCreate(Bundle savedInstanceState) {
       Log.i(Prefs.TAG, "onCreate ffmpeg4android ProgressBarExample");

       super.onCreate(savedInstanceState);
       setContentView(R.layout.ffmpeg_demo_client_2);

       demoVideoFolder = Environment.getExternalStorageDirectory().getAbsolutePath() + "/videokit/";
       demoVideoPath = demoVideoFolder + "in.mp4";

       Log.i(Prefs.TAG, getString(R.string.app_name) + " version: " + GeneralUtils.getVersionName(getApplicationContext()) );

       Button invoke =  (Button)findViewById(R.id.invokeButton);
       invoke.setOnClickListener(new OnClickListener() {
           public void onClick(View v){
               Log.i(Prefs.TAG, "run clicked.");
               runTranscoding();
           }
       });

       workFolder = getApplicationContext().getFilesDir() + "/";
       Log.i(Prefs.TAG, "workFolder (license and logs location) path: " + workFolder);
       vkLogPath = workFolder + "vk.log";
       Log.i(Prefs.TAG, "vk log (native log) path: " + vkLogPath);
       GeneralUtils.copyLicenseFromAssetsToSDIfNeeded(this, workFolder);
       GeneralUtils.copyDemoVideoFromAssetsToSDIfNeeded(this, demoVideoFolder);
       int rc = GeneralUtils.isLicenseValid(getApplicationContext(), workFolder);
       Log.i(Prefs.TAG, "License check RC: " + rc);
    }



    private Handler handler = new Handler() {
       @Override
       public void handleMessage(Message msg) {
           Log.i(Prefs.TAG, "Handler got message");
           if (progressBar != null) {
               progressBar.dismiss();

               // stopping the transcoding native
               if (msg.what == STOP_TRANSCODING_MSG) {
                   Log.i(Prefs.TAG, "Got cancel message, calling fexit");
                   vk.fExit(getApplicationContext());


               }
           }
       }
    };

    public void runTranscoding() {
       progressBar = new ProgressDialog(ProgressBarExample.this);
       progressBar.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
       progressBar.setTitle("FFmpeg4Android Direct JNI");
       progressBar.setMessage("Press the cancel button to end the operation");
       progressBar.setMax(100);
       progressBar.setProgress(0);

       progressBar.setCancelable(false);
       progressBar.setButton(DialogInterface.BUTTON_NEGATIVE, "Cancel", new DialogInterface.OnClickListener() {
           @Override
           public void onClick(DialogInterface dialog, int which) {
               handler.sendEmptyMessage(STOP_TRANSCODING_MSG);
           }
       });

       progressBar.show();

       new Thread() {
           public void run() {
               Log.d(Prefs.TAG,"Worker started");
               try {
                   //sleep(5000);
                   runTranscodingUsingLoader();
                   handler.sendEmptyMessage(FINISHED_TRANSCODING_MSG);

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();

       // Progress update thread
       new Thread() {
           ProgressCalculator pc = new ProgressCalculator(vkLogPath);
           public void run() {
               Log.d(Prefs.TAG,"Progress update started");
               int progress = -1;
               try {
                   while (true) {
                       sleep(300);
                       progress = pc.calcProgress();
                       if (progress != 0 && progress < 100) {
                           progressBar.setProgress(progress);
                       }
                       else if (progress == 100) {
                           Log.i(Prefs.TAG, "==== progress is 100, exiting Progress update thread");
                           pc.initCalcParamsForNextInter();
                           break;
                       }
                   }

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();
    }

    }

    Any help/guidance would be really helpful. Thank you.