Recherche avancée

Médias (91)

Autres articles (62)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (7011)

  • MySql stops running in combination with Laravel Queue, Supervisor, and FFMPEg

    13 juin 2014, par egekhter

    After setting up queue listener to process uploaded videos with FFMPEG, I’ve come back to the server several times to find that MySql has stopped running. I checked drive space and it’s about 77% used (43G out of 60G).

    Here’s my code in case it’s useful :

    public function fire($job, $data)
    {
    $data = json_decode($data['transcoding_message'], true);
    $output_directory = '/home/ubuntu/transcodes/';
    $amazon_array = array();

    $s3 = AWS::get('s3');


       //execute main transcoding thread

       $cmd = 'sudo ffmpeg -i ' . $data['temp_file_url'] . ' -y -vcodec libx264 -tune zerolatency -movflags faststart -crf 20 -profile:v main -level:v 3.1 -acodec libfdk_aac -b:a 256k ' . $output_directory. $data['temp_file_key'] . '_HQ.mp4 -vcodec libx264 -s ' . $sq_width . 'x' . $sq_height . ' -tune zerolatency -movflags faststart -crf 25 -profile:v main -level:v 3.1 -acodec libfdk_aac -b:a 256k ' . $output_directory. $data['temp_file_key'] . '_SQ.mp4 -ss ' . $seek_half . ' -f image2 -vf scale=iw/2:-1 -vframes 1 ' . $output_directory. $data['temp_file_key'] . '_thumb.jpg';

       exec($cmd." 2>&1", $out, $ret);


       if ($ret)
       {

           Log::error($cmd);
           echo 'Processing error' . PHP_EOL;
           //there was a problem
           return;
       }

       else
       {
           //setup file urls
       echo 'about to move files';

       $hq_url = $this->bucket_root . $data['user_id'] . '/' . $data['temp_file_key'] . '_HQ.mp4';
       $sq_url = $this->bucket_root . $data['user_id'] . '/' . $data['temp_file_key'] . '_SQ.mp4';;
       $thumb_url = $this->bucket_root . $data['user_id'] . '/' . $data['temp_file_key'] . '_thumb.jpg';

       $amazon_array['video_hq_url'] = $data['temp_file_key'] . '_HQ.mp4';
       $amazon_array['video_sq_url'] = $data['temp_file_key'] . '_SQ.mp4';
       $amazon_array['video_thumb_url'] = $data['temp_file_key'] . '_thumb.jpg';

               //copy from temp to permanent

       foreach ($amazon_array as $k => $f)
       {
           $uploader = UploadBuilder::newInstance()
               ->setClient($s3)
               ->setSource($output_directory.$f)
               ->setBucket($this->bucket)
               ->setKey('users/' . $data['user_id'] . '/' . $f)
               ->setConcurrency(10)
               ->setOption('ACL', 'public-read')
               ->build();

           $uploader->getEventDispatcher()->addListener(
               'multipart_upload.after_part_upload',
               function($event) use ($f) {
                   // Do whatever you want
               }
           );

           try {
               $uploader->upload();
               echo "{$k} => Upload complete.\n" . PHP_EOL;

               DB::table('items')->where('id', $data['item_id'])->update(array($k => $this->bucket_root. $data['user_id'] . '/' .$f, 'deleted_at' => NULL));
               //delete local

               unlink($output_directory.$f);

               unset($uploader);
           } catch (MultipartUploadException $e) {
               $uploader->abort();
               echo "{$k} => Upload failed.\n" . PHP_EOL;
               continue;
           }
       }

           //write to database

               DB::table('archives_items')->where('id', $data['archive_item_id'])->update(array('deleted_at' => NULL));
               DB::connection('mysql3')->table('video_processing')->where('id', $data['id'])->update(array('finished_processing' => 1));

           //delete files

           //delete s3
           $s3->deleteObject(
               array(
                   'Bucket' => $this->temp_bucket,
                   'Key' => $data['file_name']
               )
           );
           echo $data['temp_file_url'] . '=>' . " deleted from temp bucket.\n" . PHP_EOL;
           DB::connection('mysql3')->table('video_processing')->where('id', $data['id'])->update(array('deleted_at' => \Carbon\Carbon::now()));

       }
           $job->delete();

           // end of processing uploaded video
           }


       else
       {
           return;
       }

    Any ideas as to why MySql would die like that ?

    ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (111)

    Edit : I wanted to add that the php artisan queue:listen command is being triggered via Supervisor and that I have 4 running concurrent processes.

  • H.264 to DNxHR 444 issue. Colors are not transcoded correctly (HDR project). Note : issue not solved yet

    22 janvier 2020, par Raulo1985

    I’m having an issue transcoding a H.264 UHD HDR file to a DNxHR file in a mxf container with FFmpeg. The issue is that both files don’t look the same at all, the colors look washed out on the DNxHR video, and I tried to make the transcoding as lossless as possible (DNxHR 444 flavor). The original file is a movie I ripped a while ago, H.264, UHD, HDR, in a mkv container.

    My goal : to create an almost lossless DNxHR file to use it as source file in Adobe Premiere Pro, and use another DNxHR file with less quality as proxy for editing. I wanted to do it that way and not use the original H.264 as the source file because it’s out of sync with the proxy file (I mean, when I toggle the proxy icon on and off, you can tell that there’s a short delay between them, which defeats all purposes for editing). My guess is that it may be because H.264 is compressed and DNxHR isn’t, and since I edit making a lot of fast cuts, I need both the source file and the proxy file to be as synced as possible. When the source file and the proxy file are both DNxHR, no matter the flavor, they are perfectly synced. I don’t want to go with Prores for the proxies, because the sync problem is a lot worse (several seconds of delay between files), maybe because it’s a VBR codec and my original file and DNxHR are CBR (for the record, I always prefer CBR).

    Well, the thing is that when I import the original H.264 file to Premiere Pro, use a DNxHR proxy, edit a little, and export it directly from the original file (H.264 10 bits, with all the settings required for HDR output enabled) the colors look as they should. When I do the same with the high quality DNxHR as source file, with the exact same export settings, the colors look washed out. The same with any DNxHR flavor.

    Then I opened both files (original H.264 and high quality DNxHR transcoded from the H.264 one) with VLC, and I also can tell that the mxf file looks washed out and the H.264 file doesn’t. So it’s not an export issue on Premiere’ side, it’s something that has to do with the original transcoding.

    I understand that DNxHR 444 is as lossless as you can get with that codec, preserving all the HDR required data, and I believe that the mfx container has some advantages over MOV, which is the other container that supports DNxHD/DNxHR. So I don’t know what’s happening really.

    The command I used was :

    ffmpeg -channel_layout 63 -i input.mkv -map 0:0 -c:v dnxhd -vf "scale=in_range=limited:out_range=full" -color_range 2 -profile:v dnxhr_444 -pix_fmt yuv444p10le -acodec pcm_s24le -ar 48000 -ac 6 -channel_layout 63 -map 0:2 -hide_banner output.mxf

    Like I said, after the transcoding, both video files look a lot different from each other, color wise. And after using them in Premiere and exporting with the exact same settings, the output files suffer from the same difference.

    Mediainfo shows the expected data for both files :
    - 10 bits, main 10, level 5, 4:2:0, CBR, BT.2020 for the original h.264 file.
    - 10 bits, 4:4:4, CBR for the DNxHR 444 file.

    One thing I noticed in Mediainfo is that both have YUV as color space, but the DNxHR 444 video has an extra field that says ColorSpace_Original : RGB. Honestly, I don’t know what that means, since the original is YUV. Color range is fine, from 0 to 1023 (and chroma range 1023). The other thing is that it says "limited" on the color range field of the H.264 file, but I’ve read that that could be a bug or missinterpretation of the file by Mediainfo.

    Well, that’s it, any help would be appreciated. I’d really like to edit with DNxHR 444 as source file and DNxHR LB for the proxies, so I can edit in a fast pace and without sync issues, but the color is just not acceptable. And I do understand that I’m adding an extra transcoding step (from original to DNxHR), but the sync issue between the original and the DNxHR proxies, even though it may be a delay of a fraction of a second, makes my workflow a lot harder since I’ll have to export many times to see if the cuts are made exactly where I want them to be. Not ideal by any means. And Prores is not an option apparently, the sync issue is a lot worse. For me, it all comes down to being able to get a DNxHR 444 file that looks, well, as close to lossless as it can be, and that goal obviously involves the colors.

    Thanks in advance.

    PS : file size is not an issue for me, so having an entire UHD HDR movie transcoded to DNxHR 444 is not a problem.

    PS2 : I tried with a different chroma subsampling (like DNxHR HQX 10 bits, which is 4:2:2), same result. Haven´t tried with 8 bits yet, but I don’t see the point since this is a HDR project.

    UPDATE :

    I tried to transcode from the H.264 source file to a DNxHR video file in a MXF container using Adobe Media Encoder instead of FFmpeg, and the colors are not transcoded correctly again, but this time they seem to be over saturated instead of washed out. Adobe Media Encoder doesn’t give much room for tweaking, but I made sure to select 444 10 bits profile, same resolution (UHD), same frame rate and render with maximum quality and maximum bit depth. FFprobe output of the resulting file again shows BT709 as the color space (the same thing happens with the output file after transcoding using FFmpeg). Seems to be something not related to FFmpeg, apparently. Any ideas ? It’s like there’s no way I can transcode and retain the colors correctly from H.264 to DNxHR, even using its most high quality flavor and correct command settings (at least they look ok to me). How can I post this so maybe developers or people with lots of experience here can give us a clue to what’s happening ? Thanks.

    PS : More potentially useful info on the comments below.

    EXTRA INFO :

    1) FFprobe output of the MXF DNxHR file (this one is 4:2:2, the only difference with the command used compared to the one stated on the OP is -pix_fmt yuv444p10le being -pix_fmt yuv422p10le) :

     libavutil      56. 31.100 / 56. 31.100
     libavcodec     58. 54.100 / 58. 54.100
     libavformat    58. 29.100 / 58. 29.100
     libavdevice    58.  8.100 / 58.  8.100
     libavfilter     7. 57.100 /  7. 57.100
     libswscale      5.  5.100 /  5.  5.100
     libswresample   3.  5.100 /  3.  5.100
     libpostproc    55.  5.100 / 55.  5.100
    [mxf @ 000001f4d17fbac0] Stream #0: not enough frames to estimate rate; consider increasing probesize
    Input #0, mxf, from 'Interstellar_Master_DNxHR_444_UHD_422_PCM24_5.1.mxf':
     Metadata:
       operational_pattern_ul: 060e2b34.04010101.0d010201.01010900
       uid             : adab4424-2f25-4dc7-92ff-29bd000c0000
       generation_uid  : adab4424-2f25-4dc7-92ff-29bd000c0001
       company_name    : FFmpeg
       product_name    : OP1a Muxer
       product_version : 58.29.100
       product_uid     : adab4424-2f25-4dc7-92ff-29bd000c0002
       material_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9300
       timecode        : 00:00:00:00
     Duration: 02:49:03.97, start: 0.000000, bitrate: 1404833 kb/s
       Stream #0:0: Video: dnxhd (DNXHR 444), yuv444p10le(bt709/unknown/unknown, progressive), 3840x2160, SAR 1:1 DAR 16:9, 23.98 tbr, 23.98 tbn, 23.98 tbc
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301
       Stream #0:1: Audio: pcm_s24le, 48000 Hz, 6 channels, s32 (24 bit), 6912 kb/s
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301

    2) FFprobe output of the MP4 H.264 source file (this one is 4:2:0, 10 bits, HDR) :

       Stream #0:0(eng): Video: hevc (Main 10) (hev1 / 0x31766568), yuv420p10le(tv, bt2020nc/bt2020/smpte2084), 3840x2160 [SAR 1:1 DAR 16:9], 15584 kb/s, 23.98 fps, 23.98 tbr, 16k tbn, 23.98 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, 5.1(side), fltp, 640 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
       Side data:
         audio service type: main
       Stream #0:2(eng): Data: bin_data (text / 0x74786574)
       Metadata:
         handler_name    : SubtitleHandler
    Unsupported codec with id 100359 for input stream 2
  • FFmpeg seeking (-ss) not working on some devices

    4 juin 2020, par ClassA

    As the title states, -ss command is not working on some devices, like the Huawei Mate 10.

    



    Here is the command I'm using and how I get the values :

    



    Format formatter = new SimpleDateFormat("00:" + "mm:ss.SS");
// Video duration
long duration = getDuration();
// Video duration devided by 6 (I want to get 5 images)
long img1 = duration / 6;
String firstThumbTime = formatter.format(img1);
// Screen dimentions devided by 7
String dimentions = width + ":" + height;

String[] a = {"-ss", firstThumbTime, "-i", mStringFilePath, "-vframes", "1", "-s", dimentions, imageThumbsDirectory + "/" + "thumb1.bmp"};


    



    The command above looks like this :

    



    -ss 00:00:00.47 -i /storage/emulated/0/Android/data/com.my.package/files/MyVideos/2020_02_19_16_00_20.mp4 -vframes 1 -s 154:274 /storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp


    



    The strange thing is that it completes without an error, but the output file is unreadable.

    



    I do not have the logs currently, but I have a user that can send it to me if you need it.

    




    



    I'm 100% sure that it is caused by -ss because I use a similar command for trimming a video elsewhere in my application and when the user exports the video without setting trimming points(-ss), then the video works.

    



    Here is the 2 command I use for trimming a video :

    



    Working :

    



    String[] s = {"-i", videonInputPath, "-crf", "18", "-c:v", "libx264", "-preset", "ultrafast", videoOutputPath};


    



    Not working

    



    String[] s = {"-ss", startValue, "-i", videonInputPath, "-crf", "18", "-c:v", "libx264", "-preset", "ultrafast", videoOutputPath};


    




    



    This is the first time a user sends me this issue, so it has to be something with his device.

    



    Any advice would greatly be appreciated.

    




    



    EDIT :

    



    Log as requested by @Gyan in the comment section below (Added -v 48 to command) :

    



    ffmpeg version n4.0-39-gda39990 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 4.9.x (GCC) 20150123 (prerelease)
configuration: --target-os=linux --cross-prefix=/root/bravobit/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/root/bravobit/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-ffprobe --enable-libopus --enable-libvorbis --enable-libfdk-aac --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-libvpx --enable-libass --enable-yasm --enable-pthreads --disable-debug --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-linux-perf --disable-doc --disable-shared --enable-static --enable-runtime-cpudetect --enable-nonfree --enable-network --enable-avresample --enable-avformat --enable-avcodec --enable-indev=lavfi --enable-hwaccels --enable-ffmpeg --enable-zlib --enable-gpl --enable-small --enable-nonfree --pkg-config=pkg-config --pkg-config-flags=--static --prefix=/root/bravobit/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/root/bravobit/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/root/bravobit/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-cxxflags=
libavutil 56. 14.100 / 56. 14.100
libavcodec 58. 18.100 / 58. 18.100
libavformat 58. 12.100 / 58. 12.100
libavdevice 58. 3.100 / 58. 3.100
libavfilter 7. 16.100 / 7. 16.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 1.100 / 5. 1.100
libswresample 3. 1.100 / 3. 1.100
libpostproc 55. 1.100 / 55. 1.100
Splitting the commandline.
Reading option '-ss' ... matched as option 'ss' (set the start time offset) with argument '00:30:02.58'.
Reading option '-i' ... matched as input url with argument '/storage/emulated/0/Download/email/VID-20180720-WA0001.mp4'.
Reading option '-v' ... matched as option 'v' (set logging level) with argument '48'.
Reading option '-vframes' ... matched as option 'vframes' (set the number of video frames to output) with argument '1'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '154:308'.
Reading option '/storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option v (set logging level) with argument 48.
Successfully parsed a group of options.
Parsing a group of options: input url /storage/emulated/0/Download/email/VID-20180720-WA0001.mp4.
Applying option ss (set the start time offset) with argument 00:30:02.58.
Successfully parsed a group of options.
Opening an input file: /storage/emulated/0/Download/email/VID-20180720-WA0001.mp4.
[NULL @ 0xf1fa5000] Opening '/storage/emulated/0/Download/email/VID-20180720-WA0001.mp4' for reading
[file @ 0xf1f94000] Setting default whitelist 'file,crypto'
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] ISO: File Type Major Brand: mp42
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] Unknown dref type 0x206c7275 size 12
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 29.833333 0.018442
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 29.916667 0.003419
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 30.000000 0.000764
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 60.000000 0.003057
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 120.000000 0.012228
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 29.970030 0.000295
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] rfps: 59.940060 0.001178
Last message repeated 1 times
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] Before avformat_find_stream_info() pos: 6937 bytes read:32768 seeks:0 nb_streams:2
[h264 @ 0xf1fda380] nal_unit_type: 7, nal_ref_idc: 1
[h264 @ 0xf1fda380] nal_unit_type: 8, nal_ref_idc: 1
[h264 @ 0xf1fda380] nal_unit_type: 6, nal_ref_idc: 0
[h264 @ 0xf1fda380] nal_unit_type: 5, nal_ref_idc: 1
[h264 @ 0xf1fda380] Format yuv420p chosen by get_format().
[h264 @ 0xf1fda380] Reinit context to 848x480, pix_fmt: yuv420p
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] All info found
[mov,mp4,m4a,3gp,3g2,mj2 @ 0xf1fa5000] After avformat_find_stream_info() pos: 67446 bytes read:67446 seeks:0 frames:45
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Download/email/VID-20180720-WA0001.mp4':
Metadata:
major_brand : mp42
minor_version : 1
compatible_brands: mp41mp42isom
creation_time : 2018-07-20T19:57:44.000000Z
Duration: 00:00:15.52, start: 0.000000, bitrate: 1679 kb/s
Stream #0:0(und), 1, 1/600: Video: h264, 1 reference frame (avc1 / 0x31637661), yuv420p(tv, bt709, left), 848x480, 0/1, 1622 kb/s, 30.02 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
rotate : 90
creation_time : 2018-07-20T19:57:44.000000Z
handler_name : Core Media Video
Side data:
displaymatrix: rotation of -90.00 degrees
Stream #0:1(und), 44, 1/44100: Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 53 kb/s (default)
Metadata:
creation_time : 2018-07-20T19:57:44.000000Z
handler_name : Core Media Audio
Successfully opened the file.
Parsing a group of options: output url /storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp.
Applying option vframes (set the number of video frames to output) with argument 1.
Applying option s (set frame size (WxH or abbreviation)) with argument 154:308.
Successfully parsed a group of options.
Opening an output file: /storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp.
Successfully opened the file.
detected 8 logical cores
[h264 @ 0xf1fdbf80] nal_unit_type: 7, nal_ref_idc: 1
[h264 @ 0xf1fdbf80] nal_unit_type: 8, nal_ref_idc: 1
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> bmp (native))
Press [q] to stop, [?] for help
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
Last message repeated 1 times
[h264 @ 0xf1fdbf80] nal_unit_type: 5, nal_ref_idc: 1
[h264 @ 0xf1fdbf80] Format yuv420p chosen by get_format().
[h264 @ 0xf1fdbf80] Reinit context to 848x480, pix_fmt: yuv420p
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdc300] nal_unit_type: 1, nal_ref_idc: 1
[h264 @ 0xf1fdc680] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
Last message repeated 1 times
[h264 @ 0xf1fdca00] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdcd80] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdd100] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdd480] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdd800] nal_unit_type: 1, nal_ref_idc: 1
[h264 @ 0xf1fddb80] nal_unit_type: 1, nal_ref_idc: 1
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'video_size' to value '848x480'
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'pix_fmt' to value '0'
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'time_base' to value '1/600'
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'pixel_aspect' to value '0/1'
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'sws_param' to value 'flags=2'
[graph 0 input from stream 0:0 @ 0xf1fe9360] Setting 'frame_rate' to value '30000/1001'
[graph 0 input from stream 0:0 @ 0xf1fe9360] w:848 h:480 pixfmt:yuv420p tb:1/600 fr:30000/1001 sar:0/1 sws_param:flags=2
[transpose @ 0xf1fe9420] Setting 'dir' to value 'clock'
[scaler_out_0_0 @ 0xf1fe9540] Setting 'w' to value '154'
[scaler_out_0_0 @ 0xf1fe9540] Setting 'h' to value '308'
[scaler_out_0_0 @ 0xf1fe9540] Setting 'flags' to value 'bicubic'
[scaler_out_0_0 @ 0xf1fe9540] w:154 h:308 flags:'bicubic' interl:0
[format @ 0xf1fe95a0] Setting 'pix_fmts' to value 'bgra|bgr24|rgb565le|rgb555le|rgb444le|rgb8|bgr8|rgb4_byte|bgr4_byte|gray|pal8|monob'
[AVFilterGraph @ 0xf1fb20c0] query_formats: 7 queried, 6 merged, 0 already done, 0 delayed
[scaler_out_0_0 @ 0xf1fe9540] picking bgr24 out of 12 ref:yuv420p alpha:0
[transpose @ 0xf1fe9420] w:848 h:480 dir:1 -> w:480 h:848 rotation:clockwise vflip:0
[scaler_out_0_0 @ 0xf1fe9540] w:480 h:848 fmt:yuv420p sar:0/1 -> w:154 h:308 fmt:bgr24 sar:0/1 flags:0x4
Output #0, image2, to '/storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp':
Metadata:
major_brand : mp42
minor_version : 1
compatible_brands: mp41mp42isom
encoder : Lavf58.12.100
Stream #0:0(und), 0, 1001/30000: Video: bmp, 1 reference frame, bgr24(left), 154x308, 0/1, q=2-31, 200 kb/s, 29.97 fps, 29.97 tbn, 29.97 tbc (default)
Metadata:
encoder : Lavc58.18.100 bmp
creation_time : 2018-07-20T19:57:44.000000Z
handler_name : Core Media Video
Side data:
displaymatrix: rotation of -0.00 degrees
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdbf80] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdc300] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdc680] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdca00] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdcd80] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdd100] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[h264 @ 0xf1fdd480] nal_unit_type: 1, nal_ref_idc: 1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
Last message repeated 9 times
[out_0_0 @ 0xf1fe94e0] EOF on sink link out_0_0:default.
No more output streams to write to, finishing.
frame= 0 fps=0.0 q=0.0 Lsize=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Input file #0 (/storage/emulated/0/Download/email/VID-20180720-WA0001.mp4):
Input stream #0:0 (video): 16 packets read (153990 bytes); 16 frames decoded; 
Input stream #0:1 (audio): 0 packets read (0 bytes); 
Total: 16 packets (153990 bytes) demuxed
Output file #0 (/storage/emulated/0/Android/data/com.my.package/files/ThumbTemp/thumb1.bmp):
Output stream #0:0 (video): 0 frames encoded; 0 packets muxed (0 bytes); 
Total: 0 packets (0 bytes) muxed
Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)
16 frames successfully decoded, 0 decoding errors
[AVIOContext @ 0xf1fbc000] Statistics: 221436 bytes read, 1 seeks