Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (64)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (5280)

  • Why JPEG image looks different in two videos ?

    28 septembre 2015, par user606521

    I have two frames :

    f1.jpg
    f1.jpg
    f2.jpg
    f2.jpg

    And I am creating two videos :

    ./ffmpeg -i ./f%d.jpg -r 30 -y m1.mp4 # both frames 1 and 2
    ./ffmpeg -i ./f2.jpg -r 30 -y m2.mp4 # only frame 2

    For some reason frame 2 looks different in these two videos

    Frame 2 from m1.mp4 :

    $ ./ffmpeg -i m1.mp4 out1_%d.jpg
    ffmpeg version 2.7.2 Copyright (c) 2000-2015 the FFmpeg developers
     built with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
     configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --disable-doc --arch=x86_64 --enable-runtime-cpudetect
     libavutil      54. 27.100 / 54. 27.100
     libavcodec     56. 41.100 / 56. 41.100
     libavformat    56. 36.100 / 56. 36.100
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 16.101 /  5. 16.101
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  2.100 /  1.  2.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'm1.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf56.36.100
     Duration: 00:00:00.07, start: 0.000000, bitrate: 478 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 980x100, 381 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
       Metadata:
         handler_name    : VideoHandler
    [swscaler @ 0x7f7f9a000600] deprecated pixel format used, make sure you did set range correctly
    Output #0, image2, to 'out1_%d.jpg':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf56.36.100
       Stream #0:0(und): Video: mjpeg, yuvj420p(pc), 980x100, q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc (default)
       Metadata:
         handler_name    : VideoHandler
         encoder         : Lavc56.41.100 mjpeg
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
    Press [q] to stop, [?] for help
    frame=    2 fps=0.0 q=1.6 Lsize=N/A time=00:00:00.06 bitrate=N/A    
    video:6kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

    f1_out.jpg

    Frame 2 from m2.mp4 :

    $ ./ffmpeg -i m2.mp4 out2_%d.jpg
    ffmpeg version 2.7.2 Copyright (c) 2000-2015 the FFmpeg developers
     built with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
     configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --disable-doc --arch=x86_64 --enable-runtime-cpudetect
     libavutil      54. 27.100 / 54. 27.100
     libavcodec     56. 41.100 / 56. 41.100
     libavformat    56. 36.100 / 56. 36.100
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 16.101 /  5. 16.101
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  2.100 /  1.  2.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'm2.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf56.36.100
     Duration: 00:00:00.03, start: 0.000000, bitrate: 824 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc), 980x100 [SAR 1:1 DAR 49:5], 648 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
       Metadata:
         handler_name    : VideoHandler
    Output #0, image2, to 'out2_%d.jpg':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf56.36.100
       Stream #0:0(und): Video: mjpeg, yuvj420p(pc), 980x100 [SAR 1:1 DAR 49:5], q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc (default)
       Metadata:
         handler_name    : VideoHandler
         encoder         : Lavc56.41.100 mjpeg
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
    Press [q] to stop, [?] for help
    frame=    1 fps=0.0 q=1.6 Lsize=N/A time=00:00:00.03 bitrate=N/A    
    video:4kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

    f2_out.jpg

    It seems that frame 2 looks different when combined with frame 1 in a video. One thing I noticed that when dumping frames from m1.mp4 (two frames combined) there is a warning in ffmpeg output :

    [swscaler @ 0x7f7f9a000600] deprecated pixel format used, make sure you did set range correctly

    Also two frame images have same dimmensions (980x100) and their format seems to be same :

    $ file -b f1.jpg
    JPEG image data, JFIF standard 1.01, comment: "CS=ITU601??"
    $ file -b f2.jpg
    JPEG image data, JFIF standard 1.01

    So why this is happening ? I am struggling with this issue for quite a long time - for some reason some JPEGs somehow are in different "format ?" than others and make mess in videos...


    You can download first two images included in this question and test commands on your own.


    $ identify -verbose f1.jpg
    Image: f1.jpg
     Format: JPEG (Joint Photographic Experts Group JFIF format)
     Mime type: image/jpeg
     Class: DirectClass
     Geometry: 980x100+0+0
     Resolution: 552x551
     Print size: 1.77536x0.181488
     Units: PixelsPerInch
     Type: Palette
     Endianess: Undefined
     Colorspace: sRGB
     Depth: 8-bit
     Channel depth:
       red: 8-bit
       green: 8-bit
       blue: 8-bit
     Channel statistics:
       Red:
         min: 189 (0.741176)
         max: 233 (0.913725)
         mean: 208.261 (0.81671)
         standard deviation: 14.1741 (0.0555848)
         kurtosis: -1.40078
         skewness: 0.332664
       Green:
         min: 189 (0.741176)
         max: 233 (0.913725)
         mean: 208.261 (0.81671)
         standard deviation: 14.1741 (0.0555848)
         kurtosis: -1.40078
         skewness: 0.332664
       Blue:
         min: 189 (0.741176)
         max: 233 (0.913725)
         mean: 208.26 (0.816706)
         standard deviation: 14.1756 (0.0555905)
         kurtosis: -1.40068
         skewness: 0.332424
     Image statistics:
       Overall:
         min: 189 (0.741176)
         max: 233 (0.913725)
         mean: 208.261 (0.816709)
         standard deviation: 14.1746 (0.0555867)
         kurtosis: -1.40075
         skewness: 0.332584
     Colors: 47
     Histogram:
          368: (189,189,189) #BDBDBD grey74
         1984: (190,190,190) #BEBEBE grey
           28: (191,191,189) #BFBFBD srgb(191,191,189)
         3780: (191,191,191) #BFBFBF grey75
           28: (192,192,190) #C0C0BE srgb(192,192,190)
         6628: (192,192,192) #C0C0C0 silver
         5664: (193,193,193) #C1C1C1 srgb(193,193,193)
         4608: (194,194,194) #C2C2C2 grey76
         4480: (195,195,195) #C3C3C3 srgb(195,195,195)
         3328: (196,196,196) #C4C4C4 grey77
         2592: (197,197,197) #C5C5C5 srgb(197,197,197)
         3072: (198,198,198) #C6C6C6 srgb(198,198,198)
         2272: (199,199,199) #C7C7C7 grey78
         2112: (200,200,200) #C8C8C8 srgb(200,200,200)
         2112: (201,201,201) #C9C9C9 grey79
         1920: (202,202,202) #CACACA srgb(202,202,202)
         1728: (203,203,203) #CBCBCB srgb(203,203,203)
         1760: (204,204,204) #CCCCCC grey80
         1696: (205,205,205) #CDCDCD srgb(205,205,205)
         1248: (206,206,206) #CECECE srgb(206,206,206)
         1536: (207,207,207) #CFCFCF grey81
         1504: (208,208,208) #D0D0D0 srgb(208,208,208)
         1344: (209,209,209) #D1D1D1 grey82
         1536: (210,210,210) #D2D2D2 srgb(210,210,210)
         1472: (211,211,211) #D3D3D3 LightGray
         1088: (212,212,212) #D4D4D4 grey83
         1472: (213,213,213) #D5D5D5 srgb(213,213,213)
         1536: (214,214,214) #D6D6D6 grey84
         1344: (215,215,215) #D7D7D7 srgb(215,215,215)
         1184: (216,216,216) #D8D8D8 srgb(216,216,216)
         1408: (217,217,217) #D9D9D9 grey85
         1472: (218,218,218) #DADADA srgb(218,218,218)
         1216: (219,219,219) #DBDBDB grey86
         1280: (220,220,220) #DCDCDC gainsboro
         1536: (221,221,221) #DDDDDD srgb(221,221,221)
         1472: (222,222,222) #DEDEDE grey87
         1600: (223,223,223) #DFDFDF srgb(223,223,223)
         1696: (224,224,224) #E0E0E0 grey88
         1792: (225,225,225) #E1E1E1 srgb(225,225,225)
         1728: (226,226,226) #E2E2E2 srgb(226,226,226)
         1952: (227,227,227) #E3E3E3 grey89
         2272: (228,228,228) #E4E4E4 srgb(228,228,228)
         2752: (229,229,229) #E5E5E5 grey90
         4512: (230,230,230) #E6E6E6 srgb(230,230,230)
         4672: (231,231,231) #E7E7E7 srgb(231,231,231)
          640: (232,232,232) #E8E8E8 grey91
          576: (233,233,233) #E9E9E9 srgb(233,233,233)
     Rendering intent: Perceptual
     Gamma: 0.454545
     Chromaticity:
       red primary: (0.64,0.33)
       green primary: (0.3,0.6)
       blue primary: (0.15,0.06)
       white point: (0.3127,0.329)
     Background color: white
     Border color: srgb(223,223,223)
     Matte color: grey74
     Transparent color: black
     Interlace: None
     Intensity: Undefined
     Compose: Over
     Page geometry: 980x100+0+0
     Dispose: Undefined
     Iterations: 0
     Compression: JPEG
     Orientation: Undefined
     Properties:
       comment: CS=ITU601
       date:create: 2015-09-28T21:47:26+02:00
       date:modify: 2015-09-28T21:47:26+02:00
       jpeg:colorspace: 2
       jpeg:quality: 92
       jpeg:sampling-factor: 2x2,1x1,1x1
       signature: ca599d8ad07c79b36837cb9f4811d83e236b8d4a4cdfada8d60c4aa330f28f38
     Artifacts:
       filename: f1.jpg
       verbose: true
     Tainted: False
     Filesize: 1.36KB
     Number pixels: 98K
     Pixels per second: 9.8MB
     User time: 0.000u
     Elapsed time: 0:01.009
     Version: ImageMagick 6.8.7-0 2013-10-28 Q16 http://www.imagemagick.org

    $ identify -verbose f2.jpg
    Image: f2.jpg
     Format: JPEG (Joint Photographic Experts Group JFIF format)
     Mime type: image/jpeg
     Class: DirectClass
     Geometry: 980x100+0+0
     Resolution: 72x72
     Print size: 13.6111x1.38889
     Units: PixelsPerInch
     Type: Palette
     Endianess: Undefined
     Colorspace: sRGB
     Depth: 8-bit
     Channel depth:
       red: 8-bit
       green: 8-bit
       blue: 8-bit
     Channel statistics:
       Red:
         min: 186 (0.729412)
         max: 254 (0.996078)
         mean: 242.844 (0.952329)
         standard deviation: 11.0845 (0.0434688)
         kurtosis: 4.24417
         skewness: -2.17102
       Green:
         min: 186 (0.729412)
         max: 254 (0.996078)
         mean: 242.844 (0.952329)
         standard deviation: 11.0845 (0.0434688)
         kurtosis: 4.24417
         skewness: -2.17102
       Blue:
         min: 186 (0.729412)
         max: 254 (0.996078)
         mean: 242.842 (0.952323)
         standard deviation: 11.0886 (0.0434848)
         kurtosis: 4.24235
         skewness: -2.17103
     Image statistics:
       Overall:
         min: 186 (0.729412)
         max: 254 (0.996078)
         mean: 242.843 (0.952327)
         standard deviation: 11.0859 (0.0434741)
         kurtosis: 4.24356
         skewness: -2.17102
     Colors: 91
     Histogram:
            2: (186,186,186) #BABABA grey73
            2: (187,187,187) #BBBBBB srgb(187,187,187)
            1: (189,189,189) #BDBDBD grey74
            5: (190,190,190) #BEBEBE grey
            2: (191,191,191) #BFBFBF grey75
           20: (192,192,192) #C0C0C0 silver
           28: (193,193,193) #C1C1C1 srgb(193,193,193)
           37: (194,194,194) #C2C2C2 grey76
           46: (195,195,195) #C3C3C3 srgb(195,195,195)
           59: (196,196,196) #C4C4C4 grey77
          108: (197,197,197) #C5C5C5 srgb(197,197,197)
          134: (198,198,198) #C6C6C6 srgb(198,198,198)
          145: (199,199,199) #C7C7C7 grey78
          188: (200,200,200) #C8C8C8 srgb(200,200,200)
          230: (201,201,201) #C9C9C9 grey79
          241: (202,202,202) #CACACA srgb(202,202,202)
          236: (203,203,203) #CBCBCB srgb(203,203,203)
          252: (204,204,204) #CCCCCC grey80
          240: (205,205,205) #CDCDCD srgb(205,205,205)
          243: (206,206,206) #CECECE srgb(206,206,206)
            1: (207,207,205) #CFCFCD srgb(207,207,205)
          250: (207,207,207) #CFCFCF grey81
            1: (208,208,206) #D0D0CE srgb(208,208,206)
          267: (208,208,208) #D0D0D0 srgb(208,208,208)
            2: (209,209,207) #D1D1CF srgb(209,209,207)
          226: (209,209,209) #D1D1D1 grey82
            3: (210,210,208) #D2D2D0 srgb(210,210,208)
          193: (210,210,210) #D2D2D2 srgb(210,210,210)
            5: (211,211,209) #D3D3D1 srgb(211,211,209)
          215: (211,211,211) #D3D3D3 LightGray
            7: (212,212,210) #D4D4D2 srgb(212,212,210)
          227: (212,212,212) #D4D4D4 grey83
           11: (213,213,211) #D5D5D3 srgb(213,213,211)
          250: (213,213,213) #D5D5D5 srgb(213,213,213)
            4: (214,214,212) #D6D6D4 srgb(214,214,212)
          291: (214,214,214) #D6D6D6 grey84
           16: (215,215,213) #D7D7D5 srgb(215,215,213)
          307: (215,215,215) #D7D7D7 srgb(215,215,215)
            1: (216,216,214) #D8D8D6 srgb(216,216,214)
          371: (216,216,216) #D8D8D8 srgb(216,216,216)
            4: (217,217,215) #D9D9D7 srgb(217,217,215)
          355: (217,217,217) #D9D9D9 grey85
            3: (218,218,216) #DADAD8 srgb(218,218,216)
          398: (218,218,218) #DADADA srgb(218,218,218)
            3: (219,219,217) #DBDBD9 srgb(219,219,217)
          404: (219,219,219) #DBDBDB grey86
            5: (220,220,218) #DCDCDA srgb(220,220,218)
          435: (220,220,220) #DCDCDC gainsboro
            1: (221,221,219) #DDDDDB srgb(221,221,219)
          489: (221,221,221) #DDDDDD srgb(221,221,221)
            2: (222,222,220) #DEDEDC srgb(222,222,220)
          569: (222,222,222) #DEDEDE grey87
            1: (223,223,221) #DFDFDD srgb(223,223,221)
          552: (223,223,223) #DFDFDF srgb(223,223,223)
            2: (224,224,222) #E0E0DE srgb(224,224,222)
          595: (224,224,224) #E0E0E0 grey88
            2: (225,225,223) #E1E1DF srgb(225,225,223)
          645: (225,225,225) #E1E1E1 srgb(225,225,225)
          736: (226,226,226) #E2E2E2 srgb(226,226,226)
            3: (227,227,225) #E3E3E1 srgb(227,227,225)
          646: (227,227,227) #E3E3E3 grey89
            1: (228,228,226) #E4E4E2 srgb(228,228,226)
          707: (228,228,228) #E4E4E4 srgb(228,228,228)
            1: (229,229,227) #E5E5E3 srgb(229,229,227)
          667: (229,229,229) #E5E5E5 grey90
            1: (230,230,228) #E6E6E4 srgb(230,230,228)
          759: (230,230,230) #E6E6E6 srgb(230,230,230)
          767: (231,231,231) #E7E7E7 srgb(231,231,231)
          788: (232,232,232) #E8E8E8 grey91
          862: (233,233,233) #E9E9E9 srgb(233,233,233)
          880: (234,234,234) #EAEAEA srgb(234,234,234)
          889: (235,235,235) #EBEBEB grey92
          863: (236,236,236) #ECECEC srgb(236,236,236)
          868: (237,237,237) #EDEDED grey93
         1032: (238,238,238) #EEEEEE srgb(238,238,238)
          878: (239,239,239) #EFEFEF srgb(239,239,239)
         1083: (240,240,240) #F0F0F0 grey94
         1035: (241,241,241) #F1F1F1 srgb(241,241,241)
         1247: (242,242,242) #F2F2F2 grey95
         1610: (243,243,243) #F3F3F3 srgb(243,243,243)
         2084: (244,244,244) #F4F4F4 srgb(244,244,244)
         3473: (245,245,245) #F5F5F5 grey96
         6350: (246,246,246) #F6F6F6 srgb(246,246,246)
         9152: (247,247,247) #F7F7F7 grey97
        14755: (248,248,248) #F8F8F8 srgb(248,248,248)
        21183: (249,249,249) #F9F9F9 srgb(249,249,249)
        12507: (250,250,250) #FAFAFA grey98
         2516: (251,251,251) #FBFBFB srgb(251,251,251)
          305: (252,252,252) #FCFCFC grey99
           16: (253,253,253) #FDFDFD srgb(253,253,253)
            4: (254,254,254) #FEFEFE srgb(254,254,254)
     Rendering intent: Perceptual
     Gamma: 0.454545
     Chromaticity:
       red primary: (0.64,0.33)
       green primary: (0.3,0.6)
       blue primary: (0.15,0.06)
       white point: (0.3127,0.329)
     Background color: white
     Border color: srgb(223,223,223)
     Matte color: grey74
     Transparent color: black
     Interlace: None
     Intensity: Undefined
     Compose: Over
     Page geometry: 980x100+0+0
     Dispose: Undefined
     Iterations: 0
     Compression: JPEG
     Orientation: Undefined
     Properties:
       date:create: 2015-09-28T21:48:30+02:00
       date:modify: 2015-09-28T21:48:30+02:00
       jpeg:colorspace: 2
       jpeg:quality: 92
       jpeg:sampling-factor: 2x2,1x1,1x1
       signature: f718ab157fae4ff0395eaf07a0165897fd9de558eaed00586530690d39e5ed23
     Artifacts:
       filename: f2.jpg
       verbose: true
     Tainted: False
     Filesize: 4.32KB
     Number pixels: 98K
     Pixels per second: 0B
     User time: 0.000u
     Elapsed time: 0:01.000
     Version: ImageMagick 6.8.7-0 2013-10-28 Q16 http://www.imagemagick.org
  • FFmpeg : generating H264 video in c++

    13 octobre 2015, par EZEROFIVE EMD

    Im using ffmpeg library in windows with vs2012 to convert a series of images into Mp4 with H264 encoding. Im new to FFMPEG.

    Below is my code. Everything went fine. Video is created.But i can play the video in vlc player only if i changed the extension to ".h264", also when i check for codec information it says "H264 - MPEG-4 AVC (part 10) (h264)". but when i checked the same for other mp4 videos which are downloaded from web. it says "H264 - MPEG-4" AVC (part 10) (avc1)". I dont understand where it went wrong. Also i searched a lot, some says like add SPS and PPS.

    const uint8_t sps[] = { 0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0x00,
    0x0a, 0xf8, 0x41, 0xa2 };
    const uint8_t pps[] = { 0x00, 0x00, 0x00, 0x01, 0x68, 0xce,
    0x38, 0x80 };

    So i added the above value to video file before i add the image stream. But no luck.
    Can anyone help on this..Thanks in advance.

    const uint8_t sps[] = { 0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0x00,
    0x0a, 0xf8, 0x41, 0xa2 };
    const uint8_t pps[] = { 0x00, 0x00, 0x00, 0x01, 0x68, 0xce,
    0x38, 0x80 };
    const uint8_t slice_header[] = { 0x00, 0x00, 0x00, 0x01, 0x05, 0x88,
    0x84, 0x21, 0xa0 };
    const uint8_t macroblock_header[] = { 0x0d, 0x00 };

    const uint8_t spspps[] = { 0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0x00, 0x0a, 0xf8, 0x41, 0xa2,
                              0x00, 0x00, 0x00, 0x01, 0x68, 0xce, 0x38, 0x80
                            };

    int ff_load_image(uint8_t *data[4], int linesize[4],
           int *w, int *h, enum PixelFormat *pix_fmt,
           const char *filename, void *log_ctx)
       {
           AVInputFormat *iformat = NULL;
           AVFormatContext *format_ctx = NULL;
           AVCodec *codec=NULL;
           AVCodecContext *codec_ctx=NULL;
           AVFrame *frame=NULL;
           int frame_decoded, ret = 0;
           AVPacket pkt;

           av_register_all();

           iformat = av_find_input_format("image2");
           if ((ret = avformat_open_input(&format_ctx, filename, iformat, NULL)) < 0) {
               return ret;
           }

           codec_ctx = format_ctx->streams[0]->codec;
           codec = avcodec_find_decoder(codec_ctx->codec_id);
           if (!codec) {
               ret = AVERROR(EINVAL);
               goto end;
           }

           if ((ret = avcodec_open2(codec_ctx, codec, NULL)) < 0) {
               goto end;
           }

           //if (!(frame = avcodec_alloc_frame()) ) {
           if (!(frame = av_frame_alloc()) ) {
               ret = AVERROR(ENOMEM);
               goto end;
           }

           ret = av_read_frame(format_ctx, &pkt);
           if (ret < 0) {
               goto end;
           }

           ret = avcodec_decode_video2(codec_ctx, frame, &frame_decoded, &pkt);
           if (ret < 0 || !frame_decoded) {
               goto end;
           }
           ret = 0;

           *w       = frame->width;
           *h       = frame->height;
           *pix_fmt = (PixelFormat)frame->format;

           if ((ret = av_image_alloc(data, linesize, *w, *h, (AVPixelFormat)*pix_fmt, 16)) < 0)
               goto end;
           ret = 0;

           av_image_copy(data, linesize, (const uint8_t **)frame->data, frame->linesize, (AVPixelFormat)*pix_fmt, *w, *h);

    end:
           if(codec_ctx) { avcodec_close(codec_ctx); }
           if(format_ctx) { avformat_close_input(&format_ctx); }
           if(frame) { av_freep(&frame); }
           av_free_packet(&pkt);
                   return ret;
       }

       int load_image_into_frame(AVFrame *frame, const char *filename)
       {
           int retval = -1, res;
           static struct SwsContext *sws_ctx;
           uint8_t *image_data[4];
           int linesize[4];
           int source_width, source_height;
           enum PixelFormat source_fmt;

           res = ff_load_image(image_data, linesize, &source_width, &source_height, &source_fmt, filename, NULL);

           if (source_fmt != frame->format) {
               sws_ctx = sws_getContext(source_width, source_height, (AVPixelFormat)source_fmt,
                   frame->width, frame->height, (AVPixelFormat)frame->format,
                   sws_flags, NULL, NULL, NULL);

               sws_scale(sws_ctx,
                   (const uint8_t * const *)image_data, linesize,
                   0, frame->height, frame->data, frame->linesize);
           }

           retval = 0;
    error:
           av_freep(&image_data[0]);
           sws_freeContext(sws_ctx);
           return retval;
       }

       int write_frame_to_file(FILE *file, AVFrame *frame, AVCodecContext *codec_context, AVPacket *pkt) {
           int res, got_output;
           av_init_packet(pkt);
           pkt->data = NULL;
           pkt->size = 0;

           /* generate synthetic video */
           frame->pts += 30;

           res = avcodec_encode_video2(codec_context, pkt, frame, &got_output);

           if (got_output) {

               fwrite(pkt->data, 1, pkt->size, file);
               av_free_packet(pkt);
           }
           return 0;
    error:
           return -1;
       }

       int write_image_to_file(FILE *file, const char *filename, int count, AVFrame *frame, AVCodecContext *codec_context, AVPacket *pkt) {
           int res, i;
           res = load_image_into_frame(frame, filename);

           for (i = 0; i < count; i++) {

               res = write_frame_to_file(file, frame, codec_context, pkt);
           }

           return 0;
    error:
           return -1;
       }

       int write_delayed_frames_to_file(FILE *file, AVFrame *frame, AVCodecContext *codec_context, AVPacket *pkt) {
           int res, got_output;

           for (got_output = 1; got_output;) {
               res = avcodec_encode_video2(codec_context, pkt, NULL, &got_output);

               if (got_output) {
                   fwrite(pkt->data, 1, pkt->size, file);
                   av_free_packet(pkt);
               }
           }

           return 0;
    error:
           return -1;
       }

       AVCodecContext *get_codec_context(int width, int height, int fps)
       {
           int res;
           avcodec_register_all();

           AVCodec *codec;
           AVCodecContext *codec_context = NULL;

           codec = avcodec_find_encoder(AV_CODEC_ID_H264);

           codec_context = avcodec_alloc_context3(codec);

           codec_context->bit_rate = 441000;
           codec_context->width = width;
           codec_context->height = height;
           AVRational temp_113 = {1, fps};
           AVRational temp_114 = {fps, 1};
           codec_context->time_base= temp_113;
           codec_context->gop_size = 10;
           codec_context->max_b_frames=1;
           codec_context->pix_fmt = AV_PIX_FMT_YUV420P;        

           res = avcodec_open2(codec_context, codec, NULL);

           return codec_context;
    error:
           return NULL;
       }

       AVFrame *get_av_frame(AVCodecContext *codec_context) {
           int res;
           AVFrame *frame;

           frame = av_frame_alloc();
           frame->height = codec_context->height;
           frame->width = codec_context->width;
           frame->format = codec_context->pix_fmt;
           frame->pts = 0;

           res = av_image_alloc(frame->data, frame->linesize, frame->width, frame->height, (AVPixelFormat)frame->format, 1);

           return frame;
    error:
           return NULL;
       }

       int main(int argc, char **argv)
       {
           const char *filename = "result video\\test.mp4";
           FILE *file=NULL;
           int res, retval=-1;
           AVCodecContext *codec_context= NULL;
           AVFrame *frame=NULL;
           AVPacket pkt;
           uint8_t endcode[] = { 0, 0, 1, 0xb7 };

           codec_context = get_codec_context(1920, 1080, 30);

           file = fopen(filename, "wb");
           //check(file != NULL, "could not open destination file %s", filename);

           frame = get_av_frame(codec_context);        

           //fwrite(sps, 1, sizeof(sps), file);
           //fwrite(pps, 1, sizeof(pps), file);

           /*codec_context->extradata = (uint8_t *)malloc(sizeof(uint8_t) * sizeof(spspps));

           for(unsigned int index = 0; index < sizeof(spspps); index++)
           {
               codec_context->extradata[index] = spspps[index];
           }

           codec_context->extradata_size = (int)sizeof(spspps);*/

           codec_context->flags |= CODEC_FLAG_GLOBAL_HEADER;

           int i, frames= 51;
           for (i = 0; i < frames; i++) {
               std::stringstream ss;
               ss<<"\\frames\\out"<<( i + 1)<<".jpg";
               res = write_image_to_file(file, ss.str().c_str(), 3, frame, codec_context, &pkt);
           }


           res = write_delayed_frames_to_file(file, frame, codec_context, &pkt);
           fwrite(endcode, 1, sizeof(endcode), file);

           retval = 0;
    error:
           if (file)
               fclose(file);
           if (codec_context) {
               avcodec_close(codec_context);
               av_free(codec_context);
           }
           if (frame) {
               av_freep(&frame->data[0]);
               av_free(frame);
           }
           return retval;
       }
    }
  • Android + OpenCV + video recorder

    19 février 2016, par t0m

    I have problem with code, which is functional only for Genymotion device (Android 4.1.1), but for Genymotion device 5.0.1 and real device Huawei honor 4c Android 4.4.2 not.

    I have imported OpenCV 3.1 to Android studio by : http://stackoverflow.com/a/27421494/4244605
    I added JavaCV with FFmpeg by : https://github.com/bytedeco/javacv

    minSdkVersion 15
    compileSdkVersion 23

    OpenCVCameraActivity.java :

    package co.timeiseverything.pstimeiseverything;

    import android.app.Activity;
    import android.hardware.Camera;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Menu;
    import android.view.MenuItem;
    import android.view.MotionEvent;
    import android.view.SubMenu;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.WindowManager;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    import org.opencv.android.BaseLoaderCallback;
    import org.opencv.android.CameraBridgeViewBase;
    import org.opencv.android.LoaderCallbackInterface;
    import org.opencv.android.OpenCVLoader;
    import org.opencv.core.Mat;

    import java.io.File;
    import java.nio.ShortBuffer;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.List;
    import java.util.ListIterator;

    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity implements
           CameraBridgeViewBase.CvCameraViewListener2,
           View.OnTouchListener {

       //name of activity, for DEBUGGING
       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private OpenCVCameraPreview mOpenCvCameraView;
       private List mResolutionList;
       private MenuItem[] mEffectMenuItems;
       private SubMenu mColorEffectsMenu;
       private MenuItem[] mResolutionMenuItems;
       private SubMenu mResolutionMenu;

       private static long frameCounter = 0;

       long startTime = 0;
       private Mat edgesMat;
       boolean recording = false;
       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 1920;
       private int imageHeight = 1080;
       private int frameRate = 30;
       private Frame yuvImage = null;
       private File ffmpeg_link;
       private FFmpegFrameRecorder recorder;

       /* audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;
       ShortBuffer[] samples;


       private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
           @Override
           public void onManagerConnected(int status) {
               switch (status) {
                   case LoaderCallbackInterface.SUCCESS:
                       Log.i(TAG, "OpenCV loaded successfully");
                       mOpenCvCameraView.enableView();
                       mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
                   break;
                   default:
                       super.onManagerConnected(status);
                   break;
               }
           }
       };

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

           try {
               setContentView(R.layout.activity_opencv);

               mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
               mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
               mOpenCvCameraView.setCvCameraViewListener(this);

               //mOpenCvCameraView.enableFpsMeter();

               ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       @Override
       protected void onRestart() {
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
           super.onRestart();
       }

       @Override
       protected void onStart() {
           if (Static.DEBUG) Log.i(TAG, "onStart()");
           super.onStart();
       }

       @Override
       protected void onResume() {
           if (Static.DEBUG) Log.i(TAG, "onResume()");
           super.onResume();

           if (!OpenCVLoader.initDebug()) {
               Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
               OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_11, this, mLoaderCallback);
           } else {
               Log.i(TAG, "OpenCV library found inside package. Using it!");
               mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
           }

       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           super.onCreateOptionsMenu(menu);

           List<string> effects = mOpenCvCameraView.getEffectList();

           if (effects == null) {
               Log.e(TAG, "Color effects are not supported by device!");
               return true;
           }

           mColorEffectsMenu = menu.addSubMenu("Color Effect");
           mEffectMenuItems = new MenuItem[effects.size()];

           int idx = 0;
           ListIterator<string> effectItr = effects.listIterator();
           while(effectItr.hasNext()) {
               String element = effectItr.next();
               mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
               idx++;
           }

           mResolutionMenu = menu.addSubMenu("Resolution");
           mResolutionList = mOpenCvCameraView.getResolutionList();
           mResolutionMenuItems = new MenuItem[mResolutionList.size()];

           ListIterator resolutionItr = mResolutionList.listIterator();
           idx = 0;
           while(resolutionItr.hasNext()) {
               Camera.Size element = resolutionItr.next();
               mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
                       Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
               idx++;
           }

           return true;
       }

       @Override
       protected void onPause() {
           if (Static.DEBUG) Log.i(TAG, "onPause()");
           super.onPause();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();

       }

       @Override
       protected void onStop() {
           if (Static.DEBUG) Log.i(TAG, "onStop()");
           super.onStop();
       }

       @Override
       protected void onDestroy() {
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");
           super.onDestroy();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();
       }

       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {

           ++frameCounter;
           //Log.i(TAG, "Frame number: "+frameCounter);

           return inputFrame.rgba();
       }


       @Override
       public void onCameraViewStarted(int width, int height) {
           edgesMat = new Mat();
       }

       @Override
       public void onCameraViewStopped() {
           if (edgesMat != null)
               edgesMat.release();

           edgesMat = null;
       }

       public boolean onOptionsItemSelected(MenuItem item) {
           Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
           if (item.getGroupId() == 1)
           {
               mOpenCvCameraView.setEffect((String) item.getTitle());
               Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
           } else if (item.getGroupId() == 2) {
               int id = item.getItemId();
               Camera.Size resolution = mResolutionList.get(id);
               mOpenCvCameraView.setResolution(resolution);
               resolution = mOpenCvCameraView.getResolution();
               String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
               Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
           }

           return true;
       }

       @Override
       public boolean onTouch(View v, MotionEvent event) {
           Log.i(TAG,"onTouch event");
           SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
           String currentDateandTime = sdf.format(new Date());
           String fileName = Environment.getExternalStorageDirectory().getPath() +
                   "/sample_picture_" + currentDateandTime + ".jpg";
           mOpenCvCameraView.takePicture(fileName);
           Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
           return false;
       }

       /**
        * Click to ImageButton to start recording.
        */
       public void onClickBtnStartRecord2(View v) {
           if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");

           if(!recording)
               startRecording();
           else
               stopRecording();
       }

       private void startRecording() {
           if (Static.DEBUG) Log.i(TAG, "startRecording()");
           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();
           } catch(FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       private void stopRecording() {
           if (Static.DEBUG) Log.i(TAG, "stopRecording()");

           runAudioThread = false;
           try {
               audioThread.join();
           } catch(InterruptedException e) {
               e.printStackTrace();
           }
           audioRecordRunnable = null;
           audioThread = null;

           if(recorder != null &amp;&amp; recording) {

               recording = false;
               Log.v(TAG, "Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch(FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;
           }
       }


       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.w(TAG, "init recorder");
           try {

               if (yuvImage == null) {
                   yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                   Log.i(TAG, "create yuvImage");
               }

               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
               recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
               recorder.setFormat("mp4");
               recorder.setSampleRate(sampleAudioRateInHz);
               // Set in the surface changed method
               recorder.setFrameRate(frameRate);

               Log.i(TAG, "recorder initialize success");

               audioRecordRunnable = new AudioRecordRunnable();
               audioThread = new Thread(audioRecordRunnable);
               runAudioThread = true;
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               Log.d(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while(runAudioThread) {
                   //Log.v(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if(bufferReadResult > 0) {
                       Log.v(TAG, "bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if(recording) {
                               try {
                                   recorder.recordSamples(audioData);
                                   //Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                               } catch(FFmpegFrameRecorder.Exception e) {
                                   Log.v(TAG, e.getMessage());
                                   e.printStackTrace();
                               }
                       }
                   }
               }
               Log.v(TAG, "AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if(audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(TAG, "audioRecord released");
               }
           }
       }
    }
    </string></string>

    OpenCVCameraPreview.java :

    package co.timeiseverything.pstimeiseverything;

    import android.content.Context;
    import android.hardware.Camera;
    import android.util.AttributeSet;
    import android.util.Log;

    import org.opencv.android.JavaCameraView;

    import java.io.FileOutputStream;
    import java.util.List;

    public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {

       private static final String TAG =  OpenCVCameraPreview.class.getSimpleName();
       private String mPictureFileName;

       public OpenCVCameraPreview(Context context, AttributeSet attrs) {
           super(context, attrs);
       }

       public List<string> getEffectList() {
           return mCamera.getParameters().getSupportedColorEffects();
       }

       public boolean isEffectSupported() {
           return (mCamera.getParameters().getColorEffect() != null);
       }

       public String getEffect() {
           return mCamera.getParameters().getColorEffect();
       }

       public void setEffect(String effect) {
           Camera.Parameters params = mCamera.getParameters();
           params.setColorEffect(effect);
           mCamera.setParameters(params);
       }

       public List getResolutionList() {
           return mCamera.getParameters().getSupportedPreviewSizes();
       }

       public void setResolution(Camera.Size resolution) {
           disconnectCamera();
           mMaxHeight = resolution.height;
           mMaxWidth = resolution.width;
           connectCamera(getWidth(), getHeight());
       }

       public Camera.Size getResolution() {
           return mCamera.getParameters().getPreviewSize();
       }

       public void takePicture(final String fileName) {
           Log.i(TAG, "Taking picture");
           this.mPictureFileName = fileName;
           // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
           // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
           mCamera.setPreviewCallback(null);

           // PictureCallback is implemented by the current class
           mCamera.takePicture(null, null, this);
       }

       @Override
       public void onPictureTaken(byte[] data, Camera camera) {
           Log.i(TAG, "Saving a bitmap to file");
           // The camera preview was automatically stopped. Start it again.
           mCamera.startPreview();
           mCamera.setPreviewCallback(this);

           // Write the image in a file (in jpeg format)
           try {
               FileOutputStream fos = new FileOutputStream(mPictureFileName);

               fos.write(data);
               fos.close();

           } catch (java.io.IOException e) {
               Log.e("PictureDemo", "Exception in photoCallback", e);
           }

       }
    }
    </string>

    Gradle :

    apply plugin: 'com.android.application'

    android {
       compileSdkVersion 23
       buildToolsVersion "23.0.2"

       defaultConfig {
           applicationId "co.timeiseverything.pstimeiseverything"
           minSdkVersion 15
           targetSdkVersion 23
           versionCode 1
           versionName "1.0"
       }
       buildTypes {
           release {
               minifyEnabled false
               proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
           }
       }

       packagingOptions {
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
       }
    }

    repositories {
       mavenCentral()
    }

    dependencies {
       compile fileTree(include: ['*.jar'], dir: 'libs')
       testCompile 'junit:junit:4.12'
       compile 'com.android.support:appcompat-v7:23.1.1'
       compile 'com.google.android.gms:play-services-appindexing:8.1.0'

       compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'

       compile project(':openCVLibrary310')
    }

    proguard-rules.pro
    Edited by : link

    jniLibs :
    app/src/main/jniLibs :

    armeabi armeabi-v7a arm64-v8a mips mips64 x86 x86_64

    Problem

    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: onClickBtnStartRecord()
    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: startRecording()
    02-19 11:57:37.684 1759-1759/ W/OpenCVCameraActivity: init recorder
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: create yuvImage
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: /storage/emulated/0/stream.mp4
    02-19 11:57:37.696 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: false
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x1d arg 0x18cc3
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6ffffffe arg 0x21c30
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6fffffff arg 0x1
    02-19 11:57:37.838 1759-1759/co.timeiseverything.pstimeiseverything E/art: dlopen("/data/app/co.timeiseverything.pstimeiseverything-2/lib/x86/libjniavutil.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "av_version_info" referenced by "libjniavutil.so"...
    02-19 11:57:37.843 1759-1759/co.timeiseverything.pstimeiseverything I/art: Rejecting re-init on previously-failed class java.lang.Class
    02-19 11:57:37.844 1759-1759/co.timeiseverything.pstimeiseverything E/AndroidRuntime: FATAL EXCEPTION: main
                                           Process: co.timeiseverything.pstimeiseverything, PID: 1759
                                           java.lang.IllegalStateException: Could not execute method of the activity
                                               at android.view.View$1.onClick(View.java:4020)
                                               at android.view.View.performClick(View.java:4780)
                                               at android.view.View$PerformClick.run(View.java:19866)
                                               at android.os.Handler.handleCallback(Handler.java:739)
                                               at android.os.Handler.dispatchMessage(Handler.java:95)
                                               at android.os.Looper.loop(Looper.java:135)
                                               at android.app.ActivityThread.main(ActivityThread.java:5254)
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
                                            Caused by: java.lang.reflect.InvocationTargetException
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at android.view.View$1.onClick(View.java:4015)
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
                                            Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
                                               at java.lang.Class.classForName(Native Method)
                                               at java.lang.Class.forName(Class.java:309)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:413)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:381)
                                               at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1650)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:129)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.initRecorder(OpenCVCameraActivity.java:320)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.startRecording(OpenCVCameraActivity.java:266)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.onClickBtnStartRecord2(OpenCVCameraActivity.java:259)
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at android.view.View$1.onClick(View.java:4015) 
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
    </init></init></clinit>