Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (64)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (5280)

  • Android recording video with overlay view [way 2]

    2 mars 2016, par t0m

    I am trying app in android which have functionality to capture video with overlay views. I tried two ways (1. and 2.).
    1. Via SurfaceView and JavaCV with FFmpeg.
    2. Via OpenCV and JavaCV with FFmpeg.
    3. For API21+ maybe with MediaProjection.

    (Question is divided to two questions, due to stackoverflow length limit.)

    ad 1. Via SurfaceView and JavaCV with FFmpeg :

    Here

    ad 2. Via OpenCV and JavaCV with FFmpeg :

    OpenCVCameraActivity.java :

    import android.app.Activity;
    import android.hardware.Camera;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Menu;
    import android.view.MenuItem;
    import android.view.MotionEvent;
    import android.view.SubMenu;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.WindowManager;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    import org.opencv.android.BaseLoaderCallback;
    import org.opencv.android.CameraBridgeViewBase;
    import org.opencv.android.LoaderCallbackInterface;
    import org.opencv.android.OpenCVLoader;
    import org.opencv.core.Mat;

    import java.io.File;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.List;
    import java.util.ListIterator;

    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity implements
           CameraBridgeViewBase.CvCameraViewListener2,
           View.OnTouchListener {

       //name of activity, for DEBUGGING
       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private OpenCVCameraPreview mOpenCvCameraView;
       private List mResolutionList;
       private MenuItem[] mEffectMenuItems;
       private SubMenu mColorEffectsMenu;
       private MenuItem[] mResolutionMenuItems;
       private SubMenu mResolutionMenu;

       private static long frameCounter = 0;

       long startTime = 0;
       private Mat edgesMat;
       boolean recording = false;
       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 1280;
       private int imageHeight = 720;
       private int frameRate = 30;
       private Frame yuvImage = null;
       private File ffmpeg_link;
       private FFmpegFrameRecorder recorder;

       /*audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;
       ShortBuffer[] samples;


       private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
           @Override
           public void onManagerConnected(int status) {
               switch (status) {
                   case LoaderCallbackInterface.SUCCESS:
                       Log.i(TAG, "OpenCV loaded successfully");
                       mOpenCvCameraView.enableView();
                       mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
                   break;
                   default:
                       super.onManagerConnected(status);
                   break;
               }
           }
       };

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

           Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);

           try {
               setContentView(R.layout.activity_opencv);

               mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
               mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
               mOpenCvCameraView.setCvCameraViewListener(this);

               mOpenCvCameraView.enableFpsMeter();

               ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
               new Thread.UncaughtExceptionHandler() {
                   public void uncaughtException(Thread thread, Throwable ex) {
                       if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
                   }
               };

       @Override
       protected void onRestart() {
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
           super.onRestart();
       }

       @Override
       protected void onStart() {
           if (Static.DEBUG) Log.i(TAG, "onStart()");
           super.onStart();
       }

       @Override
       protected void onResume() {
           if (Static.DEBUG) Log.i(TAG, "onResume()");
           super.onResume();

           if (!OpenCVLoader.initDebug()) {
               Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
               OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_1_0, this, mLoaderCallback);
           } else {
               Log.i(TAG, "OpenCV library found inside package. Using it!");
               mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
           }

       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           super.onCreateOptionsMenu(menu);

           List<string> effects = mOpenCvCameraView.getEffectList();

           if (effects == null) {
               Log.e(TAG, "Color effects are not supported by device!");
               return true;
           }

           mColorEffectsMenu = menu.addSubMenu("Color Effect");
           mEffectMenuItems = new MenuItem[effects.size()];

           int idx = 0;
           ListIterator<string> effectItr = effects.listIterator();
           while(effectItr.hasNext()) {
               String element = effectItr.next();
               mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
               idx++;
           }

           mResolutionMenu = menu.addSubMenu("Resolution");
           mResolutionList = mOpenCvCameraView.getResolutionList();
           mResolutionMenuItems = new MenuItem[mResolutionList.size()];

           ListIterator resolutionItr = mResolutionList.listIterator();
           idx = 0;
           while(resolutionItr.hasNext()) {
               Camera.Size element = resolutionItr.next();
               mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
                       Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
               idx++;
           }

           return true;
       }

       @Override
       protected void onPause() {
           if (Static.DEBUG) Log.i(TAG, "onPause()");
           super.onPause();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();

       }

       @Override
       protected void onStop() {
           if (Static.DEBUG) Log.i(TAG, "onStop()");
           super.onStop();
       }

       @Override
       protected void onDestroy() {
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");
           super.onDestroy();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();
       }

       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {

           ++frameCounter;
           //Log.i(TAG, "Frame number: "+frameCounter);

           final Mat rgba = inputFrame.rgba();
           //Core.flip(rgba, rgba, 1);

           /*if(Static.DEBUG) Log.i(TAG,"rgba.total(): "+rgba.total());
           if(Static.DEBUG) Log.i(TAG,"rgba.channels(): " +rgba.channels());*/
           byte[] data = new byte[(int) (rgba.total() * rgba.channels())];
           rgba.get(0, 0, data);
           //if(Static.DEBUG) Log.i(TAG,"return_buff: "+return_buff.length);

           if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
               startTime = System.currentTimeMillis();
               return rgba;
           }

           // get video data
           if (yuvImage != null &amp;&amp; recording) {
               ByteBuffer b = (ByteBuffer)yuvImage.image[0].position(0);
               b.put(data);

               try {
                   long t = 1000 * (System.currentTimeMillis() - startTime);
                   if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
                   if (t > recorder.getTimestamp()) {
                       recorder.setTimestamp(t);
                   }
                   recorder.record(yuvImage);
               } catch (FFmpegFrameRecorder.Exception e) {
                   if(Static.DEBUG) Log.i(TAG,e.getMessage());
                   e.printStackTrace();
               }
           }

           return rgba;
       }

       @Override
       public void onCameraViewStarted(int width, int height) {
           edgesMat = new Mat();
       }

       @Override
       public void onCameraViewStopped() {
           if (edgesMat != null)
               edgesMat.release();

           edgesMat = null;
       }

       public boolean onOptionsItemSelected(MenuItem item) {
           Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
           if (item.getGroupId() == 1)
           {
               mOpenCvCameraView.setEffect((String) item.getTitle());
               Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
           } else if (item.getGroupId() == 2) {
               int id = item.getItemId();
               Camera.Size resolution = mResolutionList.get(id);
               mOpenCvCameraView.setResolution(resolution);
               resolution = mOpenCvCameraView.getResolution();
               String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
               Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
           }

           return true;
       }

       @Override
       public boolean onTouch(View v, MotionEvent event) {
           Log.i(TAG,"onTouch event");
           SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
           String currentDateandTime = sdf.format(new Date());
           String fileName = Environment.getExternalStorageDirectory().getPath() +
                   "/sample_picture_" + currentDateandTime + ".jpg";
           mOpenCvCameraView.takePicture(fileName);
           Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
           return false;
       }

       /**
        * Click to ImageButton to start recording.
        */
       public void onClickBtnStartRecord2(View v) {
           if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");

           if(!recording)
               startRecording();
           else
               stopRecording();
       }

       private void startRecording() {
           if (Static.DEBUG) Log.i(TAG, "startRecording()");
           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();
               if (Static.DEBUG) Log.i(TAG, "startRecording() success");
           } catch(FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       private void stopRecording() {
           if (Static.DEBUG) Log.i(TAG, "stopRecording()");

           runAudioThread = false;
           try {
               audioThread.join();
           } catch(InterruptedException e) {
               e.printStackTrace();
           }
           audioRecordRunnable = null;
           audioThread = null;
           if (Static.DEBUG) Log.i(TAG, "stopRecording() 2");
           if(recorder != null &amp;&amp; recording) {

               recording = false;
               try {
                   recorder.stop();
                   recorder.release();
                   Log.i(TAG, "Finishing recording, calling stop and release on recorder");
               } catch(FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;
           }
       }


       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.i(TAG, "init recorder");
           try {

               if (yuvImage == null) {
                   yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 4);
                   Log.i(TAG, "create yuvImage");
               }

               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
               //Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
               recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
               recorder.setFormat("mp4");
               recorder.setSampleRate(sampleAudioRateInHz);
               // Set in the surface changed method
               recorder.setFrameRate(frameRate);

               audioRecordRunnable = new AudioRecordRunnable();
               audioThread = new Thread(audioRecordRunnable);
               runAudioThread = true;
               Log.i(TAG, "recorder initialize success");
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               Log.d(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               // ffmpeg_audio encoding loop
               while(runAudioThread) {
                   //Log.v(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if(bufferReadResult > 0) {
                       Log.v(TAG, "bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if(recording) {
                               try {
                                   recorder.recordSamples(audioData);
                                   //Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                               } catch(FFmpegFrameRecorder.Exception e) {
                                   Log.v(TAG, e.getMessage());
                                   e.printStackTrace();
                               }
                       }
                   }
               }
               Log.v(TAG, "AudioThread Finished, release audioRecord");

               // encoding finish, release recorder
               if(audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(TAG, "audioRecord released");
               }
           }
       }
    }
    </string></string>

    OpenCVCameraPreview.java :

    import android.content.Context;
    import android.hardware.Camera;
    import android.util.AttributeSet;
    import android.util.Log;

    import org.opencv.android.JavaCameraView;

    import java.io.FileOutputStream;
    import java.util.List;

    public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {

       private static final String TAG =  OpenCVCameraPreview.class.getSimpleName();
       private String mPictureFileName;

       public OpenCVCameraPreview(Context context, AttributeSet attrs) {
           super(context, attrs);
       }

       public List<string> getEffectList() {
           return mCamera.getParameters().getSupportedColorEffects();
       }

       public boolean isEffectSupported() {
           return (mCamera.getParameters().getColorEffect() != null);
       }

       public String getEffect() {
           return mCamera.getParameters().getColorEffect();
       }

       public void setEffect(String effect) {
           Camera.Parameters params = mCamera.getParameters();
           params.setColorEffect(effect);
           mCamera.setParameters(params);
       }

       public List getResolutionList() {
           return mCamera.getParameters().getSupportedPreviewSizes();
       }

       public void setResolution(Camera.Size resolution) {
           disconnectCamera();
           mMaxHeight = resolution.height;
           mMaxWidth = resolution.width;
           connectCamera(getWidth(), getHeight());
       }

       public Camera.Size getResolution() {
           return mCamera.getParameters().getPreviewSize();
       }

       public void takePicture(final String fileName) {
           Log.i(TAG, "Taking picture");
           this.mPictureFileName = fileName;
           // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
           // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
           mCamera.setPreviewCallback(null);

           // PictureCallback is implemented by the current class
           mCamera.takePicture(null, null, this);
       }

       @Override
       public void onPictureTaken(byte[] data, Camera camera) {
           Log.i(TAG, "Saving a bitmap to file");
           // The camera preview was automatically stopped. Start it again.
           mCamera.startPreview();
           mCamera.setPreviewCallback(this);

           // Write the image in a file (in jpeg format)
           try {
               FileOutputStream fos = new FileOutputStream(mPictureFileName);

               fos.write(data);
               fos.close();

           } catch (java.io.IOException e) {
               Log.e("PictureDemo", "Exception in photoCallback", e);
           }

       }
    }
    </string>

    activity_opencv.xml :

    &lt;?xml version="1.0" encoding="utf-8"?>
    <relativelayout>

       

       &lt;ImageButton<br />
           android:id=&quot;@+id/btnStartRecord2&quot;<br />
           android:layout_width=&quot;70dp&quot;<br />
           android:layout_height=&quot;70dp&quot;<br />
           android:scaleType=&quot;fitXY&quot;<br />
           android:src=&quot;@drawable/record_icon&quot;<br />
           android:background=&quot;@null&quot;<br />
           android:text=&quot;@string/btnStartRecord&quot;<br />
           android:onClick=&quot;onClickBtnStartRecord2&quot;<br />
           android:layout_centerVertical=&quot;true&quot;<br />
           android:layout_alignParentRight=&quot;true&quot;<br />
           android:layout_alignParentEnd=&quot;true&quot;/&gt;


    </relativelayout>

    Overlay views working, but recorded video is without overlay views, and recording with onCameraFrame method is very slow.

  • How to extract a fixed number of frames with ffmpeg ?

    10 mars 2016, par W. Han

    I am trying to extract a fixed number of frames uniformly from a bunch of videos(say 50 frames from each video, 10,000 videos in total).

    Since the duration varies, I calculated the ideal output fps for each video and take it as a parameter for ffmpeg extraction, but failed to get the required number of frames.

    Does anyone know how to extract a fixed number of frames with ffmpeg, or other tools ? Thanks !

  • Ffmpeg lose streams while using -map 0

    27 mars 2016, par Ngoral

    I had a strange issue using ffmpeg on Ubuntu 14.04.
    I run a command

    ffmpeg -i output2.avi -c:v h264 -minrate 2000k -maxrate 5000k -bufsize 2000k -profile:v high -level:v 4 -coder 1 -s 640x360 -bf 0 -pix_fmt yuv420p -r 25 -g 25 -c:a aac -ar 48k -b:a 321k -map 0 -y outpu.mp4

    It provides such a usual output in console (already with -loglevel verbose) :

    ffmpeg version N-79004-g2e6636a Copyright (c) 2000-2016 the FFmpeg developers


    built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
     configuration: --prefix=/home/ngoral/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ngoral/ffmpeg_build/include --extra-ldflags=-L/home/ngoral/ffmpeg_build/lib --bindir=/home/ngoral/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-nonfree
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 28.101 / 57. 28.101
     libavformat    57. 28.101 / 57. 28.101
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 39.102 /  6. 39.102
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    [avi @ 0x2707800] parser not found for codec dvvideo, packets or times may be invalid.
       Last message repeated 1 times
    Input #0, avi, from 'output2.avi':
     Metadata:
       encoder         : Lavf57.28.101
     Duration: 00:00:20.04, start: 0.000000, bitrate: 28911 kb/s
       Stream #0:0: Video: dvvideo, 1 reference frame (dvsd / 0x64737664), yuv420p, 720x576 [SAR 16:15 DAR 4:3], 28684 kb/s, 25 fps, 25 tbr, 25 tbn, 25 tbc
       Stream #0:1: Audio: mp3 (U[0][0][0] / 0x0055), 48000 Hz, stereo, s16p, 192 kb/s
       Stream #0:2: Audio: mp3 (U[0][0][0] / 0x0055), 48000 Hz, stereo, s16p, 64 kb/s
       Stream #0:3: Audio: aac ([255][0][0][0] / 0x00FF), 48000 Hz, stereo, fltp, 117 kb/s
    Matched encoder 'libx264' for codec 'h264'.
    [graph 0 input from stream 0:0 @ 0x2784f60] w:720 h:576 pixfmt:yuv420p tb:1/25 fr:25/1 sar:16/15 sws_param:flags=2
    [scaler for output stream 0:0 @ 0x2749d20] w:640 h:360 flags:'bicubic' interl:0
    [scaler for output stream 0:0 @ 0x2749d20] w:720 h:576 fmt:yuv420p sar:16/15 -> w:640 h:360 fmt:yuv420p sar:3/4 flags:0x4
    [graph 1 input from stream 0:1 @ 0x27a4fc0] tb:1/48000 samplefmt:s16p samplerate:48000 chlayout:0x3
    [audio format for output stream 0:1 @ 0x27a5380] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:1'
    [auto-inserted resampler 0 @ 0x27a7ae0] ch:2 chl:stereo fmt:s16p r:48000Hz -> ch:2 chl:stereo fmt:fltp r:48000Hz
    [graph 2 input from stream 0:2 @ 0x27a6620] tb:1/48000 samplefmt:s16p samplerate:48000 chlayout:0x3
    [audio format for output stream 0:2 @ 0x27a6440] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:2'
    [auto-inserted resampler 0 @ 0x27b6be0] ch:2 chl:stereo fmt:s16p r:48000Hz -> ch:2 chl:stereo fmt:fltp r:48000Hz
    [graph 3 input from stream 0:3 @ 0x27b6560] tb:1/48000 samplefmt:fltp samplerate:48000 chlayout:0x3
    [libx264 @ 0x27889a0] using SAR=3/4
    [libx264 @ 0x27889a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
    [libx264 @ 0x27889a0] profile High, level 4.0
    [libx264 @ 0x27889a0] 264 - core 148 r2643 5c65704 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=2 keyint=25 keyint_min=2 scenecut=40 intra_refresh=0 rc_lookahead=25 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=5000 vbv_bufsize=2000 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'outpu.mp4':
     Metadata:
       encoder         : Lavf57.28.101
       Stream #0:0: Video: h264 (libx264), -1 reference frame ([33][0][0][0] / 0x0021), yuv420p, 640x360 [SAR 3:4 DAR 4:3], q=-1--1, max. 5000 kb/s, 25 fps, 12800 tbn, 25 tbc
       Metadata:
         encoder         : Lavc57.28.101 libx264
       Side data:
         cpb: bitrate max/min/avg: 5000000/0/0 buffer size: 2000000 vbv_delay: -1
       Stream #0:1: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 321 kb/s
       Metadata:
         encoder         : Lavc57.28.101 aac
       Stream #0:2: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 321 kb/s
       Metadata:
         encoder         : Lavc57.28.101 aac
       Stream #0:3: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 321 kb/s
       Metadata:
         encoder         : Lavc57.28.101 aac
    Stream mapping:
     Stream #0:0 -> #0:0 (dvvideo (native) -> h264 (libx264))
     Stream #0:1 -> #0:1 (mp3 (native) -> aac (native))
     Stream #0:2 -> #0:2 (mp3 (native) -> aac (native))
     Stream #0:3 -> #0:3 (aac (native) -> aac (native))
    Press [q] to stop, [?] for help
    *** 3 dup!
    No more output streams to write to, finishing.e=00:00:19.84 bitrate= 359.9kbits/s dup=3 drop=0 speed=1.15x
    frame=  501 fps= 29 q=-1.0 Lsize=    1792kB time=00:00:20.05 bitrate= 732.2kbits/s dup=3 drop=0 speed=1.14x
    video:440kB audio:1331kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.155376%
    Input file #0 (output2.avi):
     Input stream #0:0 (video): 498 packets read (71712000 bytes); 498 frames decoded;
     Input stream #0:1 (audio): 834 packets read (480384 bytes); 834 frames decoded (960768 samples);
     Input stream #0:2 (audio): 835 packets read (160320 bytes); 835 frames decoded (961920 samples);
     Input stream #0:3 (audio): 0 packets read (0 bytes); 0 frames decoded (0 samples);
     Total: 2167 packets (72352704 bytes) demuxed
    Output file #0 (outpu.mp4):
     Output stream #0:0 (video): 501 frames encoded; 501 packets muxed (451055 bytes);
     Output stream #0:1 (audio): 939 frames encoded (960768 samples); 940 packets muxed (724261 bytes);
     Output stream #0:2 (audio): 940 frames encoded (961920 samples); 941 packets muxed (639072 bytes);
     Output stream #0:3 (audio): 0 frames encoded (0 samples); 0 packets muxed (0 bytes);
     Total: 2382 packets (1814388 bytes) muxed
    [libx264 @ 0x27889a0] frame I:21    Avg QP:15.30  size:  8718
    [libx264 @ 0x27889a0] frame P:480   Avg QP:24.52  size:   557
    [libx264 @ 0x27889a0] mb I  I16..4: 20.4% 55.5% 24.1%
    [libx264 @ 0x27889a0] mb P  I16..4:  0.0%  0.1%  0.0%  P16..4:  7.6%  3.7%  1.7%  0.0%  0.0%    skip:86.8%
    [libx264 @ 0x27889a0] 8x8 transform intra:56.3% inter:50.3%
    [libx264 @ 0x27889a0] coded y,uvDC,uvAC intra: 42.0% 39.5% 27.5% inter: 2.6% 1.4% 0.0%
    [libx264 @ 0x27889a0] i16 v,h,dc,p: 36% 52%  3% 10%
    [libx264 @ 0x27889a0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 56% 20% 14%  2%  1%  1%  2%  2%  2%
    [libx264 @ 0x27889a0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 42% 26%  7%  4%  3%  4%  5%  5%  4%
    [libx264 @ 0x27889a0] i8c dc,h,v,p: 66% 13% 17%  4%
    [libx264 @ 0x27889a0] Weighted P-Frames: Y:0.0% UV:0.0%
    [libx264 @ 0x27889a0] ref P L0: 68.1% 11.5% 13.9%  6.5%
    [libx264 @ 0x27889a0] kb/s:179.78
    [aac @ 0x2747da0] Qavg: 62719.090
    [aac @ 0x2748b20] Qavg: 64509.496
    [aac @ 0x27498a0] Qavg: -nan

    it seems like it outputs all 3 audiostreams, but then i do

    ffmpeg -loglevel verbose -i outpu.mp4

    And get only 2 audiostreams :

    ffmpeg version N-79004-g2e6636a Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
     configuration: --prefix=/home/ngoral/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ngoral/ffmpeg_build/include --extra-ldflags=-L/home/ngoral/ffmpeg_build/lib --bindir=/home/ngoral/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-nonfree
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 28.101 / 57. 28.101
     libavformat    57. 28.101 / 57. 28.101
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 39.102 /  6. 39.102
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'outpu.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.28.101
     Duration: 00:00:20.06, start: 0.021333, bitrate: 731 kb/s
       Stream #0:0(und): Video: h264 (High), 3 reference frames (avc1 / 0x31637661), yuv420p, 640x360 (640x368) [SAR 3:4 DAR 4:3], 180 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 289 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
       Stream #0:2(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 254 kb/s
       Metadata:
         handler_name    : SoundHandler

    What’s wrong with it ?
    It works fine on my Win machine, on virtual machine with Ubuntu on it, but as ran on real Ubuntu it behaves like this. Do you have any ideas ?
    Thanks !