Recherche avancée

Médias (0)

Mot : - Tags -/optimisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (46)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (7097)

  • Audio & Video not synchronized properly if i merged more videos in mp4parser

    1er octobre 2013, par maniya

    I have used mp4parser for merging video with dynamic pause and record video capture for max 6 second recording. In preview its working fine when recorded video with minimum pause/record, If i tried with more than 3 pause/record mean the last video file not get merged properly with audio.At the start of the video the sync is ok but at the end the video hanged and audio playing in screen for the remaining file duration about 1sec.

    My Recording manager

    public class RecordingManager implements Camera.ErrorCallback, MediaRecorder.OnErrorListener, MediaRecorder.OnInfoListener {

       private static final String TAG = RecordingManager.class.getSimpleName();
       private static final int FOCUS_AREA_RADIUS = 32;
       private static final int FOCUS_MAX_VALUE = 1000;
       private static final int FOCUS_MIN_VALUE = -1000;
       private static final long MINIMUM_RECORDING_TIME = 2000;
       private static final int MAXIMUM_RECORDING_TIME = 70 * 1000;
       private static final long LOW_STORAGE_THRESHOLD = 5 * 1024 * 1024;
       private static final long RECORDING_FILE_LIMIT = 100 * 1024 * 1024;

       private boolean paused = true;

       private MediaRecorder mediaRecorder = null;
       private boolean recording = false;

       private FrameLayout previewFrame = null;

       private boolean mPreviewing = false;

    //    private TextureView mTextureView = null;
    //    private SurfaceTexture mSurfaceTexture = null;
    //    private boolean mSurfaceTextureReady = false;
    //
       private SurfaceView surfaceView = null;
       private SurfaceHolder surfaceHolder = null;
       private boolean surfaceViewReady = false;

       private Camera camera = null;
       private Camera.Parameters cameraParameters = null;
       private CamcorderProfile camcorderProfile = null;

       private int mOrientation = -1;
       private OrientationEventListener mOrientationEventListener = null;

       private long mStartRecordingTime;
       private int mVideoWidth;
       private int mVideoHeight;
       private long mStorageSpace;

       private Handler mHandler = new Handler();
    //    private Runnable mUpdateRecordingTimeTask = new Runnable() {
    //        @Override
    //        public void run() {
    //            long recordingTime = System.currentTimeMillis() - mStartRecordingTime;
    //            Log.d(TAG, String.format("Recording time:%d", recordingTime));
    //            mHandler.postDelayed(this, CLIP_GRAPH_UPDATE_INTERVAL);
    //        }
    //    };
       private Runnable mStopRecordingTask = new Runnable() {
           @Override
           public void run() {
               stopRecording();
           }
       };

       private static RecordingManager mInstance = null;
       private Activity currentActivity = null;
       private String destinationFilepath = "";
       private String snapshotFilepath = "";

       public static RecordingManager getInstance(Activity activity, FrameLayout previewFrame) {
           if (mInstance == null || mInstance.currentActivity != activity) {
               mInstance = new RecordingManager(activity, previewFrame);
           }
           return mInstance;
       }

       private RecordingManager(Activity activity, FrameLayout previewFrame) {
           currentActivity = activity;
           this.previewFrame = previewFrame;
       }

       public int getVideoWidth() {
           return this.mVideoWidth;
       }
       public int getVideoHeight() {
           return this.mVideoHeight;
       }
       public void setDestinationFilepath(String filepath) {
           this.destinationFilepath = filepath;
       }
       public String getDestinationFilepath() {
           return this.destinationFilepath;
       }
       public void setSnapshotFilepath(String filepath) {
           this.snapshotFilepath = filepath;
       }
       public String getSnapshotFilepath() {
           return this.snapshotFilepath;
       }
       public void init(String videoPath, String snapshotPath) {
           Log.v(TAG, "init.");
           setDestinationFilepath(videoPath);
           setSnapshotFilepath(snapshotPath);
           if (!Utils.isExternalStorageAvailable()) {
               showStorageErrorAndFinish();
               return;
           }

           openCamera();
           if (camera == null) {
               showCameraErrorAndFinish();
               return;
           }



       public void onResume() {
           Log.v(TAG, "onResume.");
           paused = false;

           // Open the camera
           if (camera == null) {
               openCamera();
               if (camera == null) {
                   showCameraErrorAndFinish();
                   return;
               }
           }

           // Initialize the surface texture or surface view
    //        if (useTexture() && mTextureView == null) {
    //            initTextureView();
    //            mTextureView.setVisibility(View.VISIBLE);
    //        } else if (!useTexture() && mSurfaceView == null) {
               initSurfaceView();
               surfaceView.setVisibility(View.VISIBLE);
    //        }

           // Start the preview
           if (!mPreviewing) {
               startPreview();
           }
       }

       private void openCamera() {
           Log.v(TAG, "openCamera");
           try {
               camera = Camera.open();
               camera.setErrorCallback(this);
               camera.setDisplayOrientation(90); // Since we only support portrait mode
               cameraParameters = camera.getParameters();
           } catch (RuntimeException e) {
               e.printStackTrace();
               camera = null;
           }
       }

       private void closeCamera() {
           Log.v(TAG, "closeCamera");
           if (camera == null) {
               Log.d(TAG, "Already stopped.");
               return;
           }

           camera.setErrorCallback(null);
           if (mPreviewing) {
               stopPreview();
           }
           camera.release();
           camera = null;
       }




       private void initSurfaceView() {
           surfaceView = new SurfaceView(currentActivity);
           surfaceView.getHolder().addCallback(new SurfaceViewCallback());
           surfaceView.setVisibility(View.GONE);
           FrameLayout.LayoutParams params = new LayoutParams(
                   LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, Gravity.CENTER);
           surfaceView.setLayoutParams(params);
           Log.d(TAG, "add surface view to preview frame");
           previewFrame.addView(surfaceView);
       }

       private void releaseSurfaceView() {
           if (surfaceView != null) {
               previewFrame.removeAllViews();
               surfaceView = null;
               surfaceHolder = null;
               surfaceViewReady = false;
           }
       }

       private void startPreview() {
    //        if ((useTexture() && !mSurfaceTextureReady) || (!useTexture() && !mSurfaceViewReady)) {
    //            return;
    //        }

           Log.v(TAG, "startPreview.");
           if (mPreviewing) {
               stopPreview();
           }

           setCameraParameters();
           resizePreview();

           try {
    //            if (useTexture()) {
    //                mCamera.setPreviewTexture(mSurfaceTexture);
    //            } else {
                   camera.setPreviewDisplay(surfaceHolder);
    //            }
               camera.startPreview();
               mPreviewing = true;
           } catch (Exception e) {
               closeCamera();
               e.printStackTrace();
               Log.e(TAG, "startPreview failed.");
           }

       }

       private void stopPreview() {
           Log.v(TAG, "stopPreview");
           if (camera != null) {
               camera.stopPreview();
               mPreviewing = false;
           }
       }

       public void onPause() {
           paused = true;

           if (recording) {
               stopRecording();
           }
           closeCamera();

    //        if (useTexture()) {
    //            releaseSurfaceTexture();
    //        } else {
               releaseSurfaceView();
    //        }
       }

       private void setCameraParameters() {
           if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
           } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
           } else {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
           }
           mVideoWidth = camcorderProfile.videoFrameWidth;
           mVideoHeight = camcorderProfile.videoFrameHeight;
           camcorderProfile.fileFormat = MediaRecorder.OutputFormat.MPEG_4;
           camcorderProfile.videoFrameRate = 30;

           Log.v(TAG, "mVideoWidth=" + mVideoWidth + " mVideoHeight=" + mVideoHeight);
           cameraParameters.setPreviewSize(mVideoWidth, mVideoHeight);

           if (cameraParameters.getSupportedWhiteBalance().contains(Camera.Parameters.WHITE_BALANCE_AUTO)) {
               cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
           }

           if (cameraParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           }

           cameraParameters.setRecordingHint(true);
           cameraParameters.set("cam_mode", 1);

           camera.setParameters(cameraParameters);
           cameraParameters = camera.getParameters();

           camera.setDisplayOrientation(90);
           android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
           Log.d(TAG, info.orientation + " degree");
       }

       private void resizePreview() {
           Log.d(TAG, String.format("Video size:%d|%d", mVideoWidth, mVideoHeight));

           Point optimizedSize = getOptimizedPreviewSize(mVideoWidth, mVideoHeight);
           Log.d(TAG, String.format("Optimized size:%d|%d", optimizedSize.x, optimizedSize.y));

           ViewGroup.LayoutParams params = (ViewGroup.LayoutParams) previewFrame.getLayoutParams();
           params.width = optimizedSize.x;
           params.height = optimizedSize.y;
           previewFrame.setLayoutParams(params);
       }

       public void setOrientation(int ori) {
           this.mOrientation = ori;
       }

       public void setOrientationEventListener(OrientationEventListener listener) {
           this.mOrientationEventListener = listener;
       }

       public Camera getCamera() {
           return camera;
       }

       @SuppressWarnings("serial")
       public void setFocusArea(float x, float y) {
           if (camera != null) {
               int viewWidth = surfaceView.getWidth();
               int viewHeight = surfaceView.getHeight();

               int focusCenterX = FOCUS_MAX_VALUE - (int) (x / viewWidth * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               int focusCenterY = FOCUS_MIN_VALUE + (int) (y / viewHeight * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               final int left = focusCenterY - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterY - FOCUS_AREA_RADIUS;
               final int top = focusCenterX - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterX - FOCUS_AREA_RADIUS;
               final int right = focusCenterY + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterY + FOCUS_AREA_RADIUS;
               final int bottom = focusCenterX + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterX + FOCUS_AREA_RADIUS;

               Camera.Parameters params = camera.getParameters();
               params.setFocusAreas(new ArrayList() {
                   {
                       add(new Camera.Area(new Rect(left, top, right, bottom), 1000));
                   }
               });
               camera.setParameters(params);
               camera.autoFocus(new AutoFocusCallback() {
                   @Override
                   public void onAutoFocus(boolean success, Camera camera) {
                       Log.d(TAG, "onAutoFocus");
                   }
               });
           }
       }

       public void startRecording(String destinationFilepath) {
           if (!recording) {
               updateStorageSpace();
               setDestinationFilepath(destinationFilepath);
               if (mStorageSpace <= LOW_STORAGE_THRESHOLD) {
                   Log.v(TAG, "Storage issue, ignore the start request");
                   Toast.makeText(currentActivity, "Storage issue, ignore the recording request", Toast.LENGTH_LONG).show();
                   return;
               }

               if (!prepareMediaRecorder()) {
                   Toast.makeText(currentActivity, "prepareMediaRecorder failed.", Toast.LENGTH_LONG).show();
                   return;
               }

               Log.d(TAG, "Successfully prepare media recorder.");
               try {
                   mediaRecorder.start();
               } catch (RuntimeException e) {
                   Log.e(TAG, "MediaRecorder start failed.");
                   releaseMediaRecorder();
                   return;
               }

               mStartRecordingTime = System.currentTimeMillis();

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.disable();
               }

               recording = true;
           }
       }

       public void stopRecording() {
           if (recording) {
               if (!paused) {
                   // Capture at least 1 second video
                   long currentTime = System.currentTimeMillis();
                   if (currentTime - mStartRecordingTime < MINIMUM_RECORDING_TIME) {
                       mHandler.postDelayed(mStopRecordingTask, MINIMUM_RECORDING_TIME - (currentTime - mStartRecordingTime));
                       return;
                   }
               }

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.enable();
               }

    //            mHandler.removeCallbacks(mUpdateRecordingTimeTask);

               try {
                   mediaRecorder.setOnErrorListener(null);
                   mediaRecorder.setOnInfoListener(null);
                   mediaRecorder.stop(); // stop the recording
                   Toast.makeText(currentActivity, "Video file saved.", Toast.LENGTH_LONG).show();

                   long stopRecordingTime = System.currentTimeMillis();
                   Log.d(TAG, String.format("stopRecording. file:%s duration:%d", destinationFilepath, stopRecordingTime - mStartRecordingTime));

                   // Calculate the duration of video
                   MediaMetadataRetriever mmr = new MediaMetadataRetriever();
                   mmr.setDataSource(this.destinationFilepath);
                   String _length = mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
                   if (_length != null) {
                       Log.d(TAG, String.format("clip duration:%d", Long.parseLong(_length)));
                   }

                   // Taking the snapshot of video
                   Bitmap snapshot = ThumbnailUtils.createVideoThumbnail(this.destinationFilepath, Thumbnails.MICRO_KIND);
                   try {
                       FileOutputStream out = new FileOutputStream(this.snapshotFilepath);
                       snapshot.compress(Bitmap.CompressFormat.JPEG, 70, out);
                       out.close();
                   } catch (Exception e) {
                       e.printStackTrace();
                   }

    //                mActivity.showPlayButton();

               } catch (RuntimeException e) {
                   e.printStackTrace();
                   Log.e(TAG, e.getMessage());
                   // if no valid audio/video data has been received when stop() is
                   // called
               } finally {
    //          

                   releaseMediaRecorder(); // release the MediaRecorder object
                   if (!paused) {
                       cameraParameters = camera.getParameters();
                   }
                   recording = false;
               }

           }
       }

       public void setRecorderOrientation(int orientation) {
           // For back camera only
           if (orientation != -1) {
               Log.d(TAG, "set orientationHint:" + (orientation + 135) % 360 / 90 * 90);
               mediaRecorder.setOrientationHint((orientation + 135) % 360 / 90 * 90);
           }else {
               Log.d(TAG, "not set orientationHint to mediaRecorder");
           }
       }

       private boolean prepareMediaRecorder() {
           mediaRecorder = new MediaRecorder();

           camera.unlock();
           mediaRecorder.setCamera(camera);

           mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
           mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);

           mediaRecorder.setProfile(camcorderProfile);

           mediaRecorder.setMaxDuration(MAXIMUM_RECORDING_TIME);
           mediaRecorder.setOutputFile(this.destinationFilepath);

           try {
               mediaRecorder.setMaxFileSize(Math.min(RECORDING_FILE_LIMIT, mStorageSpace - LOW_STORAGE_THRESHOLD));
           } catch (RuntimeException exception) {
           }

           setRecorderOrientation(mOrientation);

           if (!useTexture()) {
               mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
           }

           try {
               mediaRecorder.prepare();
           } catch (IllegalStateException e) {
               releaseMediaRecorder();
               return false;
           } catch (IOException e) {
               releaseMediaRecorder();
               return false;
           }

           mediaRecorder.setOnErrorListener(this);
           mediaRecorder.setOnInfoListener(this);

           return true;

       }

       private void releaseMediaRecorder() {
           if (mediaRecorder != null) {
               mediaRecorder.reset(); // clear recorder configuration
               mediaRecorder.release(); // release the recorder object
               mediaRecorder = null;
               camera.lock(); // lock camera for later use
           }
       }

       private Point getOptimizedPreviewSize(int videoWidth, int videoHeight) {
           Display display = currentActivity.getWindowManager().getDefaultDisplay();
           Point size = new Point();
           display.getSize(size);

           Point optimizedSize = new Point();
           optimizedSize.x = size.x;
           optimizedSize.y = (int) ((float) videoWidth / (float) videoHeight * size.x);

           return optimizedSize;
       }

       private void showCameraErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Camera error")
                   .setMessage("Cannot connect to the camera.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void showStorageErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Storage error")
                   .setMessage("Cannot read external storage.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void updateStorageSpace() {
           mStorageSpace = getAvailableSpace();
           Log.v(TAG, "updateStorageSpace mStorageSpace=" + mStorageSpace);
       }

       private long getAvailableSpace() {
           String state = Environment.getExternalStorageState();
           Log.d(TAG, "External storage state=" + state);
           if (Environment.MEDIA_CHECKING.equals(state)) {
               return -1;
           }
           if (!Environment.MEDIA_MOUNTED.equals(state)) {
               return -1;
           }

           File directory = currentActivity.getExternalFilesDir("vine");
           directory.mkdirs();
           if (!directory.isDirectory() || !directory.canWrite()) {
               return -1;
           }

           try {
               StatFs stat = new StatFs(directory.getAbsolutePath());
               return stat.getAvailableBlocks() * (long) stat.getBlockSize();
           } catch (Exception e) {
               Log.i(TAG, "Fail to access external storage", e);
           }
           return -1;
       }

       private boolean useTexture() {
           return false;
    //        return Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1;
       }

       private class SurfaceViewCallback implements SurfaceHolder.Callback {

           @Override
           public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
               Log.v(TAG, "surfaceChanged. width=" + width + ". height=" + height);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               Log.v(TAG, "surfaceCreated");
               surfaceViewReady = true;
               surfaceHolder = holder;
               startPreview();
           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               Log.d(TAG, "surfaceDestroyed");
               surfaceViewReady = false;
           }

       }

       @Override
       public void onError(int error, Camera camera) {
           Log.e(TAG, "Camera onError. what=" + error + ".");
           if (error == Camera.CAMERA_ERROR_SERVER_DIED) {

           } else if (error == Camera.CAMERA_ERROR_UNKNOWN) {

           }
       }

       @Override
       public void onInfo(MediaRecorder mr, int what, int extra) {
           if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
               stopRecording();
           } else if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) {
               stopRecording();
               Toast.makeText(currentActivity, "Size limit reached", Toast.LENGTH_LONG).show();
           }
       }

       @Override
       public void onError(MediaRecorder mr, int what, int extra) {
           Log.e(TAG, "MediaRecorder onError. what=" + what + ". extra=" + extra);
           if (what == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
               stopRecording();
           }
       }

    }

    VideoUtils

    public class VideoUtils {
       private static final String TAG = VideoUtils.class.getSimpleName();

       static double[] matrix = new double[] { 0.0, 1.0, 0.0, -1.0, 0.0, 0.0, 0.0,
               0.0, 1.0 };

       public static boolean MergeFiles(String speratedDirPath,
               String targetFileName) {
           File videoSourceDirFile = new File(speratedDirPath);
           String[] videoList = videoSourceDirFile.list();
           List<track> videoTracks = new LinkedList<track>();
           List<track> audioTracks = new LinkedList<track>();
           for (String file : videoList) {
               Log.d(TAG, "source files" + speratedDirPath
                       + File.separator + file);
               try {
                   FileChannel fc = new FileInputStream(speratedDirPath
                           + File.separator + file).getChannel();
                   Movie movie = MovieCreator.build(fc);
                   for (Track t : movie.getTracks()) {
                       if (t.getHandler().equals("soun")) {
                           audioTracks.add(t);
                       }
                       if (t.getHandler().equals("vide")) {

                           videoTracks.add(t);
                       }
                   }
               } catch (FileNotFoundException e) {
                   e.printStackTrace();
                   return false;
               } catch (IOException e) {
                   e.printStackTrace();
                   return false;
               }
           }

           Movie result = new Movie();

           try {
               if (audioTracks.size() > 0) {
                   result.addTrack(new AppendTrack(audioTracks
                           .toArray(new Track[audioTracks.size()])));
               }
               if (videoTracks.size() > 0) {
                   result.addTrack(new AppendTrack(videoTracks
                           .toArray(new Track[videoTracks.size()])));
               }
               IsoFile out = new DefaultMp4Builder().build(result);



               FileChannel fc = new RandomAccessFile(
                       String.format(targetFileName), "rw").getChannel();

               Log.d(TAG, "target file:" + targetFileName);
               TrackBox tb = out.getMovieBox().getBoxes(TrackBox.class).get(1);

               TrackHeaderBox tkhd = tb.getTrackHeaderBox();
               double[] b = tb.getTrackHeaderBox().getMatrix();

               tkhd.setMatrix(matrix);

               fc.position(0);
               out.getBox(fc);
               fc.close();
               for (String file : videoList) {
                   File TBRFile = new File(speratedDirPath + File.separator + file);
                   TBRFile.delete();
               }
               boolean a = videoSourceDirFile.delete();
               Log.d(TAG, "try to delete dir:" + a);
           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
               return false;
           }

           return true;
       }

       public static boolean clearFiles(String speratedDirPath) {
           File videoSourceDirFile = new File(speratedDirPath);
           if (videoSourceDirFile != null
                   &amp;&amp; videoSourceDirFile.listFiles() != null) {
               File[] videoList = videoSourceDirFile.listFiles();
               for (File video : videoList) {
                   video.delete();
               }
               videoSourceDirFile.delete();
           }
           return true;
       }

       public static int createSnapshot(String videoFile, int kind, String snapshotFilepath) {
           return 0;
       };

       public static int createSnapshot(String videoFile, int width, int height, String snapshotFilepath) {
           return 0;
       }
    }
    </track></track></track></track>

    my reference code project link is

    https://github.com/jwfing/AndroidVideoKit

  • Crossdevice encoding static file to stream in browser using FFMPEG (segmented h264 ?)

    20 mars 2014, par Vprnl

    I'm building a mediacenter application in NodeJS which is going pretty ok.
    (you can check it out on Github : https://github.com/jansmolders86/mediacenterjs )

    I'm using FFMPEG to transcode local (static) movies to a stream which I then send to the browser.

    At first I used h264 with Flash which worked in browsers, but I really need it to work on Android an iOS (so no Flash) and preferably working on a Raspberry Pi.

    But getting it to play on all devices is driving me absolutely insane !

    I have all these bits of the puzzle I've gathered from countless hours reading articles, tutorials and stack overflow posts, which led me to the conclusion that I need to produce the following :

    • Use video codec H264 to transcode to MP4
    • Move the moovatom '-movflags' to make a MP4 streamable
    • Segment the stream so Apple can play the stream as well.

    But getting nowhere with this. Every time I produce a series of FFMPEG settings that either don't work, or work on some devices rather than all.

    Some of my failed attempt where :

    My flash attempt -> Main problem (not running in IOS) :

       &#39;-y&#39;,&#39;-ss 0&#39;,&#39;-b 800k&#39;,&#39;-vcodec libx264&#39;,&#39;-acodec mp3&#39;\
       &#39;-ab 128&#39;,&#39;-ar 44100&#39;,&#39;-bufsize 62000&#39;, &#39;-maxrate 620k&#39;\
       metaDuration,tDuration,&#39;-f flv

    my HLS attempt -> Main problem (not running in browser) :

           &#39;-r 15&#39;,&#39;-b:v 128k&#39;,&#39;-c:v libx264&#39;,&#39;-x264opts level=41&#39;\
           &#39;-threads 4&#39;,&#39;-s 640x480&#39;,&#39;-map 0:v&#39;,&#39;-map 0:a:0&#39;,&#39;-c:a mp3&#39;\
           &#39;-b:a 160000&#39;,&#39;-ac 2&#39;,&#39;-f hls&#39;,&#39;-hls_time 10&#39;,&#39;-hls_list_size 6&#39;\
           &#39;-hls_wrap 18&#39;,&#39;-start_number 1&#39;

    My MP4 attempt -> Main problem (duration is shortened and the later part of the video is speeding by)

          &#39;-y&#39;,&#39;-ss 0&#39;,&#39;-b 800k&#39;,&#39;-vcodec libx264&#39;,&#39;-acodec mp3&#39;\
          &#39;-ab 128&#39;,&#39;-ar 44100&#39;,&#39;-bufsize 62000&#39;, &#39;-maxrate 620k&#39;\
          metaDuration,tDuration,&#39;-f mp4&#39;,&#39;-movflags&#39;,&#39;frag_keyframe+empty_moov&#39;

    Second MP4 attempt : -> Main problem (duration is shortened and the later part of the video is speeding by)

       &#39;-y&#39;,&#39;-vcodec libx264&#39;,&#39;-pix_fmt yuv420p&#39;,&#39;-b 1200k&#39;,&#39;-flags +loop+mv4&#39;\
       &#39;-cmp 256&#39;,&#39;-partitions +parti4x4+parti8x8+partp4x4+partp8x8+partb8x8&#39;\
       &#39;-me_method hex&#39;,&#39;-subq 7&#39;,&#39;-trellis 1&#39;,&#39;-refs 5&#39;,&#39;-bf 3&#39;,&#39;-coder 1&#39;\
       &#39;-me_range 16&#39;,&#39;-g 150&#39;,&#39;-keyint_min 25&#39;,&#39;-sc_threshold 40&#39;\
       &#39;-i_qfactor 0.71&#39;,&#39;-acodec mp3&#39;,&#39;-qmin 10&#39;,&#39;-qdiff 4&#39;,&#39;-qmax 51&#39;\
       &#39;-ab 128k&#39;,&#39;-ar 44100&#39;,&#39;-threads 2&#39;,&#39;-f mp4&#39;,&#39;-movflags&#39;,&#39;frag_keyframe+empty_moov&#39;])

    Here is an example of the FFMPEG log running with these settings :

           file conversion error ffmpeg version N-52458-gaa96439 Copyright (c) 2000-2013 the FFmpeg developers
             built on Apr 24 2013 22:19:32 with gcc 4.8.0 (GCC)
             configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --e
           nable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable
           -libgsm --enable-libilbc --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --ena
           ble-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwola
           me --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enabl
           e-libxvid --enable-zlib
             libavutil      52. 27.101 / 52. 27.101
             libavcodec     55.  6.100 / 55.  6.100
             libavformat    55.  3.100 / 55.  3.100
             libavdevice    55.  0.100 / 55.  0.100
             libavfilter     3. 60.101 /  3. 60.101
             libswscale      2.  2.100 /  2.  2.100
             libswresample   0. 17.102 /  0. 17.102
             libpostproc    52.  3.100 / 52.  3.100
           [avi @ 02427900] non-interleaved AVI
           Guessed Channel Layout for  Input Stream #0.1 : mono
           Input #0, avi, from &#39;C:/temp/the avengers.avi&#39;:
             Duration: 00:00:34.00, start: 0.000000, bitrate: 1433 kb/s
               Stream #0:0: Video: cinepak (cvid / 0x64697663), rgb24, 320x240, 15 tbr, 15 tbn, 15 tbc
               Stream #0:1: Audio: pcm_u8 ([1][0][0][0] / 0x0001), 22050 Hz, mono, u8, 176 kb/s
           Please use -b:a or -b:v, -b is ambiguous
           [libx264 @ 02527c60] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
           [libx264 @ 02527c60] profile High, level 2.0
           [libx264 @ 02527c60] 264 - core 130 r2274 c832fe9 - H.264/MPEG-4 AVC codec - Copyleft 2003-2013 - http://www.videolan.org/x26
           4.html - options: cabac=1 ref=5 deblock=1:0:0 analyse=0x3:0x133 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16
            chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=2 lookahead_threads=1 sliced_th
           reads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 we
           ightb=1 open_gop=0 weightp=2 keyint=150 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=120
           0 ratetol=1.0 qcomp=0.60 qpmin=10 qpmax=51 qpstep=4 ip_ratio=1.41 aq=1:1.00
           Output #0, mp4, to &#39;pipe:1&#39;:
             Metadata:
               encoder         : Lavf55.3.100
               Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 320x240, q=10-51, 1200 kb/s, 15360 tbn, 15 tbc
               Stream #0:1: Audio: mp3 (i[0][0][0] / 0x0069), 44100 Hz, mono, s16p, 128 kb/s
           Stream mapping:
             Stream #0:0 -> #0:0 (cinepak -> libx264)
             Stream #0:1 -> #0:1 (pcm_u8 -> libmp3lame)
           Press [q] to stop, [?] for help
           frame=  106 fps=0.0 q=10.0 size=       1kB time=00:00:06.94 bitrate=   1.4kbits/s
           frame=  150 fps=149 q=14.0 size=       1kB time=00:00:09.87 bitrate=   1.0kbits/s
           frame=  191 fps=126 q=16.0 size=       1kB time=00:00:12.61 bitrate=   0.8kbits/s
           frame=  244 fps=121 q=16.0 size=    2262kB time=00:00:16.14 bitrate=1147.6kbits/s
           frame=  303 fps=120 q=14.0 size=    2262kB time=00:00:20.08 bitrate= 922.2kbits/s
           frame=  354 fps=117 q=15.0 size=    3035kB time=00:00:23.48 bitrate=1058.6kbits/s
           frame=  402 fps=113 q=15.0 size=    3035kB time=00:00:26.67 bitrate= 932.1kbits/s
           frame=  459 fps=113 q=16.0 size=    4041kB time=00:00:30.43 bitrate=1087.7kbits/s
           frame=  510 fps=103 q=2686559.0 Lsize=    5755kB time=00:00:33.93 bitrate=1389.3kbits/s

           video:5211kB audio:531kB subtitle:0 global headers:0kB muxing overhead 0.235111%
           [libx264 @ 02527c60] frame I:6     Avg QP:10.55  size: 25921
           [libx264 @ 02527c60] frame P:245   Avg QP:12.15  size: 14543
           [libx264 @ 02527c60] frame B:259   Avg QP:15.55  size:  6242
           [libx264 @ 02527c60] consecutive B-frames:  6.1% 73.7% 14.7%  5.5%
           [libx264 @ 02527c60] mb I  I16..4: 19.9%  6.2% 73.9%
           [libx264 @ 02527c60] mb P  I16..4:  6.0%  0.2% 12.0%  P16..4: 35.4%  9.6% 16.3%  7.0%  5.6%    skip: 7.8%
           [libx264 @ 02527c60] mb B  I16..4:  0.7%  0.0%  4.3%  B16..8: 27.6% 17.2% 17.0%  direct:17.3%  skip:15.9%  L0:39.4% L1:43.2%
           BI:17.4%
           [libx264 @ 02527c60] final ratefactor: 11.41
           [libx264 @ 02527c60] 8x8 transform intra:1.6% inter:4.0%
           [libx264 @ 02527c60] coded y,uvDC,uvAC intra: 93.0% 97.0% 94.9% inter: 58.4% 58.7% 50.6%
           [libx264 @ 02527c60] i16 v,h,dc,p: 15% 26% 54%  5%
           [libx264 @ 02527c60] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 16% 17% 39%  4%  4%  3%  1%  6%  9%
           [libx264 @ 02527c60] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 34% 21%  4%  2%  2%  2%  2%  5%
           [libx264 @ 02527c60] i8c dc,h,v,p: 51% 24% 19%  6%
           [libx264 @ 02527c60] Weighted P-Frames: Y:4.1% UV:1.2%
           [libx264 @ 02527c60] ref P L0: 68.2%  9.8% 11.0%  5.6%  4.6%  0.8%  0.0%
           [libx264 @ 02527c60] ref B L0: 87.7%  8.0%  3.9%  0.4%
           [libx264 @ 02527c60] ref B L1: 97.8%  2.2%
           [libx264 @ 02527c60] kb/s:1255.36

    Lastly this is my node code fireing up FFMPEG. (I use the module Fluent-ffmpeg : https://github.com/schaermu/node-fluent-ffmpeg )

       var proc = new ffmpeg({ source: movie, nolog: true, timeout:15000})                        
           .addOptions([&#39;-r 15&#39;,&#39;-b:v 128k&#39;,&#39;-c:v libx264&#39;,&#39;-x264opts level=41&#39;,&#39;-threads 4&#39;,&#39;-s 640x480&#39;,&#39;-map 0:v&#39;,&#39;-map 0:a:0&#39;,&#39;-c:a mp3&#39;,&#39;-b:a 160000&#39;,&#39;-ac 2&#39;,&#39;-f hls&#39;,&#39;-hls_time 10&#39;,&#39;-hls_list_size 6&#39;,&#39;-hls_wrap 18&#39;,&#39;-start_number 1 stream.m3u8&#39;])
           .writeToStream(res, function(retcode, error){
               if (!error){
                   console.log(&#39;file has been converted succesfully&#39;,retcode .green);
               }else{
                   console.log(&#39;file conversion error&#39;,error .red);
               }
           });

    So to conclude this very long and code heavy question :

    I hope this does not come off as a lazy request, but could someone show/explain to me which FFMPEG settings could/should work on all platforms (modern browsers, Android and iOS) producing a stream of a static file which I can send to a HTML5 player.

    [EDIT] what I need if a generic option isn't available

    And if this is not possible as some posts might suggest, I would love to see a set of FFMPEG settings that get's the job done properly as far as mp4 streaming is concerned. (e.g encoding a streamable mp4).

    The streaming mp4 needs the following

    • A shifted moovAtom
    • It needs to be h264

    Thanks very much for your help !

  • Merge filename.mp4.tmp with filename.mp4

    10 janvier 2017, par Sathish

    How to merge a abc.mp4.tmp swap file with actual abc.mp4 file. I was trying to recording a live event using Wowza Media server 3.6 and the recorded file was not muxed properly at final moment. The abc.mp4.tmp swap file was not merged with actual abc.mp4 file. So could someone tell me how to merge the abc.mp4.tmp and abc.mp4 file and creates a new file and I can able to play with vlc

    Here is the mediainfo of the files

    [root@ip-ss-21-98-2 content]# mediainfo 03_03_2014_12_03_08.mp4
    General
    Complete name                            : 03_03_2014_12_03_08.mp4
    Format                                   : MPEG-4
    Format profile                           : Adobe Flash
    Codec ID                                 : f4v
    File size                                : 5.20 GiB


    [root@ip-ss-21-98-2 content]# mediainfo 03_03_2014_12_03_08.mp4.tmp
    General
    Complete name                            : 03_03_2014_12_03_08.mp4.tmp
    File size                                : 38.3 MiB