Recherche avancée

Médias (91)

Autres articles (63)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (5853)

  • avformat/demux : add duration_probesize AVOption

    29 mars 2024, par Nicolas Gaullier
    avformat/demux : add duration_probesize AVOption
    

    Yet another probesize used to get the durations when
    estimate_timings_from_pts is required. It is aimed at users interested
    in better durations probing for itself, or because using
    avformat_find_stream_info indirectly and requiring exact values : for
    concatdec for example, especially if streamcopying above it.
    The current code is a performance trade-off that can fail to get video
    stream durations in a scenario with high bitrates and buffering for
    files ending cleanly (as opposed to live captures) : the physical gap
    between the last video packet and the last audio packet is very high in
    such a case.

    Default behaviour is unchanged : 250k up to 250k << 6 (step by step).
    Setting this new option has two effects :
    - override the maximum probesize (currently 250k << 6)
    - reduce the number of steps to 1 instead of 6, this is to avoid
    detecting the audio "too early" and failing to reach a video packet.
    Even if a single audio stream duration is found but not the other
    audio/video stream durations, there will be a retry, so at the end the
    full user-overriden probesize will be used as expected by the user.

    Signed-off-by : Nicolas Gaullier <nicolas.gaullier@cji.paris>

    • [DH] doc/APIchanges
    • [DH] doc/formats.texi
    • [DH] libavformat/avformat.h
    • [DH] libavformat/demux.c
    • [DH] libavformat/options_table.h
    • [DH] libavformat/version.h
  • Android Merging two or more videos from the List to a single synchronised video

    18 août 2017, par Alok Kumar Verma

    I’ve an activity which has the certain videos inside my recyclerview. I’ve one button which takes you to the next activity which has the VideoView.
    Now the main task is to merge the videos to single synchronous video and that video file will be played in the video View.

    The mapping is like -> EditVideoActivity.java (List of Videos in recyclerview) -> Processing the video merging into one -> saving somewhere -> the merged video getting played.

    I’ve researched on several sites about this and came to know that two things are available for this, these are :

    1. FFMPEG

    2. MP4Parser

    Now I’ve followed some questions on StackOverflow about this :

    1. FFMPEG Usage getting problem in Orientation : This doesn’t serve my motive. As this guy has used some coding.
    2. Mp4Parser : They have moved their site to Github but I got 504 gateway timeout so no guidance for that.

    Developer Android has MediaCodec and I went there it was quiet complex

    I’m sending out the data in string format or i must say the video path in array format to other activity and is working fine.

    I’ve done some homework what I did is just in media player I appended the videos. But I have to merge the videos and then it has to be played inside the videoplayer.

    1. When the button is clicked it merges the videos and saves it in a storage specified

    The same videos is now played in videoplayer

    EditVideo.java from where the data is passed from the press of the button

    audioButton.setOnClickListener(new View.OnClickListener() {
                     @Override
                     public void onClick(View view) {
                         //the position of the video
                         videoPosition = selectedVideo;

                         if(videoPosition != null){
                             Intent intent = new Intent(EditVideoActivity.this, AudioActivity.class);
                             intent.putExtra("videoData",videoPosition);
                             Log.e("VIDEO_SENT_DATA=======", videoPosition);
                             startActivity(intent);
                         }else
                             Toast.makeText(EditVideoActivity.this,"You have not selected any video",Toast.LENGTH_LONG)
                                     .show();
                     }
                 });

    from here the data is getting passed, I’m working on how to merge the videos with the press of the above button

    And PreviewActivity.java is the page where the videoView is there to be played (the merged files)

    I’ve done this : Appending the videos in onCreate()

    //getting the passed value from the previous activity
       Bundle extras = getIntent().getExtras();
       final ArrayList<string> videoReceived = extras.getStringArrayList("videos");
       Log.e("VIDEO_RECEIVED",videoReceived.toString());

       mVideoPlayer.setVideoPath(String.valueOf(videoReceived.get(0)));
       mMediaController = new MediaController(this);
       mMediaController.setMediaPlayer(mVideoPlayer);
       mVideoPlayer.setMediaController(mMediaController);
       mVideoPlayer.setBackgroundColor(Color.TRANSPARENT);
       mVideoPlayer.requestFocus();
       mVideoPlayer.start();
       Log.e("VIDEO_SIZE===",String.valueOf(videoReceived.size()));
       mVideoPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
           @Override
           public void onCompletion(MediaPlayer mediaPlayer) {

               if( currentIndex &lt; videoReceived.size()){
                   String uri = String.valueOf(videoReceived.get(currentIndex));
                   mVideoPlayer.setVideoPath(uri);
                   mVideoPlayer.start();
                   currentIndex++;
               }
           }
       });
    </string>
  • Setting individual pixels of an RGB frame for ffmpeg encoding

    15 mai 2013, par Camille Goudeseune

    I'm trying to change the test pattern of an ffmpeg streamer, Trouble syncing libavformat/ffmpeg with x264 and RTP , into familiar RGB format. My broader goal is to compute frames of a streamed video on the fly.

    So I replaced its AV_PIX_FMT_MONOWHITE with AV_PIX_FMT_RGB24, which is "packed RGB 8:8:8, 24bpp, RGBRGB..." according to http://libav.org/doxygen/master/pixfmt_8h.html .

    To stuff its pixel array called data, I've tried many variations on

    for (int y=0; y/  const double j = y/double(HEIGHT);
       rgb[0] = 255*i;
       rgb[1] = 0;
       rgb[2] = 255*(1-i);
     }
    }

    At HEIGHTxWIDTH= 80x60, this version yields
    screenshot of red-to-blue stripes, when I expect a single blue-to-red horizontal gradient.

    640x480 yields the same 4-column pattern, but with far more horizontal stripes.

    640x640, 160x160, etc, yield three columns, cyan-ish / magenta-ish / yellow-ish, with the same kind of horizontal stripiness.

    Vertical gradients behave even more weirdly.

    Appearance was unaffected by an AV_PIX_FMT_RGBA attempt (4 not 3 bytes per pixel, alpha=255). Also unaffected by a port from C to C++.

    The argument srcStrides passed to sws_scale() is a length-1 array, containing the single int HEIGHT.

    Access each Pixel of AVFrame asks the same question in less detail, so far unanswered.

    The streamer emits one warning, which I doubt affects appearance :

    [rtp @ 0x269c0a0] Encoder did not produce proper pts, making some up.

    So. How do you set the RGB value of a pixel in a frame to be sent to sws_scale() (and then to x264_encoder_encode() and av_interleaved_write_frame()) ?