Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (28)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

Sur d’autres sites (6394)

  • FFmpeg stream extraction modifies subtitles [closed]

    21 mai 2024, par user18812922

    I have a video with the following ffprobe output :

    


    Input #0, matroska,webm, from 'video.mkv':
  Metadata:
    title           : Video - 01
    creation_time   : 2021-07-14T02:49:59.000000Z
    ENCODER         : Lavf58.29.100
  Duration: 00:22:57.28, start: 0.000000, bitrate: 392 kb/s
  Chapters:
    Chapter #0:0: start 0.000000, end 86.169000
      Metadata:
        title           : Opening
    Chapter #0:1: start 86.169000, end 641.266000
      Metadata:
        title           : Part A
    Chapter #0:2: start 641.266000, end 651.359000
      Metadata:
        title           : Eyecatch
    Chapter #0:3: start 651.359000, end 1286.160000
      Metadata:
        title           : Part B
    Chapter #0:4: start 1286.160000, end 1356.355000
      Metadata:
        title           : Ending
    Chapter #0:5: start 1356.355000, end 1376.876000
      Metadata:
        title           : Preview
  Stream #0:0: Video: hevc (Main 10), yuv420p10le(tv, bt709), 854x480 [SAR 1280:1281 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn (default)
      Metadata:
        DURATION        : 00:22:56.959000000
  Stream #0:1(eng): Audio: vorbis, 48000 Hz, stereo, fltp (default)
      Metadata:
        title           : English [FLAC 2.0]
        DURATION        : 00:22:57.278000000
  Stream #0:2(jpn): Audio: vorbis, 48000 Hz, stereo, fltp
      Metadata:
        title           : Japanese [FLAC 2.0]
        DURATION        : 00:22:57.276000000
  Stream #0:3(eng): Subtitle: ass (ssa)
      Metadata:
        title           : Signs and Songs [FMA1394/Redc4t]
        DURATION        : 00:22:51.090000000
  Stream #0:4(eng): Subtitle: ass (ssa)
      Metadata:
        title           : English [FMA1394/Redc4t]
        DURATION        : 00:22:51.090000000
  Stream #0:5(eng): Subtitle: hdmv_pgs_subtitle (pgssub), 1920x1080
      Metadata:
        title           : Full English Retail
        DURATION        : 00:22:51.120000000
  Stream #0:6: Attachment: ttf
      Metadata:
        filename        : 8bitoperator.ttf
        mimetype        : application/x-truetype-font
  Stream #0:7: Attachment: ttf
      Metadata:
        filename        : Cabin-Bold.ttf
        mimetype        : application/x-truetype-font
  Stream #0:8: Attachment: ttf
      Metadata:
        filename        : calibrib.ttf
        mimetype        : application/x-truetype-font
  Stream #0:9: Attachment: ttf
      Metadata:
        filename        : daniel_0.ttf
        mimetype        : application/x-truetype-font
  Stream #0:10: Attachment: ttf
      Metadata:
        filename        : DEATH_FONT.TTF
        mimetype        : application/x-truetype-font
  Stream #0:11: Attachment: ttf
      Metadata:
        filename        : Dominican.ttf
        mimetype        : application/x-truetype-font
  Stream #0:12: Attachment: ttf
      Metadata:
        filename        : gishabd.ttf
        mimetype        : application/x-truetype-font
  Stream #0:13: Attachment: ttf
      Metadata:
        filename        : PATRICK_0.TTF
        mimetype        : application/x-truetype-font
  Stream #0:14: Attachment: ttf
      Metadata:
        filename        : Qlassik-Medium.ttf
        mimetype        : application/x-truetype-font
Unsupported codec with id 98304 for input stream 6
Unsupported codec with id 98304 for input stream 7
Unsupported codec with id 98304 for input stream 8
Unsupported codec with id 98304 for input stream 9
Unsupported codec with id 98304 for input stream 10
Unsupported codec with id 98304 for input stream 11
Unsupported codec with id 98304 for input stream 12
Unsupported codec with id 98304 for input stream 13
Unsupported codec with id 98304 for input stream 14


    


    I am trying to extract the subtitles, edit them and reattach them to the video.
(I need my program to do that so I don't want to use other software)

    


    Command 1

    


    ffmpeg -i video.mkv -map 0:3 -c:s ssa subs.ass
ffmpeg -i video.mkv -i subs.ass -map 0 -map -0:s -map 1 -c copy out.mkv


    


    Command 2

    


    ffmpeg -i video.mkv -map 0:3 subs.ass
ffmpeg -i video.mkv -i subs.ass -map 0 -map -0:s -map 1 -c copy out.mkv


    


    Command 3

    


    ffmpeg -i video.mkv -map 0:3 subs.srt
ffmpeg -i video.mkv -i subs.srt -map 0 -map -0:s -map 1 -c copy out.mkv


    


    Command 4

    


    ffmpeg -i video.mkv -map 0:3 subs.srt
ffmpeg -i subs.srt subs.ass
ffmpeg -i video.mkv -i subs.ass -map 0 -map -0:s -map 1 -c copy out.mkv


    


    Command 5

    


    ffmpeg -i video.mkv -map 0:3 subs.ass
ffmpeg -i subs.ass subs.srt
ffmpeg -i video.mkv -i subs.srt -map 0 -map -0:s -map 1 -c copy out.mkv


    


    The problem

    


    After extraction the subtitles seem to be really quick, meaning they are displayed and disappear really quickly.

    


    For example the first subtitle is as follows in srt :

    


    1&#xA;00:00:03,100 --> 00:00:03,560&#xA;<font face="Dominican" size="77" color="#f7f7f7">Within the spreading darkness</font>&#xA;

    &#xA;

    Now, in srt it also has wrong size but I assume that's because of the conversion from ass to srt.

    &#xA;

    If I reattach the subtitle file in the video and open it, it is displayed and disappears way too fast and it doesn't match the original subtitles in the video.

    &#xA;

    (ie, the original video subtitles are showing for at least a second)

    &#xA;

    Expected behaviour

    &#xA;

    The subtitles should be displayed for the same duration as the original subtitles.

    &#xA;

    NOTE

    &#xA;

    It's my first question for ffmpeg related issues so feel free to ask me for anything else you may need.

    &#xA;

    UPDATE 1

    &#xA;

    I realized that the subtitles were ok for the timings as they had the same line multiple times, so the problem for not playing is something else.

    &#xA;

    Example of the file

    &#xA;

    1&#xA;00:00:03,100 --> 00:00:03,560&#xA;<font face="Dominican" size="77" color="#f7f7f7">Within the spreading darkness</font>&#xA;&#xA;2&#xA;00:00:03,560 --> 00:00:04,650&#xA;<font face="Dominican" size="77" color="#f7f7f7">Within the spreading darkness</font>&#xA;&#xA;3&#xA;00:00:04,650 --> 00:00:05,100&#xA;<font face="Dominican" size="77" color="#f7f7f7">Within the spreading darkness</font>&#xA;

    &#xA;

    So the problem is that VLC doesn't show more than the first subtitle.

    &#xA;

    The strange thing is when I use the below command

    &#xA;

    ffmpeg -i video.mkv -i subs.srt -map 0 -map -0:s -map 1 -c copy -c:s subrip out.mkv&#xA;

    &#xA;

    Then more lines of the subtitle (but not all) play.

    &#xA;

    It stops at the 17th line.

    &#xA;

    I believe that's an encoder's problem ? but I really don't know.

    &#xA;

    Also what I noticed is that VLC stops the subtitles but Windows Media Player (Windows 11 version) display the subtitles correctly even after the 17th line.

    &#xA;

    BUT, if I add subtitles from another video they are played correctly in both VLC and Windows Media Player.

    &#xA;

    Update 2&#xA;As @Gyan said in his answer I should use the following command

    &#xA;

    ffmpeg -i video.mkv -map 0:3 -c:s copy subs.ass&#xA;

    &#xA;

    But then if I attach the subs again with

    &#xA;

    ffmpeg -i video.mkv -i subs.ass -map 0 -map -0:s -map 1 -c copy -c:s ass out.mkv&#xA;

    &#xA;

    The subtitles show up to 17th line in both VLC and Windows Media Player.

    &#xA;

    or

    &#xA;

    ffmpeg -i video.mkv -i .\subs.ass -map 0 -map -0:s -map 1 -c copy out.mkv&#xA;

    &#xA;

    The subtitles do not show up at all. (Not even in Windows Media Player)

    &#xA;

  • nodejs ffmpeg play video at specific time and stream it to client

    12 mars 2020, par bluejayke

    I’m trying to make a basic online video editor with nodeJS and ffmpeg.

    To do this I need 2 steps :

    1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

    2. send the input-output data to nodejs and export it with ffmpeg as a finished vide.

    At first I wanted to do 1. purely on the client, then upload the source video(s) to nodeJS, and generate the same result with ffmpeg, and send back the result.

    But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans : to do all of the processing on the nodeJS server, including the video playing.

    This is the part I am stuck at now. I’m aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.

    I’ve seen the pipe:1 attribute from ffmpeg, but I couldn’t find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client. And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.

    I’ve also seen ffplay, but that’s only for testing as far as I know ; I haven’t seen any way of getting the video data from it in realtime with nodejs.

    So :

    how can I play a video, in nodeJS, at a specific time (preferably with ffmpeg), and send it back to the client in realtime ?

    What I have already seen :

    Best approach to real time http streaming to HTML5 video client

    Live streaming using FFMPEG to web audio api

    Ffmpeg - How to force MJPEG output of whole frames ?

    ffmpeg : Render webm from stdin using NodeJS

    No data written to stdin or stderr from ffmpeg

    node.js live streaming ffmpeg stdout to res

    Realtime video conversion using nodejs and ffmpeg

    Pipe output of ffmpeg using nodejs stdout

    can’t re-stream using FFMPEG to MP4 HTML5 video

    FFmpeg live streaming webm video to multiple http clients over Nodejs

    http://www.mobiuso.com/blog/2018/04/18/video-processing-with-node-ffmpeg-and-gearman/

    stream mp4 video with node fluent-ffmpeg

    How to get specific start & end time in ffmpeg by Node JS ?

    Live streaming : node-media-server + Dash.js configured for real-time low latency

    Low Latency (50ms) Video Streaming with NODE.JS and html5

    Server node.js for livestreaming

    HLS Streaming using node JS

    Stream part of the video to the client

    Video streaming with HTML 5 via node.js

    Streaming a video file to an html5 video player with Node.js so that the video controls continue to work ?

    How to (pseudo) stream H.264 video - in a cross browser and html5 way ?

    Pseudo Streaming an MP4 file

    How to stream video data to a video element ?

    How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client ?

    https://medium.com/@brianshaler/on-the-fly-video-rendering-with-node-js-and-ffmpeg-165590314f2

    node.js live streaming ffmpeg stdout to res

    Can Node.js edit video files ?

  • bitmap to yuv , video recorded has only green pixels

    19 janvier 2016, par UserAx

    I am trying to convert a bitmap to yuv, and recording this yuv in the ffmpeg frame recorder...
    I am getting the video output with only green pixels, though when i check the properties of this video it shows the set Frame rate and the resolution...

    The yuv encoding part is correct, but i feel i am making mistake somewhere else, mostly in returning the yuv bytes to recording part ( getByte(byte [] yuv ) because only there the yuv.length displayed in console is 0,, rest all methods return a big value in console ...

    Kindly help...

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);
       directory.mkdirs();

       addListenerOnButton();

       play=(Button)findViewById(R.id.buttonplay);
       stop=(Button)findViewById(R.id.buttonstop);
       record=(Button)findViewById(R.id.buttonstart);

       stop.setEnabled(false);
       play.setEnabled(false);


       record.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               startRecording();
               getByte(new byte[]{});
           }
       });

       stop.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               stopRecording();
           }
       });


       play.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) throws IllegalArgumentException, SecurityException, IllegalStateException {
               Intent intent = new Intent(Intent.ACTION_VIEW, Uri.parse(String.valueOf(asmileys)));
               intent.setDataAndType(Uri.parse(String.valueOf(asmileys)), "video/mp4");
               startActivity(intent);
               Toast.makeText(getApplicationContext(), "Playing Video", Toast.LENGTH_LONG).show();
           }
       });

    }

    ......//......



    public void getByte(byte[] yuv) {
       getNV21(640, 480, bitmap);
       System.out.println(yuv.length + " ");
       if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
           startTime = System.currentTimeMillis();
           return;
       }
       if (RECORD_LENGTH > 0) {
           int i = imagesIndex++ % images.length;
           yuvimage = images[i];
           timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
       }
           /* get video data */
       if (yuvimage != null &amp;&amp; recording) {
               ((ByteBuffer) yuvimage.image[0].position(0)).put(yuv);

               if (RECORD_LENGTH &lt;= 0) {
                   try {
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvimage);
                   } catch (FFmpegFrameRecorder.Exception e) {

                       e.printStackTrace();
                   }
               }
           }
    }

    public byte [] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {

       int[] argb = new int[inputWidth * inputHeight];

       bitmap.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);

       byte[] yuv = new byte[inputWidth * inputHeight * 3 / 2];
       encodeYUV420SP(yuv, argb, inputWidth, inputHeight);

       bitmap.recycle();
       System.out.println(yuv.length + " ");
       return yuv;

    }

    void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
       final int frameSize = width * height;

       int yIndex = 0;
       int uIndex = frameSize;
       int vIndex = frameSize;
       System.out.println(yuv420sp.length + " " + frameSize);

       int a, R, G, B, Y, U, V;
       int index = 0;
       for (int j = 0; j &lt; height; j++) {
           for (int i = 0; i &lt; width; i++) {

               a = (argb[index] &amp; 0xff000000) >> 24; // a is not used obviously
               R = (argb[index] &amp; 0xff0000) >> 16;
               G = (argb[index] &amp; 0xff00) >> 8;
               B = (argb[index] &amp; 0xff) >> 0;

               // well known RGB to YUV algorithm

               Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
               U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
               V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;

               // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
               //    meaning for every 4 Y pixels there are 1 V and 1 U.  Note the sampling is every other
               //    pixel AND every other scanline.
               yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0 : ((Y > 255) ? 255 : Y));
               if (j % 2 == 0 &amp;&amp; index % 2 == 0) {
                   yuv420sp[uIndex++] = (byte) ((U &lt; 0) ? 0 : ((U > 255) ? 255 : U));
                   yuv420sp[vIndex++] = (byte) ((V &lt; 0) ? 0 : ((V > 255) ? 255 : V));
               }

               index++;
           }
       }
    }

    .....//.....

    public void addListenerOnButton() {
    image = (ImageView) findViewById(R.id.imageView);
    image.setDrawingCacheEnabled(true);
    image.buildDrawingCache();
    bitmap = image.getDrawingCache();
    System.out.println(bitmap.getByteCount() + " " );

    button = (Button) findViewById(R.id.btn1);
    button.setOnClickListener(new OnClickListener() {
    @Override
    public void onClick(View view){
       image.setImageResource(R.drawable.image1);
     }
    });

    ......//......

    EDIT 1 :

    I made few changes in the above code :

    record.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               startRecording();
               getByte();
           }
       });
    .....//....

    public void getbyte() {
       byte[] yuv = getNV21(640, 480, bitmap);

    So now in the console ; i get same yuv length in this method as the yuv length from getNV21 method..

    But now i am getting half screen Black and Half screen green(black above and green below) pixels in the recorded video...

    If i add these lines to onCreate method ;

    image = (ImageView) findViewById(R.id.imageView);
    image.setDrawingCacheEnabled(true);
    image.buildDrawingCache();
    bitmap = image.getDrawingCache();

    I do get distorted frames( frames are 1/4th of the image displayed with mix up of colors here and there) in the video....

    All i am trying to learn is the image processing and flow of Bytes[] from one method to another ; but i am still a noob.. ;

    Kindly help..!