Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Autres articles (77)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (5333)

  • Is there an efficient way to retrieve frames from a video in Android ?

    28 mars 2015, par Naveed

    I have an app which requires me to retrieve frames from a video and do some processing with them. However it seems like that the frame retrieval is very slow to the point where it is unacceptable. Sometimes it is taking upto 2.5 second to retrieve a single frame. I am using the MediaMetadataRetriever as most stackoverflow questions suggested. However the performance is very bad. Here is what I have :

      private List<bitmap> retrieveFrames() {

           MediaMetadataRetriever fmmr = new MediaMetadataRetriever();
           fmmr.setDataSource("/path/to/some/video.mp4");
           String strLength = fmmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
           long milliSecs = Long.parseLong(strLength);
           long microSecLength = milliSecs * 1000;

           Log.d("TAG", "length: " + microSecLength);
           long one_sec = 1000000; // one sec in micro seconds

           ArrayList<bitmap> frames = new ArrayList&lt;>();
           int j = 0;
           for (int i = 0; i &lt; microSecLength; i += (one_sec / 5)) {
               long time = System.currentTimeMillis();
               Bitmap frame = fmmr.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST);
               j++;
               Log.d("TAG", "Frame number: " + j + " Time taken: " + (System.currentTimeMillis() - time));
               // commented out because each frame would be written to disk instead of holding them in memory
               //  frames.add(frame);
           }
           fmmr.release();
           return frames;
       }
    </bitmap></bitmap>

    The above will logs :

    03-26 21:49:29.781  13213-13239/com.example.naveed.myapplication D/TAG﹕ length: 4949000
    03-26 21:49:30.187  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 1 Time taken: 406
    03-26 21:49:30.779  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 2 Time taken: 592
    03-26 21:49:31.578  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 3 Time taken: 799
    03-26 21:49:32.632  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 4 Time taken: 1054
    03-26 21:49:33.895  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 5 Time taken: 1262
    03-26 21:49:35.382  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 6 Time taken: 1486
    03-26 21:49:37.128  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 7 Time taken: 1746
    03-26 21:49:39.077  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 8 Time taken: 1948
    03-26 21:49:41.287  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 9 Time taken: 2210
    03-26 21:49:43.717  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 10 Time taken: 2429
    03-26 21:49:44.093  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 11 Time taken: 376
    03-26 21:49:44.707  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 12 Time taken: 614
    03-26 21:49:45.539  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 13 Time taken: 831
    03-26 21:49:46.597  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 14 Time taken: 1057
    03-26 21:49:47.875  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 15 Time taken: 1278
    03-26 21:49:49.384  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 16 Time taken: 1508
    03-26 21:49:51.112  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 17 Time taken: 1728
    03-26 21:49:53.096  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 18 Time taken: 1983
    03-26 21:49:55.315  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 19 Time taken: 2218
    03-26 21:49:57.711  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 20 Time taken: 2396
    03-26 21:49:58.065  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 21 Time taken: 354
    03-26 21:49:58.640  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 22 Time taken: 574
    03-26 21:49:59.369  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 23 Time taken: 728
    03-26 21:50:00.112  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 24 Time taken: 742
    03-26 21:50:00.834  13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 25 Time taken: 721

    As you can see from above, it is taking about 18 - 25 sec to retrieve 25 frames from a 4 sec long video.

    I have also tried this which uses FFmpeg underneath to do the same. I am not sure how well this library is implemented but it only improves the over all performance by a couple of seconds meaning it takes about 15-20 sec to do the same.

    So my question is : is there a way to do it quicker ? My friend has an iOS app where he does something similar but it only takes couple of seconds and he is grabbing even more frames however he is not sure how to do it on android.

    Is there anything on android that would speed up the process. Am I approaching this wrong ?

    The end goal is to stitch those frames together into a gif.

  • Split h.264 stream into multiple parts in python

    31 janvier 2023, par BillPlayz

    My objective is to split an h.264 stream into multiple parts, meaning while reading the stream from a pipe i would like to save it into x second long packages (in my case 10).

    &#xA;

    I am using a libcamera-vid subprocess on my Raspberry Pi that outputs the h.264 stream into stdout.
    &#xA;Might be irrelevant, depends : libcamera-vid outputs a message every frame and I am able to locate it at isFrameStopLine&#xA;To convert the stream, I use an ffmpeg subprocess, as you can see in the code below.

    &#xA;

    Imagine it like that :
    &#xA;Stream is running...
    &#xA;- Start recording to a file
    &#xA;- Sleep x seconds
    &#xA;- Finish recording to file
    &#xA;- Start recording a new file
    &#xA;- Sleep x seconds
    &#xA;- Finish recording the new file
    &#xA;- and so on...

    &#xA;

    Here is my current code, however upon running the first export succeeds, and after the second or third the ffmpeg-subprocess is terminating with the error :
    &#xA;pipe:: Invalid data found when processing input
    &#xA;And shortly after, the python process, because of the ffmpeg termination i believe.&#xA;Traceback (most recent call last):   File "/home/survpi-camera/main.py", line 56, in <module>     processStreamLine(readData)   File "/home/survpi-camera/main.py", line 16, in processStreamLine     streamInfo["process"].stdin.write(data) BrokenPipeError: [Errno 32] Broken pipe</module>

    &#xA;

    recentStreamProcesses = []&#xA;streamInfo = {&#xA;    "lastStreamStart": -1,&#xA;    "process": None&#xA;}&#xA;&#xA;def processStreamLine(data):&#xA;    isInfoLine = ((data.startswith(b"[") and (b"INFO" in data)) or (data == b"Preview window unavailable"))&#xA;    isFrameStopLine = (data.startswith(b"#") and (b" fps) exp" in data))&#xA;    if ((not isInfoLine) and (not isFrameStopLine)):&#xA;        streamInfo["process"].stdin.write(data)&#xA;    &#xA;    if (isFrameStopLine):&#xA;        if (time.time() - streamInfo["lastStreamStart"] >= 10):&#xA;            print("10 seconds passed, exporting...")&#xA;            exportStream()&#xA;            createNewStream()&#xA;&#xA;def createNewStream():&#xA;    streamInfo["lastStreamStart"] = time.time()&#xA;    streamInfo["process"] = subprocess.Popen([&#xA;        "ffmpeg",&#xA;        "-r","30",&#xA;        "-i","-",&#xA;        "-c","copy",("/home/survpi-camera/" &#x2B; str(round(time.time())) &#x2B; ".mp4")&#xA;    ],stdin=subprocess.PIPE,stderr=subprocess.STDOUT)&#xA;    print("Created new streamProcess.")&#xA;&#xA;def exportStream():&#xA;    print("Exporting...")&#xA;    streamInfo["process"].stdin.close()&#xA;    recentStreamProcesses.append(streamInfo["process"])&#xA;&#xA;&#xA;cameraProcess = subprocess.Popen([&#xA;    "libcamera-vid",&#xA;    "-t","0",&#xA;    "--width","1920",&#xA;    "--height","1080",&#xA;    "--codec","h264",&#xA;    "--inline",&#xA;    "--listen",&#xA;    "-o","-"&#xA;],stdout=subprocess.PIPE,stdin=subprocess.PIPE,stderr=subprocess.STDOUT)&#xA;&#xA;createNewStream()&#xA;&#xA;while True:&#xA;    readData = cameraProcess.stdout.readline()&#xA;    &#xA;    processStreamLine(readData)&#xA;

    &#xA;

    Thank you in advance !

    &#xA;

  • ffmpeg hwo to choose the best bit rate for HLS streaming ?

    7 février 2023, par Josef Kranz

    I'm building a video streaming white label product, but I've run into the following scenario where I'm not sure what might be the optimal way to get the best video quality in return. At the moment I'm using crf based encoding, which is from a streaming based point of view a very bad idea. From a quality point of view, it will definitely give me the best quality and also the best efficiency. Why is crf based encoding bad for streaming ? First you have a non-fixed file size, meaning your bit rate might vary between 20 kbps and 20 Mbit/s depending on the codec and picture motion of the current frames ...&#xA;When you have such heavily varying stream according to the bit rate, automatic quality selection at the player might not function correctly, and the player automatically switches from 1080p to 480p for no reason (using VideoJS here).

    &#xA;

    To fix this issue, it's fine if I set minrate, maxrate with ffmpeg but this will also come at the cost that some frames might look pixelated, which I absolutely do not want.

    &#xA;

    Currently, my encoding command looks something like this :

    &#xA;

    /usr/bin/ffmpeg -i "/tmp/VODProcessing/Tester 2160p UHD-HDR.mp4" -map 0:0 -c:v libsvtav1 -crf 19 -vf zscale=width=3840:height=2160 -svtav1-params "profile=0:enable-force-key-frames=1:superres-mode=1:enable-tf=0:tune=0:enable-overlays=1:scd=1:scm=2:enable-mfmv=1:enable-cdef=1:enable-dlf=1:fast-decode=1:color-primaries=9:transfer-characteristics=16:matrix-coefficients=9:input-depth=10:mastering-display=G(0.265,0.69)B(0.15,0.06)R(0.68,0.32)WP(0.3127,0.329)L(4000.0,0.005):content-light=368,226:enable-hdr=1:color-range=1" -pix_fmt yuv420p10le -color_trc smpte2084 -color_primaries bt2020 -colorspace bt2020nc -chroma_sample_location:v topleft -color_range:v pc -max_muxing_queue_size 1024 -preset 7 -bf 0 -force_key_frames "expr:gte(t,n_forced*4.004)" -keyint_min 48 -sc_threshold 0 -use_timeline 1 -use_template 1 -map_metadata -1 -map_chapters -1 -f hls -seg_duration 4.004 -hls_time 4.004 -streaming 1 -hls_list_size 0 -hls_segment_filename "/tmp/VODProcessing/output/Tester 2160p UHD-HDR/v-av01-2160p-av01.0.12M.10_PQ/f-%04d.m4s"  -hls_fmp4_init_filename "init-v-av01-2160p-av01.0.12M.10_PQ.m4s"  -hls_segment_type fmp4 -hls_playlist_type vod -movflags frag_keyframe&#x2B;frag_every_frame&#x2B;write_colr&#x2B;prefer_icc&#x2B;skip_trailer&#x2B;faststart -hls_flags independent_segments -strict experimental "/tmp/VODProcessing/output/Tester 2160p UHD-HDR/v-av01-2160p-av01.0.12M.10_PQ/master.m3u8"&#xA;

    &#xA;

    Which will form a stream made out of independent m4s segments. As you can see, I'm using libsvtav1 in crf mode to output the result in AV1.

    &#xA;

    Now my question is, how can I have the same nice quality output as with the crf mode while having a static/fixed bit rate ? Will 2 pass encoding solve this problem by distributing the pixels or data rate differently ?

    &#xA;

    Thanks in advance

    &#xA;