Recherche avancée

Médias (91)

Autres articles (76)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (4968)

  • Recursively scan and identify video files with FFMPEG

    2 mars 2021, par Jason Paul Michaels

    UPDATE

    


    Thank you everyone for sharing up your suggestions. I was able to make the work properly by modifying it as follows.

    


    #!/bin/bash

ROOTPATH="/Volumes/NVME-RAID/ASSET-Processing/CORRUPT-SCAN/SCAN " # manually define root  path to your folder that has subfolders
for subdir in *; do
  cd ${ROOTPATH}${subdir} 
  mkdir 00errors
  
  for path in *.{MOV,mov,MP4,mp4}; do

    ffmpeg -i "${path}" -f null -; echo$?
    RC=$?

    if [ "${RC}" -ne "0" ]; then
        # Do something to handle the error.
        mv ${path} ./00errors
    fi

  done
  cd ../
done


    


    The only issue i have now is that it does not appear to be traversing the subfolders. As I understand it it should create a "00errors" folder in within EACH sub folder and move the errors files within that sub folder.

    


    trying to sift through 14TB of recovered video...

    


    I'm trying to figure out how to properly convert this script so it will run on a bash/MacOS. I'm coming to a wall with the "ROOTPATH" call because MacOS doesn't use /mnt

    


    #!/bin/bash

ROOTPATH="/mnt/f/00test/" # manually define root  path to your folder that has subfolders
for subdir in *; do
  cd ${ROOTPATH}${subdir} 
  mkdir 00errors
  
  for path in *.mp4; do

    ffmpeg error -i "${path}"
    RC=$?

    if [ "${RC}" -ne "0" ]; then
        # Do something to handle the error.
        mv ${path} ./00errors
    fi

  done
  cd ../
done


    


    REF Quickly check the integrity of video files inside a directory with ffmpeg

    


  • Does stream seek order matter for ffmpeg av_seek_frame() ?

    2 janvier 2019, par necrosato

    I am attempting to seek both audio and video streams for an mp4 using the ffmpeg av_seek_frame method.

    I have encountered an issue when seeking that I have remedied by changing my seek order, but would like to make sure my fix is actually a fix and not some coincidental hack that works.

    I am attempting to seek both the audio and video stream to the first packet. For video, the first packet has a pts of 0. For audio, the first packet has a pts of -1024. The video stream has an index of 0 and the audio stream has an index of 1. This has all been verified using ffprobe on the media file to view the packets and streams.

    The following code does not work, it seeks both the audio and video streams to packets with pts of 0 :

    for (int i = format_context->nb_streams - 1; i >= 0; --i) {
       AVStream* stream = format_context->streams[i];
       av_seek_frame(format_context, i, stream->first_dts, flags);
    }

    But this properly seeks the video stream to pts 0 and audio stream to pts -1024 :

    for (int i = 0; i < format_context->nb_streams; ++i) {
       AVStream* stream = format_context->streams[i];
       av_seek_frame(format_context, i, stream->first_dts, flags);
    }

    Note that in the first example, audio is seeked before video, and in the second example video is seeked before audio.

    Does the order of the av_seek_frame calls actually matter, or is there a bug somewhere else in my code that this just so happens to cover up ?

  • parsing different ffmpeg -i output

    11 août 2012, par Jizbo Jonez

    I have a bit of code that gets a video files duration, width, height and framerate which works fine for some videos -

    $output = `ffmpeg -i /var/thismovie.avi`;
    preg_match('/Duration: (.*?),.*?Video:.*?0x.*?([0-9]+)x([[0-9]+).*?([0-9]+) fps/i'
    ,$output , $result);

    The problem is there are other videos that give a slightly different output information, for example the above code works with this output -

    Input #0, avi, from '/var/www/vhosts/thissite.com/httpdocs/video1.avi':
    Duration: 00:00:10.76, start: 0.000000, bitrate: 5180 kb/s
    Stream #0:0: Video: h264 (High) (H264 / 0x34363248), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 50 tbc
    Stream #0:1: Audio: ac3 ([0] [0][0] / 0x2000), 48000 Hz, stereo, s16, 128 kb/s

    but another video gives this info and will not give any results when used with the above code -

    Input #0, avi, from '/var/www/vhosts/thissite.com/httpdocs/video2.avi':
    Duration: 00:00:05.68, start: 0.000000, bitrate: 887 kb/s
    Stream #0:0: Video: mpeg4 (Advanced Simple Profile) (XVID / 0x44495658), yuv420p, 640x272 [SAR 1:1 DAR 40:17], 25 tbr, 25 tbn, 25 tbc
    Stream #0:1: Audio: mp3 (U[0][0][0] / 0x0055), 48000 Hz, stereo, s16, 128 kb/s

    The difference between the two outputs is in the Stream #0:0 part. The fist output shows 7 different pieces of information separated by a comma, the last output only shows 6 bits. The missing piece of info in the last output is the frame rate (fps) but apparently I can use the value for tbr instead.

    So my question is, how can I modify the code I am using to cover both types of outputs ?