Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (33)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Utilisation et configuration du script

    19 janvier 2011, par

    Informations spécifiques à la distribution Debian
    Si vous utilisez cette distribution, vous devrez activer les dépôts "debian-multimedia" comme expliqué ici :
    Depuis la version 0.3.1 du script, le dépôt peut être automatiquement activé à la suite d’une question.
    Récupération du script
    Le script d’installation peut être récupéré de deux manières différentes.
    Via svn en utilisant la commande pour récupérer le code source à jour :
    svn co (...)

Sur d’autres sites (5440)

  • Mac terminal command to list files and sort by date to use in ffmpeg

    22 septembre 2020, par Jeff

    I am using a gopro to film a bunch of videos. I want to then take those videos directly from the SD card folder and concatenate them into a single video (bypass an editor) by using FFMPEG.

    


    I'm currently able to stitch together "chaptered" videos with the following example command on my Mac (10.13) :

    


    ffmpeg -f concat -safe 0 -i <(for f in /sdcardfolder/100GOPRO/GH*488.MP4; do echo "file '$f'"; done) -c copy /folder/video.mp4

    


    The reason for this is that the ffmpeg command requires a text file that looks like this :

    


    


    file '/folder/GH016992.MP4'

    
file '/folder/GH036990.MP4'

    
...

    


    


    The real command is this, which generates the list of files in the right format with file in front of each one and can be embedded into the ffmpeg command :

    


    for f in /Volumes/GoPro8/DCIM/100GOPRO/GH0*71*.MP4; do echo "file '$f'"; done

    


    I want to add 2 changes to this :

    


      

    1. List the files in date order (ascending) : I want the list of files to be in date order. But I can't figure out how to add a -sort or something to the for f in command.

      


    2. 


    3. Allow a more robust set of file matching/filtering : Right now I can add basic regex like GH*488.MP4 or, with chapters which increments the first number, something like GH0[123]488.MP4 would work to just get the first few. But when I change it to be more flexible like GH0[0-9]71[0-9][0-9].MP4 - which would be necessary to match all files that were recorded yesterday, but nothing before then, the command doesn't like this regex. It seems to only accept a *.

      


    4. 


    


    I looked at a few examples like https://opensource.com/article/19/6/how-write-loop-bash but there wasn't much more than just listing files.

    


    This boils down to a terminal command and isn't really related to FFMPEG but I hope it's helpful context.

    


    I imagined it would be something like this, but this definitely doesn't work :

    


    for f in (find /Volumes/GoPro8/DCIM/100GOPRO/GH0[0-9]71[0-9][0-9].MP4 -type f | sort); do echo "file '$f'"; done

    


    I'd appreciate any help ! Thanks !

    


    Update

    


    It looks like sorting isn't easy with Mac tools so I gave up and wrote a much simpler Ruby script that could execute everything for me. This is not really an answer to my question above but it is a solution.

    


    Here I can easily write the text file necessary for ffmpeg and I can also filter files with a regex on the name, filter for a particular date, and size. Then, via the script, simply execute the ffmpeg command with args to concat files. I can also have it immediately resample the file to compress it (gopro videos are giant and I'm ok with a much lower bitrate if I want to save raw footage).

    


    I got lucky with this Dir.entries in Ruby - it seems to automatically sort by date ? I don't know how to sort it otherwise.

    


    PATH = '/Volumes/GoPro8/DCIM/100GOPRO/'
NEW_FILENAME = '/folder/new-file.mp4'
video_list = '/folder/ffmpeg-list.txt'

# create the text file
File.delete(video_list) if File.exist?(video_list)
i = 1
Dir.entries(PATH).each do |f|
    d = File.mtime(PATH + f)
    size = File.size(PATH + f)
    if f.match(/GH0.*.MP4/) && d.to_s.match(/2020-07-30/) && size.to_i < 1000000000
        puts "#{i}\t#{f}\t#{d}\t#{size}"
        File.write(video_list, "file #{PATH + f}\n", mode: "a")
        i= i+1
    end
end

command = "ffmpeg -f concat -safe 0 -i #{video_list} -c copy #{NEW_FILENAME}"

puts "executing concatenate..."
puts command
system(command)


    


  • FFmpeg audio video merge issue in Android

    20 octobre 2017, par djac

    Below code is to merge audio and video file in Android. Both input files are in raw folder in app and both will be stored to
    sd card in Oncreate funtion and will be merged.

    Here the issue is the code is executing, but the video input file is written into audio input folder and the output merge file result.mp4 is faulty.
    Could you please help to find out the issue in code/ command ?

    public class Mrge  extends AppCompatActivity {


       Uri vuri=null;
       public String vabsolutePath=null,  aabsolutePath=null, dabsolutePath=null;


       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.message_layout);

           OutputStream out;

           try {
               ByteArrayOutputStream stream = new ByteArrayOutputStream();
               InputStream ins = getResources().openRawResource(
                       getResources().getIdentifier("angry",
                               "raw", getPackageName()));


               byte[] buf = new byte[1024];
               int n;
               while (-1 != (n = ins.read(buf)))
                   stream.write(buf, 0, n);

               byte[] bytes = stream.toByteArray();

               String root = Environment.getExternalStorageDirectory().getAbsolutePath() + "/";
               File createDir = new File(root + "master" + File.separator);

               createDir.mkdir();


               File file = new File(root + "master" + File.separator + "master.mp4");


               file.createNewFile();
               out = new FileOutputStream(file);
               out.write(bytes);
               out.close();



               vabsolutePath = file.getAbsolutePath();

               //-------------------------------------------------------------------

               ins = getResources().openRawResource(
                       getResources().getIdentifier("audio",
                               "raw", getPackageName()));

               while (-1 != (n = ins.read(buf)))
                   stream.write(buf, 0, n);

               bytes = stream.toByteArray();


               root = Environment.getExternalStorageDirectory().getAbsolutePath() + "/";
               createDir = new File(root + "audio" + File.separator);
               createDir.mkdir();


               file = new File(root + "audio" + File.separator + "audio.aac");

               file.createNewFile();
               out = new FileOutputStream(file);
               out.write(bytes);
               out.close();

               aabsolutePath = file.getAbsolutePath();

               root = Environment.getExternalStorageDirectory().getAbsolutePath() + "/";
               createDir = new File(root + "result" + File.separator);
               createDir.mkdir();


               file = new File(root + "result" + File.separator + "result.mp4");

               file.createNewFile();

               dabsolutePath = file.getAbsolutePath();


               //------------------------------------------------------------------------






           } catch (IOException e) {
               e.printStackTrace();
           }
           String ccommand[] = {"-y", "-i",vabsolutePath,"-i",aabsolutePath, "-c:v", "copy", "-c:a", "aac","-shortest", dabsolutePath};

           loadFFMpegBinary();
           execFFmpegBinary(ccommand);

       }






           FFmpeg ffmpeg;
           private void loadFFMpegBinary() {
               try {
                   if (ffmpeg == null) {

                       ffmpeg = FFmpeg.getInstance(this);
                   }
                   ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
                       @Override
                       public void onFailure() {
                           //showUnsupportedExceptionDialog();
                       }

                       @Override
                       public void onSuccess() {

                       }
                   });
               } catch (FFmpegNotSupportedException e) {
                   //showUnsupportedExceptionDialog();
               } catch (Exception e) {

               }
           }




       private void execFFmpegBinary(final String[] command) {
           try {
               ffmpeg.execute(command, new ExecuteBinaryResponseHandler()
               {
                   @Override
                   public void onFailure(String s) {

                   }

                   @Override
                   public void onSuccess(String s) {


                   }


               @Override
               public void onProgress(String s) {

               }

               @Override
               public void onStart() {

               }

               @Override
               public void onFinish() {


               }
           });
       } catch (FFmpegCommandAlreadyRunningException e) {

               String m="hi";

       }




    }










    }
  • Real Time Audio and Video Streaming in C#

    16 novembre 2014, par Nuwan

    I am developing an application which can be used to stream audio and video in real time.
    I can stream in two different ways. I used a capture card to capture live HD stream and
    re send it. And also I need to stream local video file in real time.

    Now I capture video using OpenCV and store frames as bitmaps in blokingCollection bitmap queue.
    After that I encode video frames using ffmpeg (used c# library Nreco) and stored in a queue. Then I send that encoded data through UDP (did not used RTP/RTSP) to omxplayer in raspberry pi and it works very fine.

    Then I captured audio data using ffmpeg
    I used this command to capture and encode audio data.

                    data = ms.ToArray();
                    ffMpegTask = ffmpegConverter.ConvertLiveMedia(
                           fileName,
                           null,
                           ms,
                           Format.avi,
                           new ConvertSettings()
                           {                                
                               CustomOutputArgs = " -tune zerolatency -ss " + second + " -t " + endTime + " -strict experimental -acodec aac -ab 160k -ac 2 -ar 44100 -vn ",                            
                           });
                   ffMpegTask.Start();
                   ffMpegTask.Stop();
                   byte[] data = ms.ToArray();

    After that I saved every audio data packet to queue.

    And I tried to stream these separate audio and video data to omxplayer by using two different
    ports. and received streams by using two omxplayers. And it works fine.

    But what I need to do is multiplex this audio and video stream and send as one stream.
    what I do is first stream two streams as UDP://224.1.1.1:1250(video) and UDP://224.1.1.1:1260(audio)
    then I used nreco invoke method. We can use it to execute ffmpeg commands.

    " -re -i udp://224.1.1.1:1250 -i udp://224.1.1.1:1260 -c copy -f avi udp://224.1.1.1:1270"

    and this works for both audio and video stream but completely out of sync.

    Next thing what I do is creating another ffmpeg ConvertLiveMedia task and write audio and video data
    to that task using write method. And I stream that mux data and received using ffplay. And it plays the stream
    and the sync problem is solved. But sometimes audio and video frames are dropping and then it begins to
    play out of sync.

                   combine = new MemoryStream();
                   ffMpegTaskcom = ffmpegConvertercom.ConvertLiveMedia(
                           Format.mpeg,
                           combine,
                           Format.avi,
                           new ConvertSettings()
                           {
                               CustomInputArgs = " ", // windows bitmap pixel format
                               CustomOutputArgs = " -threads 7 -c:v libx264 -preset ultrafast -tune zerolatency -strict experimental -profile:v baseline -movflags +faststart -tune film -level 3.0 -tune zerolatency -tune film -pix_fmt yuv420p -g 250 -crf 22 -b:v 4000k -minrate 3000k -maxrate 5000k -acodec aac -ab 160k -ac 2 -ar 44100",

                           });
                   ffMpegTaskcom.Start();
                   byte[] streamBytesvi = null;
                   byte[] streamBytesau = null;
                   encodeQueqe.TryDequeue(out streamBytesvi);
                   encodeQueqeau.TryDequeue(out streamBytesau);
                   ffMpegTaskcom.Write(streamBytesvi, 0, streamBytesvi.Length);
                   ffMpegTaskcom.Write(streamBytesau, 0, streamBytesau.Length);

                   //ffMpegTaskcom.Wait();
                   ffMpegTaskcom.Stop();

    Now I need to know a good method to deliver audio and video data with synchronization.
    Please tell me what is the wrong I have done or suggest a better way to do this.

    Thank You !