Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (34)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Utilisation et configuration du script

    19 janvier 2011, par

    Informations spécifiques à la distribution Debian
    Si vous utilisez cette distribution, vous devrez activer les dépôts "debian-multimedia" comme expliqué ici :
    Depuis la version 0.3.1 du script, le dépôt peut être automatiquement activé à la suite d’une question.
    Récupération du script
    Le script d’installation peut être récupéré de deux manières différentes.
    Via svn en utilisant la commande pour récupérer le code source à jour :
    svn co (...)

  • Installation en mode standalone

    4 février 2011, par

    L’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
    [mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
    Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...)

Sur d’autres sites (2663)

  • Recording real-time video from images with FFmpeg

    17 juillet 2015, par Solarnum

    I am really not sure what else I could be doing to achieve this. I’m trying to record the actions in one of the views in my Android app so that it can be played back later and show the previous actions in real time. The major problem (among others, because there is no way I’m doing this optimally) is that the video takes at least 4 times longer to make than it does to playback. If I ask FFmpeg to create a 5 second video the process will run in the background for 20 seconds and output a greatly accelerated 5 second video.

    My current strategy is to use the -loop 1 parameter on a single image file and continuously write a jpeg to that image file. (If someone has a better idea than this for feeding continuously updated image information to FFmpeg let me know)

    encodingThread = new Thread(new Runnable() {
               private boolean running = true;
               @Override
               public void run() {
                   while (running) {
                       try {

                           Bitmap bitmap = makeBitmapFromView();

                           String filepath = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test.jpg";
                           File file = new File(filepath);
                           FileOutputStream fout = new FileOutputStream(file);
                           bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fout);
                           fout.flush();
                           fout.close();
                           Thread.sleep(50);
                       } catch (IOException e) {
                           e.printStackTrace();
                       } catch (InterruptedException e) {
                           running = false;
                       }
                   }
               }
           });
           startVideoMaking();
           encodingThread.start();

    The startVideoMaking method is as follows :

    private void startVideoMaking(){
       ffmpeg.killRunningProcesses();
               File outputFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/testout.mp4");
               String path = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test.jpg";
               String output = Environment.getExternalStorageDirectory().getAbsolutePath() + "/testout.mp4";

               String command = "-loop 1 -t 5 -re -i " + path + " -c:v libx264 -loglevel verbose -vsync 0 -threads 0 -preset ultrafast -tune zerolatency -y -pix_fmt yuv420p " + output;
               executeFFM(command);
    }

    Just to make it clear, the FFmpeg command that I am executing is

    ffmpeg -loop 1 -re -i /storage/emulated/0/test.jpg -t 5 -c:v libx264 -loglevel verbose -vsync 0 -threads 0 -preset ultrafast -tune zerolatency -y -pix_fmt yuv420p /storage/emulated/0/testout.mp4

    The makeBitmapFromView() method takes about 50ms to process and writing the bitmap to the sd card takes around 200ms, which is not great.

    I’m pretty lost as to what other solutions there would be to creating a video of a single view in Android. I know there is the MediaCodec class, but I couldn’t get that to work and also it would raise my minimum sdk, which is not ideal. I’m also not sure that the MediaCodec class would even solve my problem.

    Is there some way that I can get FFmpeg to create a 5 second video that is equivalent to 5 seconds of real time ? I have also tried converting a single image, without updating it’s content continuously and had the same results.

    If my question isn’t clear enough let me know.

  • Ffmpeg Command to capture video from Decklink 4k Extreme in Ubuntu Linux

    24 septembre 2016, par George.Ef

    I am trying to capture a video using ffmpeg, from the HDMI input port of the Blackmagic DeckLink 4K Extreme capture card in Ubuntu Linux.

    As per the ffmpeg Documentation I have tried the following :

    ffmpeg -f decklink -video_input 'hdmi' -i 'DeckLink 4K Extreme (1)@14' -acodec copy -vcodec copy ~/testCapture/card1_f14_hdmi.avi

    but no matter what I do I always get this picture as a video
    Video unavailable

    My ffmpeg version is :

    ffmpeg version git-2016-08-15-4899953 Copyright (c) 2000-2016 the FFmpeg developers
    built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
    configuration: --prefix=/root/ffmpeg_build --pkg-config-flags=--static
    --extra-cflags=-I/root/ffmpeg_build/include
    --extra-ldflags=-L/root/ffmpeg_build/lib
    --bindir=/root/bin --enable-gpl --enable-libass --enable-libfdk-aac
    --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora
    --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265
    --enable-decklink
    --extra-cflags=-I/root/decklinkSDK/Blackmagic_DeckLink_SDK/Linux/include
    --extra-ldflags=-L/root/decklinkSDK/Blackmagic_DeckLink_SDK/Linux/include
    --enable-nonfree
    libavutil      55. 28.100 / 55. 28.100
    libavcodec     57. 51.102 / 57. 51.102
    libavformat    57. 46.101 / 57. 46.101
    libavdevice    57.  0.102 / 57.  0.102
    libavfilter     6. 51.100 /  6. 51.100
    libswscale      4.  1.100 /  4.  1.100
    libswresample   2.  1.100 /  2.  1.100
    libpostproc    54.  0.100 / 54.  0.100

    I have two of these cards as per the following :

    ffmpeg -f decklink -list_devices 1 -i dummy
    .....
    [decklink @ 0x2e9e440] Blackmagic DeckLink devices:
    [decklink @ 0x2e9e440]    'DeckLink 4K Extreme (1)'
    [decklink @ 0x2e9e440]    'DeckLink 4K Extreme (2)'

    I am able to get a list of the supported formats with the following :

    ffmpeg -f decklink -list_formats 1 -i 'DeckLink 4K Extreme (1)'
    ...
    [decklink @ 0x36e2440] Supported formats for 'DeckLink 4K Extreme (1)':
    [decklink @ 0x36e2440]    1   720x486 at 30000/1001 fps (interlaced, lower field first)
    [decklink @ 0x36e2440]    2   720x576 at 25000/1000 fps (interlaced, upper field first)
    [decklink @ 0x36e2440]    3   1920x1080 at 24000/1001 fps
    [decklink @ 0x36e2440]    4   1920x1080 at 24000/1000 fps
    [decklink @ 0x36e2440]    5   1920x1080 at 25000/1000 fps
    [decklink @ 0x36e2440]    6   1920x1080 at 30000/1001 fps
    [decklink @ 0x36e2440]    7   1920x1080 at 30000/1000 fps
    [decklink @ 0x36e2440]    8   1920x1080 at 25000/1000 fps (interlaced, upper field first)
    [decklink @ 0x36e2440]    9   1920x1080 at 30000/1001 fps (interlaced, upper field first)
    [decklink @ 0x36e2440]    10  1920x1080 at 30000/1000 fps (interlaced, upper field first)
    [decklink @ 0x36e2440]    11  1920x1080 at 50000/1000 fps
    [decklink @ 0x36e2440]    12  1920x1080 at 60000/1001 fps
    [decklink @ 0x36e2440]    13  1920x1080 at 60000/1000 fps
    [decklink @ 0x36e2440]    14  1280x720 at 50000/1000 fps
    [decklink @ 0x36e2440]    15  1280x720 at 60000/1001 fps
    [decklink @ 0x36e2440]    16  1280x720 at 60000/1000 fps
    ...
    DeckLink 4K Extreme (1): Immediate exit requested

    What should I use with ffmpeg in order to capture an HD video with sound from the HDMI port ?

  • Real Time Audio and Video Streaming in C#

    16 novembre 2014, par Nuwan

    I am developing an application which can be used to stream audio and video in real time.
    I can stream in two different ways. I used a capture card to capture live HD stream and
    re send it. And also I need to stream local video file in real time.

    Now I capture video using OpenCV and store frames as bitmaps in blokingCollection bitmap queue.
    After that I encode video frames using ffmpeg (used c# library Nreco) and stored in a queue. Then I send that encoded data through UDP (did not used RTP/RTSP) to omxplayer in raspberry pi and it works very fine.

    Then I captured audio data using ffmpeg
    I used this command to capture and encode audio data.

                    data = ms.ToArray();
                    ffMpegTask = ffmpegConverter.ConvertLiveMedia(
                           fileName,
                           null,
                           ms,
                           Format.avi,
                           new ConvertSettings()
                           {                                
                               CustomOutputArgs = " -tune zerolatency -ss " + second + " -t " + endTime + " -strict experimental -acodec aac -ab 160k -ac 2 -ar 44100 -vn ",                            
                           });
                   ffMpegTask.Start();
                   ffMpegTask.Stop();
                   byte[] data = ms.ToArray();

    After that I saved every audio data packet to queue.

    And I tried to stream these separate audio and video data to omxplayer by using two different
    ports. and received streams by using two omxplayers. And it works fine.

    But what I need to do is multiplex this audio and video stream and send as one stream.
    what I do is first stream two streams as UDP://224.1.1.1:1250(video) and UDP://224.1.1.1:1260(audio)
    then I used nreco invoke method. We can use it to execute ffmpeg commands.

    " -re -i udp://224.1.1.1:1250 -i udp://224.1.1.1:1260 -c copy -f avi udp://224.1.1.1:1270"

    and this works for both audio and video stream but completely out of sync.

    Next thing what I do is creating another ffmpeg ConvertLiveMedia task and write audio and video data
    to that task using write method. And I stream that mux data and received using ffplay. And it plays the stream
    and the sync problem is solved. But sometimes audio and video frames are dropping and then it begins to
    play out of sync.

                   combine = new MemoryStream();
                   ffMpegTaskcom = ffmpegConvertercom.ConvertLiveMedia(
                           Format.mpeg,
                           combine,
                           Format.avi,
                           new ConvertSettings()
                           {
                               CustomInputArgs = " ", // windows bitmap pixel format
                               CustomOutputArgs = " -threads 7 -c:v libx264 -preset ultrafast -tune zerolatency -strict experimental -profile:v baseline -movflags +faststart -tune film -level 3.0 -tune zerolatency -tune film -pix_fmt yuv420p -g 250 -crf 22 -b:v 4000k -minrate 3000k -maxrate 5000k -acodec aac -ab 160k -ac 2 -ar 44100",

                           });
                   ffMpegTaskcom.Start();
                   byte[] streamBytesvi = null;
                   byte[] streamBytesau = null;
                   encodeQueqe.TryDequeue(out streamBytesvi);
                   encodeQueqeau.TryDequeue(out streamBytesau);
                   ffMpegTaskcom.Write(streamBytesvi, 0, streamBytesvi.Length);
                   ffMpegTaskcom.Write(streamBytesau, 0, streamBytesau.Length);

                   //ffMpegTaskcom.Wait();
                   ffMpegTaskcom.Stop();

    Now I need to know a good method to deliver audio and video data with synchronization.
    Please tell me what is the wrong I have done or suggest a better way to do this.

    Thank You !