Recherche avancée

Médias (1)

Mot : - Tags -/ogv

Autres articles (78)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (9523)

  • "The system cannot find the file specified" when animating with ffmpeg in matplotlib

    27 novembre 2019, par algol

    I am trying to generate a movie from a stack of numpy arrays using a function I have used on a different computer (a mac) on my home computer (Windows 10). Here is the function that I am using :

    def make_animation(frames,name):

       plt.rcParams['animation.ffmpeg_path'] = u'C:\ffmpeg-20190320-0739d5c-win64-static\bin\ffmpeg.exe'
       n_images=frames.shape[2]
       assert (n_images>1)  
       figsize=(10,10)
       fig, ax = plt.subplots(figsize=figsize)
       fig.tight_layout()
       fig.subplots_adjust(left=0, bottom=0, right=1, top=1, wspace=None, hspace=None)
       #lineR, = ax.plot(xaxis_data[0],R_data[0],'c-',label="resources")
       img = ax.imshow(frames[:,:,0], animated = True)  


       def updatefig(img_num):

           #lineR.set_data(xaxis_data[img_num],R_data[img_num],'r-')

           img.set_data(frames[:,:,img_num])

           return [img]


       ani = animation.FuncAnimation(fig, updatefig, np.arange(1, n_images), interval=50, blit=True)
       mywriter = animation.FFMpegWriter(fps = 20)
       #ani.save('mymovie.mp4',writer=mywriter)

       ani.save(f"D:\{name}.mp4",writer=mywriter)

       plt.close(fig)

    Here is the error that I am getting :

    Traceback (most recent call last):

     File "", line 1, in <module>
       make_animation(stack,'full_test')

     File "", line 27, in make_animation
       ani.save(f"D:\{name}.mp4",writer=mywriter)

     File "C:\Users\~snip~\Anaconda3\lib\site-packages\matplotlib\animation.py", line 1136, in save
       with writer.saving(self._fig, filename, dpi):

     File "C:\Users\~snip~\Anaconda3\lib\contextlib.py", line 112, in __enter__
       return next(self.gen)

     File "C:\Users\~snip~\Anaconda3\lib\site-packages\matplotlib\animation.py", line 228, in saving
       self.setup(fig, outfile, dpi, *args, **kwargs)

     File "C:\Users\~snip~\Anaconda3\lib\site-packages\matplotlib\animation.py", line 352, in setup
       self._run()

     File "C:\Users\~snip~\Anaconda3\lib\site-packages\matplotlib\animation.py", line 363, in _run
       creationflags=subprocess_creation_flags)

     File "C:\Users\~snip~\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 143, in __init__
       super(SubprocessPopen, self).__init__(*args, **kwargs)

     File "C:\Users\~snip~\Anaconda3\lib\subprocess.py", line 775, in __init__
       restore_signals, start_new_session)

     File "C:\Users\~snip~\Anaconda3\lib\subprocess.py", line 1178, in _execute_child
       startupinfo)

    FileNotFoundError: [WinError 2] The system cannot find the file specified
    </module>

    I know this code basically works since I have used it before on another computer. My guess is that something about ffmpeg is messed up or something about the output path is wrong. I’m not sure what could be wrong with the ffmpeg since I definitely have it installed (via conda) and the path is pretty straightforward. On the other hand I’m not sure what could be wrong with the output path.

  • lavc/lpc : exploit even symmetry of window function

    9 mars 2016, par Ganesh Ajjanagadde
    lavc/lpc : exploit even symmetry of window function
    

    Yields 2x improvement in function performance, and boosts aac encoding
    speed by 4% overall. Sample benchmark (Haswell+GCC under -march=native) :
    after :
    ffmpeg -i sin.flac -acodec aac -y sin_new.aac 5.22s user 0.03s system 105% cpu 4.970 total

    before :
    ffmpeg -i sin.flac -acodec aac -y sin_new.aac 5.40s user 0.05s system 105% cpu 5.162 total

    Reviewed-by : Rostislav Pehlivanov <atomnuker@gmail.com>
    Signed-off-by : Ganesh Ajjanagadde <gajjanag@gmail.com>

    • [DH] libavcodec/lpc.c
  • Black screen when playing a video with ffmpeg and SDL on iOS

    1er avril 2012, par patrick

    I'm attempting to create a video player on iOS using ffmpeg and SDL. I'm decoding the video stream and attempting to convert the pixel data into a SDL_Surface and then convert that over to an SDL_Texture and render it on screen. However, all I'm getting is a black screen. I know the video file is good and can be viewed fine from VLC. Any idea what I'm missing here ?

    Initialization code :

       // initialize SDL (Simple DirectMedia Layer) to playback the content
       if( SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER) )
       {
           DDLogError(@"Unable to initialize SDL");
           return NO;
       }

       // create window and renderer
       window = SDL_CreateWindow(NULL, 0, 0, SCREEN_WIDTH, SCREEN_HEIGHT,
                                 SDL_WINDOW_OPENGL | SDL_WINDOW_BORDERLESS |
                                 SDL_WINDOW_SHOWN);
       if ( window == 0 )
       {
           DDLogError(@"Unable to initialize SDL Window");
       }

       renderer = SDL_CreateRenderer(window, -1, 0);
       if ( !renderer )
       {
           DDLogError(@"Unable to initialize SDL Renderer");
       }

       // Initialize the FFMpeg and register codecs and their respected file formats
       av_register_all();

    Playback code :
    AVFormatContext *formatContext = NULL ;

    DDLogInfo(@"Opening media file at location:%@", filePath);

    const char *filename = [filePath cStringUsingEncoding:NSUTF8StringEncoding];
    // Open media file
    if( avformat_open_input(&amp;formatContext, filename, NULL, NULL) != 0 )
    {
       DDLogWarn(@"Unable to open media file. [File:%@]", filePath);

       NSString *failureReason = NSLocalizedString(@"Unable to open file.", @"Media playback failed, unable to open file.");

       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNABLE_TO_OPEN
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO; // Couldn&#39;t open file
    }

    // Retrieve stream information
    if( avformat_find_stream_info(formatContext, NULL) &lt;= 0 )
    {
       DDLogWarn(@"Unable to locate stream information for file. [File:%@]", filePath);

       NSString *failureReason = NSLocalizedString(@"Unable to find audio/video stream information.", @"Media playback failed, unable to find stream information.");

       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNABLE_TO_FIND_STREAM
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO;  // Missing stream information
    }

    // Find the first video or audio stream
    int videoStream = -1;
    int audioStream = -1;

    DDLogInfo(@"Locating stream information for media file");

    for( int index=0; index&lt;(formatContext->nb_streams); index++)
    {
       if( formatContext->streams[index]->codec->codec_type==AVMEDIA_TYPE_VIDEO )
       {
           DDLogInfo(@"Found video stream");
           videoStream = index;
           break;
       }
       else if( mediaType == AUDIO_FILE &amp;&amp;
               (formatContext->streams[index]->codec->codec_type==AVMEDIA_TYPE_AUDIO) )
       {
           DDLogInfo(@"Found audio stream");
           audioStream = index;
           break;
       }
    }

    if( videoStream == -1 &amp;&amp; (audioStream == -1) )
    {
       DDLogWarn(@"Unable to find video or audio stream for file");

       NSString *failureReason = NSLocalizedString(@"Unable to locate audio/video stream.", @"Media playback failed, unable to locate media stream.");

       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNABLE_TO_FIND_STREAM
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO; // Didn&#39;t find a video or audio stream
    }

    // Get a pointer to the codec context for the video/audio stream
    AVCodecContext *codecContext;

    DDLogInfo(@"Attempting to locate the codec for the media file");

    if ( videoStream > -1 )
    {
       codecContext = formatContext->streams[videoStream]->codec;

    }
    else
    {
       codecContext = formatContext->streams[audioStream]->codec;
    }

    // Now that we have information about the codec that the file is using,
    // we need to actually open the codec to decode the content

    DDLogInfo(@"Attempting to open the codec to playback the media file");

    AVCodec *codec;

    // Find the decoder for the video stream
    codec = avcodec_find_decoder(codecContext->codec_id);
    if( codec == NULL )
    {
       DDLogWarn(@"Unsupported codec! Cannot playback meda file [File:%@]", filePath);

       NSString *failureReason = NSLocalizedString(@"Unsupported file format. Cannot playback media.", @"Media playback failed, unsupported codec.");
       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNSUPPORTED_CODEC
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO; // Codec not found
    }

    // Open codec
    if( avcodec_open2(codecContext, codec, NULL) &lt; 0 )
    {
       DDLogWarn(@"Unable to open codec! Cannot playback meda file [File:%@]", filePath);

       NSString *failureReason = NSLocalizedString(@"Unable to open media codec. Cannot playback media.", @"Media playback failed, cannot open codec.");
       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNABLE_TO_LOAD_CODEC
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO; // Could not open codec
    }

    // Allocate player frame
    AVFrame *playerFrame=avcodec_alloc_frame();

    // Allocate an AVFrame structure
    AVFrame *RGBframe=avcodec_alloc_frame();
    if( RGBframe==NULL )
    {
       // could not create a frame to convert our video frame
       // to a 16-bit RGB565 frame.

       DDLogWarn(@"Unable to convert video frame. Cannot playback meda file [File:%@]", filePath);

       NSString *failureReason = NSLocalizedString(@"Problems interpreting video frame information.", @"Media playback failed, cannot convert frame.");
       if ( error != NULL )
       {
           *error = [NSError errorWithDomain:MediaPlayerErrorDomain
                                        code:UNABLE_TO_LOAD_FRAME
                                    userInfo:[NSDictionary dictionaryWithObject:failureReason
                                                                         forKey:NSLocalizedFailureReasonErrorKey]];
       }

       return NO; // Could not open codec
    }

    int frameFinished = 0;
    AVPacket packet;

    // Figure out the destination width/height based on the screen size
    int destHeight = codecContext->height;
    int destWidth  = codecContext->width;
    if ( destHeight > SCREEN_HEIGHT || (destWidth > SCREEN_WIDTH) )
    {
       if ( destWidth > SCREEN_WIDTH )
       {
           float percentDiff = ( destWidth - SCREEN_WIDTH ) / (float)destWidth;
           destWidth  = destWidth  - (int)(destWidth * percentDiff );
           destHeight = destHeight - (int)(destHeight * percentDiff );
       }

       if ( destHeight > SCREEN_HEIGHT )
       {
           float percentDiff = (destHeight - SCREEN_HEIGHT ) / (float)destHeight;
           destWidth  = destWidth  - (int)(destWidth * percentDiff );
           destHeight = destHeight - (int)(destHeight * percentDiff );
       }
    }

    SwsContext *swsContext = sws_getContext(codecContext->width, codecContext->height, codecContext->pix_fmt, destWidth, destHeight, PIX_FMT_RGB565, SWS_BICUBIC, NULL, NULL, NULL);

    while( av_read_frame(formatContext, &amp;packet) >= 0 )
    {
       // Is this a packet from the video stream?
       if( packet.stream_index == videoStream )
       {
           // Decode video frame
           avcodec_decode_video2(codecContext, playerFrame, &amp;frameFinished, &amp;packet);

           // Did we get a video frame?
           if( frameFinished != 0 )
           {
               // Convert the content over to RGB565 (16-bit RGB) to playback with SDL

               uint8_t *dst[3];
               int dstStride[3];

               // Set the destination stride
               for (int plane = 0; plane &lt; 3; plane++)
               {
                   dstStride[plane] = codecContext->width*2;
                   dst[plane]= (uint8_t*) malloc(dstStride[plane]*destHeight);
               }

               sws_scale(swsContext, playerFrame->data,
                         playerFrame->linesize, 0,
                         destHeight,
                         dst, dstStride);

               // Create the SDL surface frame that we are going to use to draw our video

               // 16-bit RGB so 2 bytes per pixel (pitch = width*(bytes per pixel))
               int pitch = destWidth*2;
               SDL_Surface *frameSurface = SDL_CreateRGBSurfaceFrom(dst[0], destWidth, destHeight, 16, pitch, 0, 0, 0, 0);

               // Clear the old frame first
               SDL_RenderClear(renderer);

               // Move the frame over to a texture and render it on screen
               SDL_Texture *texture = SDL_CreateTextureFromSurface(renderer, frameSurface);
               SDL_SetTextureBlendMode(texture, SDL_BLENDMODE_BLEND);

               // Draw the new frame on the screen
               SDL_RenderPresent(renderer);

               SDL_DestroyTexture(texture);
               SDL_FreeSurface(frameSurface);
           }