Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (89)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Initialisation de MediaSPIP (préconfiguration)

    20 février 2010, par

    Lors de l’installation de MediaSPIP, celui-ci est préconfiguré pour les usages les plus fréquents.
    Cette préconfiguration est réalisée par un plugin activé par défaut et non désactivable appelé MediaSPIP Init.
    Ce plugin sert à préconfigurer de manière correcte chaque instance de MediaSPIP. Il doit donc être placé dans le dossier plugins-dist/ du site ou de la ferme pour être installé par défaut avant de pouvoir utiliser le site.
    Dans un premier temps il active ou désactive des options de SPIP qui ne le (...)

Sur d’autres sites (5331)

  • convert CCTV .264 video to one of common formats (mp4, avi, etc.) via ffmpeg or other cmdline tool

    21 mars 2016, par yy502

    I’ve got a CCTV cam that produces .264 format video clips. These clips plays fine just like any other normal video recording that you would expect on portable devices, but only with it’s manufacture provided App. When played directly using VLC or mplayer, only gray blocks are visible in the picture. I doubt it is propitiatory encoding, but some kind of hardware encoded raw h264 format that I’m just lacking the right combination of arguments/options for playback or convert using ffmpeg. ffmpeg -i does report the basic metadata correctly, but also crazy amount of frame errors.... but I know the video can be played fine.

    The Android App has the following files in its lib folder :
    enter image description here

    I understand these files are not all for video decoding but also some other feature in the app. I’m just hoping someone could maybe determine what extra lib or option is needed to convert it with ffmpeg. e.g. libh264ToRGB565.so could be useful maybe...?

    This is a screenshot of what to expect from the sample video.
    enter image description here

    And here is the sample video clip (1.3M, 1280x720p) : http://146.185.145.75/vid.264
    (MD5 = 0ae871484b3832984f46e6820b21c673)

    Any suggestion is appreciated.

  • How to extract grayscale image from a video with ffmpeg-library ?

    9 mars 2016, par user1587451

    I’v compiled and tested this tutorial from here which works just fine. After I tried to edit the tutorial to read/convert frames into grayscale. I just changed pFrameRGB to pFrameGray, PIX_FMT_RGB24 to PIX_FMT_GRAY16 and to save just the 200th frame. It compiles and run but the image don’t show the expected. What’s wrong ?

    The image :
    image

    The edited code :

    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavformat></libavformat>avformat.h>
    #include <libswscale></libswscale>swscale.h>

    #include

    // compatibility with newer API
    #if LIBAVCODEC_VERSION_INT &lt; AV_VERSION_INT(55,28,1)
    #define av_frame_alloc avcodec_alloc_frame
    #define av_frame_free avcodec_free_frame
    #endif

    void SaveFrame(AVFrame *pFrame, int width, int height, int iFrame) {
     FILE *pFile;
     char szFilename[32];
     int  y;

     // Open file
     sprintf(szFilename, "frame%d.ppm", iFrame);
     pFile=fopen(szFilename, "wb");
     if(pFile==NULL)
       return;

     // Write header
     fprintf(pFile, "P6\n%d %d\n255\n", width, height);

     // Write pixel data
     for(y=0; ydata[0]+y*pFrame->linesize[0], 1, width*3, pFile);

     // Close file
     fclose(pFile);
    }

    int main(int argc, char *argv[]) {
     // Initalizing these to NULL prevents segfaults!
     AVFormatContext   *pFormatCtx = NULL;
     int               i, videoStream;
     AVCodecContext    *pCodecCtxOrig = NULL;
     AVCodecContext    *pCodecCtx = NULL;
     AVCodec           *pCodec = NULL;
     AVFrame           *pFrame = NULL;
     AVFrame           *pFrameGRAY = NULL;
     AVPacket          packet;
     int               frameFinished;
     int               numBytes;
     uint8_t           *buffer = NULL;
     struct SwsContext *sws_ctx = NULL;

     if(argc &lt; 2) {
       printf("Please provide a movie file\n");
       return -1;
     }
     // Register all formats and codecs
     av_register_all();

     // Open video file
     if(avformat_open_input(&amp;pFormatCtx, argv[1], NULL, NULL)!=0)
       return -1; // Couldn't open file

     // Retrieve stream information
     if(avformat_find_stream_info(pFormatCtx, NULL)&lt;0)
       return -1; // Couldn't find stream information

     // Dump information about file onto standard error
     av_dump_format(pFormatCtx, 0, argv[1], 0);

     // Find the first video stream
     videoStream=-1;
     for(i=0; inb_streams; i++)
       if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
         videoStream=i;
         break;
       }
     if(videoStream==-1)
       return -1; // Didn't find a video stream

     // Get a pointer to the codec context for the video stream
     pCodecCtxOrig=pFormatCtx->streams[videoStream]->codec;
     // Find the decoder for the video stream
     pCodec=avcodec_find_decoder(pCodecCtxOrig->codec_id);
     if(pCodec==NULL) {
       fprintf(stderr, "Unsupported codec!\n");
       return -1; // Codec not found
     }
     // Copy context
     pCodecCtx = avcodec_alloc_context3(pCodec);
     if(avcodec_copy_context(pCodecCtx, pCodecCtxOrig) != 0) {
       fprintf(stderr, "Couldn't copy codec context");
       return -1; // Error copying codec context
     }

     // Open codec
     if(avcodec_open2(pCodecCtx, pCodec, NULL)&lt;0)
       return -1; // Could not open codec

     // Allocate video frame
     pFrame=av_frame_alloc();

     // Allocate an AVFrame structure
     pFrameGRAY=av_frame_alloc();
     if(pFrameGRAY==NULL)
       return -1;

     // Determine required buffer size and allocate buffer
     numBytes=avpicture_get_size(PIX_FMT_GRAY16, pCodecCtx->width,
                     pCodecCtx->height);
     buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));

     // Assign appropriate parts of buffer to image planes in pFrameGRAY
     // Note that pFrameGRAY is an AVFrame, but AVFrame is a superset
     // of AVPicture
     avpicture_fill((AVPicture *)pFrameGRAY, buffer, PIX_FMT_GRAY16,
            pCodecCtx->width, pCodecCtx->height);

     // initialize SWS context for software scaling
     sws_ctx = sws_getContext(pCodecCtx->width,
                  pCodecCtx->height,
                  pCodecCtx->pix_fmt,
                  pCodecCtx->width,
                  pCodecCtx->height,
                  PIX_FMT_GRAY16,
                  SWS_BILINEAR,
                  NULL,
                  NULL,
                  NULL
                  );

     // Read frames and save first five frames to disk
     i=0;
     while(av_read_frame(pFormatCtx, &amp;packet)>=0) {
       // Is this a packet from the video stream?
       if(packet.stream_index==videoStream) {
         // Decode video frame
         avcodec_decode_video2(pCodecCtx, pFrame, &amp;frameFinished, &amp;packet);

         // Did we get a video frame?
         if(frameFinished) {
       // Convert the image from its native format to GRAY
       sws_scale(sws_ctx, (uint8_t const * const *)pFrame->data,
             pFrame->linesize, 0, pCodecCtx->height,
             pFrameGRAY->data, pFrameGRAY->linesize);

       // Save the frame to disk
       if(++i==200)
         SaveFrame(pFrameGRAY, pCodecCtx->width, pCodecCtx->height,
               i);
         }
       }

       // Free the packet that was allocated by av_read_frame
       av_free_packet(&amp;packet);
     }

     // Free the GRAY image
     av_free(buffer);
     av_frame_free(&amp;pFrameGRAY);

     // Free the YUV frame
     av_frame_free(&amp;pFrame);

     // Close the codecs
     avcodec_close(pCodecCtx);
     avcodec_close(pCodecCtxOrig);

     // Close the video file
     avformat_close_input(&amp;pFormatCtx);

     return 0;
    }
  • Python and FFMPEG video streaming not displaying via HTML5

    18 mars 2016, par arussell

    I’m trying to write a python script to serve a video over HTTP and display it via HTML5 video tag, I’m using FFMPEG to serve the video over HTTP and receiving the the video via sockets in Python. FFMPEG seems to be sending the video and my Python script is receiving it but for some reason I’m not able to display it in my web browser nor getting any visible error in my script.

    Any help will be highly appreciated.

    This is the FFMPEG line I’m using to send the video to HTTP

    FFMPEG -re -i video_file.webm -c:v libx264 -c:a copy -f h264 http://127.0.0.1:8081

    Here is my Python code

    import socket   #for sockets handling
    import time     #for time functions
    import sys

    hostIP = '127.0.0.1'
    SourcePort = 8081 #FFMPEG
    PlayerPort = 8082 #Internet Browser

    def gen_headers():
        # determine response code
        h = ''
        h = 'HTTP/1.1 200 OK\n'
        # write further headers
        current_date = time.strftime("%a, %d %b %Y %H:%M:%S", time.localtime())
        h += 'Date: ' + current_date +'\n'
        h += 'Content-Type: video/mp4\n\n'
        return h

    def start_server():
       socketFFMPEG = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
       # this is for easy starting/killing the app
       socketFFMPEG.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
       print('Socket created')

       try:
           socketFFMPEG.bind((hostIP, SourcePort))
           print('Socket bind complete')
       except socket.error as msg:
           print('Bind failed. Error : ' + str(sys.exc_info()))
           sys.exit()

       #Start listening on socketFFMPEG
       socketFFMPEG.listen(10)
       print('Socket now listening. Waiting for video source from FFMPEG on port', SourcePort)

       conn, addr = socketFFMPEG.accept()
       ip, port = str(addr[0]), str(addr[1])
       print('Accepting connection from ' + ip + ':' + port)

       socketPlayer = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
       socketPlayer.bind((hostIP, PlayerPort))
       socketPlayer.listen(1) #listen just 1 petition
       print('Waiting for Internet Browser')
       conn2, addr2 = socketPlayer.accept()
       conn2.sendall(gen_headers().encode())

       while True:
           try :
               #receive data from client FFMPEG
               input_from_FFMPEG = conn.recv(1024)
               #send data to internet browser
               conn2.sendall(input_from_FFMPEG)
           except socket.error:
               print('Error data :' + str(input_from_FFMPEG))
               print('send Error : ' + str(sys.exc_info()))
               conn2.close()
               sys.exit()

       socketFFMPEG.close()

    start_server()  

    I’m getting the error 10053 "An established connection was aborted by the software in your host machine" when loading the following "byte" type data

    \x00\x00\x00\x01A\x9b\x1dI\xe1\x0f&amp;S\x02\xbf\xb1\x82j3{qz\x85\xca\\\xb2\xb7\xc5\xdfi\x92y\x0c{\xb0\xde\xd1\x96j\xccE\xa3G\x87\x84Z\x0191\xba\x8a3\x8e\xe2lfX\x82\xd4*N\x8a\x9f\xa9\xc9\xfb\x13\xfc_]D\x0f\x9e\x1c"\x0fN\xceu\t\x17n\xbe\x95\xd1\x10Wj\xf5t\x90\xa8\x1am\xf7!d\x82py\n\x10\xf5\x9b{\xd9\xf8\x8e^\xc7\xb3o+\x0eQX\xb3\x17B?\xb8\x1c\xecP\xa0\xf10\xc7\xc8\x8d\xf1P\xd3\xdf\xd0\xd5\x13ah+bM\x9c\xbe\xca\xb4\x9a?\xac\xb9\x0fao\xf3\xed\x9c\xe4^\x10\x079t\xf4\x0f\xce\xbe*\xd4w\x1f\x1a\x02\xbd\xed\xe9\x16\x8a\x98\xe0\x1d\xc4\xde5\xa8\xf0\x88\xb4\x07=\xe2w\xc3Q\xc1\x99K7\xff\x01`(\xb3sN\x88\x18\xfd7\xd4\x07\xab\x95\xf95\x05\xcd\xd6,!=\xfb\xc4\xc8\xbf\xad\x96\x83\xc0\x9b%\xdds\x92s\xc0lN\xdd\x14\xba\xbd\x04L\xb1\x08\xec[~tB~`\r\xbe\xa9\xbe\xa4r`\xa3\x98\x19z\xa9\xe9\xd3jK>(\xd5\x8c\x9eY~\xa8\x9f\x86\x90\x148R\xfd&lt;\xb2\xdaUy\xa8\xb5\xba\x1d\xd1\xf6\xa6N\xb0#\x08Xo\xa6\x1c \xbaB\x8cbp\x1c\r\xa1\xa4"\x06\xd8\xe5\x85[\x89\x8a\xcba\xa3\xcc\xe0C\x946\xad6\x08\x90\r&amp;\xcb\x13\xa6\xfbG\xc5\x85I&lt;\x96\xcb\x89}\xcb\xda\xa5\x02\xbcB\xb9\x93\x938\x89\x1c\x92\xb3\x83\xfe\xa7\xf6\xa8\x1f\xdf\xa8\xef\xd55\xb6\xbf>#\xba\xd7\x8e\xd2z\xc2\xca\xf9\xdd2\xdd\x96\xb6\xf8\xc3\xc1\x0f/D\x05\xd3?\x18\xb1\x85T\x08\xcd\xfc\xc7p\xba\x0c\x93\xcdY\xf3 !4\x13\xaen\x82\x10[\x07I>\xe4\xc3\xb7\xca\xee\x93\r\xc3\xe1>\xe9\xd6\x9a\xbeLQ\x93\x86n\xadb\x13\xcas\xc0\xdeh\x1a\x9f\x00Dp\x94fv\xb7\xd9\xba\x93\x0c\xd1H2\x0e\xa2]`\xf2Q{+o\x80\xf0\x8a\x11"\x94u\x9b1\xc3\xdaV\xd9\x9e\xc6\xf7?\x18\xd9\xfbs\xf3\x07\xc6\x91\x83\x19\'\x13\xe4o\xa9S\x1cP\xa4w\xbc\xe36\xb9k\xc3\xaa":~`\xe7\x18\xe8\x9bu\n\x16\xf3\x89\xe2k\x07\x08\xf6\x8c\x98\x98\xbd\x8f*\x11\xe7\xa1\nj1\'\xe2=\x7f\xdf\x16\xc8\xf6\xec\xe1\xe6G\xd1\x1b\xeb\xc0\xd4\xf7\xc3c\xc7v\xc3\xf8\xa5\xac\x89\xdd4\x90i\t\x98\xfe\xfcx\xad{[\xf4\x92\x16^O\xf2\xc2]\xec\xa7\xe9Gu\\dF\xa6\xa7\xd3k?\xba\xedY\xba\x85\'\x1a\xa6.(\xcfB\x82tN\xdc\xad\xe6\xfcM\x01:\x0b\x14\x070\xf4\x99l2C\x92\x9c\x13h\x82\xf6w\xc4$5\xe1~\x11T~\xc9\x8f\xaeUAI%\xa6\x12(\x9c\x17\x9d*\xcc9\xee\xb7\xb8w \x92\x9a\x1cD\xfd\xd8wi7rt\xd8\x93\xbd7\x83\xf1\xe3\xbd\x92\x81\xe0\xfel\xfa\\\x9c\xebM\xf3m`p\xb9\xe2\x13Kd\xe08\xcc\x15\x96[G\xda`\x8cD\xa7\xf1\xd3\xc8T\xcf\xb1)\xa5E$\x91\x94{\x88&amp;\xac\xc1\x92\xd5E\xa98\xd2\x89\xd1?\xd7\x9c\xdc\xbb!\x18\xc1\xa1m\xba*L\xab\xa0\xff\xd8\xee\xbbH\xe3\xa2\xe4\x9d=9\x05\xb4\x9bm\xe7\xc6J\xd9\xc3\xb1\xe9b*jB`4t\x9fv\xe8\xc4F\x9c`\xd0\x03\xd8\x12}\x8b\xb3$A\x9c\xdc;\x81@)rH\xf1\x18\xe1\xba\x0c4\x06\xe9xa\x94\xdd\xde\xa8&amp;\xef)\xd7F\x94F\xa7j\xd3\x13O\xe03\xc9\xc9\xf2\x15\x1a\x9bsy\x16\x83H\xb4\x9e\xee\xc9M\xe7\xf4x \xa5\x9c^\xb9m\xeee\x03=_\x11\xda^l\xfe\xba\xa4\x98mjW\xf0\xa9\xc4\x11g\xd9C\xf7K.\x8c\xab3~n%\x7f\xc0p\xc8\xb1\xd6\x8d\xe5E\xb1\xc1\xe3(~\x9e\x9c\x91.\xdc\x08\xfb\xa0\xbe\x98y$U\xdeH\x08\xb2z,yX\xfaqx\xfe\xb0\xa9\xb4Q\xf2P\x95d\xc8\x88\r\xc3\x1dr\x88\xba\xc8\x990`(\x08m\x19\xebi\xf8\x11\xc6g\xd6\xc4\x12C\xad~\xe1$2\x01Hmg\xdb\x920\x18\xcc\xc0K\x04~\x1e\xeb\xd9>\x81F*I\x99\xe4\x00\xa3\xc4,U\x89\xdf\x843\xa3\xfb\xea\xc9d\x05\xeb]