Recherche avancée

Médias (0)

Mot : - Tags -/interaction

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (46)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Submit enhancements and plugins

    13 avril 2011

    If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
    You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.

Sur d’autres sites (4869)

  • How can I use avformat_open_input function (ffmpeg)

    5 mai 2016, par johncarrie

    I have bought a HD HDMI to UVC device which has HDMI video source input and UVC for video output here.
    I connect it from laptop A (input source HDMI) to laptop B (output USB).
    I have installed Ubuntu 14.04 desktop on Laptop B and Win 8.1 on Laptop A.
    B also have ffmpeg, opencv and sdl library installed.
    My target is to capture video and audio from A via HD HDMI to UVC on B.
    So I have decided to use libav of ffmpeg.
    I saw this and used avformat_open_input function but this function returned error.
    I thought that the error was occurred because the second parameter of avformat_open_input (const char * url) was invalid.
    I know that the url should be like video:video device name:audio:audio card name.
    How can I indicate the device names ?

    Here’s the result of v4l2-ctl --list-devices command in terminal.

    HD WebCam (usb-0000:02:03.0-1):  
       /dev/video0
    HD TV CAM (usb-0000:03:00.0-2.1):  
       /dev/video1

    And the result of arecord -l in terminal.

    **** List of CAPTURE Hardware Devices ****  
    card 0: AudioPCI [Ensoniq AudioPCI], device 0: ES1371/1 [ES1371 DAC2/ADC]  
    Subdevices: 1/1  
    Subdevice #0: subdevice #0    
    card 1: CAM [HD TV CAM], device 0: USB Audio [USB Audio]    
    Subdevices: 1/1  
    Subdevice #0: subdevice #0  

    Thank you.

  • Ffmpeg streaming from capturing device Osprey 450e fails

    11 juillet 2018, par diegosn79

    i want to encode live stream video with ffmpeg capturing from Directshow card(Osprey Card 450e) to mp4 streaming multicast. For the moment i have this error.

    ffmpeg -f dshow -i video="Osprey-450e Video Device 1A":audio="Osprey-450e Audio Device 1A" -f mpegts -b:v 5120k -r 30 -c:v mpeg2video -c:a ac3 -b:a 256k udp://239.192.42.61:1234

    [dshow @ 02c7f640] Could not run filter video=Osprey-450e Video Device 1A:Audio?Osprey-450e Audio Device 1A: Input/output error

    Ffmpeg can encode a Directshow input ?

  • AVFrame : How to get/replace plane data buffer(s) and size ?

    19 juillet 2018, par user10099431

    I’m working on gstreamer1.0-libav (1.6.3), trying to port custom FPGA based H264 video acceleration from gstreamer 0.10.

    The data planes (YUV) used to be allocated by a simple malloc back in gstreamer 0.10, so we simply replaced the AVFrame.data[i] pointers by pointers to memory in our video acceleration core. It seems to be MUCH more complicated in gstreamer 1.12.

    For starters, I tried copying the YUV planes from AVFrame.data[i] to a separate buffer - which worked fine ! Since I haven’t seen an immediate way to obtain the size of AVFrame.data[i] and I recognized that data[0], data[1], data[2] seem to be in a single continuous buffer, I simply used (data[1] - data [0]) for the size of the Y plane and (data[2] - data[1]) for the sizes of the U/V planes respectively. This works fine, expect for one scenario :

    • Input H264 stream with resolution of 800x600 or greater
    • The camera is covered (jacket, hand, ...)

    This causes a SEGFAULT in the memcpy of the V plane (data[2]) using the sizes determined as described above. Before covering the camera, the stream is displayed completely fine ... so for some reason the dark screen changes the plane sizes ?

    My ultimate goal is replacing the data[i] pointers allocated by gstreamer by my custom memory allocation (for futher processing) ... where exactly are these buffers assigned, can I change them and how can I obtain the size of each plane (data[0], data[1], data[2]) ?