Recherche avancée

Médias (1)

Mot : - Tags -/MediaSPIP 0.2

Autres articles (79)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

Sur d’autres sites (5691)

  • spdifenc : support ac3 core+eac3 dependent streams

    3 avril 2018, par Hendrik Leppkes
    spdifenc : support ac3 core+eac3 dependent streams
    

    Such streams are found on Blu-ray, and identified as EAC3 type in
    avformat, while the bitstream of the core stream is actually a pure AC3
    frame.

    Adjust the parsing accordingly, since AC3 frames always hold 6 blocks
    and the numblkscod syntax element is not present.

    • [DH] libavformat/spdifenc.c
  • How to build latest ffmpeg for android-ffmpeg project on github

    9 février 2015, par user2927954

    I am working on https://github.com/guardianproject/android-ffmpeg project.
    In this project, it uses ffmpeg version 0.11.1.
    How can i build this project with the latest ffmpeg version ?

    I try to delete ffmpeg folder in this project. Checkout the latest ffmpeg version in url : [git ://git.videolan.org/ffmpeg.git]. After that, i ran command ./configure_make_everything.sh as normal but i got error :

       File to patch:
    Skip this patch? [y]
    Skipping patch.
    3 out of 3 hunks ignored
    patching file libavutil/arm/intmath.h
    Reversed (or previously applied) patch detected!  Skipping patch.
    3 out of 3 hunks ignored
    patching file configure
    Reversed (or previously applied) patch detected!  Skipping patch.
    1 out of 1 hunk ignored
    ~/workspace/android-ffmpeg/ffmpeg ~/workspace/android-ffmpeg ~/workspace/android-ffmpeg
    **ERROR: freetype2 not found**

    If you think configure made a mistake, make sure you are using the latest
    version from Git.  If the latest version fails, report the problem to the
    ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net.
    Include the log file "config.log" produced by configure as this will help
    solving the problem.
    ~/workspace/android-ffmpeg ~/workspace/android-ffmpeg
    ~/workspace/android-ffmpeg
    ~/workspace/android-ffmpeg ~/workspace/android-ffmpeg
    ~/workspace/android-ffmpeg/ffmpeg ~/workspace/android-ffmpeg ~/workspace/android-ffmpeg
    Makefile:2: config.mak: No such file or directory
    Makefile:53: /common.mak: No such file or directory
    Makefile:93: /libavutil/Makefile: No such file or directory
    Makefile:93: /library.mak: No such file or directory
    Makefile:95: /doc/Makefile: No such file or directory
    Makefile:178: /tests/Makefile: No such file or directory
    make: *** No rule to make target `/tests/Makefile'.  Stop.
    Makefile:2: config.mak: No such file or directory
    Makefile:53: /common.mak: No such file or directory
    Makefile:93: /libavutil/Makefile: No such file or directory
    Makefile:93: /library.mak: No such file or directory
    Makefile:95: /doc/Makefile: No such file or directory
    Makefile:178: /tests/Makefile: No such file or directory
    make: *** No rule to make target `/tests/Makefile'.  Stop.
    ~/workspace/android-ffmpeg ~/workspace/android-ffmpeg
    ~/workspace/android-ffmpeg
    admin@ubuntu:~/workspace/android-ffmpeg$

    I got error : Freetype2 not found but if i build with the original ffmpeg include in this project, this error is not occur.

    How can i fix it ? Please help

  • texture rendering issue on iOS using OpenGL ES in Unity project

    28 mars 2016, par Time1ess

    I’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :

    ffmpeg \
       -f avfoundation -i "1" -s 1280*720 -r 29.97 \
       -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
       -f mpegts udp://192.168.1.102:6666

    with this, I successfully create my video stream.

    In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :

    In my library :

    buffer alloc :

    uint8_t *buffer;
    int buffer_size;
    buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);

    buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));

    avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
                  VIEW_WIDTH, VIEW_HEIGHT);

    getContext :

       is->sws_ctx = sws_getContext
       (
        is->video_st->codec->width,
        is->video_st->codec->height,
        is->video_st->codec->pix_fmt,
        VIEW_WIDTH,
        VIEW_HEIGHT,
        AV_PIX_FMT_RGBA,
        SWS_BILINEAR,
        NULL,
        NULL,
        NULL
        );

    sws_scale :

    sws_scale(
             is->sws_ctx,
             (uint8_t const * const *)pFrame->data,
             pFrame->linesize,
             0,
             is->video_st->codec->height,
             pFrameRGB->data,
             pFrameRGB->linesize
             );

    texture render :

    static void UNITY_INTERFACE_API OnRenderEvent(int texID)
    {
       GLuint gltex = (GLuint)(size_t)(texID);

       glBindTexture(GL_TEXTURE_2D, gltex);

       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
                       GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

       glGetError();
       return;
    }

    extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
    {
       return OnRenderEvent;
    }

    In Unity :

    texture created :

       private Texture2D texture;
       private int texID;
       texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
       texture.filterMode = FilterMode.Point;
       texture.Apply ();
       GetComponent<renderer> ().material.mainTexture = texture;
       texID = texture.GetNativeTexturePtr ().ToInt32();
    </renderer>

    update func :

       void Update ()
       {
           GL.IssuePluginEvent(GetRenderEventFunc(), texID);
       }

    Video stream info :

    Input #0, mpegts, from 'udp://0.0.0.0:6666':
     Duration: N/A, start: 2.534467, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

    Leave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the OnRenderEvent func had been called.

    I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.

    FYI :

    Unity 5.3.2f1 Personal

    Xcode 7.2.1

    iOS 9.2.1

    ffmpeg 3.0