Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (47)

  • Activation de l’inscription des visiteurs

    12 avril 2011, par

    Il est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
    Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
    Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (6052)

  • How To Install FFMPEG on Elastic Beanstalk

    26 mars 2020, par Nick Lynch

    This is not a duplicate, I have found one thread, and it is outdated and does not work :
    Install ffmpeg on elastic beanstalk using ebextensions config.

    I have been trying to install this for some time, nothing seems to work.
    Please share the config.yml that will make this work.

    I am using 64bit Amazon Linux 2016.03 v2.1.6 running PHP 7.0 on Elastic Beanstalk


    My current file is

    branch-defaults:
     default:
       environment: Default-Environment
     master:
       environment: Default-Environment
    global:
     application_name: "My First Elastic Beanstalk Application"
     default_ec2_keyname: ~
     default_platform: "64bit Amazon Linux 2016.03 v2.1.6 running PHP 7.0"
     default_region: us-east-1
     profile: eb-cli
     sc: git
    packages: ~
    yum:
     ImageMagick: []
     ImageMagick-devel: []
     commands:
       01-wget:
         command: "wget -O /tmp/ffmpeg.tar.gz http://ffmpeg.gusari.org/static/64bit/ffmpeg.static.64bit.2014-03-05.tar.gz"
       02-mkdir:
         command: "if [ ! -d /opt/ffmpeg ] ; then mkdir -p /opt/ffmpeg; fi"
       03-tar:
         command: "tar -xzf ffmpeg.tar.gz -C /opt/ffmpeg"
         cwd: /tmp
       04-ln:
         command: "if [[ ! -f /usr/bin/ffmpeg ]] ; then ln -s /opt/ffmpeg/ffmpeg /usr/bin/ffmpeg; fi"
       05-ln:
         command: "if [[ ! -f /usr/bin/ffprobe ]] ; then ln -s /opt/ffmpeg/ffprobe /usr/bin/ffprobe; fi"
       06-pecl:
         command: "if [ `pecl list | grep imagick` ] ; then pecl install -f imagick; fi"
  • Encoding of D3D11Texture2D to an rtsp stream using libav*

    1er décembre 2020, par uzer

    Firstly I'll declare that I am just beginning with whole libav* and no experience with DirectX, so please go easy on me.

    


    I have managed to create a rtsp stream using libav* using video file as source. Now, I am trying to create an rtsp stream from an ID3D11Texture2D, which I am obtaining from GDI API using Bitblit method. Here's my approach for creating live rtsp stream :

    


      

    1. Set input context

        

      • AVFormatContext* ifmt_ctx = avformat_alloc_context() ;
      • 


      • avformat_open_input(&ifmt_ctx, _videoFileName, 0, 0) ;
      • 


      


    2. 


    3. Set output context

        

      • avformat_alloc_output_context2(&ofmt_ctx, NULL, "rtsp", _rtspServerAdress) ; //RTSP
      • 


      • copy all the codec context and stream from input to output
      • 


      


    4. 


    5. Start streaming

        

      • while av_read_frame(ifmt_ctx, &pkt) ; is valid av_interleaved_write_frame(ofmt_ctx, &pkt) ;
      • 


      • with some timestamp checks and conditions for livestreaming
      • 


      


    6. 


    


    Now I am finding it difficult to follow current libav* documentation (which is deprecated) and little tutorial content available online.

    


    The most relevant article I found on working between directX and libav* is this article.
However it's actually doing the opposite of what I need to do. I am not sure how to go about creating input stream and context with DirectX texture ! How can I convert the texture into an AVFrame which can be encoded to an AVStream ?

    


    Here's some rough outline of what I am expecting

    


    ID3D11Texture2D* win_textureptr = WindowManager::Get().GetWindow(RtspStreaming::WindowId())->GetWindowTexture();&#xA;&#xA;    D3D11_TEXTURE2D_DESC* desc;&#xA;    win_textureptr->GetDesc(desc);&#xA;    int width = desc->Width;&#xA;    int height = desc->Height;&#xA;    //double audio_time=0.0;&#xA;    auto start_time = std::chrono::system_clock::now();&#xA;    std::chrono::duration<double> video_time;&#xA;    &#xA;    //DirectX BGRA to h264 YUV420p&#xA;    SwsContext* conversion_ctx = sws_getContext(&#xA;        width, height, AV_PIX_FMT_BGRA,&#xA;        width, height, AV_PIX_FMT_YUV420P,&#xA;        SWS_BICUBLIN | SWS_BITEXACT, nullptr, nullptr, nullptr);&#xA;    &#xA;    uint8_t* sw_data[AV_NUM_DATA_POINTERS];&#xA;    int sw_linesize[AV_NUM_DATA_POINTERS];&#xA;&#xA;    while (RtspStreaming::IsStreaming())&#xA;    {&#xA;&#xA;        //copy the texture&#xA;        //win_textureptr->GetPrivateData();&#xA;        &#xA;&#xA;        // convert BGRA to yuv420 pixel format&#xA;        /*&#xA;    frame = av_frame_alloc();&#xA;        //this obviously is incorrect... I would like to use d3d11 texture here instead of frame&#xA;        sws_scale(conversion_ctx, frame->data, frame->linesize, 0, frame->height,&#xA;            sw_data, sw_linesize);&#xA;&#xA;        frame->format = AV_PIX_FMT_YUV420P;&#xA;        frame->width = width;&#xA;        frame->height = height;*/&#xA;&#xA;&#xA;        //encode to the video stream&#xA;        &#xA;        /* Compute current audio and video time. */&#xA;        video_time = std::chrono::system_clock::now() - start_time;&#xA;&#xA;        //write frame and send&#xA;    av_interleaved_write_frame(ofmt_ctx, &amp;pkt);&#xA;&#xA;        av_frame_unref(frame);&#xA;    }&#xA;&#xA;    av_write_trailer(ofmt_ctx);&#xA;</double>

    &#xA;

  • Cannot create SDL_Window giving SDL_Main.h error even though included

    27 février 2015, par user2270995

    I am trying to play network stream using FFMPEG(Fetching, decoding.etc) and trying to render it using SDL.

    When I run my Application it starts normally but as soon as I call OpenFile() which contains code for opening network stream and creating SDL_Window and then SDL_Renderer but it gives me a error on SDL_CreateWindow() saying :

    Window not created: [Application didn't initialize properly, did you include SDL_main.h in the file containing your main() function?].

    Even though I have included SDL_main.h.

    #include
    #include
    ..... //other header files

    #define  LOGI(...)  __android_log_print(ANDROID_LOG_INFO,LOG_TAG,__VA_ARGS__)
    #define  LOGE(...)  __android_log_print(ANDROID_LOG_ERROR,LOG_TAG,__VA_ARGS__)

    SDL_Window *window;
    SDL_Renderer *renderer;

    int main(int argc, char *argv[])
    {
       if(SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER))
       {
         LOGE("Could not initialize SDL - %s\n", SDL_GetError());
         exit(1);
       }
       return 0;
    }

    void Java_com_my_app_MainActivity_openFile(JNIEnv * env, jobject this, jstring url)
    {
       ......
       ...... //FFMPEG code

     //---------------------------SDL part -------------------------//

     window = SDL_CreateWindow("Window", SDL_WINDOWPOS_UNDEFINED,
              SDL_WINDOWPOS_UNDEFINED, 0, 0,
              SDL_WINDOW_SHOWN | SDL_WINDOW_FULLSCREEN);

      if(window == NULL)
      {
          LOGE("Window not created: [%s]", SDL_GetError());
          return;
      }
      LOGE("Window created");

     renderer = SDL_CreateRenderer(window,-1,SDL_RENDERER_ACCELERATED | SDL_RENDERER_TARGETTEXTURE);
      if(renderer == NULL)
      {
          LOGE("renderer not created: [%s]", SDL_GetError());
          return;
      }
      LOGE("Rendering Created...");
    }

    Here’s code from another project i tried (following this tutorial) in which i do not in my MainActivity and i works fine

      #include
      #include
      #include

      #include "SDL.h"

    typedef struct Sprite
    {
      SDL_Texture* texture;
      Uint16 w;
      Uint16 h;
    } Sprite;

    Sprite LoadSprite(const char* file, SDL_Renderer* renderer)
    {
       Sprite result;
       result.texture = NULL;
       result.w = 0;
       result.h = 0;

       SDL_Surface* temp;

       /* Load the sprite image */
       temp = SDL_LoadBMP(file);
       if (temp == NULL)
       {
           fprintf(stderr, "Couldn't load %s: %s\n", file, SDL_GetError());
           return result;
       }
       result.w = temp->w;
       result.h = temp->h;

       /* Create texture from the image */
       result.texture = SDL_CreateTextureFromSurface(renderer, temp);
       if (!result.texture) {
           fprintf(stderr, "Couldn't create texture: %s\n", SDL_GetError());
           SDL_FreeSurface(temp);
           return result;
       }
       SDL_FreeSurface(temp);

       return result;
    }

    void draw(SDL_Window* window, SDL_Renderer* renderer, const Sprite sprite)
    {
       int w, h;
       SDL_GetWindowSize(window, &amp;w, &amp;h);
       SDL_Rect destRect = {w/2 - sprite.w/2, h/2 - sprite.h/2, sprite.w, sprite.h};
       /* Blit the sprite onto the screen */
       SDL_RenderCopy(renderer, sprite.texture, NULL, &amp;destRect);
    }

    int main(int argc, char *argv[])
    {
       if(SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER))
       {
            fprintf(stderr, "Could not initialize SDL - %s\n", SDL_GetError());
            exit(1);
       }

       SDL_Window *window;
       SDL_Renderer *renderer;

       if(SDL_CreateWindowAndRenderer(0, 0, 0, &amp;window, &amp;renderer) &lt; 0)
           exit(2);

       Sprite sprite = LoadSprite("image.bmp", renderer);
       if(sprite.texture == NULL)
           exit(2);

       /* Main render loop */
       Uint8 done = 0;
       SDL_Event event;
       while(!done)
       {
           /* Check for events */
           while(SDL_PollEvent(&amp;event))
           {
               if(event.type == SDL_QUIT || event.type == SDL_KEYDOWN )//|| event.type == SDL_FINGERDOWN)
               {
                   done = 1;
               }
           }


       /* Draw a gray background */
       SDL_SetRenderDrawColor(renderer, 0xA0, 0xA0, 0xA0, 0xFF);
       SDL_RenderClear(renderer);

       draw(window, renderer, sprite);

       /* Update the screen! */
       SDL_RenderPresent(renderer);

       SDL_Delay(10);
    }

    exit(0);

    }

    Note : I created the first project (the one giving error) from this fine working project

    I have tried removing SDL_main.h’, using #undef main but none of it works. Now it has started giving me a error Saying 'java.lang.UnsatisfiedLinkError: Cannot load library: soinfo_relocate(linker.cpp:975): cannot locate symbol "SDL_main" referenced by "libmain.so"...

    My Android.mk

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    LOCAL_MODULE := ffmpeg
    LOCAL_SRC_FILES := $(TARGET_ARCH_ABI)/libffmpeg.so
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/$(TARGET_ARCH_ABI)/include
    include $(PREBUILT_SHARED_LIBRARY)


    include $(CLEAR_VARS)
    SDL_PATH := ../SDL
    MY_FILES_PATH := ../src
    LOCAL_MODULE    := main
    # Add your application source files here...
    LOCAL_SRC_FILES := $(SDL_PATH)/src/main/android/SDL_android_main.c \
                      $(patsubst $(LOCAL_PATH)/%, %, $(wildcard $(LOCAL_PATH)/src/*.c))

    LOCAL_C_INCLUDES := $(LOCAL_PATH)/$(SDL_PATH)/include
    LOCAL_LDLIBS := -L$(SYSROOT)/usr/lib -ljnigraphics -lGLESv1_CM -lGLESv2 -llog -lz -lm
    LOCAL_ALLOW_UNDEFINED_SYMBOLS := true
    LOCAL_SHARED_LIBRARIES := SDL2
    LOCAL_SHARED_LIBRARIES += ffmpeg
    include $(BUILD_SHARED_LIBRARY)