Recherche avancée

Médias (91)

Autres articles (9)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (2630)

  • Mixed Reality WebRTC without Signalling Server

    25 mai 2021, par SilverLife

    I am trying to find a way, which allows me to use Mixed Reality WebRTC (link to git-repo) without a signalling server.
In detail, I want to create a sdp-file from my ffmpeg Video sender and use this sdp-description in my unity-Project to bypass the signaling process and receive the ffmpeg video stream.
Is there a way of doing so with Mixed Reality WebRTC ? I was already searching for the line of code, where the sdp-file is created within MR WebRTC but I didn´t find it.

    


    I am relatively new to this topic and I am not sure if this works at all but since ffmpeg is not directly compatible with webrtc I was thinking that this might be the most promising approach.

    


  • preventing playing video files outside the app/website

    5 octobre 2019, par hretic

    i feel the down votes are coming ! but im desperate so here it goes

    i have bunch of videos that i want my users to be able to play them only trough my mobile app or website(not sure about the platform) ... even when they’re downloaded on the client side

    i’ve never worked with video files before but i have lots of coding expense and know multiple languages (js , php , python , node , .... )

    but i done know where to begin , im not looking for code ... just someone with experience in this area to point me to the right direction and maybe give me some links to read about the subject and what technologies i need to get familiar with

    i tried to search around but most results are about html5 player or desktop video players

    i was thinking maybe i can encode mp4 files so they wouldn’t play with available players and then create a custom video player to decode and show them ?

    any suggestion would be appreciated

  • MPEG-TS, Android and FFMPEG

    31 janvier 2013, par STeN

    I am receiving the MPEG-TS (MPEG transport stream) packets with the multiplexed H.264 video and AAC audio streams. I need to be able to show the audio and video on the Android phone. My assumption is that I need :

    • MPEG-TS de-multiplexer
    • AAC decoder
    • H.264 decoder
    • Synchronize the audio and video playback

    Assuming that I am right then (in Android 2.x) MPEG-TS de-multiplexer is not part of the OS and must be ported, both AAC and H.264 decoder are part of the Android OS, but I am not sure if they have interface, which allows passing the data in buffers and if they allow mutual timing synchronization. In the worst case those components must be ported here as well.

    Can you give me some advices where to start ? I was thinking about the FFMPEG porting. Are there any other ways ?

    Regards,
    STeN