Recherche avancée

Médias (91)

Autres articles (20)

  • Qu’est ce qu’un éditorial

    21 juin 2013, par

    Ecrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
    Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
    Vous pouvez personnaliser le formulaire de création d’un éditorial.
    Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

Sur d’autres sites (4460)

  • ffserver : local OOB write with custom program name

    6 janvier 2017, par Tobias Stoeckmann
    ffserver : local OOB write with custom program name
    

    When the command line for children is created, it is assumed that
    my_program_name always ends with "ffserver", which doesn’t have to
    be true if ffserver is called through a symbolic link.

    In such a case, it could be that not enough space for "ffmpeg" is
    available at the end, leading to a buffer overflow.

    One example would be :

    $ ln -s /usr/bin/ffserver /f ; /f

    As this is only a local buffer overflow, i.e. is based on a weird
    program call, this has NO security impact.

    Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>

    • [DH] ffserver.c
  • FFMPEG on Heroku exceeds memory quota in testing

    5 juillet 2022, par Patrick Vellia

    After following this tutorial, and getting it to work locally on my own development environment, before really getting my hands dirty and working deeper on my own project implementation, I decided to push it up to Heroku to test in a staging environment.

    &#xA;

    I had to have Heroku add the FFMPEG build-pack and turn on the Redis Server for ActionCable to work.

    &#xA;

    I didn't link the staging to a cloud storage bucket on Google or Amazon yet, just allowed it to upload directly to the dymo disk for testing. So it would go into the storage directory as it would in development for now.

    &#xA;

    the test MOV file is 186 MB in size.

    &#xA;

    The system uploaded the file fine.

    &#xA;

    According to the logs, it then copied the file from storage to tmp as the tutorial has us do.

    &#xA;

    Then it called streamio-ffmpeg's transcode method.

    &#xA;

    At this point, Heroku forcibly kills the dymo because it far exceeds the memory quota.

    &#xA;

    As this is a test environment, it's only on the free tier of Heroku.

    &#xA;

    I'm thinking I won't be able to directly process video projects on Heroku itself, unless I'm wrong ? Would it be better to call an API like Cloud Functions or Amazon Lambda, or spin up a Compute Engine long enough to process the FFMPEG command ?

    &#xA;

  • Android : Recording and Streaming at the same time

    23 avril 2020, par Bruno Siqueira

    This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.

    &#xA;&#xA;

    I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.

    &#xA;&#xA;

    Copy bytes from libstreaming stream to a mp4 file

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.

    &#xA;&#xA;

    Use ffmpeg compiled to android to access the camera

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.

    &#xA;&#xA;

    Use ffmpeg compiled to android combined with MediaRecorder

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    FFMPEG can not stream MP4 files that are not yet done with the recording.

    &#xA;&#xA;

    Use ffmpeg compiled to android with libstreaming

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.

    &#xA;&#xA;

    Use OpenCV

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).

    &#xA;&#xA;

    Use Kickflip SDK

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.

    &#xA;&#xA;

    Use Adobe Air

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    We started consulting other developers of app's already available in the Play Store, that stream to servers already.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.

    &#xA;&#xA;

    UPDATE

    &#xA;&#xA;

    Webrtc

    &#xA;&#xA;

    Development

    &#xA;&#xA;

    We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.

    &#xA;&#xA;

    Impediment

    &#xA;&#xA;

    Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.

    &#xA;