Recherche avancée

Médias (1)

Mot : - Tags -/ogv

Autres articles (53)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (4528)

  • FFMPEG - Non-monotonous DTS in output stream 0:0

    6 novembre 2015, par Tayax

    I’m trying to save an online input stream from a m3u8 playlist, to a Mp4 file.

    Here is my command :

    usr/bin/ffmpeg -y -i '.$stream.' -t 20  \
    -vcodec copy -s 640x480 \
    -acodec copy  -bsf:a aac_adtstoasc \
    -f mp4 '.$filename.'.mp4 > block.txt 2>&1 &

    And here is my output file :

    ffmpeg version 2.6.3 Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 4.8.3 (GCC) 20140911 (Red Hat 4.8.3-9)
     configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libcdio --enable-libdc1394 --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
     libavutil      54. 20.100 / 54. 20.100
     libavcodec     56. 26.100 / 56. 26.100
     libavformat    56. 25.101 / 56. 25.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 11.102 /  5. 11.102
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, hls,applehttp, from 'playlist.m3u8':
     Duration: N/A, start: 2375.973000, bitrate: N/A
     Program 0
       Metadata:
         variant_bitrate : 613420
       Stream #0:0: Data: timed_id3 (ID3  / 0x20334449)
       Metadata:
         variant_bitrate : 613420
       Stream #0:1: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(tv), 640x480 [SAR 1:1 DAR 4:3], 30 fps, 30 tbr, 90k tbn, 60 tbc
       Metadata:
         variant_bitrate : 613420
       Stream #0:2: Audio: aac (LC) ([15][0][0][0] / 0x000F), 16000 Hz, mono, fltp, 53 kb/s
       Metadata:
         variant_bitrate : 613420
    Output #0, mp4, to 'output.mp4':
     Metadata:
       encoder         : Lavf56.25.101
       Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 30 fps, 30 tbr, 90k tbn, 90k tbc
       Metadata:
         variant_bitrate : 613420
       Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 16000 Hz, mono, 53 kb/s
       Metadata:
         variant_bitrate : 613420
    Stream mapping:
     Stream #0:1 -> #0:0 (copy)
     Stream #0:2 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 168300, current: 168300; changing to 168301. This may result in incorrect timestamps in the output file.
    frame=   53 fps=0.0 q=-1.0 size=     163kB time=00:00:02.75 bitrate= 485.9kbits/s    
    frame=   79 fps= 74 q=-1.0 size=     294kB time=00:00:04.28 bitrate= 560.7kbits/s    
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 400320, current: 400320; changing to 400321. This may result in incorrect timestamps in the output file.
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 416970, current: 416970; changing to 416971. This may result in incorrect timestamps in the output file.
    frame=  131 fps= 81 q=-1.0 size=     466kB time=00:00:07.10 bitrate= 537.4kbits/s    
    frame=  150 fps= 67 q=-1.0 size=     506kB time=00:00:08.12 bitrate= 510.2kbits/s    
    frame=  163 fps= 59 q=-1.0 size=     549kB time=00:00:08.89 bitrate= 505.4kbits/s    
    frame=  177 fps= 54 q=-1.0 size=     621kB time=00:00:09.79 bitrate= 519.4kbits/s    
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 902160, current: 902160; changing to 902161. This may result in incorrect timestamps in the output file.
    frame=  206 fps= 54 q=-1.0 size=     732kB time=00:00:11.52 bitrate= 520.8kbits/s    
    frame=  236 fps= 55 q=-1.0 size=     869kB time=00:00:13.44 bitrate= 529.7kbits/s    
    frame=  276 fps= 57 q=-1.0 size=    1036kB time=00:00:15.74 bitrate= 539.0kbits/s    
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 1433880, current: 1433880; changing to 1433881. This may result in incorrect timestamps in the output file.
    frame=  284 fps= 36 q=-1.0 size=    1079kB time=00:00:16.25 bitrate= 543.7kbits/s    
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 1454490, current: 1454490; changing to 1454491. This may result in incorrect timestamps in the output file.
    frame=  295 fps= 35 q=-1.0 size=    1129kB time=00:00:16.70 bitrate= 553.9kbits/s    
    frame=  317 fps= 35 q=-1.0 size=    1222kB time=00:00:18.04 bitrate= 554.8kbits/s    
    [mp4 @ 0x18e7700] Non-monotonous DTS in output stream 0:0; previous: 1696050, current: 1696050; changing to 1696051. This may result in incorrect timestamps in the output file.
    frame=  347 fps= 36 q=-1.0 size=    1342kB time=00:00:19.39 bitrate= 566.9kbits/s    
    frame=  361 fps= 37 q=-1.0 Lsize=    1400kB time=00:00:20.03 bitrate= 572.5kbits/s    
    video:1274kB audio:115kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.784608%

    As you can see, it seems that there are some dropped frame during the recording and I have no idea how to correct this in order to have a clean file at the end.

    I believe it’s a setting problem maybe ? but I can’t really pinpoint what’s wrong.
    I tried to play with the buffer & the framerate but nothing would change.

    Any help would be appreciated.

  • Read multiple frames in demuxer

    22 décembre 2024, par Aycon

    I use FFmpeg.AutoGen. This is C# wrapper of ffmpeg (7.0) C++ library for reading mediafiles and generate stream of frames for other application.
I want get n frames and hold pointers in memory indefinitely.
However, I am completely confused trying to figure out which API is deprecated and how I can tell ffmpeg to hold the pointers in memory until I tell it to.
    
I don't want to copy the frame after receiving it if I can avoid it.

    


    I tried many ways.
    
My last attempt was to receive the frames using ffmpeg.avcodec_send_packet(), ffmpeg.av_read_frame() and ffmpeg.avcodec_receive_frame() functions as it is specified in the current manual.
    
My code fragment for reading frames :

    


    using Core.Backends.FFmpeg.Helpers;&#xA;using Core.Backends.FFmpeg.UnsafeWrappers;&#xA;using Core.Enums;&#xA;using Core.Interfaces;&#xA;using FFmpeg.AutoGen.Abstractions;&#xA;using System.Diagnostics;&#xA;using System.Drawing;&#xA;&#xA;namespace Core.Backends.FFmpeg.Internal;&#xA;&#xA;internal class Video : IVideo&#xA;{&#xA;  private readonly AVFormatHandler p_format;&#xA;  private readonly AVCodecHandler p_codec;&#xA;  private readonly AVPacketWrapper p_packet;&#xA;  private readonly FramesPool p_framesPool;&#xA;  private readonly FramesPool p_framesBufferPool;&#xA;  private bool p_disposedValue;&#xA;&#xA;  public Video(VideoMetadata _videoMetadata, AVFormatHandler _format, AVCodecHandler _codec, int _bufferizedFramesCount = 1)&#xA;  {&#xA;    Duration = _videoMetadata.Duration;&#xA;    FrameRate = _videoMetadata.FrameRate;&#xA;    FrameSize = _videoMetadata.FrameSize;&#xA;    PixelFormat = _videoMetadata.PixelFormat;&#xA;    SelectedStreamID = _videoMetadata.SelectedStreamID;&#xA;    p_format = _format;&#xA;    p_codec = _codec;&#xA;    p_packet = new AVPacketWrapper();&#xA;    var frame = new AVFrameWrapper(p_format, p_packet);&#xA;    p_framesPool = new(frame, _bufferizedFramesCount);&#xA;    p_framesBufferPool = new(frame, _bufferizedFramesCount);&#xA;  }&#xA;&#xA;  /// <inheritdoc></inheritdoc>&#xA;  public long Duration { get; init; }&#xA;&#xA;  /// <inheritdoc></inheritdoc>&#xA;  public (int num, int den) FrameRate { get; init; }&#xA;&#xA;  /// <inheritdoc></inheritdoc>&#xA;  public Size FrameSize { get; init; }&#xA;&#xA;  /// <inheritdoc></inheritdoc>&#xA;  public PixelFormat PixelFormat { get; init; }&#xA;&#xA;  private int SelectedStreamID { get; init; }&#xA;&#xA;  private unsafe int SendPacket(AVPacketWrapper? _packet)&#xA;  {&#xA;    if (_packet == null)&#xA;      return ffmpeg.avcodec_send_packet(p_codec.AVCodecContextPointer, null);&#xA;&#xA;    return ffmpeg.avcodec_send_packet(p_codec.AVCodecContextPointer, _packet.AVPacketPointer);&#xA;  }&#xA;&#xA;  private unsafe bool IsSelectedStream(AVPacketWrapper _packet)&#xA;  {&#xA;    return _packet.AVPacketPointer->stream_index == SelectedStreamID;&#xA;  }&#xA;&#xA;  private unsafe int ReadFrame(AVPacketWrapper _packet)&#xA;  {&#xA;    return ffmpeg.av_read_frame(p_format.AVFormatPointer, _packet.AVPacketPointer);&#xA;  }&#xA;&#xA;  private static unsafe void UnrefPacket(AVPacketWrapper _packet) => ffmpeg.av_packet_unref(_packet.AVPacketPointer);&#xA;&#xA;  private IEnumerable<int> ReadToSelectedStream(AVPacketWrapper _packet)&#xA;  {&#xA;    do&#xA;    {&#xA;      UnrefPacket(p_packet);&#xA;      yield return ReadFrame(_packet);&#xA;    } while (!IsSelectedStream(_packet));&#xA;  }&#xA;&#xA;  private unsafe void FlushBuffers() => ffmpeg.avcodec_flush_buffers(p_codec.AVCodecContextPointer);&#xA;&#xA;  private IEnumerable<avpacketwrapper> GetNextPacketPrivate()&#xA;  {&#xA;    try&#xA;    {&#xA;      while (true)&#xA;      {&#xA;        foreach (int errorCodeRead in ReadToSelectedStream(p_packet))&#xA;        {&#xA;          if (errorCodeRead == ffmpeg.AVERROR_EOF)&#xA;            break;&#xA;&#xA;          errorCodeRead.ThrowInvalidOperationExceptionIfError();&#xA;        }&#xA;&#xA;        int errorCodeSend = SendPacket(p_packet);&#xA;&#xA;        if (errorCodeSend == ffmpeg.AVERROR(ffmpeg.EAGAIN))&#xA;        {&#xA;          yield return p_packet;&#xA;          continue;&#xA;        }&#xA;&#xA;        if (errorCodeSend == ffmpeg.AVERROR_EOF)&#xA;        {&#xA;          yield return p_packet;&#xA;          break;&#xA;        }&#xA;&#xA;        errorCodeSend.ThrowInvalidOperationExceptionIfError();&#xA;&#xA;        yield return p_packet;&#xA;      }&#xA;&#xA;      // Last iteration special case handling&#xA;      int errorCodeSendLast = SendPacket(null);&#xA;&#xA;      if (errorCodeSendLast != ffmpeg.AVERROR_EOF)&#xA;        errorCodeSendLast.ThrowInvalidOperationExceptionIfError();&#xA;&#xA;      yield return p_packet;&#xA;    }&#xA;    finally&#xA;    {&#xA;      UnrefPacket(p_packet);&#xA;      FlushBuffers();&#xA;    }&#xA;  }&#xA;&#xA;  private unsafe int ReceiveFrame(AVFrameWrapper _frame)&#xA;  {&#xA;    return ffmpeg.avcodec_receive_frame(p_codec.AVCodecContextPointer, _frame.AVFramePointer);&#xA;  }&#xA;&#xA;  private unsafe AVFrameWrapper HWFrameCopyIfRequired(AVCodecHandler _codec, AVFrameWrapper _frame, AVFrameWrapper _buffer)&#xA;  {&#xA;    if (_codec.AVCodecContextPointer->hw_device_ctx != null)&#xA;    {&#xA;      int errorCode = ffmpeg.av_hwframe_transfer_data(_buffer.AVFramePointer, _frame.AVFramePointer, flags: 0);&#xA;      errorCode.ThrowInvalidOperationExceptionIfError();&#xA;      return _buffer;&#xA;    }&#xA;&#xA;    return _frame;&#xA;  }&#xA;&#xA;  private IEnumerable GetNextFramePrivate(AVFrameWrapper _fresh_frame, AVFrameWrapper _fresh_frameBuffer)&#xA;  {&#xA;    int readCode;&#xA;&#xA;    while (true)&#xA;    {&#xA;      readCode = ReceiveFrame(_fresh_frame);&#xA;&#xA;      if (readCode == ffmpeg.AVERROR(ffmpeg.EAGAIN) || readCode == ffmpeg.AVERROR_EOF)&#xA;        yield break;&#xA;&#xA;      readCode.ThrowInvalidOperationExceptionIfError();&#xA;      &#xA;      yield return HWFrameCopyIfRequired(p_codec, _fresh_frame, _fresh_frameBuffer);&#xA;    }&#xA;  }&#xA;&#xA;  private static void RefreshFrames&#xA;  (&#xA;    IEnumerator<avframewrapper> _framesEnumerator,&#xA;    IEnumerator<avframewrapper> _framesBufferEnumerator,&#xA;    out AVFrameWrapper _frame,&#xA;    out AVFrameWrapper _frameBuffer&#xA;  )&#xA;  {&#xA;    // Catch fresh frame from pool&#xA;    Debug.Assert(_framesEnumerator.MoveNext(), "Пул фреймов никогда не должен завершать предоставление фреймов.");&#xA;    _frame = _framesEnumerator.Current;&#xA;&#xA;    // Catch fresh frame buffer from pool&#xA;    Debug.Assert(_framesBufferEnumerator.MoveNext(), "Пул фреймов никогда не должен завершать предоставление фреймов.");&#xA;    _frameBuffer = _framesBufferEnumerator.Current;&#xA;  }&#xA;&#xA;  /// <inheritdoc></inheritdoc>&#xA;  public IEnumerable GetNextFrame()&#xA;  {&#xA;    IEnumerator<avframewrapper> framesEnumerator = p_framesPool.GetNextFrame().GetEnumerator();&#xA;    IEnumerator<avframewrapper> framesBufferEnumerator = p_framesBufferPool.GetNextFrame().GetEnumerator();&#xA;    RefreshFrames(framesEnumerator, framesBufferEnumerator, out AVFrameWrapper fresh_frame, out AVFrameWrapper fresh_frameBuffer);&#xA;    foreach (var packet in GetNextPacketPrivate())&#xA;      foreach (var frame in GetNextFramePrivate(fresh_frame, fresh_frameBuffer))&#xA;      {&#xA;        yield return frame;&#xA;        RefreshFrames(framesEnumerator, framesBufferEnumerator, out fresh_frame, out fresh_frameBuffer);&#xA;      }&#xA;  }&#xA;&#xA;  protected virtual void Dispose(bool disposing)&#xA;  {&#xA;    if (!p_disposedValue)&#xA;    {&#xA;      if (disposing)&#xA;      {&#xA;      }&#xA;&#xA;      p_packet.Dispose();&#xA;      p_framesPool.Flush();&#xA;      p_framesBufferPool.Flush();&#xA;&#xA;      p_disposedValue = true;&#xA;    }&#xA;  }&#xA;&#xA;  ~Video()&#xA;  {&#xA;    Dispose(disposing: false);&#xA;  }&#xA;&#xA;  public void Dispose()&#xA;  {&#xA;    Dispose(disposing: true);&#xA;    GC.SuppressFinalize(this);&#xA;  }&#xA;}&#xA;</avframewrapper></avframewrapper></avframewrapper></avframewrapper></avpacketwrapper></int>

    &#xA;

    My FramesPool class :

    &#xA;

    using Core.Backends.FFmpeg.UnsafeWrappers;&#xA;using FFmpeg.AutoGen.Abstractions;&#xA;&#xA;namespace Core.Backends.FFmpeg.Internal;&#xA;&#xA;internal class FramesPool&#xA;{&#xA;  private readonly AVFrameWrapper p_frameWrapper;&#xA;  private readonly Queue<avframewrapper> p_frames;&#xA;  private readonly int p_count;&#xA;&#xA;  public FramesPool(AVFrameWrapper _initframeWrapper, int _count = 1)&#xA;  {&#xA;    p_frameWrapper = _initframeWrapper;&#xA;    p_frames = new(_count);&#xA;    p_frames.Enqueue(p_frameWrapper);&#xA;    p_count = _count;&#xA;  }&#xA;&#xA;  private static unsafe void UnrefFrame(AVFrameWrapper _frame) => ffmpeg.av_frame_unref(_frame.AVFramePointer);&#xA;&#xA;  public IEnumerable<avframewrapper> GetNextFrame()&#xA;  {&#xA;    // First frame case&#xA;    UnrefFrame(p_frameWrapper);&#xA;    yield return p_frameWrapper;&#xA;&#xA;    while (true)&#xA;    {&#xA;      if (p_frames.Count &lt; p_count)&#xA;      {&#xA;        var new_frame = p_frameWrapper.Clone();&#xA;        p_frames.Enqueue(new_frame);&#xA;        yield return new_frame;&#xA;      }&#xA;      else&#xA;      {&#xA;        var frame = p_frames.Dequeue();&#xA;        UnrefFrame(frame);&#xA;        yield return frame;&#xA;        p_frames.Enqueue(frame);&#xA;      }&#xA;    }&#xA;  }&#xA;&#xA;  public void Flush()&#xA;  {&#xA;    foreach(var frame in p_frames)&#xA;    {&#xA;      UnrefFrame(frame);&#xA;      frame.Dispose();&#xA;    }&#xA;&#xA;    p_frames.Clear();&#xA;  }&#xA;}&#xA;</avframewrapper></avframewrapper>

    &#xA;

    Additional calls, among others :

    &#xA;

    ffmpeg.avformat_alloc_context();&#xA;ffmpeg.avformat_open_input(pptr, p_filePath, null, null);&#xA;ffmpeg.av_hwdevice_ctx_create(&amp;p_avCodecHandler!.AVCodecContextPointer->hw_device_ctx, strongDevice, null, null, 0);&#xA;ffmpeg.av_find_best_stream(/*args*/);&#xA;ffmpeg.avcodec_alloc_context3(p_avCodec);&#xA;ffmpeg.avcodec_parameters_to_context(/*args*/);&#xA;ffmpeg.avcodec_open2(/*args*/);&#xA;

    &#xA;

    Function ffmpeg.av_hwframe_transfer_data(_buffer.AVFramePointer, _frame.AVFramePointer, flags: 0); returns "-22" (message : "Invalid argument")

    &#xA;

    Please, help me)

    &#xA;

  • How to stream synchronized video and audio in real-time from an Android smartphone using HLS while preserving orientation metadata ?

    6 mars, par Jérôme LAROSE
    Hello,  &#xA;I am working on an Android application where I need to stream video&#xA;from one or two cameras on my smartphone, along with audio from the&#xA;microphone, in real-time via a link or web page accessible to users.&#xA;The stream should be live, allow rewinding (DVR functionality), and be&#xA;recorded simultaneously. A latency of 1 to 2 minutes is acceptable,&#xA;and the streaming is one-way.  &#xA;&#xA;I have chosen HLS (HTTP Live Streaming) for its browser compatibility&#xA;and DVR support. However, I am encountering issues with audio-video&#xA;synchronization, managing camera orientation metadata, and format&#xA;conversions.&#xA;

    &#xA;

    Here are my attempts :

    &#xA;

      &#xA;
    1. MP4 segmentation with MediaRecorder

      &#xA;

        &#xA;
      • I used MediaRecorder with setNextOutputFile to generate short MP4 segments, then ffmpeg-kit to convert them to fMP4 for HLS.
      • &#xA;

      • Expected : Well-aligned segments for smooth HLS playback.
      • &#xA;

      • Result : Timestamp issues causing jumps or interruptions in playback.
      • &#xA;

      &#xA;

    2. &#xA;

    3. MPEG2-TS via local socket

      &#xA;

        &#xA;
      • I configured MediaRecorder to produce an MPEG2-TS stream sent via a local socket to ffmpeg-kit.
      • &#xA;

      • Expected : Stable streaming with preserved metadata.
      • &#xA;

      • Result : Streaming works, but orientation metadata is lost, leading to incorrectly oriented video (e.g., rotated 90°).
      • &#xA;

      &#xA;

    4. &#xA;

    5. Orientation correction with ffmpeg

      &#xA;

        &#xA;
      • I tested -vf transpose=1 in ffmpeg to correct the orientation.
      • &#xA;

      • Expected : Correctly oriented video without excessive latency.
      • &#xA;

      • Result : Re-encoding takes too long for real-time streaming, causing unacceptable latency.
      • &#xA;

      &#xA;

    6. &#xA;

    7. MPEG2-TS to fMP4 conversion

      &#xA;

        &#xA;
      • I converted the MPEG2-TS stream to fMP4 with ffmpeg to preserve orientation.
      • &#xA;

      • Expected : Perfect audio-video synchronization.
      • &#xA;

      • Result : Slight desynchronization between audio and video, affecting the user experience.
      • &#xA;

      &#xA;

    8. &#xA;

    &#xA;

    I am looking for a solution to :

    &#xA;

      &#xA;
    • Stream an HLS feed from Android with correctly timestamped segments.
    • &#xA;

    • Preserve orientation metadata without heavy re-encoding.
    • &#xA;

    • Ensure perfect audio-video synchronization.
    • &#xA;

    &#xA;

    UPDATE

    &#xA;

    package com.example.angegardien&#xA;&#xA;import android.Manifest&#xA;import android.content.Context&#xA;import android.content.pm.PackageManager&#xA;import android.graphics.SurfaceTexture&#xA;import android.hardware.camera2.*&#xA;import android.media.*&#xA;import android.os.*&#xA;import android.util.Log&#xA;import android.view.Surface&#xA;import android.view.TextureView&#xA;import android.view.WindowManager&#xA;import androidx.activity.ComponentActivity&#xA;import androidx.core.app.ActivityCompat&#xA;import com.arthenica.ffmpegkit.FFmpegKit&#xA;import fi.iki.elonen.NanoHTTPD&#xA;import kotlinx.coroutines.*&#xA;import java.io.File&#xA;import java.io.IOException&#xA;import java.net.ServerSocket&#xA;import android.view.OrientationEventListener&#xA;&#xA;/**&#xA; * MainActivity class:&#xA; * - Manages camera operations using the Camera2 API.&#xA; * - Records video using MediaRecorder.&#xA; * - Pipes data to FFmpeg to generate HLS segments.&#xA; * - Hosts a local HLS server using NanoHTTPD to serve the generated HLS content.&#xA; */&#xA;class MainActivity : ComponentActivity() {&#xA;&#xA;    // TextureView used for displaying the camera preview.&#xA;    private lateinit var textureView: TextureView&#xA;    // Camera device instance.&#xA;    private lateinit var cameraDevice: CameraDevice&#xA;    // Camera capture session for managing capture requests.&#xA;    private lateinit var cameraCaptureSession: CameraCaptureSession&#xA;    // CameraManager to access camera devices.&#xA;    private lateinit var cameraManager: CameraManager&#xA;    // Directory where HLS output files will be stored.&#xA;    private lateinit var hlsDir: File&#xA;    // Instance of the HLS server.&#xA;    private lateinit var hlsServer: HlsServer&#xA;&#xA;    // Camera id ("1" corresponds to the rear camera).&#xA;    private val cameraId = "1"&#xA;    // Flag indicating whether recording is currently active.&#xA;    private var isRecording = false&#xA;&#xA;    // MediaRecorder used for capturing audio and video.&#xA;    private lateinit var activeRecorder: MediaRecorder&#xA;    // Surface for the camera preview.&#xA;    private lateinit var previewSurface: Surface&#xA;    // Surface provided by MediaRecorder for recording.&#xA;    private lateinit var recorderSurface: Surface&#xA;&#xA;    // Port for the FFmpeg local socket connection.&#xA;    private val ffmpegPort = 8080&#xA;&#xA;    // Coroutine scope to manage asynchronous tasks.&#xA;    private val scope = CoroutineScope(Dispatchers.IO &#x2B; SupervisorJob())&#xA;&#xA;    // Variables to track current device rotation and listen for orientation changes.&#xA;    private var currentRotation = 0&#xA;    private lateinit var orientationListener: OrientationEventListener&#xA;&#xA;    override fun onCreate(savedInstanceState: Bundle?) {&#xA;        super.onCreate(savedInstanceState)&#xA;&#xA;        // Initialize the TextureView and set it as the content view.&#xA;        textureView = TextureView(this)&#xA;        setContentView(textureView)&#xA;&#xA;        // Get the CameraManager system service.&#xA;        cameraManager = getSystemService(CAMERA_SERVICE) as CameraManager&#xA;        // Setup the directory for HLS output.&#xA;        setupHLSDirectory()&#xA;&#xA;        // Start the local HLS server on port 8081.&#xA;        hlsServer = HlsServer(8081, hlsDir, this)&#xA;        try {&#xA;            hlsServer.start()&#xA;            Log.d("HLS_SERVER", "HLS Server started on port 8081")&#xA;        } catch (e: IOException) {&#xA;            Log.e("HLS_SERVER", "Error starting HLS Server", e)&#xA;        }&#xA;&#xA;        // Initialize the current rotation.&#xA;        currentRotation = getDeviceRotation()&#xA;&#xA;        // Add a listener to detect orientation changes.&#xA;        orientationListener = object : OrientationEventListener(this) {&#xA;            override fun onOrientationChanged(orientation: Int) {&#xA;                if (orientation == ORIENTATION_UNKNOWN) return // Skip unknown orientations.&#xA;                // Determine the new rotation angle.&#xA;                val newRotation = when {&#xA;                    orientation >= 315 || orientation &lt; 45 -> 0&#xA;                    orientation >= 45 &amp;&amp; orientation &lt; 135 -> 90&#xA;                    orientation >= 135 &amp;&amp; orientation &lt; 225 -> 180&#xA;                    orientation >= 225 &amp;&amp; orientation &lt; 315 -> 270&#xA;                    else -> 0&#xA;                }&#xA;                // If the rotation has changed and recording is active, update the rotation.&#xA;                if (newRotation != currentRotation &amp;&amp; isRecording) {&#xA;                    Log.d("ROTATION", "Orientation change detected: $newRotation")&#xA;                    currentRotation = newRotation&#xA;                }&#xA;            }&#xA;        }&#xA;        orientationListener.enable()&#xA;&#xA;        // Set up the TextureView listener to know when the surface is available.&#xA;        textureView.surfaceTextureListener = object : TextureView.SurfaceTextureListener {&#xA;            override fun onSurfaceTextureAvailable(surface: SurfaceTexture, width: Int, height: Int) {&#xA;                // Open the camera when the texture becomes available.&#xA;                openCamera()&#xA;            }&#xA;            override fun onSurfaceTextureSizeChanged(surface: SurfaceTexture, width: Int, height: Int) {}&#xA;            override fun onSurfaceTextureDestroyed(surface: SurfaceTexture) = false&#xA;            override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {}&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Sets up the HLS directory in the public Downloads folder.&#xA;     * If the directory exists, it deletes it recursively and creates a new one.&#xA;     */&#xA;    private fun setupHLSDirectory() {&#xA;        val downloadsDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)&#xA;        hlsDir = File(downloadsDir, "HLS_Output")&#xA;&#xA;        if (hlsDir.exists()) {&#xA;            hlsDir.deleteRecursively()&#xA;        }&#xA;        hlsDir.mkdirs()&#xA;&#xA;        Log.d("HLS", "&#128194; HLS folder created: ${hlsDir.absolutePath}")&#xA;    }&#xA;&#xA;    /**&#xA;     * Opens the camera after checking for necessary permissions.&#xA;     */&#xA;    private fun openCamera() {&#xA;        if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||&#xA;            ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {&#xA;            // Request permissions if they are not already granted.&#xA;            ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO), 101)&#xA;            return&#xA;        }&#xA;&#xA;        try {&#xA;            // Open the specified camera using its cameraId.&#xA;            cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {&#xA;                override fun onOpened(camera: CameraDevice) {&#xA;                    cameraDevice = camera&#xA;                    // Start the recording session once the camera is opened.&#xA;                    startNextRecording()&#xA;                }&#xA;                override fun onDisconnected(camera: CameraDevice) { camera.close() }&#xA;                override fun onError(camera: CameraDevice, error: Int) { camera.close() }&#xA;            }, null)&#xA;        } catch (e: CameraAccessException) {&#xA;            e.printStackTrace()&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Starts a new recording session:&#xA;     * - Sets up the preview and recorder surfaces.&#xA;     * - Creates a pipe for MediaRecorder output.&#xA;     * - Creates a capture session for simultaneous preview and recording.&#xA;     */&#xA;    private fun startNextRecording() {&#xA;        // Get the SurfaceTexture from the TextureView and set its default buffer size.&#xA;        val texture = textureView.surfaceTexture!!&#xA;        texture.setDefaultBufferSize(1920, 1080)&#xA;        // Create the preview surface.&#xA;        previewSurface = Surface(texture)&#xA;&#xA;        // Create and configure the MediaRecorder.&#xA;        activeRecorder = createMediaRecorder()&#xA;&#xA;        // Create a pipe to route MediaRecorder data.&#xA;        val pipe = ParcelFileDescriptor.createPipe()&#xA;        val pfdWrite = pipe[1] // Write end used by MediaRecorder.&#xA;        val pfdRead = pipe[0]  // Read end used by the local socket server.&#xA;&#xA;        // Set MediaRecorder output to the file descriptor of the write end.&#xA;        activeRecorder.setOutputFile(pfdWrite.fileDescriptor)&#xA;        setupMediaRecorder(activeRecorder)&#xA;        // Obtain the recorder surface from MediaRecorder.&#xA;        recorderSurface = activeRecorder.surface&#xA;&#xA;        // Create a capture request using the RECORD template.&#xA;        val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)&#xA;        captureRequestBuilder.addTarget(previewSurface)&#xA;        captureRequestBuilder.addTarget(recorderSurface)&#xA;&#xA;        // Create a capture session including both preview and recorder surfaces.&#xA;        cameraDevice.createCaptureSession(&#xA;            listOf(previewSurface, recorderSurface),&#xA;            object : CameraCaptureSession.StateCallback() {&#xA;                override fun onConfigured(session: CameraCaptureSession) {&#xA;                    cameraCaptureSession = session&#xA;                    captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO)&#xA;                    // Start a continuous capture request.&#xA;                    cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null)&#xA;&#xA;                    // Launch a coroutine to start FFmpeg and MediaRecorder with synchronization.&#xA;                    scope.launch {&#xA;                        startFFmpeg()&#xA;                        delay(500) // Wait for FFmpeg to be ready.&#xA;                        activeRecorder.start()&#xA;                        isRecording = true&#xA;                        Log.d("HLS", "&#127909; Recording started...")&#xA;                    }&#xA;&#xA;                    // Launch a coroutine to run the local socket server to forward data.&#xA;                    scope.launch {&#xA;                        startLocalSocketServer(pfdRead)&#xA;                    }&#xA;                }&#xA;                override fun onConfigureFailed(session: CameraCaptureSession) {&#xA;                    Log.e("Camera2", "❌ Configuration failed")&#xA;                }&#xA;            },&#xA;            null&#xA;        )&#xA;    }&#xA;&#xA;    /**&#xA;     * Coroutine to start a local socket server.&#xA;     * It reads from the MediaRecorder pipe and sends the data to FFmpeg.&#xA;     */&#xA;    private suspend fun startLocalSocketServer(pfdRead: ParcelFileDescriptor) {&#xA;        withContext(Dispatchers.IO) {&#xA;            val serverSocket = ServerSocket(ffmpegPort)&#xA;            Log.d("HLS", "Local socket server started on port $ffmpegPort")&#xA;&#xA;            // Accept connection from FFmpeg.&#xA;            val socket = serverSocket.accept()&#xA;            Log.d("HLS", "Connection accepted from FFmpeg")&#xA;&#xA;            // Read data from the pipe and forward it through the socket.&#xA;            val inputStream = ParcelFileDescriptor.AutoCloseInputStream(pfdRead)&#xA;            val outputStream = socket.getOutputStream()&#xA;            val buffer = ByteArray(8192)&#xA;            var bytesRead: Int&#xA;            while (inputStream.read(buffer).also { bytesRead = it } != -1) {&#xA;                outputStream.write(buffer, 0, bytesRead)&#xA;            }&#xA;            outputStream.close()&#xA;            inputStream.close()&#xA;            socket.close()&#xA;            serverSocket.close()&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Coroutine to start FFmpeg using a local TCP input.&#xA;     * Applies a video rotation filter based on device orientation and generates HLS segments.&#xA;     */&#xA;    private suspend fun startFFmpeg() {&#xA;        withContext(Dispatchers.IO) {&#xA;            // Retrieve the appropriate transpose filter based on current rotation.&#xA;            val transposeFilter = getTransposeFilter(currentRotation)&#xA;&#xA;            // FFmpeg command to read from the TCP socket and generate an HLS stream.&#xA;            // Two alternative commands are commented below.&#xA;            // val ffmpegCommand = "-fflags &#x2B;genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags &#x2B;faststart -f dash -seg_duration 10 -hls_playlist 1 ${hlsDir.absolutePath}/manifest.mpd"&#xA;            // val ffmpegCommand = "-fflags &#x2B;genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags &#x2B;faststart -f hls -hls_time 5 -hls_segment_type fmp4 -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_fmp4_init_filename init.mp4 -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.m4s ${hlsDir.absolutePath}/playlist.m3u8"&#xA;            val ffmpegCommand = "-fflags &#x2B;genpts -i tcp://localhost:$ffmpegPort -vf $transposeFilter -c:v libx264 -preset ultrafast -crf 23 -c:a copy -movflags &#x2B;faststart -f hls -hls_time 0.1 -hls_segment_type mpegts -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.ts ${hlsDir.absolutePath}/playlist.m3u8"&#xA;&#xA;            FFmpegKit.executeAsync(ffmpegCommand) { session ->&#xA;                if (session.returnCode.isValueSuccess) {&#xA;                    Log.d("HLS", "✅ HLS generated successfully")&#xA;                } else {&#xA;                    Log.e("FFmpeg", "❌ Error generating HLS: ${session.allLogsAsString}")&#xA;                }&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Gets the current device rotation using the WindowManager.&#xA;     */&#xA;    private fun getDeviceRotation(): Int {&#xA;        val windowManager = getSystemService(Context.WINDOW_SERVICE) as WindowManager&#xA;        return when (windowManager.defaultDisplay.rotation) {&#xA;            Surface.ROTATION_0 -> 0&#xA;            Surface.ROTATION_90 -> 90&#xA;            Surface.ROTATION_180 -> 180&#xA;            Surface.ROTATION_270 -> 270&#xA;            else -> 0&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Returns the FFmpeg transpose filter based on the rotation angle.&#xA;     * Used to rotate the video stream accordingly.&#xA;     */&#xA;    private fun getTransposeFilter(rotation: Int): String {&#xA;        return when (rotation) {&#xA;            90 -> "transpose=1" // 90&#xB0; clockwise&#xA;            180 -> "transpose=2,transpose=2" // 180&#xB0; rotation&#xA;            270 -> "transpose=2" // 90&#xB0; counter-clockwise&#xA;            else -> "transpose=0" // No rotation&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Creates and configures a MediaRecorder instance.&#xA;     * Sets up audio and video sources, formats, encoders, and bitrates.&#xA;     */&#xA;    private fun createMediaRecorder(): MediaRecorder {&#xA;        return MediaRecorder().apply {&#xA;            setAudioSource(MediaRecorder.AudioSource.MIC)&#xA;            setVideoSource(MediaRecorder.VideoSource.SURFACE)&#xA;            setOutputFormat(MediaRecorder.OutputFormat.MPEG_2_TS)&#xA;            setVideoEncodingBitRate(5000000)&#xA;            setVideoFrameRate(24)&#xA;            setVideoSize(1080, 720)&#xA;            setVideoEncoder(MediaRecorder.VideoEncoder.H264)&#xA;            setAudioEncoder(MediaRecorder.AudioEncoder.AAC)&#xA;            setAudioSamplingRate(16000)&#xA;            setAudioEncodingBitRate(96000) // 96 kbps&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Prepares the MediaRecorder and logs the outcome.&#xA;     */&#xA;    private fun setupMediaRecorder(recorder: MediaRecorder) {&#xA;        try {&#xA;            recorder.prepare()&#xA;            Log.d("HLS", "✅ MediaRecorder prepared")&#xA;        } catch (e: IOException) {&#xA;            Log.e("HLS", "❌ Error preparing MediaRecorder", e)&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Custom HLS server class extending NanoHTTPD.&#xA;     * Serves HLS segments and playlists from the designated HLS directory.&#xA;     */&#xA;    private inner class HlsServer(port: Int, private val hlsDir: File, private val context: Context) : NanoHTTPD(port) {&#xA;        override fun serve(session: IHTTPSession): Response {&#xA;            val uri = session.uri.trimStart(&#x27;/&#x27;)&#xA;&#xA;            // Intercept the request for `init.mp4` and serve it from assets.&#xA;            /*&#xA;            if (uri == "init.mp4") {&#xA;                Log.d("HLS Server", "&#128225; Intercepting init.mp4, sending file from assets...")&#xA;                return try {&#xA;                    val assetManager = context.assets&#xA;                    val inputStream = assetManager.open("init.mp4")&#xA;                    newFixedLengthResponse(Response.Status.OK, "video/mp4", inputStream, inputStream.available().toLong())&#xA;                } catch (e: Exception) {&#xA;                    Log.e("HLS Server", "❌ Error reading init.mp4 from assets: ${e.message}")&#xA;                    newFixedLengthResponse(Response.Status.INTERNAL_ERROR, MIME_PLAINTEXT, "Server error")&#xA;                }&#xA;            }&#xA;            */&#xA;&#xA;            // Serve all other HLS files normally from the hlsDir.&#xA;            val file = File(hlsDir, uri)&#xA;            return if (file.exists()) {&#xA;                newFixedLengthResponse(Response.Status.OK, getMimeTypeForFile(uri), file.inputStream(), file.length())&#xA;            } else {&#xA;                newFixedLengthResponse(Response.Status.NOT_FOUND, MIME_PLAINTEXT, "File not found")&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    /**&#xA;     * Clean up resources when the activity is destroyed.&#xA;     * Stops recording, releases the camera, cancels coroutines, and stops the HLS server.&#xA;     */&#xA;    override fun onDestroy() {&#xA;        super.onDestroy()&#xA;        if (isRecording) {&#xA;            activeRecorder.stop()&#xA;            activeRecorder.release()&#xA;        }&#xA;        cameraDevice.close()&#xA;        scope.cancel()&#xA;        hlsServer.stop()&#xA;        orientationListener.disable()&#xA;        Log.d("HLS", "&#128721; Activity destroyed")&#xA;    }&#xA;}&#xA;

    &#xA;

    I have three examples of ffmpeg commands.

    &#xA;

      &#xA;
    • One command segments into DASH, but the camera does not have the correct rotation.
    • &#xA;

    • One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation.
    • &#xA;

    • One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen.
    • &#xA;

    &#xA;

    Note :

    &#xA;

      &#xA;
    • In the second command ("One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation."), it returns fMP4. To achieve the correct rotation, I provide a preconfigured init.mp4 file during the HTTP request to retrieve it (see comment).
    • &#xA;

    • In the third command ("One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen."), it returns TS.
    • &#xA;

    &#xA;