
Recherche avancée
Médias (91)
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Lights in the Sky
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Head Down
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Echoplex
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Discipline
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Letting You
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (103)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)
Sur d’autres sites (6894)
-
How to stream synchronized video and audio in real-time from an Android smartphone using HLS while preserving orientation metadata ?
6 mars, par Jérôme LAROSEHello, 
I am working on an Android application where I need to stream video
from one or two cameras on my smartphone, along with audio from the
microphone, in real-time via a link or web page accessible to users.
The stream should be live, allow rewinding (DVR functionality), and be
recorded simultaneously. A latency of 1 to 2 minutes is acceptable,
and the streaming is one-way. 

I have chosen HLS (HTTP Live Streaming) for its browser compatibility
and DVR support. However, I am encountering issues with audio-video
synchronization, managing camera orientation metadata, and format
conversions.



Here are my attempts :


- 

-
MP4 segmentation with
MediaRecorder


- 

- I used
MediaRecorder
withsetNextOutputFile
to generate short MP4 segments, thenffmpeg-kit
to convert them to fMP4 for HLS. - Expected : Well-aligned segments for smooth HLS playback.
- Result : Timestamp issues causing jumps or interruptions in playback.








- I used
-
MPEG2-TS via local socket


- 

- I configured
MediaRecorder
to produce an MPEG2-TS stream sent via a local socket toffmpeg-kit
. - Expected : Stable streaming with preserved metadata.
- Result : Streaming works, but orientation metadata is lost, leading to incorrectly oriented video (e.g., rotated 90°).








- I configured
-
Orientation correction with
ffmpeg


- 

- I tested
-vf transpose=1
inffmpeg
to correct the orientation. - Expected : Correctly oriented video without excessive latency.
- Result : Re-encoding takes too long for real-time streaming, causing unacceptable latency.








- I tested
-
MPEG2-TS to fMP4 conversion


- 

- I converted the MPEG2-TS stream to fMP4 with
ffmpeg
to preserve orientation. - Expected : Perfect audio-video synchronization.
- Result : Slight desynchronization between audio and video, affecting the user experience.








- I converted the MPEG2-TS stream to fMP4 with










I am looking for a solution to :


- 

- Stream an HLS feed from Android with correctly timestamped segments.
- Preserve orientation metadata without heavy re-encoding.
- Ensure perfect audio-video synchronization.








UPDATE


package com.example.angegardien

import android.Manifest
import android.content.Context
import android.content.pm.PackageManager
import android.graphics.SurfaceTexture
import android.hardware.camera2.*
import android.media.*
import android.os.*
import android.util.Log
import android.view.Surface
import android.view.TextureView
import android.view.WindowManager
import androidx.activity.ComponentActivity
import androidx.core.app.ActivityCompat
import com.arthenica.ffmpegkit.FFmpegKit
import fi.iki.elonen.NanoHTTPD
import kotlinx.coroutines.*
import java.io.File
import java.io.IOException
import java.net.ServerSocket
import android.view.OrientationEventListener

/**
 * MainActivity class:
 * - Manages camera operations using the Camera2 API.
 * - Records video using MediaRecorder.
 * - Pipes data to FFmpeg to generate HLS segments.
 * - Hosts a local HLS server using NanoHTTPD to serve the generated HLS content.
 */
class MainActivity : ComponentActivity() {

 // TextureView used for displaying the camera preview.
 private lateinit var textureView: TextureView
 // Camera device instance.
 private lateinit var cameraDevice: CameraDevice
 // Camera capture session for managing capture requests.
 private lateinit var cameraCaptureSession: CameraCaptureSession
 // CameraManager to access camera devices.
 private lateinit var cameraManager: CameraManager
 // Directory where HLS output files will be stored.
 private lateinit var hlsDir: File
 // Instance of the HLS server.
 private lateinit var hlsServer: HlsServer

 // Camera id ("1" corresponds to the rear camera).
 private val cameraId = "1"
 // Flag indicating whether recording is currently active.
 private var isRecording = false

 // MediaRecorder used for capturing audio and video.
 private lateinit var activeRecorder: MediaRecorder
 // Surface for the camera preview.
 private lateinit var previewSurface: Surface
 // Surface provided by MediaRecorder for recording.
 private lateinit var recorderSurface: Surface

 // Port for the FFmpeg local socket connection.
 private val ffmpegPort = 8080

 // Coroutine scope to manage asynchronous tasks.
 private val scope = CoroutineScope(Dispatchers.IO + SupervisorJob())

 // Variables to track current device rotation and listen for orientation changes.
 private var currentRotation = 0
 private lateinit var orientationListener: OrientationEventListener

 override fun onCreate(savedInstanceState: Bundle?) {
 super.onCreate(savedInstanceState)

 // Initialize the TextureView and set it as the content view.
 textureView = TextureView(this)
 setContentView(textureView)

 // Get the CameraManager system service.
 cameraManager = getSystemService(CAMERA_SERVICE) as CameraManager
 // Setup the directory for HLS output.
 setupHLSDirectory()

 // Start the local HLS server on port 8081.
 hlsServer = HlsServer(8081, hlsDir, this)
 try {
 hlsServer.start()
 Log.d("HLS_SERVER", "HLS Server started on port 8081")
 } catch (e: IOException) {
 Log.e("HLS_SERVER", "Error starting HLS Server", e)
 }

 // Initialize the current rotation.
 currentRotation = getDeviceRotation()

 // Add a listener to detect orientation changes.
 orientationListener = object : OrientationEventListener(this) {
 override fun onOrientationChanged(orientation: Int) {
 if (orientation == ORIENTATION_UNKNOWN) return // Skip unknown orientations.
 // Determine the new rotation angle.
 val newRotation = when {
 orientation >= 315 || orientation < 45 -> 0
 orientation >= 45 && orientation < 135 -> 90
 orientation >= 135 && orientation < 225 -> 180
 orientation >= 225 && orientation < 315 -> 270
 else -> 0
 }
 // If the rotation has changed and recording is active, update the rotation.
 if (newRotation != currentRotation && isRecording) {
 Log.d("ROTATION", "Orientation change detected: $newRotation")
 currentRotation = newRotation
 }
 }
 }
 orientationListener.enable()

 // Set up the TextureView listener to know when the surface is available.
 textureView.surfaceTextureListener = object : TextureView.SurfaceTextureListener {
 override fun onSurfaceTextureAvailable(surface: SurfaceTexture, width: Int, height: Int) {
 // Open the camera when the texture becomes available.
 openCamera()
 }
 override fun onSurfaceTextureSizeChanged(surface: SurfaceTexture, width: Int, height: Int) {}
 override fun onSurfaceTextureDestroyed(surface: SurfaceTexture) = false
 override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {}
 }
 }

 /**
 * Sets up the HLS directory in the public Downloads folder.
 * If the directory exists, it deletes it recursively and creates a new one.
 */
 private fun setupHLSDirectory() {
 val downloadsDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)
 hlsDir = File(downloadsDir, "HLS_Output")

 if (hlsDir.exists()) {
 hlsDir.deleteRecursively()
 }
 hlsDir.mkdirs()

 Log.d("HLS", "📂 HLS folder created: ${hlsDir.absolutePath}")
 }

 /**
 * Opens the camera after checking for necessary permissions.
 */
 private fun openCamera() {
 if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
 ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
 // Request permissions if they are not already granted.
 ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO), 101)
 return
 }

 try {
 // Open the specified camera using its cameraId.
 cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {
 override fun onOpened(camera: CameraDevice) {
 cameraDevice = camera
 // Start the recording session once the camera is opened.
 startNextRecording()
 }
 override fun onDisconnected(camera: CameraDevice) { camera.close() }
 override fun onError(camera: CameraDevice, error: Int) { camera.close() }
 }, null)
 } catch (e: CameraAccessException) {
 e.printStackTrace()
 }
 }

 /**
 * Starts a new recording session:
 * - Sets up the preview and recorder surfaces.
 * - Creates a pipe for MediaRecorder output.
 * - Creates a capture session for simultaneous preview and recording.
 */
 private fun startNextRecording() {
 // Get the SurfaceTexture from the TextureView and set its default buffer size.
 val texture = textureView.surfaceTexture!!
 texture.setDefaultBufferSize(1920, 1080)
 // Create the preview surface.
 previewSurface = Surface(texture)

 // Create and configure the MediaRecorder.
 activeRecorder = createMediaRecorder()

 // Create a pipe to route MediaRecorder data.
 val pipe = ParcelFileDescriptor.createPipe()
 val pfdWrite = pipe[1] // Write end used by MediaRecorder.
 val pfdRead = pipe[0] // Read end used by the local socket server.

 // Set MediaRecorder output to the file descriptor of the write end.
 activeRecorder.setOutputFile(pfdWrite.fileDescriptor)
 setupMediaRecorder(activeRecorder)
 // Obtain the recorder surface from MediaRecorder.
 recorderSurface = activeRecorder.surface

 // Create a capture request using the RECORD template.
 val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
 captureRequestBuilder.addTarget(previewSurface)
 captureRequestBuilder.addTarget(recorderSurface)

 // Create a capture session including both preview and recorder surfaces.
 cameraDevice.createCaptureSession(
 listOf(previewSurface, recorderSurface),
 object : CameraCaptureSession.StateCallback() {
 override fun onConfigured(session: CameraCaptureSession) {
 cameraCaptureSession = session
 captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO)
 // Start a continuous capture request.
 cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null)

 // Launch a coroutine to start FFmpeg and MediaRecorder with synchronization.
 scope.launch {
 startFFmpeg()
 delay(500) // Wait for FFmpeg to be ready.
 activeRecorder.start()
 isRecording = true
 Log.d("HLS", "🎥 Recording started...")
 }

 // Launch a coroutine to run the local socket server to forward data.
 scope.launch {
 startLocalSocketServer(pfdRead)
 }
 }
 override fun onConfigureFailed(session: CameraCaptureSession) {
 Log.e("Camera2", "❌ Configuration failed")
 }
 },
 null
 )
 }

 /**
 * Coroutine to start a local socket server.
 * It reads from the MediaRecorder pipe and sends the data to FFmpeg.
 */
 private suspend fun startLocalSocketServer(pfdRead: ParcelFileDescriptor) {
 withContext(Dispatchers.IO) {
 val serverSocket = ServerSocket(ffmpegPort)
 Log.d("HLS", "Local socket server started on port $ffmpegPort")

 // Accept connection from FFmpeg.
 val socket = serverSocket.accept()
 Log.d("HLS", "Connection accepted from FFmpeg")

 // Read data from the pipe and forward it through the socket.
 val inputStream = ParcelFileDescriptor.AutoCloseInputStream(pfdRead)
 val outputStream = socket.getOutputStream()
 val buffer = ByteArray(8192)
 var bytesRead: Int
 while (inputStream.read(buffer).also { bytesRead = it } != -1) {
 outputStream.write(buffer, 0, bytesRead)
 }
 outputStream.close()
 inputStream.close()
 socket.close()
 serverSocket.close()
 }
 }

 /**
 * Coroutine to start FFmpeg using a local TCP input.
 * Applies a video rotation filter based on device orientation and generates HLS segments.
 */
 private suspend fun startFFmpeg() {
 withContext(Dispatchers.IO) {
 // Retrieve the appropriate transpose filter based on current rotation.
 val transposeFilter = getTransposeFilter(currentRotation)

 // FFmpeg command to read from the TCP socket and generate an HLS stream.
 // Two alternative commands are commented below.
 // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f dash -seg_duration 10 -hls_playlist 1 ${hlsDir.absolutePath}/manifest.mpd"
 // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f hls -hls_time 5 -hls_segment_type fmp4 -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_fmp4_init_filename init.mp4 -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.m4s ${hlsDir.absolutePath}/playlist.m3u8"
 val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -vf $transposeFilter -c:v libx264 -preset ultrafast -crf 23 -c:a copy -movflags +faststart -f hls -hls_time 0.1 -hls_segment_type mpegts -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.ts ${hlsDir.absolutePath}/playlist.m3u8"

 FFmpegKit.executeAsync(ffmpegCommand) { session ->
 if (session.returnCode.isValueSuccess) {
 Log.d("HLS", "✅ HLS generated successfully")
 } else {
 Log.e("FFmpeg", "❌ Error generating HLS: ${session.allLogsAsString}")
 }
 }
 }
 }

 /**
 * Gets the current device rotation using the WindowManager.
 */
 private fun getDeviceRotation(): Int {
 val windowManager = getSystemService(Context.WINDOW_SERVICE) as WindowManager
 return when (windowManager.defaultDisplay.rotation) {
 Surface.ROTATION_0 -> 0
 Surface.ROTATION_90 -> 90
 Surface.ROTATION_180 -> 180
 Surface.ROTATION_270 -> 270
 else -> 0
 }
 }

 /**
 * Returns the FFmpeg transpose filter based on the rotation angle.
 * Used to rotate the video stream accordingly.
 */
 private fun getTransposeFilter(rotation: Int): String {
 return when (rotation) {
 90 -> "transpose=1" // 90° clockwise
 180 -> "transpose=2,transpose=2" // 180° rotation
 270 -> "transpose=2" // 90° counter-clockwise
 else -> "transpose=0" // No rotation
 }
 }

 /**
 * Creates and configures a MediaRecorder instance.
 * Sets up audio and video sources, formats, encoders, and bitrates.
 */
 private fun createMediaRecorder(): MediaRecorder {
 return MediaRecorder().apply {
 setAudioSource(MediaRecorder.AudioSource.MIC)
 setVideoSource(MediaRecorder.VideoSource.SURFACE)
 setOutputFormat(MediaRecorder.OutputFormat.MPEG_2_TS)
 setVideoEncodingBitRate(5000000)
 setVideoFrameRate(24)
 setVideoSize(1080, 720)
 setVideoEncoder(MediaRecorder.VideoEncoder.H264)
 setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
 setAudioSamplingRate(16000)
 setAudioEncodingBitRate(96000) // 96 kbps
 }
 }

 /**
 * Prepares the MediaRecorder and logs the outcome.
 */
 private fun setupMediaRecorder(recorder: MediaRecorder) {
 try {
 recorder.prepare()
 Log.d("HLS", "✅ MediaRecorder prepared")
 } catch (e: IOException) {
 Log.e("HLS", "❌ Error preparing MediaRecorder", e)
 }
 }

 /**
 * Custom HLS server class extending NanoHTTPD.
 * Serves HLS segments and playlists from the designated HLS directory.
 */
 private inner class HlsServer(port: Int, private val hlsDir: File, private val context: Context) : NanoHTTPD(port) {
 override fun serve(session: IHTTPSession): Response {
 val uri = session.uri.trimStart('/')

 // Intercept the request for `init.mp4` and serve it from assets.
 /*
 if (uri == "init.mp4") {
 Log.d("HLS Server", "📡 Intercepting init.mp4, sending file from assets...")
 return try {
 val assetManager = context.assets
 val inputStream = assetManager.open("init.mp4")
 newFixedLengthResponse(Response.Status.OK, "video/mp4", inputStream, inputStream.available().toLong())
 } catch (e: Exception) {
 Log.e("HLS Server", "❌ Error reading init.mp4 from assets: ${e.message}")
 newFixedLengthResponse(Response.Status.INTERNAL_ERROR, MIME_PLAINTEXT, "Server error")
 }
 }
 */

 // Serve all other HLS files normally from the hlsDir.
 val file = File(hlsDir, uri)
 return if (file.exists()) {
 newFixedLengthResponse(Response.Status.OK, getMimeTypeForFile(uri), file.inputStream(), file.length())
 } else {
 newFixedLengthResponse(Response.Status.NOT_FOUND, MIME_PLAINTEXT, "File not found")
 }
 }
 }

 /**
 * Clean up resources when the activity is destroyed.
 * Stops recording, releases the camera, cancels coroutines, and stops the HLS server.
 */
 override fun onDestroy() {
 super.onDestroy()
 if (isRecording) {
 activeRecorder.stop()
 activeRecorder.release()
 }
 cameraDevice.close()
 scope.cancel()
 hlsServer.stop()
 orientationListener.disable()
 Log.d("HLS", "🛑 Activity destroyed")
 }
}



I have three examples of ffmpeg commands.


- 

- One command segments into DASH, but the camera does not have the correct rotation.
- One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation.
- One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen.








Note :


- 

- In the second command ("One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation."), it returns fMP4. To achieve the correct rotation, I provide a preconfigured
init.mp4
file during the HTTP request to retrieve it (see comment). - In the third command ("One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen."), it returns TS.






-
-
Nginx rtmp module - on_publish fires multiple time instead of once
29 juillet 2017, par Stephen WrightThis is copy and pasted from the bug report I created on the rtmp-module by Arut, I am not completely sure if it is a bug or me not understanding how the module works, I have read the whole directives of module as from https://github.com/arut/nginx-rtmp-module/wiki/Directives
Proper explanation, if code is not displayed properly I will edit and fix
Hi, been using the module and finding it very very good !
Think I have found a issue though, although it may be me misunderstanding the directives.
Essentially I wish to fire a script (/usr/local/bin/make_thumbnail.sh) which creates a thumbnail automatically from a stream (using ffmpeg), the idea is to have this done for every stream as soon as it is published in order to create a function a bit like twitch tv where the streamer will not have to specify any thumbnail image, authenticated users simply start a stream (which will later be authenticated but is not yet) The script does also write data into the database however this stage works fine and I don’t believe the issue is related, if I comment out these lines then the thumbnail creation still works and my issue continues.
Initially this was done using the "exec" command as I believe I mis-read the documentation and I believe the exec command doesn’t work for my problem as ". When publishing stops the process is terminated." does this mean it will continually execute until stream stops ?
I have started using the exec_publish command to try and fix this issue however the same issue seems to occur. The entire script repeats approximately every 15-17 seconds, a new thumbnail is created and a new database entry is create with all the correct information.
Below is the nginx.conf line. Please ignore if indentation is incorrect couldn’t see a way to indent blocks of code and it’s late here, assume all code is indented correctly unless you believe that could be the issue in which case I will post it indented as early as I can.
application live {
allow play all;
live on;
record all;
record_path /var/stream/video_recordings/;
record_unique on;
hls on;
hls_nested on;
hls_path /var/stream/HLS/live;
hls_fragment 10s;
#on publish create thumbnail using first second of stream and save in
/var/stream/video_recordings/thumbnails
exec_publish usr/local/bin/make_thumbnail.sh $name;The rest can be pasted or attached if needed but is working nginx config for rtmp + website
The most simple version of the make_thumbnail..sh is pasted below, I have omitted the variables that I have used for database entryys obviously but as the script works without fail from terminal I believe this to be an nginx issue (if I run the command manually under the nginx user e.g. sudo -u nginx /usr/local/bin/make_thumbnail.sh with a name the same as any running stream, it works and only executes once as would expect, all permissions in script are ok and tested.
make_thumbnail.sh
#!/bin/bash
TIME=$(date +%s)
NAME=$1
echo "time: "
FILENAME=${TIME}_${NAME}
ffmpeg -i rtmp://192.168.0.98:1935/live/$1 -vframes 1 -s 150x150 -ss 10 -
strftime 1 /var/stream/video_recordings/thumbnails/"$FILENAME.jpg";
#Writes path to video into database
mysql --user=$DB_USER --password=$DB_PASSWD $DB_NAME << EOF
INSERT INTO $TABLE3 (thumbnailfile) VALUES ('$FILENAME');
set @last_id_in_thumbnails = LAST_INSERT_ID();
INSERT INTO $TABLE (created_at, updated_at, thumnailID) VALUES
(NOW(),NOW(),@last_id_in_thumbnails);
SET @last_id_in_livestreams = LAST_INSERT_ID();
INSERT INTO $TABLE2 (created_at, updated_at, filename,liveID) VALUES
(NOW(),NOW(),'$FILENAME',@last_id_in_livestreams);
EOFI have not got the nginx rtmp logs installed, I can obviously do this however some of the logs appear in the nginx error.log, strangely the latest stream I tried did not update in the access log, however I think this is because I did not attempt to connect to it via any method. I don’t fully understand the error.log, in my stupidity I decided to use nginx with which I am quite inexperienced and I am finding it very difficult to troubleshoot this issue, it appears to me that as part of the RTMP protocol or my streaming software (OBS) is either directly pinging the rtmp stream or is being pinged by the server to ensure the connection is still there. And this ping is
I have left a stream running from approx 4 minutes without interacting with the server, streaming software, computer running the stream, I have ensured the internet connection is constant as my first though was the connection dropped, however on inspecting the database the executing is done always after at least 11 seconds however usually this is 16, I can’t seem to figure out how to select the closest dates from the database however there has been at least a few 17 second differences (potentially when
I am unsure if this is an issue or if it is intended behavior but I do require this to finish a university degree, I’m not asking for answers but if it is a legitimate issue then I would be happy to spend as much time I can commit to it if some insight into what is causing it, or if there is a workaround I believe it should be documented somewhere, I have googled into making any exec commands run only once on publishI can’t seem to pinpoint where in the log the issue is happening however think it is something to do with the below exceprts I would attach the file but can’t seem to select all lines after the timestamp upon starting a stream
2017/07/26 18:17:35 [info] 1451#0: *2229 exec: starting managed child
'ffmpeg', client: 192.168.0.78, server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 client connected '192.168.0.98'
2017/07/26 18:17:35 [info] 1451#0: *2412 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98, server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 createStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 play: name='newname' args=''
start=-2000 duration=0 reset=0 silent=0, client: 192.168.0.98, server:
0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 recv() failed (104: Connection
reset by peer), client: 192.168.0.98, server: 0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 disconnect, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 deleteStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:36 [notice] 1451#0: signal 17 (SIGCHLD) received
2017/07/26 18:17:36 [notice] 1451#0: unknown process 10487 exited with code
0
2017/07/26 18:17:36 [info] 1451#0: *2229 exec: child 10487 exited; ignoring,
client: 192.168.0.78, server: 0.0.0.0:1935
ver: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2229 exec: starting managed child
'usr/local/bin/make_thumbnail.sh', client: 192.168.0.78, server:
0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 client connected '192.168.0.98'
2017/07/26 18:17:41 [info] 1451#0: *2413 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 createStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 play: name='newname' args=''
start=-2000 duration=0 reset=0 silent=0, client: 192.168.0.98, server:
0.0.0.0:1935
2017/07/26 18:17:43 [info] 1451#0: *2229 exec: starting managed child
'ffmpeg',
client: 192.168.0.78, server: 0.0.0.0:1935
2017/07/26 18:17:43 [info] 1451#0: *2414 client connected '192.168.0.98'
2017/07/26 18:17:43 [info] 1451#0: *2414 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98,
server: 0.0.0.0:1935
@ -
Dockerized ffmpeg stops for no reason
2 mai 2023, par Arthur AttoutI'm trying to fire up a container that reads a video stream via
ffmpeg
and saves the stream as 30 seconds segments.

When I run the container, it stops after 20-ish seconds and returns with no error.


Here is my Dockerfile


FROM linuxserver/ffmpeg
ENTRYPOINT ffmpeg -i rtsp://192.168.1.85:8554/camera -f v4l2 -c copy -reset_timestamps 1 -map 0 -f segment -segment_time 30 -segment_format mp4 "output/out%03d.mp4" -loglevel debug



Here is the output when I run
sudo docker run -it --rm -v /data/camera:/output --name camera_recorder camera_recorder:latest


[+] Building 1.8s (5/5) FINISHED
 => [internal] load build definition from Dockerfile 0.3s
 => => transferring dockerfile: 777B 0.0s
 => [internal] load .dockerignore 0.5s
 => => transferring context: 2B 0.0s
 => [internal] load metadata for docker.io/linuxserver/ffmpeg:latest 1.2s
 => CACHED [1/1] FROM docker.io/linuxserver/ffmpeg@sha256:823c611e0af82b864608c21d96bf363403310d92f154e238f6d51fe3d783e53b 0.0s
 => exporting to image 0.1s
 => => exporting layers 0.0s
 => => writing image sha256:f0509ccf0b07ff53d4aafa0d3b80fd50ed53e96db906c9a1e0e8c44e163dce94 0.1s
 => => naming to docker.io/library/camera_recorder 0.0s
ffmpeg version 5.1.2 Copyright (c) 2000-2022 the FFmpeg developers
 built with gcc 11 (Ubuntu 11.3.0-1ubuntu1~22.04)
 configuration: --disable-debug --disable-doc --disable-ffplay --enable-ffprobe --enable-cuvid --enable-gpl --enable-libaom --enable-libass --enable-libfdk_aac --enable-libfreetype --enable-libkvazaar --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libxml2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-nonfree --enable-nvdec --enable-nvenc --enable-opencl --enable-openssl --enable-small --enable-stripping --enable-vaapi --enable-vdpau --enable-version3
 libavutil 57. 28.100 / 57. 28.100
 libavcodec 59. 37.100 / 59. 37.100
 libavformat 59. 27.100 / 59. 27.100
 libavdevice 59. 7.100 / 59. 7.100
 libavfilter 8. 44.100 / 8. 44.100
 libswscale 6. 7.100 / 6. 7.100
 libswresample 4. 7.100 / 4. 7.100
 libpostproc 56. 6.100 / 56. 6.100
Splitting the commandline.
Reading option '-i' ... matched as input url with argument 'rtsp://192.168.1.85:8554/camera'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'v4l2'.
Reading option '-c' ... matched as option 'c' (codec name) with argument 'copy'.
Reading option '-reset_timestamps' ... matched as AVOption 'reset_timestamps' with argument '1'.
Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'segment'.
Reading option '-segment_time' ... matched as AVOption 'segment_time' with argument '30'.
Reading option '-segment_format' ... matched as AVOption 'segment_format' with argument 'mp4'.
Reading option 'output/out%03d.mp4' ... matched as output url.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Successfully parsed a group of options.
Parsing a group of options: input url rtsp://192.168.1.85:8554/camera.
Successfully parsed a group of options.
Opening an input file: rtsp://192.168.1.85:8554/camera.
[tcp @ 0x55c15e3eb040] No default whitelist set
[tcp @ 0x55c15e3eb040] Original list of addresses:
[tcp @ 0x55c15e3eb040] Address 192.168.1.85 port 8554
[tcp @ 0x55c15e3eb040] Interleaved list of addresses:
[tcp @ 0x55c15e3eb040] Address 192.168.1.85 port 8554
[tcp @ 0x55c15e3eb040] Starting connection attempt to 192.168.1.85 port 8554
[tcp @ 0x55c15e3eb040] Successfully connected to 192.168.1.85 port 8554
[rtsp @ 0x55c15e3e8300] SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=Stream
c=IN IP4 0.0.0.0
t=0 0
m=video 0 RTP/AVP 96
a=control:rtsp://192.168.1.85:8554/camera/trackID=0
a=rtpmap:96 MP4V-ES/90000
a=fmtp:96 config=000001B001000001B58913000001000000012000C48D88002D3C04871443000001B24C61766335392E33372E313030; profile-level-id=1

[rtsp @ 0x55c15e3e8300] video codec set to: mpeg4
[rtp @ 0x55c15e3ef600] No default whitelist set
[udp @ 0x55c15e3f0200] No default whitelist set
[udp @ 0x55c15e3f0200] end receive buffer size reported is 425984
[udp @ 0x55c15e3eff40] No default whitelist set
[udp @ 0x55c15e3eff40] end receive buffer size reported is 425984
[rtsp @ 0x55c15e3e8300] setting jitter buffer size to 500
[rtsp @ 0x55c15e3e8300] hello state=0
[rtsp @ 0x55c15e3e8300] Could not find codec parameters for stream 0 (Video: mpeg4, 1 reference frame, none(left), 1920x1080 [SAR 1:1 DAR 16:9], 1/5): unspecified pixel format
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options
Input #0, rtsp, from 'rtsp://192.168.1.85:8554/camera':
 Metadata:
 title : Stream
 Duration: N/A, bitrate: N/A
 Stream #0:0, 0, 1/90000: Video: mpeg4, 1 reference frame, none(left), 1920x1080 [SAR 1:1 DAR 16:9], 0/1, 5 tbr, 90k tbn
Successfully opened the file.
Parsing a group of options: output url output/out%03d.mp4.
Applying option f (force format) with argument v4l2.
Applying option c (codec name) with argument copy.
Applying option map (set input stream mapping) with argument 0.
Applying option f (force format) with argument segment.
Successfully parsed a group of options.
Opening an output file: output/out%03d.mp4.
Successfully opened the file.
[segment @ 0x55c15e415a80] Selected stream id:0 type:video
[segment @ 0x55c15e415a80] Opening 'output/out000.mp4' for writing
[file @ 0x55c15e42d840] Setting default whitelist 'file,crypto,data'
Output #0, segment, to 'output/out%03d.mp4':
 Metadata:
 title : Stream
 encoder : Lavf59.27.100
 Stream #0:0, 0, 1/10240: Video: mpeg4, 1 reference frame, none(left), 1920x1080 (0x0) [SAR 1:1 DAR 16:9], 0/1, q=2-31, 5 tbr, 10240 tbn
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
cur_dts is invalid st:0 (0) [init:1 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
No more output streams to write to, finishing.:00.00 bitrate=N/A speed= 0x
[segment @ 0x55c15e415a80] segment:'output/out000.mp4' count:0 ended
[AVIOContext @ 0x55c15e42d8c0] Statistics: 292 bytes written, 2 seeks, 3 writeouts
frame= 0 fps=0.0 q=-1.0 Lsize=N/A time=00:00:00.00 bitrate=N/A speed= 0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Input file #0 (rtsp://192.168.1.85:8554/camera):
 Input stream #0:0 (video): 0 packets read (0 bytes);
 Total: 0 packets (0 bytes) demuxed
Output file #0 (output/out%03d.mp4):
 Output stream #0:0 (video): 0 packets muxed (0 bytes);
 Total: 0 packets (0 bytes) muxed
0 frames successfully decoded, 0 decoding errors



Additional info :


- 

- The stream is up and running.
ffplay rtsp://192.168.1.85:8554/camera
opens normally - The exact command (from
ENTRYPOINT
) on the host, works perfectly fine (it generates files for every 30 seconds). - From inside the container, I can ping 192.168.1.85 (it is actually
localhost
) - Setting
-analyzeduration 1000
does not fix the issue










Why is the container stopping for no reason ?


- The stream is up and running.