Recherche avancée

Médias (1)

Mot : - Tags -/MediaSPIP

Autres articles (64)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (6199)

  • How can I publish an RTSP stream to a MediaMTX instance with TLS encryption configured ? [closed]

    15 février 2024, par cmd

    I have an instance of (MediaMTX) running on my laptop. I have an Amcrest IP camera. I can publish the camera's RTSP stream to the server so that the stream can be accessed from the MediaMTX instance.

    


    However, I want to encrypt the stream. On the Github page for MediaMTX, they detail how TLS encryption can be enabled for incoming and outgoing RTSP streams by generating a server.key and server.crt and editing lines in the .yml configuration file. Streams can then be published and read with RTSPS and port 8322.

    


    I have followed these steps, and set up the following path for my IP camera in the .yml :

    


    ###############################################
# Path settings

# Settings in "paths" are applied to specific paths, and the map key
# is the name of the path.
# Any setting in "pathDefaults" can be overridden here.
# It's possible to use regular expressions by using a tilde as prefix,
# for example "~^(test1|test2)$" will match both "test1" and "test2",
# for example "~^prefix" will match all paths that start with "prefix".
paths:

  cam1:
    source: rtsp://user:password@192.168.68.142:554
    runOnInit: ffmpeg -i rtsp://user:password@192.168.68.142:554 -c:v copy -c:a copy -f rtsp rtsps://localhost:8322/stream/cam1


    


    But I am getting the following output from my MediaMTX instance when it runs :

    


    2024/02/15 18:20:59 INF [path cam1] [RTSP source] ready: 2 tracks (H264, MPEG-4 Audio)
Input #0, rtsp, from 'rtsp://user:password@192.168.68.142:554':
  Metadata:
    title           : Media Server
  Duration: N/A, start: 0.030000, bitrate: N/A
  Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1920x1080, 100 tbr, 90k tbn
  Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #0:1 -> #0:1 (copy)
2024/02/15 18:21:00 INF [RTSPS] [conn [::1]:60147] opened
2024/02/15 18:21:00 INF [RTSPS] [session 07a8f23e] created by [::1]:60147
2024/02/15 18:21:00 INF [RTSPS] [conn [::1]:60147] closed: path 'stream/cam1' is not configured
2024/02/15 18:21:00 INF [RTSPS] [session 07a8f23e] destroyed: not in use
[out#0/rtsp @ 00000225c75fe9c0] Could not write header (incorrect codec parameters ?): Server returned 400 Bad Request
Conversion failed!
2024/02/15 18:21:00 INF [path cam1] runOnInit command exited: command exited with code 3486501640
2024/02/15 18:21:29 WAR [path cam1] [RTSP source] 10 RTP packets lost
2024/02/15 18:21:33 WAR [path cam1] [RTSP source] 2 RTP packets lost
2024/02/15 18:21:39 WAR [path cam1] [RTSP source] 11 RTP packets lost


    


    I have no errors and was able to publish the stream with the same path configuration (using rtsp ://, not rtsps ://) when using no encryption. What possible solutions can I try ?

    


  • Output file does not show up after executing ffmpeg command [closed]

    19 février 2024, par davai

    I'm using ffmpeg to combine an MP3 + G file and produce an MP4 file. I've placed the source code / .exe file for 'ffmpeg' in the project folder, and the MP3 + G files are also in the project folder. I also set the MP4 output to show up in the project folder as well. The weird thing is that, initially, I was producing output files, and while trying to tweak the constant rate factor, the MP4 output just stopped showing up entirely. I'm also not receiving any errors while running the code, and it does print out that the file has been successfully created, despite nothing showing up in the project folder.

    


    
        String mp3FilePath = "C:/Users/exampleuser/pfolder/example.mp3";
        String gFilePath = "C:/Users/exampleuser/pfolder/example.cdg";
        String mp4OutputPath = "C:/Users/exampleuser/pfolder/example.mp4";

        try
        {
            String[] command = {
                    "C:/Users/tonih/IdeaProjects/MP3GtoMP4Conversion/ffmpeg/ffmpeg-2024-02-19-git-0c8e64e268-full_build/bin/ffmpeg.exe",
                    "-i", mp3FilePath,       // Input MP3 file
                    "-r", "25",              // Frame rate
                    "-loop", "1",            // Loop input video
                    "-i", gFilePath,         // Input G file
                    "-c:v", "libx264",       // Video codec
                    "-preset", "slow",       // Encoding preset for quality (choose according to your requirement)
                    "-crf", "18",            // Constant Rate Factor (lower is higher quality, typical range 18-28)
                    "-c:a", "aac",           // Audio codec
                    "-b:a", "320k",          // Audio bitrate
                    "-shortest",             // Stop when the shortest stream ends
                    mp4OutputPath            // Output MP4 file
            };

            Process process = Runtime.getRuntime().exec(command);
            process.waitFor();
            System.out.println("MP4 file created successfully: " + mp4OutputPath);
        }
        catch (IOException | InterruptedException e)
        {
            e.printStackTrace();
        }


    


  • How to send a camera capture frame to YouTube streaming using ffmpeg

    2 mars 2024, par 유혜진
    import subprocess 
import cv2

# YouTube streaming settings
YOUTUBE_URL = "rtmp://a.rtmp.youtube.com/live2/"
KEY = "..."

# OpenCV camera setup
cap = cv2.VideoCapture(0)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 640)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)

# FFmpeg command for streaming
command = [r"C:\utility\ffmpeg\ffmpeg-2024-02-22-git-76b2bb96b4-full_build\ffmpeg-2024-02-22-git-76b2bb96b4-full_build\bin\ffmpeg.exe",
            '-f', 'rawvideo',
            '-pix_fmt', 'bgr24',
            '-s', '640x480',
            '-i', '-',
            '-ar', '44100',
            '-ac', '2',
            '-acodec', 'pcm_s16le',
            '-f', 's16le',
            '-ac', '2',
            '-i', 'NUL',   
            '-acodec', 'aac',
            '-ab', '128k',
            '-strict', 'experimental',
            '-vcodec', 'h264',
            '-pix_fmt', 'yuv420p',
            '-g', '50',
            '-vb', '1000k',
            '-profile:v', 'baseline',
            '-preset', 'ultrafast',
            '-r', '30',
            '-f', 'flv', 
            f"{YOUTUBE_URL}/{KEY}",]

# Open a subprocess with FFmpeg
pipe = subprocess.Popen(command, stdin=subprocess.PIPE)

while True:
    # Read a frame from the camera
    ret, frame = cap.read()
    if not ret:
        break

    # Display the frame
    cv2.imshow('Frame', frame)
    cv2.waitKey(1)  # Wait for 1ms

    # Send the frame through the pipe for streaming
    pipe.stdin.write(frame.tobytes())

    # Check for 'q' key press to stop streaming
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

# Release resources
cap.release()
cv2.destroyAllWindows()


    


    I'm trying to implement capturing the camera screen using opencv and transmitting this frame to the YouTube streaming broadcast via ffmpeg. YouTube streaming does start when I run this code. However, it appears to be a black screen, not a camera screen. I don't see what the problem is.

    


    I didn't even start streaming at first, but I changed the command option to various things, and when I ran the code, I succeeded in starting streaming. There are many references to transmitting mp4, but there are not many references to transmitting real-time capture. I'm going to process the camera screen using opencv and then send it to streaming. I don't know what the problem is. Please help me.