Recherche avancée

Médias (91)

Autres articles (80)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Configuration spécifique pour PHP5

    4 février 2011, par

    PHP5 est obligatoire, vous pouvez l’installer en suivant ce tutoriel spécifique.
    Il est recommandé dans un premier temps de désactiver le safe_mode, cependant, s’il est correctement configuré et que les binaires nécessaires sont accessibles, MediaSPIP devrait fonctionner correctement avec le safe_mode activé.
    Modules spécifiques
    Il est nécessaire d’installer certains modules PHP spécifiques, via le gestionnaire de paquet de votre distribution ou manuellement : php5-mysql pour la connectivité avec la (...)

Sur d’autres sites (4642)

  • How do I get video frame buffer stream from a connected GoPro camera ?

    29 décembre 2024, par cleanrun

    I'm creating an app that can connect to a GoPro camera and I want to get the frame buffer stream from the connected GoPro camera and use it on my iOS app (by converting the buffer data into CMSampleBuffer). I'm currently trying to use the FFmpeg library but so far it doesn't work. Here's the logic I've implemented (I'm using ChatGPT to generate the code) :

    


    import Foundation
import CoreMedia
import AVFoundation
import ffmpegkit


final class FFmpegBufferProcessor: AnyBufferProcessor {
    weak var delegate: BufferProcessorDelegate?
    
    private var pipePath: String = NSTemporaryDirectory() + "ffmpeg_pipe"
    private var isProcessing: Bool = false
    private var videoWidth = 1920
    private var videoHeight = 1080
    private let pixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
    
    init() {
        setupPipe()
    }
    
    deinit {
        cleanupPipe()
    }
    
    private func setupPipe() {
        do {
            if FileManager.default.fileExists(atPath: pipePath) {
                try FileManager.default.removeItem(atPath: pipePath)
            }
            
            let result = mkfifo(pipePath.cString(using: .utf8), 0o644)
            if result != 0 {
                print("\(#function); Pipe creation failed.")
                return
            }
        } catch {
            print("\(#function); Setup pipe error: \(error.localizedDescription)")
        }
    }
    
    private func cleanupPipe() {
        do {
            try FileManager.default.removeItem(atPath: pipePath)
        } catch {
            print("\(#function); Cleanup pipe error: \(error.localizedDescription)")
        }
    }
    
    func startProcessingStream(from udpURL: String) {
        guard !isProcessing else {
            print("\(#function); Already processing stream.")
            return
        }
        
        isProcessing = true
        let command = """
        -i \(udpURL) -f rawvideo -pix_fmt nv12 \(pipePath)
        """
        
        FFmpegKit.executeAsync(command) { [weak self] session in
            let returnCode = session?.getReturnCode()
            if ReturnCode.isSuccess(returnCode) {
                print("\(#function); FFmpeg session completed.")
            } else {
                print("\(#function); FFmpeg session error: \(String(describing: session?.getFailStackTrace())).")
            }
            
            self?.isProcessing = false
        }
        
        readFromPipe()
    }
    
    func stopProcessingStream() {
        isProcessing = false
        FFmpegKit.cancel()
    }
}

// MARK: - Private methods

private extension FFmpegBufferProcessor {
    func readFromPipe() {
        DispatchQueue.global(qos: .background).async { [unowned self] in
            guard let fileHandle = FileHandle(forReadingAtPath: self.pipePath) else {
                print("\(#function); Fail to read file handle from pipe path.")
                return
            }
            
            autoreleasepool {
                while self.isProcessing {
                    let frameSize = self.videoWidth * self.videoHeight * 3 / 2
                    let rawData = fileHandle.readData(ofLength: frameSize)
                    
                    if rawData.isEmpty {
                        print("\(#function); Pipe closed / no more data to read.")
                        break
                    }
                    
                    self.handleRawFrameData(rawData)
                }
                
                fileHandle.closeFile()
            }
        }
    }
    
    func handleRawFrameData(_ data: Data) {
        let width = 1920
        let height = 1080
        
        // Creating the Pixel Buffer (if possible)
        guard let pixelBuffer = createPixelBuffer(from: data, width: width, height: height) else {
            print("\(#function); Failed to create pixel buffer")
            return
        }
        
        var timing = CMSampleTimingInfo(duration: CMTime(value: 1, timescale: 30), presentationTimeStamp: .zero, decodeTimeStamp: .invalid)
        // Creating the Sample Buffer (if possible)
        guard let sampleBuffer = createSampleBuffer(from: pixelBuffer, timing: &timing) else {
            print("\(#function); Failed to create sample buffer")
            return
        }
        
        delegate?.bufferProcessor(self, didOutput: sampleBuffer)
    }
}


    


    Here's the logs I'm getting from FFMpeg :

    


    Debug log

    


    Also a quick note, I'm using AVSampleBufferDisplayLayer to enqueue and show the buffers, but obviously it doesn't show up.

    


    What should I do to fix this ? Or maybe is there any other way to get the frame buffers from a GoPro camera and show it in iOS ? Any help would be appreciated. Thank you.

    


  • Flutter FFMPEG : The BackgroundIsolateBinaryMessenger.instance value is invalid until BackgroundIsolateBinaryMessenger.ensureInitialized is executed

    25 juin 2023, par Danny

    Hey guys I have a function that uses ffmpeg to convert images to gifs. I am using the simple compute function provided by flutter, but I am getting this error.

    


    


    I/flutter (12889) : Loading ffmpeg-kit-flutter. E/flutter (12889) :
[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled
Exception : Bad state : The BackgroundIsolateBinaryMessenger.instance
value is invalid until
BackgroundIsolateBinaryMessenger.ensureInitialized is executed.
E/flutter (12889) : #0 BackgroundIsolateBinaryMessenger.instance

    


    


    Logs :

    


    


    I/flutter (12889) : Loading ffmpeg-kit-flutter. E/flutter (12889) :
[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled
Exception : Bad state : The BackgroundIsolateBinaryMessenger.instance
value is invalid until
BackgroundIsolateBinaryMessenger.ensureInitialized is executed.
E/flutter (12889) : #0 BackgroundIsolateBinaryMessenger.instance
(package:flutter/src/services/_background_isolate_binary_messenger_io.dart:27:7)
E/flutter (12889) : #1 _findBinaryMessenger
(package:flutter/src/services/platform_channel.dart:135:42) E/flutter
(12889) : #2 EventChannel.binaryMessenger
(package:flutter/src/services/platform_channel.dart:619:27) E/flutter
(12889) : #3 EventChannel.receiveBroadcastStream. (package:flutter/src/services/platform_channel.dart:639:7)
E/flutter (12889) : #4 _runGuarded
(dart:async/stream_controller.dart:815:24) E/flutter (12889) : #5
    
_BroadcastStreamController._subscribe (dart:async/broadcast_stream_controller.dart:207:7) E/flutter (12889) :
#6 _ControllerStream._createSubscription (dart:async/stream_controller.dart:828:19) E/flutter (12889) : #7
    
_StreamImpl.listen (dart:async/stream_impl.dart:471:9) E/flutter (12889) : #8 FFmpegKitInitializer._initialize
(package:ffmpeg_kit_flutter_min_gpl/src/ffmpeg_kit_flutter_initializer.dart:311:44)
E/flutter (12889) : #9 FFmpegKitInitializer.initialize
(package:ffmpeg_kit_flutter_min_gpl/src/ffmpeg_kit_flutter_initializer.dart:54:23)
E/flutter (12889) : #10 FFmpegKitConfig.init
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit_config.dart:50:32)
E/flutter (12889) : #11 AbstractSession.createFFmpegSession
(package:ffmpeg_kit_flutter_min_gpl/abstract_session.dart:69:29)
E/flutter (12889) : #12 FFmpegSession.create
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_session.dart:40:43)
E/flutter (12889) : #13 FFmpegKit.executeWithArguments
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit.dart:44:29) E/flutter
(12889) : #14 FFmpegKit.execute
(package:ffmpeg_kit_flutter_min_gpl/ffmpeg_kit.dart:38:17) E/flutter
(12889) : #15 _shareMoments
(package:carefour/presentation/maker/create_function.dart:217:19)
E/flutter (12889) : #16 compute.
(package:flutter/src/foundation/_isolates_io.dart:19:20) E/flutter
(12889) : #17 _RemoteRunner._run (dart:isolate:1021:47) E/flutter
(12889) : #18 _RemoteRunner._remoteExecute (dart:isolate:1015:12)
E/flutter (12889) : #19 _delayEntrypointInvocation. (dart:isolate-patch/isolate_patch.dart:299:17) E/flutter
(12889) : #20 _RawReceivePort._handleMessage
(dart:isolate-patch/isolate_patch.dart:189:12)

    


    


    This is the code :

    


    Future<bool> shareMoments(ComputeMomentModel data) async {&#xA;  File? imgFile;&#xA;  File? paletteFile;&#xA;  var uuid = const Uuid();&#xA;  String newUuid = uuid.v4();&#xA;  String finalImagePath = "momentGif-$newUuid.gif";&#xA;  String paletteFileName= "momentPalette-$newUuid.png";&#xA;  File? finalImage;&#xA;  finalImage = null;&#xA;&#xA;  await FFmpegKit.execute(&#x27;-i ${data.directoryPath}/image%d.png -vf palettegen ${data.directoryPath}/$paletteFileName&#x27;).then((session) async {&#xA;    final returnCode = await session.getReturnCode();&#xA;&#xA;    if (ReturnCode.isSuccess(returnCode)) {&#xA;      paletteFile = File("${data.directoryPath}/$paletteFileName");&#xA;&#xA;      await FFmpegKit.execute(&#x27;-f image2 -y -r 8 -i ${data.directoryPath}/image%d.png -i ${paletteFile?.path} -filter_complex fps=8,scale=720:-1:flags=lanczos,split[s0][s1];[s0]palettegen=max_colors=32[p];[s1][p]paletteuse=dither=bayer ${data.directoryPath}/$finalImagePath&#x27;).then((session) async {&#xA;        final returnCode = await session.getReturnCode();&#xA;        if (ReturnCode.isSuccess(returnCode)) {&#xA;          finalImage = File("${data.directoryPath}/$finalImagePath");&#xA;        }&#xA;      });&#xA;&#xA;    } else {&#xA;      debugPrint("Failed");&#xA;    }&#xA;  });&#xA;&#xA;  // Below block to clear the cache - else ffmpeg keeps on creating the first one&#xA;  int i = 0;&#xA;  for (i = 0; i&lt;24; i&#x2B;&#x2B;) {&#xA;    imgFile = File(&#x27;${data.directoryPath}/image$i.png&#x27;);&#xA;    imgFile.delete(recursive: true);&#xA;  }&#xA;&#xA;  // —> Calling Backend API&#xA;&#xA;  return true;&#xA;&#xA;}&#xA;</bool>

    &#xA;

    compute function

    &#xA;

    Future<bool> computeShareMoments({required String directoryPath}) async {&#xA;  ComputeCreateMomentModel data = ComputeMomentModel(null, directoryPath);&#xA;  return await compute(shareMoments, data);&#xA;&#xA;}&#xA;</bool>

    &#xA;

    Can anyone help with a solution ? I am stuck for a while now. Thanks in advance.

    &#xA;

  • how to build ffmpeg with burn text on hls output while maintaining the aspect ratio

    26 février 2016, par Aameer

    My objective is to burn a text(watermark) at a particular time range on the outputed hls video, I also have to change just the height and maintain the aspect ratio which I am able to do but the burning in text part is not working. First the ffmpeg I used is for 14.04 ubuntu, details are here. I don’t think this build supports subtitle (docs) filter which I could have used for my purpose as described here. when i enter ffmpeg into the terminal this is the output

    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ffmpeg
    ffmpeg version N-78590-g5590ab4 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
     configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
     libavutil      55. 18.100 / 55. 18.100
     libavcodec     57. 24.103 / 57. 24.103
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 32.100 /  6. 32.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Hyper fast Audio and Video encoder
    usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

    Use -h to get full help or, even better, run 'man ffmpeg'

    I tried

    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ffmpeg -threads 4 -i "input_sintel_trailer_720p.mp4" -profile:v baseline -level 4.0 -vf scale="trunc(360*a/2)*2:360, subtitles='subtitles.srt'"  -start_number 0 -hls_time 10 -hls_list_size 0 -f hls videos/4444/index_4444_360_.m3u8

    and got an error

    [Parsed_subtitles_1 @ 0x2f1b7e0] Shaper: FriBidi 0.19.6 (SIMPLE) HarfBuzz-ng 0.9.27 (COMPLEX)
    [Parsed_subtitles_1 @ 0x2f1b7e0] Unable to open subtitles.srt
    [AVFilterGraph @ 0x2f1ac00] Error initializing filter 'subtitles' with args 'subtitles.srt'
    Error opening filters!

    then I tried this

    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ffmpeg -threads 4 -i "input_sintel_trailer_720p.mp4" -profile:v baseline -level 4.0 -vf "scale='trunc(360*a/2)*2:360', drawtext:drawtext"  -start_number 0 -hls_time 10 -hls_list_size 0 -f hls videos/4444/index_4444_360_.m3u8

    based on an answer here and got this error

    [AVFilterGraph @ 0x2695c20] No such filter: 'drawtext:drawtext'
    Error opening filters!

    I tried a static build too which i got from here but still couldn’t make it work. with this static build i tried :

    ffmpeg version N-63893-gc69defd Copyright (c) 2000-2014 the FFmpeg developers
     built on Jul 16 2014 05:38:01 with gcc 4.6 (Debian 4.6.3-1)
     configuration: --prefix=/root/ffmpeg-static/64bit --extra-cflags='-I/root/ffmpeg-static/64bit/include -static' --extra-ldflags='-L/root/ffmpeg-static/64bit/lib -static' --extra-libs='-lxml2 -lexpat -lfreetype' --enable-static --disable-shared --disable-ffserver --disable-doc --enable-bzlib --enable-zlib --enable-postproc --enable-runtime-cpudetect --enable-libx264 --enable-gpl --enable-libtheora --enable-libvorbis --enable-libmp3lame --enable-gray --enable-libass --enable-libfreetype --enable-libopenjpeg --enable-libspeex --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-version3 --enable-libvpx
     libavutil      52. 89.100 / 52. 89.100
     libavcodec     55. 66.101 / 55. 66.101
     libavformat    55. 43.100 / 55. 43.100
     libavdevice    55. 13.101 / 55. 13.101
     libavfilter     4.  8.100 /  4.  8.100
     libswscale      2.  6.100 /  2.  6.100
     libswresample   0. 19.100 /  0. 19.100
     libpostproc    52.  3.100 / 52.  3.100
    Hyper fast Audio and Video encoder
    usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

    Use -h to get full help or, even better, run 'man ffmpeg'

    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ./ffmpeg -threads 4 -i "input_sintel_trailer_720p.mp4" -profile:v baseline -level 4.0 -vf scale="trunc(360*a/2)*2:360, subtitles='subtitle.srt'"  -start_number 0 -hls_time 10 -hls_list_size 0 -f hls videos/4444/index_4444_360_.m3u8

    but got an error

    Could not write header for output file #0 (incorrect codec parameters ?): No such file or directory

    Any help in this regard would be appreciated. I don’t have much experience with compiling , otherwise would have done with support for libass as mentioned in the documentation here

    console output after trying the first potential answer :

    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ls
    bbb_sunflower_1080p_30fps_normal.mp4  ffmpeg_log_multiprocess.txt  ffmpeg.static.64bit.latest  input_sintel_trailer_720p.mp4  watermark.png
    encoding_script.sh                    ffmpeg_log_simple.txt        ffprobe                     subtitle.srt
    ffmpeg                                ffmpeg_log.txt               fontconfig                  videos
    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$ ffmpeg -threads 4 -i input_sintel_trailer_720p.mp4 -profile:v baseline -level 4.0 -vf "scale=-2:360,subtitles='subtitle.srt'" -start_number 0 -hls_time 10 -hls_list_size 0 -f hls videos/4444/index_4444_360_.m3u8
    ffmpeg version N-78590-g5590ab4 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
     configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
     libavutil      55. 18.100 / 55. 18.100
     libavcodec     57. 24.103 / 57. 24.103
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 32.100 /  6. 32.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input_sintel_trailer_720p.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       creation_time   : 1970-01-01 00:00:00
       title           : Sintel Trailer
       artist          : Durian Open Movie Team
       encoder         : Lavf52.62.0
       copyright       : (c) copyright Blender Foundation | durian.blender.org
       description     : Trailer for the Sintel open movie project
     Duration: 00:00:52.21, start: 0.000000, bitrate: 1165 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 1033 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 126 kb/s (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : SoundHandler
    [Parsed_subtitles_1 @ 0x2bdb700] Shaper: FriBidi 0.19.6 (SIMPLE) HarfBuzz-ng 0.9.27 (COMPLEX)
    [Parsed_subtitles_1 @ 0x2bdb700] Using font provider fontconfig
    [libx264 @ 0x2bc8b60] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
    [libx264 @ 0x2bc8b60] profile Constrained Baseline, level 4.0
    Output #0, hls, to 'videos/4444/index_4444_360_.m3u8':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       description     : Trailer for the Sintel open movie project
       title           : Sintel Trailer
       artist          : Durian Open Movie Team
       copyright       : (c) copyright Blender Foundation | durian.blender.org
       encoder         : Lavf57.25.100
       Stream #0:0(und): Video: h264 (libx264), yuv420p, 640x360, q=-1--1, 24 fps, 24 tbn, 24 tbc (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : VideoHandler
         encoder         : Lavc57.24.103 libx264
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
       Stream #0:1(und): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : SoundHandler
         encoder         : Lavc57.24.103 aac
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
     Stream #0:1 -> #0:1 (aac (native) -> aac (native))
    Could not write header for output file #0 (incorrect codec parameters ?): No such file or directory
    [aac @ 0x2bc9ac0] Qavg: -nan
    (ffmpeg)aameer@falcon:~/Documents/projects/ffmpeg$