
Recherche avancée
Autres articles (9)
-
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...) -
Les thèmes de MediaSpip
4 juin 20133 thèmes sont proposés à l’origine par MédiaSPIP. L’utilisateur MédiaSPIP peut rajouter des thèmes selon ses besoins.
Thèmes MediaSPIP
3 thèmes ont été développés au départ pour MediaSPIP : * SPIPeo : thème par défaut de MédiaSPIP. Il met en avant la présentation du site et les documents média les plus récents ( le type de tri peut être modifié - titre, popularité, date) . * Arscenic : il s’agit du thème utilisé sur le site officiel du projet, constitué notamment d’un bandeau rouge en début de page. La structure (...)
Sur d’autres sites (3182)
-
Flutter error in Ffmpeg, "Unhandled Exception : ProcessException : No such file or directory" in macOS desktop version
19 avril 2024, par pratik vekariyaI'm trying video trim video using ffmpeg, for macOS desktop application.


I have downloaded ffmpeg from here for macOS.


Here is my code


String mainPath = 'Users/apple/Workspace/User/2024/Project/videoapp/build/macos/Build/Products/Debug/';
 mainPath = mainPath.substring(0, mainPath.lastIndexOf("/"));
 
 Directory directoryExe3 = Directory("$mainPath");
 var dbPath = path.join(directoryExe3.path,
 "App.framework/Resources/flutter_assets/assets/ffmpeg/ffmpegmacos");
//here in "Products/Debug/" folder desktop app will generate

//directoryExe3 path will be, Users/apple/Workspace/User/2024/Project/videoapp/build/macos/Build/Products/Debug

//and dbPath will be, Users/apple/Workspace/User/2024/Project/videoapp/build/macos/Build/Products/Debug/App.framework/Resources/flutter_assets/assets/ffmpeg/ffmpegmacos

//so when app will run it can access it from this path

//executable code, command for ffmpeg

String transpose_str += "crop=" +
 out_w.toInt().toString() +
 ":" +
 out_h.toInt().toString() +
 ":" +
 x!.toInt().toString() +
 ":" +
 y!.toInt().toString() +
 ",";
 transpose_str += "scale=960:192";

Future<processresult> result_ = Process.run(dbPath, [
 "-ss",
 timestamp,
 "-i",
 inputFilePath,
 "-t",
 endTime,
 "-vf",
 transpose_str,
 "-an",
 "./temp.mp4",
 ]); 
</processresult>


Now when I run this in macOS desktop verison, it gives me error at Process.run that in dbPath, Unhandled Exception : ProcessException : No such file or directory.


Any help would be appreciate !


when i run this as desktop version it should get file from assets.


-
ffmpeg - convert MP4 to Panasonic Lumix playable MJPEG [closed]
21 août 2024, par Maro NattsI'm new to ffmpeg and ffprobe, I started using it to convert videos to show on 3DS and DSi for an art project.


Now I want to show my own videos on a Panasonic LUMIX digicam (DMC-FT2)(JPN), but have had trouble getting the device to play the video.


The digi takes videos in AVCHD Lite and Motion JPEG.


I've tried online converting mp4 to the respective formats + changing file name, but the device only recognises that it's there — refuses to/can't play it.


I also tried using this code to convert to MJPEG :

ffmpeg -i input.mp4 -c:v mjpeg -q:v 2 -an output.mjpeg


But no luck ! Changed the filename to match the others, the camera recognised the file is there, but can't play it.


So I used this code to get the information of a MJPEG video taken on the digi :

/opt/ffmpeg/ffprobe file.mp4 -show_streams -select_streams v -print_format json


and got the following, is it possible to convert my own video (mp4) to these settings ? So that it can play on the device ? If so how ?


Thanks !


Metadata:
 major_brand : qt 
 minor_version : 537331972
 compatible_brands: qt pana
 creation_time : 2024-08-21T21:54:59.000000Z
 Duration: 00:00:06.00, start: 0.000000, bitrate: 12287 kb/s
 Stream #0:0[0x1](eng): Video: mjpeg (Baseline) (jpeg / 0x6765706A), yuvj420p(pc, bt470bg/unknown/unknown), 640x480, 11083 kb/s, 30 fps, 30 tbr, 30 tbn (default)
 Metadata:
 creation_time : 2024-08-21T21:54:59.000000Z
 vendor_id : pana
 encoder : Photo - JPEG
 Stream #0:1[0x2](eng): Audio: pcm_s16be (twos / 0x736F7774), 16000 Hz, 1 channels, s16, 256 kb/s (default)
 Metadata:
 creation_time : 2024-08-21T21:54:59.000000Z
 vendor_id : pana
 "streams": [
 {
 "index": 0,
 "codec_name": "mjpeg",
 "codec_long_name": "Motion JPEG",
 "profile": "Baseline",
 "codec_type": "video",
 "codec_tag_string": "jpeg",
 "codec_tag": "0x6765706a",
 "width": 640,
 "height": 480,
 "coded_width": 640,
 "coded_height": 480,
 "closed_captions": 0,
 "film_grain": 0,
 "has_b_frames": 0,
 "pix_fmt": "yuvj420p",
 "level": -99,
 "color_range": "pc",
 "color_space": "bt470bg",
 "chroma_location": "center",
 "refs": 1,
 "id": "0x1",
 "r_frame_rate": "30/1",
 "avg_frame_rate": "30/1",
 "time_base": "1/30",
 "start_pts": 0,
 "start_time": "0.000000",
 "duration_ts": 180,
 "duration": "6.000000",
 "bit_rate": "11083093",
 "bits_per_raw_sample": "8",
 "nb_frames": "180",
 "disposition": {
 "default": 1,
 "dub": 0,
 "original": 0,
 "comment": 0,
 "lyrics": 0,
 "karaoke": 0,
 "forced": 0,
 "hearing_impaired": 0,
 "visual_impaired": 0,
 "clean_effects": 0,
 "attached_pic": 0,
 "timed_thumbnails": 0,
 "non_diegetic": 0,
 "captions": 0,
 "descriptions": 0,
 "metadata": 0,
 "dependent": 0,
 "still_image": 0
 },
 "tags": {
 "creation_time": "2024-08-21T21:54:59.000000Z",
 "language": "eng",
 "vendor_id": "pana",
 "encoder": "Photo - JPEG"
 }
 }
 ]
}




-
Unable to stream video file from MediaMTX media server to browser via WebRTC
8 juin 2024, par thegreatjediI took over a repository at work. It's a working demo comprising a web server which receives video and camera feeds from a media server (built from the rtsp-simple-server Docker image) via a RTSP relay server and streams the feeds to the client, all deployed via Docker Compose.


I'm trying to switch over to use WebRTC instead. rtsp-simple-server has upgraded into MediaMTX since the time the demo was created 2 years ago. This is the relevant section of the updated Docker Compose configuration :


media-server:
 image: bluenviron/mediamtx:latest-ffmpeg
 expose:
 - 8889
 init: true
 ports:
 - 8889:8889
 restart: unless-stopped
 volumes:
 - type: bind
 source: ./demo/vids
 target: /vids
 - type: bind
 source: ./demo/mediamtx.yml
 target: /mediamtx.yml



Relevant part of the MediaMTX custom configuration in
mediamtx.yml
:

###############################################
# Path settings

# Settings in "paths" are applied to specific paths, and the map key
# is the name of the path.
# Any setting in "pathDefaults" can be overridden here.
# It's possible to use regular expressions by using a tilde as prefix,
# for example "~^(test1|test2)$" will match both "test1" and "test2",
# for example "~^prefix" will match all paths that start with "prefix".
paths:
 # example:
 # my_camera:
 # source: rtsp://my_camera
 ~^demo\d+$:
 runOnDemand: ffmpeg -re -stream_loop -1 -i /vids/$MTX_PATH.mp4 -c:v libvpx -b:v 0 -crf 18 -qmin 18 -qmax 18 -f webm http://localhost:8889/$MTX_PATH/whip

 # Settings under path "all_others" are applied to all paths that
 # do not match another entry.
 all_others:



I've absolutely no experience with WebRTC. This is my first time hearing of this protocol, let alone working with it. From what I understand, I need to convert my demo mp4 videos (which were successfully streaming via RTSP in the previous implementation) to a compatible video codec, so I've opted for VP8.


Before trying to stream the videos into my web server, I tested the stream directly in the browser (tried with both the latest versions of Chrome and Edge). I went to
http://localhost:8889/demo0
(which should convert demo0.mp4 to VP8 and then stream it over WebRTC). The video player loaded in the browser but no video data was received and nothing played. After several seconds, the screen displayed "Error : bad status code 400, retrying in some seconds". In the browser console, it showed :

Failed to load resource : the server responded with a status of 400 (Bad Request)


Inside the MediaMTX container's runtime logs, this is what's displayed :


2024-04-02 14:53:08 ffmpeg version 6.1.1 Copyright (c) 2000-2023 the FFmpeg developers
2024-04-02 14:53:08 built with gcc 13.2.1 (Alpine 13.2.1_git20231014) 20231014
2024-04-02 14:53:08 configuration: --prefix=/usr --disable-librtmp --disable-lzma --disable-static --disable-stripping --enable-avfilter --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libmp3lame --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librist --enable-libsoxr --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-lto=auto --enable-lv2 --enable-openssl --enable-pic --enable-postproc --enable-pthreads --enable-shared --enable-vaapi --enable-vdpau --enable-version3 --enable-vulkan --optflags=-O3 --enable-libjxl --enable-libsvtav1 --enable-libvpl
2024-04-02 14:53:08 libavutil 58. 29.100 / 58. 29.100
2024-04-02 14:53:08 libavcodec 60. 31.102 / 60. 31.102
2024-04-02 14:53:08 libavformat 60. 16.100 / 60. 16.100
2024-04-02 14:53:08 libavdevice 60. 3.100 / 60. 3.100
2024-04-02 14:53:08 libavfilter 9. 12.100 / 9. 12.100
2024-04-02 14:53:08 libswscale 7. 5.100 / 7. 5.100
2024-04-02 14:53:08 libswresample 5. 0.100 / 5. 0.100
2024-04-02 14:53:08 libpostproc 57. 3.100 / 57. 3.100
2024-04-02 14:53:08 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/vids/demo0.mp4':
2024-04-02 14:53:08 Metadata:
2024-04-02 14:53:08 major_brand : isom
2024-04-02 14:53:08 minor_version : 512
2024-04-02 14:53:08 compatible_brands: isomiso2mp41
2024-04-02 14:53:08 encoder : Lavf58.76.100
2024-04-02 14:53:08 Duration: 00:00:03.47, start: 0.000000, bitrate: 1675 kb/s
2024-04-02 14:53:08 Stream #0:0[0x1](und): Video: mpeg1video (mp4v / 0x7634706D), yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], 104857 kb/s, 30 fps, 30 tbr, 90k tbn (default)
2024-04-02 14:53:08 Metadata:
2024-04-02 14:53:08 handler_name : VideoHandler
2024-04-02 14:53:08 vendor_id : [0][0][0][0]
2024-04-02 14:53:08 Side data:
2024-04-02 14:53:08 cpb: bitrate max/min/avg: 0/0/0 buffer size: 49152 vbv_delay: N/A
2024-04-02 14:53:08 Stream mapping:
2024-04-02 14:53:08 Stream #0:0 -> #0:0 (mpeg1video (native) -> vp8 (libvpx))
2024-04-02 14:53:08 Press [q] to stop, [?] for help
2024-04-02 14:53:08 [libvpx @ 0x7faa8591b8c0] v1.13.1
2024-04-02 14:53:08 [libvpx @ 0x7faa8591b8c0] Bitrate not specified for constrained quality mode, using default of 256kbit/sec
2024-04-02 14:53:08 Output #0, webm, to 'http://localhost:8889/demo0/whip':
2024-04-02 14:53:08 Metadata:
2024-04-02 14:53:08 major_brand : isom
2024-04-02 14:53:08 minor_version : 512
2024-04-02 14:53:08 compatible_brands: isomiso2mp41
2024-04-02 14:53:08 encoder : Lavf60.16.100
2024-04-02 14:53:08 Stream #0:0(und): Video: vp8, yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], q=2-31, 256 kb/s, 30 fps, 1k tbn (default)
2024-04-02 14:53:08 Metadata:
2024-04-02 14:53:08 handler_name : VideoHandler
2024-04-02 14:53:08 vendor_id : [0][0][0][0]
2024-04-02 14:53:08 encoder : Lavc60.31.102 libvpx
2024-04-02 14:53:08 Side data:
2024-04-02 14:53:08 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
2024-04-02 14:53:18 2024/04/02 06:53:18 INF [path demo0] runOnDemand command stopped: timed out
2024-04-02 14:53:18 2024/04/02 06:53:18 INF [WebRTC] [session 0f460c76] closed: source of path 'demo0' has timed out
[out#0/webm @ 0x7faa859487c0] video:272kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.042856%
2024-04-02 14:53:18 frame= 315 fps= 32 q=18.0 Lsize= 275kB time=00:00:10.46 bitrate= 215.1kbits/s speed=1.05x 
2024-04-02 14:53:18 Exiting normally, received signal 2.



I'm not sure what this is supposed to mean ? Why isn't the server able to stream this 3-second, 709kb video even once ? The browser connected to the server and the URL successfully, but no data was being transferred.


Just in case, I decided to manually convert all of my mp4 files to webm using ffmpeg, and verified with Window's media player that the webm videos work. Then, I modified MediaMTX's configuration to stream the webm videos directly :


paths:
 # example:
 # my_camera:
 # source: rtsp://my_camera
 ~^demo\d+$:
 runOnDemand: ffmpeg -re -stream_loop -1 -i /vids/$MTX_PATH.webm -c copy -f webm http://localhost:8889/$MTX_PATH/whip



However, the error persists :


2024-04-02 15:03:58 ffmpeg version 6.1.1 Copyright (c) 2000-2023 the FFmpeg developers
2024-04-02 15:03:58 built with gcc 13.2.1 (Alpine 13.2.1_git20231014) 20231014
2024-04-02 15:03:58 configuration: --prefix=/usr --disable-librtmp --disable-lzma --disable-static --disable-stripping --enable-avfilter --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libmp3lame --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librist --enable-libsoxr --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-lto=auto --enable-lv2 --enable-openssl --enable-pic --enable-postproc --enable-pthreads --enable-shared --enable-vaapi --enable-vdpau --enable-version3 --enable-vulkan --optflags=-O3 --enable-libjxl --enable-libsvtav1 --enable-libvpl
2024-04-02 15:03:58 libavutil 58. 29.100 / 58. 29.100
2024-04-02 15:03:58 libavcodec 60. 31.102 / 60. 31.102
2024-04-02 15:03:58 libavformat 60. 16.100 / 60. 16.100
2024-04-02 15:03:58 libavdevice 60. 3.100 / 60. 3.100
2024-04-02 15:03:58 libavfilter 9. 12.100 / 9. 12.100
2024-04-02 15:03:58 libswscale 7. 5.100 / 7. 5.100
2024-04-02 15:03:58 libswresample 5. 0.100 / 5. 0.100
2024-04-02 15:03:58 libpostproc 57. 3.100 / 57. 3.100
2024-04-02 15:03:58 Input #0, matroska,webm, from '/vids/demo0.webm':
2024-04-02 15:03:58 Metadata:
2024-04-02 15:03:58 COMPATIBLE_BRANDS: isomiso2mp41
2024-04-02 15:03:58 MAJOR_BRAND : isom
2024-04-02 15:03:58 MINOR_VERSION : 512
2024-04-02 15:03:58 ENCODER : Lavf60.16.100
2024-04-02 15:03:58 Duration: 00:00:03.47, start: 0.000000, bitrate: 217 kb/s
2024-04-02 15:03:58 Stream #0:0: Video: vp8, yuv420p(tv, progressive), 640x360, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn (default)
2024-04-02 15:03:58 Metadata:
2024-04-02 15:03:58 HANDLER_NAME : VideoHandler
2024-04-02 15:03:58 VENDOR_ID : [0][0][0][0]
2024-04-02 15:03:58 ENCODER : Lavc60.31.102 libvpx
2024-04-02 15:03:58 DURATION : 00:00:03.466000000
2024-04-02 15:03:58 Output #0, webm, to 'http://localhost:8889/demo0/whip':
2024-04-02 15:03:58 Metadata:
2024-04-02 15:03:58 COMPATIBLE_BRANDS: isomiso2mp41
2024-04-02 15:03:58 MAJOR_BRAND : isom
2024-04-02 15:03:58 MINOR_VERSION : 512
2024-04-02 15:03:58 encoder : Lavf60.16.100
2024-04-02 15:03:58 Stream #0:0: Video: vp8, yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], q=2-31, 30 fps, 30 tbr, 1k tbn (default)
2024-04-02 15:03:58 Metadata:
2024-04-02 15:03:58 HANDLER_NAME : VideoHandler
2024-04-02 15:03:58 VENDOR_ID : [0][0][0][0]
2024-04-02 15:03:58 ENCODER : Lavc60.31.102 libvpx
2024-04-02 15:03:58 DURATION : 00:00:03.466000000
2024-04-02 15:03:58 Stream mapping:
2024-04-02 15:03:58 Stream #0:0 -> #0:0 (copy)
2024-04-02 15:03:58 Press [q] to stop, [?] for help
2024-04-02 15:04:08 2024/04/02 07:04:08 INF [path demo0] runOnDemand command stopped: timed out
2024-04-02 15:04:08 2024/04/02 07:04:08 INF [WebRTC] [session 829664cb] closed: source of path 'demo0' has timed out
[out#0/webm @ 0x7f04b00515c0] video:281kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.023511%
2024-04-02 15:04:08 size= 284kB time=00:00:10.49 bitrate= 221.3kbits/s speed=1.05x 
2024-04-02 15:04:08 Exiting normally, received signal 2.



This is the same when I try to stream my other videos (demo1.mp4, demo2.mp4 etc.). What am I doing wrong ?