
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (98)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (8286)
-
Flutter (Dart) : Merge two videos and view the new output in the device's gallery (photos)
17 juin 2020, par Ittai BarkaiI am aware that there already exists a solution to a very similar question, which can be found on the following link : Flutter/Dart : Find two video segments and merge them into a single valid video file ? However, being relatively new to Flutter (and programming in general) I cannot seem to replicate the desired result.



My app is very simple and currently looks like this :






I click on the button "Record Video" to record two videos, which are both successfully stored into the device's gallery. Using the Flutter image_picker and gallery_saver packages and the following piece of code :



void _recordVideo() async {
 ImagePicker.pickVideo(source: ImageSource.camera)
 .then((File recordedVideo) {
 if (recordedVideo != null && recordedVideo.path != null) {
 setState(() {
 _buttonText = 'Saving in Progress...';
 });
 GallerySaver.saveVideo(recordedVideo.path).then((_) {
 setState(() {
 _buttonText = 'Video Saved!\n\nClick to Record New Video';
 if (_storedVideoOne == null) {
 _storedVideoOne = recordedVideo;
 print('video 1 stored');
 } else {
 _storedVideoTwo = recordedVideo;
 print('video 2 stored');
 _videoMerger();
 }
 });
 });
 }
 });
 }




I can view these videos when I click on the button at the bottom "View Video From Gallery".



Next I try to merge these two stored video files, using the flutter_ffmpeg package, as well as following the solution provided in the stack overflow question mentioned above. I try and do this using the following function I wrote :



void _videoMerger() async {


 final appDir = await syspaths.getApplicationDocumentsDirectory();
 String rawDocumentPath = appDir.path;
 final outputPath = '$rawDocumentPath/output.mp4';

 final FlutterFFmpeg _flutterFFmpeg = new FlutterFFmpeg();

 String commandToExecute = '-i ${_storedVideoOne.path} -i ${_storedVideoTwo.path} -filter_complex \'[0:0][1:0]concat=n=2:v=1:a=0[out]\' -map \'[out]\' outputPath';
 _flutterFFmpeg.execute(commandToExecute).then((rc) => print("FFmpeg process exited with rc $rc"));

 }




But after running the function I do not seem to get a new combined video, which should be stored in outputPath and ideally also viewable in the gallery. Uploaded the Flutter project onto GitHub here :



https://github.com/IttaiBarkai/Flutter-Video-Merger



Any help would be greatly appreciated :)



Updated :



Below is the output displayed on my debug console when ffmpeg gets executed :



D/flutter-ffmpeg( 4146): Running FFmpeg with arguments: [-i, /storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/d2b7a612-7c6d-48fe-8d06-85ceeb10e2f584195978113840656.mp4, -i, /storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/b6cb83a3-10ac-49c7-80f3-3447bebe93ac5245748251872788895.mp4, -filter_complex, [0:0][1:0]concat=n=2:v=1:a=0[out], -map, [out], outputPath.mp4].
I/mobile-ffmpeg( 4146): ffmpeg version git-2020-01-25-fd11dd500
I/mobile-ffmpeg( 4146): Copyright (c) 2000-2020 the FFmpeg developers
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): built with Android (5220042 based on r346389c) clang version 8.0.7 (https://android.googlesource.com/toolchain/clang b55f2d4ebfd35bf643d27dbca1bb228957008617) (https://android.googlesource.com/toolchain/llvm 3c393fe7a7e13b0fba4ac75a01aa683d7a5b11cd) (based on LLVM 8.0.7svn)
I/mobile-ffmpeg( 4146): configuration: --cross-prefix=i686-linux-android- --sysroot=/files/android-sdk/ndk-bundle/toolchains/llvm/prebuilt/linux-x86_64/sysroot --prefix=/home/taner/Projects/mobile-ffmpeg/prebuilt/android-x86/ffmpeg --pkg-config=/usr/bin/pkg-config --enable-version3 --arch=i686 --cpu=i686 --cc=i686-linux-android24-clang --cxx=i686-linux-android24-clang++ --target-os=android --disable-neon --disable-asm --disable-inline-asm --enable-cross-compile --enable-pic --enable-jni --enable-optimizations --enable-swscale --enable-shared --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-openssl --disable-xmm-clobber-test --disable-debug --enable-lto --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-static --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --di
I/mobile-ffmpeg( 4146): libavutil 56. 38.100 / 56. 38.100
I/mobile-ffmpeg( 4146): libavcodec 58. 65.102 / 58. 65.102
I/mobile-ffmpeg( 4146): libavformat 58. 35.101 / 58. 35.101
I/mobile-ffmpeg( 4146): libavdevice 58. 9.103 / 58. 9.103
I/mobile-ffmpeg( 4146): libavfilter 7. 70.101 / 7. 70.101
I/mobile-ffmpeg( 4146): libswscale 5. 6.100 / 5. 6.100
I/mobile-ffmpeg( 4146): libswresample 3. 6.100 / 3. 6.100
I/mobile-ffmpeg( 4146): Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/d2b7a612-7c6d-48fe-8d06-85ceeb10e2f584195978113840656.mp4':
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): major_brand :
I/mobile-ffmpeg( 4146): mp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): minor_version :
I/mobile-ffmpeg( 4146): 0
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): compatible_brands:
I/mobile-ffmpeg( 4146): isommp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): com.android.version:
I/mobile-ffmpeg( 4146): 10
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Duration:
I/mobile-ffmpeg( 4146): 27:34:19.40
I/mobile-ffmpeg( 4146): , start:
I/mobile-ffmpeg( 4146): 0.000000
I/mobile-ffmpeg( 4146): , bitrate:
I/mobile-ffmpeg( 4146): 0 kb/s
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Stream #0:0
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Video: h264 (avc1 / 0x31637661), yuv420p(tv, GBR), 1280x720, 3536 kb/s
I/mobile-ffmpeg( 4146): , SAR 1:1 DAR 16:9
I/mobile-ffmpeg( 4146): ,
I/mobile-ffmpeg( 4146): 28.75 fps,
I/mobile-ffmpeg( 4146): 29.08 tbr,
I/mobile-ffmpeg( 4146): 90k tbn,
I/mobile-ffmpeg( 4146): 180k tbc
I/mobile-ffmpeg( 4146): (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): rotate :
I/mobile-ffmpeg( 4146): 90
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): handler_name :
I/mobile-ffmpeg( 4146): VideoHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Side data:
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): displaymatrix: rotation of -90.00 degrees
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Stream #0:1
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Audio: amr_nb (samr / 0x726D6173), 8000 Hz, mono, flt, 12 kb/s
I/mobile-ffmpeg( 4146): (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): handler_name :
I/mobile-ffmpeg( 4146): SoundHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/b6cb83a3-10ac-49c7-80f3-3447bebe93ac5245748251872788895.mp4':
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): major_brand :
I/mobile-ffmpeg( 4146): mp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): minor_version :
I/mobile-ffmpeg( 4146): 0
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): compatible_brands:
I/mobile-ffmpeg( 4146): isommp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): com.android.version:
I/mobile-ffmpeg( 4146): 10
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Duration:
I/mobile-ffmpeg( 4146): 27:34:19.35
I/mobile-ffmpeg( 4146): , start:
I/mobile-ffmpeg( 4146): 0.000000
I/mobile-ffmpeg( 4146): , bitrate:
I/mobile-ffmpeg( 4146): 0 kb/s
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Stream #1:0
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Video: h264 (avc1 / 0x31637661), yuv420p(tv, GBR), 1280x720, 3561 kb/s
I/mobile-ffmpeg( 4146): , SAR 1:1 DAR 16:9
I/mobile-ffmpeg( 4146): ,
I/mobile-ffmpeg( 4146): 28.95 fps,
I/mobile-ffmpeg( 4146): 29 tbr,
I/mobile-ffmpeg( 4146): 90k tbn,
I/mobile-ffmpeg( 4146): 180k tbc
I/mobile-ffmpeg( 4146): (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): rotate 
:
I/mobile-ffmpeg( 4146): 90
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): handler_name :
I/mobile-ffmpeg( 4146): VideoHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Side data:
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): displaymatrix: rotation of -90.00 degrees
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Stream #1:1
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Audio: amr_nb (samr / 0x726D6173), 8000 Hz, mono, flt, 12 kb/s
I/mobile-ffmpeg( 4146): (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Metadata:
I/mobile-ffmpeg( 4146): creation_time :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): handler_name :
I/mobile-ffmpeg( 4146): SoundHandle
I/mobile-ffmpeg( 4146):
E/mobile-ffmpeg( 4146): outputPath.mp4: Read-only file system
D/flutter-ffmpeg( 4146): FFmpeg exited with rc: 1
I/flutter ( 4146): FFmpeg process exited with rc 1



-
How to export audio from a iPhone video file with FFmpeg ?
11 janvier 2020, par DanScripterUPDATE 1 the problem resides in the input read-stream. (check below)
I am using fluent-ffmpeg (version : 2.1.2) to get a .mp3 file out of a video file that I input as a stream .createReadStream(). I output the file as a .pipe to a remoteWriteStream.
ffmpeg -i pipe:0 -vn -f mp3 -acodec mp3 -movflags frag_keyframe+empty_moov pipe:1
This works fine with .mp4, .webm and .mov (codec : prores) files.
But it somehow does not want to work with a .mov out of an iPhone 11. FFmpeg is not giving me any error when running above code. It creates the .mp3 file but the size is just 152 B and it is not playable.
I ffprobed the iPhone .mov files it returns this :
"{ streams:
[ { index: 0,
codec_name: 'hevc',
codec_long_name: 'H.265 / HEVC (High Efficiency Video Coding)',
profile: 'Main',
codec_type: 'video',
codec_time_base: '1111/33300',
codec_tag_string: 'hvc1',
codec_tag: '0x31637668',
width: 1920,
height: 1080,
coded_width: 1920,
coded_height: 1088,
has_b_frames: 2,
sample_aspect_ratio: '0:1',
display_aspect_ratio: '0:1',
pix_fmt: 'yuv420p',
level: 120,
color_range: 'tv',
color_space: 'bt709',
color_transfer: 'bt709',
color_primaries: 'bt709',
chroma_location: 'unspecified',
field_order: 'unknown',
timecode: 'N/A',
refs: 1,
id: 'N/A',
r_frame_rate: '30000/1001',
avg_frame_rate: '33300/1111',
time_base: '1/600',
start_pts: 'N/A',
start_time: 'N/A',
duration_ts: 6666,
duration: 11.11,
bit_rate: 7611708,
max_bit_rate: 'N/A',
bits_per_raw_sample: 'N/A',
nb_frames: 333,
nb_read_frames: 'N/A',
nb_read_packets: 'N/A',
tags: [Object],
disposition: [Object] },
{ index: 1,
codec_name: 'aac',
codec_long_name: 'AAC (Advanced Audio Coding)',
profile: 'unknown',
codec_type: 'audio',
codec_time_base: '1/44100',
codec_tag_string: 'mp4a',
codec_tag: '0x6134706d',
sample_fmt: 'fltp',
sample_rate: 44100,
channels: 2,
channel_layout: 'stereo',
bits_per_sample: 0,
id: 'N/A',
r_frame_rate: '0/0',
avg_frame_rate: '0/0',
time_base: '1/44100',
start_pts: 'N/A',
start_time: 'N/A',
duration_ts: 489951,
duration: 11.11,
bit_rate: 135091,
max_bit_rate: 192000,
bits_per_raw_sample: 'N/A',
nb_frames: 481,
nb_read_frames: 'N/A',
nb_read_packets: 'N/A',
tags: [Object],
disposition: [Object] },
{ index: 2,
codec_name: 'unknown',
codec_long_name: 'unknown',
profile: 'unknown',
codec_type: 'data',
codec_tag_string: 'mebx',
codec_tag: '0x7862656d',
id: 'N/A',
r_frame_rate: '0/0',
avg_frame_rate: '0/0',
time_base: '1/600',
start_pts: 'N/A',
start_time: 'N/A',
duration_ts: 6666,
duration: 11.11,
bit_rate: 2670,
max_bit_rate: 'N/A',
bits_per_raw_sample: 'N/A',
nb_frames: 38,
nb_read_frames: 'N/A',
nb_read_packets: 'N/A',
tags: [Object],
disposition: [Object] },
{ index: 3,
codec_name: 'unknown',
codec_long_name: 'unknown',
profile: 'unknown',
codec_type: 'data',
codec_tag_string: 'mebx',
codec_tag: '0x7862656d',
id: 'N/A',
r_frame_rate: '0/0',
avg_frame_rate: '0/0',
time_base: '1/600',
start_pts: 'N/A',
start_time: 'N/A',
duration_ts: 6666,
duration: 11.11,
bit_rate: 7,
max_bit_rate: 'N/A',
bits_per_raw_sample: 'N/A',
nb_frames: 1,
nb_read_frames: 'N/A',
nb_read_packets: 'N/A',
tags: [Object],
disposition: [Object] },
{ index: 4,
codec_name: 'unknown',
codec_long_name: 'unknown',
profile: 'unknown',
codec_type: 'data',
codec_tag_string: 'mebx',
codec_tag: '0x7862656d',
id: 'N/A',
r_frame_rate: '0/0',
avg_frame_rate: '0/0',
time_base: '1/600',
start_pts: 'N/A',
start_time: 'N/A',
duration_ts: 6666,
duration: 11.11,
bit_rate: 18117,
max_bit_rate: 'N/A',
bits_per_raw_sample: 'N/A',
nb_frames: 333,
nb_read_frames: 'N/A',
nb_read_packets: 'N/A',
tags: [Object],
disposition: [Object] } ],
format:
{ filename: 'pipe:0',
nb_streams: 5,
nb_programs: 0,
format_name: 'mov,mp4,m4a,3gp,3g2,mj2',
format_long_name: 'QuickTime / MOV',
start_time: 'N/A',
duration: 11.11,
size: 'N/A',
bit_rate: 'N/A',
probe_score: 100,
tags:
{ major_brand: 'qt ',
minor_version: '0',
compatible_brands: 'qt ',
creation_time: '2020-01-11T12:33:36.000000Z',
'com.apple.quicktime.make': 'Apple',
'com.apple.quicktime.model': 'iPhone 11',
'com.apple.quicktime.software': '13.3',
'com.apple.quicktime.creationdate': '2020-01-11T13:33:36+0100' } },
chapters: [] }"The created .mp3 file results in a undefined.
I already tried to take the .mov re-encode it to an .mp4 via ffmpeg and then try the above code to get the .mp3 - still does not work.
Any advice how I can make this work ?
thanks !
UPDATE 1
The problem resides in the input stream ! When I download the file to the local machine and input it as as a local file, not as a read stream, it works perfectly.I am creating the stream from a google cloud bucket like this :
const myBucket = storage.bucket('myBucket');
const remoteReadStream = myBucket.file(file).createReadStream();Since this code is perfectly working with all other codecs, what maybe the issue while creating the read stream from the google cloud from a h265 file ?
-
CJEU rules US cloud servers don’t comply with GDPR and what this means for web analytics
17 juillet 2020, par Jake Thornton