
Recherche avancée
Médias (2)
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (19)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (5667)
-
Have 2 blocking scripts interact with each other in linux
18 novembre 2014, par OrtixxI have 2 blocking shell scripts which I want to have interact with each other. The scripts in question are peerflix (nodejs script) and ffmpeg (a simple bash script).
What happens : Peerflix fires up, feeds data to ffmpeg bash scrip which terminates peerflix on completion.
So once peerflix starts it outputs 2 lines and blocks immediately :
[08:15 PM]-[vagrant@packer-virtualbox-iso]-[/var/www/test]-[git master]
$ node /var/www/test/node/node_modules/peerflix/app.js /var/www/test/flexget/torrents/test.torrent -r -q
listening: http://10.0.2.15:38339/
process: 9601I have to feed the listening address to the ffmpeg bash script :
#!/bin/sh
ffmpeg -ss 00:05:00 -i {THE_LISTENING_PORT} -frames:v 1 out1.jpg
ffmpeg -ss 00:10:00 -i {THE_LISTENING_PORT} -frames:v 1 out2.jpgAfter the bash script is done I have to kill the peerflix script (hence me outputting the PID).
My question is how do I achieve this ?
-
AIR to ffmpeg via argb frames transfer
4 mai 2014, par mikaHey I’m running into a similar problem as : Converting RGB to YUV, + ffmpeg
From AIR, I figured the encoding was too long to render frames at a reasonable rate - so I exported the argb ByteArray from
bitmap.getPixels(rect)
directly to a file.So for a 30sec flash animation, I’d export let’s say 1500 frames to 1500
.argb
files.This method works great. I was able to render HD video using the ffmpeg cmd :
ffmpeg -f image2 -pix_fmt argb -vcodec rawvideo -s 640x380 -i frame_%d.argb -r 24 -qscale 1.1 -s 640x380 -i ./music.mp3 -shortest render-high.mpg
So far so good ! However, inbetween the two processes we need to store those 3gb of data.
I then tried to append all the argb to one single file and have ffmpeg consume it, but didn’t get anything good out of it... Also tried messing tcp/udp but getting stuck...
Does anyone know of a way to streamline that process and hopefully pipe both Air and ffmpeg together ?
-
How to specify the exact number of output image frames with ffmpeg ?
14 novembre 2020, par Miloslav ČížI have N input animation frames as images in a folder and I want to create interpolated inbetween frames to create a smoother animation of length N * M, i.e. for every input frame I want to create M output frames that gradually morph to the next frame, e.g. with the
minterpolate
filter.

In other words, I want to increase the FPS M times, but I am not working with time as I am not working with any video formats, both input and output are image sequences stored as image files.


I was trying to combine the
-r
andFPS
options, but without success as I don't know how they work together. For example :

- 

- I have 12 input frames.
- I want to use the
minterpolate
filter to achieve 120 frames. - I use the command
ffmpeg -i frames/f%04d.png -vf "fps=10, minterpolate" -r 100 interpolated_frames/f%04d.png
- The result I get is 31 output frames.










Is there a specific combination of
-r
andFPS
I should use ? Or is there another way I can achieve what I need ?

Thank you !