
Recherche avancée
Autres articles (60)
-
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (6323)
-
Python extract wav from video file
31 octobre 2015, par xolodecRelated :
How to extract audio from a video file using python ?
Extract audio from video as wav
How to rip the audio from a video ?
My question is how could I extract wav audio track from video file, say
video.avi
?
I read many articles and everywhere people suggest to use (from Python)ffmpeg
as a subprocess (because there are no reliable python bindings to ffmpeg - the only hope wasPyFFmpeg
but i found it is unmaintaned now). I don’t know if it is right solution and i am looking for good one.
I looked to gstreamer and found it nice but unable to satisfy my needs — the only way I found to accomplish this from command line looks likegst-launch-0.10 playbin2 uri=file://`pwd`/ex.mp4 audio-sink='identity single-segment=true ! audioconvert ! audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)16000, channels=(int)1 ! wavenc ! filesink location=foo.wav’
But it is not efficient because i need to wait ages while playing video and simultaneously writing to wav file.
ffmpeg
is much better :avconv -i foo.mp4 -ab 160k -ac 1 -ar 16000 -vn ffaudio.wav
But i am unable to launch it from python (not as a command line subprocess). Could you please point me out pros and cons of launching ffmpeg from python as a command line utility ? (I mean using python
multiprocessing
module or something similar).And second question.
What is simple way to cut long wav file into pieces so that i don’t break any words ? i mean pieces of 10-20 sec length with start and end during the pause in sentences/words ?
i know how to break them on arbitrary pieces :
import wave
win= wave.open('ffaudio.wav', 'rb')
wout= wave.open('ffsegment.wav', 'wb')
t0, t1= 2418, 2421 # cut audio between 2413, 2422 seconds
s0, s1= int(t0*win.getframerate()), int(t1*win.getframerate())
win.readframes(s0) # discard
frames= win.readframes(s1-s0)
wout.setparams(win.getparams())
wout.writeframes(frames)
win.close()
wout.close() -
could be recording video & OS timestamp with ffmpeg ?
2 septembre 2020, par sooyongchoiI'm newbie of Video, Image processing.


I want to recording High resolution & frame video with FFmpeg. (1920 * 1080 * 60FPS)
and, I need to Timestamp data of OS(Windows or ubuntu) for each frames.
I hope to Timestamp data is formatted with text data.


I searched any of using FFmpeg, but I couldn't find recording video with timestamp data.


Is there any idea of recording video & timestamp data with ffmpeg ?


Or, could you notice me with simple tip of making solution with ffmpeg ?


Thanks ! :)


-
Video players questions
31 mai 2015, par yayujGiven that FFmpeg is the leading multimedia framework and most of the video/audio players uses it, I’m wondering somethings about audio/video players using FFmpeg as intermediate.
I’m studying and I want to know how audio/video players works and I have some questions.
I was reading the ffplay source code and I saw that ffplay handles the subtitle stream. I tried to use a mkv file with a subtitle on it and doesn’t work. I tried using arguments such as -sst but nothing happened. - I was reading about subtitles and how video files uses it (or may I say containers ?). I saw that there’s two ways putting a subtitle : hardsubs and softsubs - roughly speaking hardsubs mode is burned and becomes part of the video, and softsubs turns a stream of subtitles (I might be wrong - please, correct me).
-
The question is : How does they handle this ? I mean, when the subtitle is part of the video there’s nothing to do, the video stream itself shows the subtitle, but what about the softsubs ? how are they handled ? (I heard something about text subs as well). - How does the subtitle appears on the screen and can be configured changing fonts, size, colors, without encoding everything again ?
-
I was studying some video players source codes and some or most of them uses OpenGL as renderer of the frame and others uses (such as Qt’s QWidget) (kind of or for sure) canvas. - What is the most used and which one is fastest and better ? OpenGL with shaders and stuffs ? Handling YUV or RGB and so on ? How does that work ?
-
It might be a dump question but what is the format that AVFrame returns ? For example, when we want to save frames as images first we need the frame and then we convert, from which format we are converting from ? Does it change according with the video codec or it’s always the same ?
-
Most of the videos I’ve been trying to handle is using YUV720P, I tried to save the frames as png and I need to convert to RGB first. I did a test with the players and I put at the same frame and I took also screenshots and compared. The video players shows the frames more colorful. I tried the same with ffplay that uses SDL (OpenGL) and the colors (quality) of the frames seems to be really low. What might be ? What they do ? Is it shaders (or a kind of magic ? haha).
Well, I think that is it for now. I hope you help me with that.
If this isn’t the correct place, please let me know where. I haven’t found another place in Stack Exchange communities.
-