
Recherche avancée
Autres articles (79)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (6372)
-
ffmpeg extracting frames from video
22 novembre 2017, par TheOtherguyz4kjI am using Writingminds FFmpeg to use FFmpeg with Android. I am currently trying to extract frames from a video.
I would like to extract 8 frames from the video evenly distributed throughout the video. I found this tutorial here is my implementation of it.
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(videoCroppedFile.getAbsolutePath());
String time = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
Long videoLength = Long.parseLong(time);
final String cmd[] = {
"-i",
videoCroppedFile.getAbsolutePath(),
"-f",
"image2",
"-ss",
String.valueOf(videoLength.intValue() / ((8 + 1) * 10)),
"-r",
String.valueOf((8 + 1) / videoLength.intValue()),
mediaStorageDir.getAbsolutePath() + "/%d.jpg"
};However, when i go to the folder where the frames should be saved there isn’t anything there. Also there are no error messages.
I feel like its a way this library takes the String parameters. I’ve been stuck on this for sometime I have tried lots of different versions of the
cmd
. I was hoping someone could help.Here is my output from ffmpeg :
1-22 19:33:14.904 30981-30981/com.firebase.android D/videoFrames: failure reason ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (GCC)
configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
libavutil 55. 17.103 / 55. 17.103
libavcodec 57. 24.102 / 57. 24.102
libavformat 57. 25.100 / 57. 25.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 31.100 / 6. 31.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/data/user/0/com.firebase.android/cache/videovgntp6q5ar4dglkaflaoobfpcv945824159.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf57.25.100
Duration: 00:00:08.28, start: 0.000000, bitrate: 391 kb/s
Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 720x764 [SAR 1:1 DAR 180:191], 132 kb/s, 16.67 fps, 16.67 tbr, 12800 tbn, 33.33 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 256 kb/s (default)
Metadata:
handler_name : SoundHandler
Invalid framerate value: 0Update 2
Here is my Android code :
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
@Override
public void onStart() {
Log.d("videoFrames", "starting to load binary");
}
@Override
public void onFailure() {
Log.d("videoFrames", "failed to load binary");
}
@Override
public void onSuccess() {
Log.d("videoFrames", "loaded binary");
try {
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
Log.d("videoFrames", " starting to get frames from video");
if (mediaStorageDir.isDirectory() && mediaStorageDir.list().length > 0) {
String[] children = mediaStorageDir.list();
for (int i = 0; i < children.length; i++) {
new File(mediaStorageDir, children[i]).delete();
}
}
}
@Override
public void onProgress(String message) {
Log.d("videoFrames", " progress getting frames from video");
}
@Override
public void onFailure(String message) {
Log.d("videoFrames", " failed to get frames from video");
Log.d("videoFrames", " failure reason " + message);
Log.d("videoFrames", " ----------------------------------------------- ");
}
@Override
public void onSuccess(String message) {
Log.d("videoFrames", " success getting frames from video");
}
@Override
public void onFinish() {
Log.d("videoFrames", " finished getting frames from video");
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
Log.d("videoFrames", " command already running in fmpeg");
e.printStackTrace();
}
}
@Override
public void onFinish() {}
});
} catch (FFmpegNotSupportedException e) {
// Handle if FFmpeg is not supported by device
}Update 3
I fixed my code by doing :
final String cmd[] = {
"-i",
videoCroppedFile.getAbsolutePath(),
"-f",
"image2",
"-ss",
String.valueOf(videoLength.floatValue() / (8.0 * 10.0)),
"-r",
String.valueOf(8.0 / videoLength.floatValue()),
mediaStorageDir.getAbsolutePath() + "/%d.jpg"
}; -
FFmpeg get frames equally spaced out from video
2 décembre 2017, par TheOtherguyz4kjI am having some real problems with FFmpeg in my Android application. I am currently trying to take a video recorded from the device and extract frames from the video equally throughout the video. I used this blog post. The last formula :
ffmpeg -i input.avi -f image2 -ss 2 -r 0.05 frame-%05d.png
with my own
-ss
and own-r
values.Now everything is fine. Like mentioned in the blog the first 2 frames are the same for some reason. So i delete one of the frames then I feel as though the frames accurately show the video. In the blog it also says to delete some of the last frames I dont do this because then i dont get frames going to the end of the whole video.
In the output the first two images should be discarded and the next 6 images (from 3rd til 8th) will be the ones we were looking for. Most probably there’ll be a 9th image as well which can be discarded too. You can also add the -vframes 8 option (where 8 is number_of_frames + 2) to skip the creation of the last image that you won’t need anyway.
How accurately are these frames captured ? Because I would like to add timestamps to each frame. (
FFProbe
is not an option because I am using an Android application and dont want to have to build the binaries just for this).If for example I had a video of 14 seconds long and ffmpeg extracted 11 frames (I remove one of the first duplicate ones, so now 10 frames). If i then did
videoLength / number of frames * which frame I want to find
how accurate would this be ? So :14/10 * 1 = 1 so frame one goes from 0s - 1.4s
14/10 * 2 = 2.8 so frame two goes from 1.4s - 2.8s
14/10 * 3 = 4.2 so frame two goes from 2.8s - 4.2s
...
14/10 * 10 = 14 so frame two goes from 12.6s - 14sWould this accurate represent each frame ?
If not could someonne give me a better solution please thanks.
The idea is I have created a video cropper with a recyclerview so I will be able to "go over" each frame with a drag tool, calculate how much the frame has been covered and then find what range that frame represents and then set a time for the start and end crop times
-
vdpau : do not use buggy HEVC support by default
1er juillet 2017, par wm4vdpau : do not use buggy HEVC support by default
NVIDIA broke its own API when using VDPAU decoding. If you retrieve the
decoded YUV data, or if you map the surfaces with GL interop, the result
are interlacing artifacts. The only way to get non-broken data is by
using the vdpau video mixer to convert it to RGB. There is no way to
block the non-working operations in a reasonable way (a VdpVideoSurface
has to support all operations).NVIDIA refuses to fix this issue (they "fixed" it by making it work with
the video mixer, but the rest is still broken). There is no sign of that
changing.Do not use HEVC by default with the generic hwaccle API. Detect whether
it's the NVIDIA native implementation, and exit with an error. (The same
thing work with the MESA implementation.)As an escape hatch and to allow applications to use the decoder if they
really want to (perhaps because they make sure to explicitly use the
video mixer), reuse AV_HWACCEL_FLAG_ALLOW_PROFILE_MISMATCH to disable
this check.Once NVIDIA fixes the bug, working driver versions could be detected,
and it could be allowed again.