
Recherche avancée
Médias (3)
-
Valkaama DVD Cover Outside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Valkaama DVD Cover Inside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
Autres articles (13)
-
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)
Sur d’autres sites (3220)
-
Encoding images into a movie file
5 avril 2014, par RuAwareI am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media
I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.
Ideally I wont to support down to SDK 9/8
Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.
Code for MediaCodec so far
public class EncodeAndMux extends AsyncTask {
private static int bitRate = 2000000;
private static int MAX_INPUT = 100000;
private static String mimeType = "video/avc";
private int frameRate = 15;
private int colorFormat;
private int stride = 1;
private int sliceHeight = 2;
private MediaCodec encoder = null;
private MediaFormat inputFormat;
private MediaCodecInfo codecInfo = null;
private MediaMuxer muxer;
private boolean mMuxerStarted = false;
private int mTrackIndex = 0;
private long presentationTime = 0;
private Paint bmpPaint;
private static int WAITTIME = 10000;
private static String TAG = "ENCODE";
private ArrayList<string> mFilePaths;
private String mPath;
private EncodeListener mListener;
private int width = 320;
private int height = 240;
private double mSpeed = 1;
public EncodeAndMux(ArrayList<string> filePaths, String savePath) {
mFilePaths = filePaths;
mPath = savePath;
// Create paint to draw BMP
bmpPaint = new Paint();
bmpPaint.setAntiAlias(true);
bmpPaint.setFilterBitmap(true);
bmpPaint.setDither(true);
}
public void setListner(EncodeListener listener) {
mListener = listener;
}
// set the speed, how many frames a second
public void setSpead(int speed) {
mSpeed = speed;
}
public double getSpeed() {
return mSpeed;
}
private long computePresentationTime(int frameIndex) {
final long ONE_SECOND = 1000000;
return (long) (frameIndex * (ONE_SECOND / mSpeed));
}
public interface EncodeListener {
public void finished();
public void errored();
}
@TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
@Override
protected Boolean doInBackground(Integer... params) {
try {
muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
} catch (Exception e){
e.printStackTrace();
}
// Find a code that supports the mime type
int numCodecs = MediaCodecList.getCodecCount();
for (int i = 0; i < numCodecs && codecInfo == null; i++) {
MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
if (!info.isEncoder()) {
continue;
}
String[] types = info.getSupportedTypes();
boolean found = false;
for (int j = 0; j < types.length && !found; j++) {
if (types[j].equals(mimeType))
found = true;
}
if (!found)
continue;
codecInfo = info;
}
for (int i = 0; i < MediaCodecList.getCodecCount(); i++) {
MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
if (!info.isEncoder()) {
continue;
}
String[] types = info.getSupportedTypes();
for (int j = 0; j < types.length; ++j) {
if (types[j] != mimeType)
continue;
MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
for (int k = 0; k < caps.profileLevels.length; k++) {
if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh && caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
codecInfo = info;
}
}
}
}
Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) {
int format = capabilities.colorFormats[i];
switch (format) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
colorFormat = format;
break;
}
}
Log.d(TAG, "Using color format " + colorFormat);
// Determine width, height and slice sizes
if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
// This codec doesn't support a width not a multiple of 16,
// so round down.
width &= ~15;
}
stride = width;
sliceHeight = height;
if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
stride = (stride + 15) / 16 * 16;
sliceHeight = (sliceHeight + 15) / 16 * 16;
}
inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
// inputFormat.setInteger("stride", stride);
// inputFormat.setInteger("slice-height", sliceHeight);
inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);
encoder = MediaCodec.createByCodecName(codecInfo.getName());
encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
ByteBuffer[] inputBuffers = encoder.getInputBuffers();
ByteBuffer[] outputBuffers = encoder.getOutputBuffers();
int inputBufferIndex= -1, outputBufferIndex= -1;
BufferInfo info = new BufferInfo();
for (int i = 0; i < mFilePaths.size(); i++) {
// use decode sample to calculate inSample size and then resize
Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);
// Create blank bitmap
Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);
// Center scaled image
Canvas canvas = new Canvas(bitmap);
canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);
Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
byte[] dat = getNV12(width, height, bitmap);
bitmap.recycle();
// Exception occurred on this below line in Emulator, LINE No. 182//**
inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
Log.i("DAT", "Size= "+dat.length);
if(inputBufferIndex >= 0){
int samplesiz= dat.length;
inputBuffers[inputBufferIndex].put(dat);
presentationTime = computePresentationTime(i);
if (i == mFilePaths.size()) {
encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
Log.i(TAG, "Last Frame");
} else {
encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
}
while(true) {
outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
if (outputBufferIndex >= 0) {
ByteBuffer encodedData = outputBuffers[outputBufferIndex];
if (encodedData == null) {
throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
" was null");
}
if ((info.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// The codec config data was pulled out and fed to the muxer when we got
// the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it.
Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
info.size = 0;
}
if (info.size != 0) {
if (!mMuxerStarted) {
throw new RuntimeException("muxer hasn't started");
}
// adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(info.offset);
encodedData.limit(info.offset + info.size);
muxer.writeSampleData(mTrackIndex, encodedData, info);
Log.d(TAG, "sent " + info.size + " bytes to muxer");
}
encoder.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
break; // out of while
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Subsequent data will conform to new format.
MediaFormat opmediaformat = encoder.getOutputFormat();
if (!mMuxerStarted) {
mTrackIndex = muxer.addTrack(opmediaformat);
muxer.start();
mMuxerStarted = true;
}
Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
} else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = encoder.getOutputBuffers();
Log.d(TAG, "Output Buffer changed " + outputBuffers);
} else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
// No Data, break out
break;
} else {
// Unexpected State, ignore it
Log.d(TAG, "Unexpected State " + outputBufferIndex);
}
}
}
}
if (encoder != null) {
encoder.flush();
encoder.stop();
encoder.release();
encoder = null;
}
if (muxer != null) {
muxer.stop();
muxer.release();
muxer = null;
}
return true;
};
@Override
protected void onPostExecute(Boolean result) {
if (result) {
if (mListener != null)
mListener.finished();
} else {
if (mListener != null)
mListener.errored();
}
super.onPostExecute(result);
}
byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
int [] argb = new int[inputWidth * inputHeight];
scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
byte [] yuv = new byte[inputWidth*inputHeight*3/2];
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
scaled.recycle();
return yuv;
}
void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
// well known RGB to YVU algorithm
Y = ( ( 66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
V = ( ( -38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
U = ( ( 112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
}
index ++;
}
}
}
}
</string></string>This has now been tested on 4 of my devices and works fine, is there are way to
1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released)
2/ Offer an api 16 solution ?
3/ Do I need stride and stride height ?Thanks
-
Corrupt playback of .flv when using JW Player
21 mai 2012, par Adam IngmanssonThis question is also asked on Audio-Video Production
I have some files that when played in jwplayer the playback is corrupt.
The files are encoded to h.264 using FFMpeg and there is other files encoded in the same way that works.At the moment we only use the flash version of the player.
The corrupt playback looks like this :
http://adam.ingmansson.com/public/jwplayer-corrupt-video.png
This problem started showing up after we did an upgrade to FFMpeg, so I haven't ruled out that it could be an encoding error.
the command used to run FFMpeg is :
ffmpeg
-i /home/ftp/1c8f08b7d0d9e7fa4b24066156ad50bc981497a0.mov
-vcodec libx264
-preset ultrafast
-profile baseline
-acodec libfaac
-ab 96k
-crf 19
-vf movie="/home/adam/logo.png [watermark]; [in][watermark] overlay=main_w-overlay_w-10:main_h-overlay_h-10 [out]"
-y /home/ftp/1c8f08b7d0d9e7fa4b24066156ad50bc981497a0.flvI am in no way an expert in FFMpeg commandline, so feel free to point out any mistakes made.
FFMpeg info :
ffmpeg version git-2012-05-02-2330eb1 Copyright (c) 2000-2012 the FFmpeg developers
built on May 3 2012 08:51:25 with gcc 4.4.3
configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-version3 --enable-x11grab
libavutil 51. 49.100 / 51. 49.100
libavcodec 54. 17.101 / 54. 17.101
libavformat 54. 3.100 / 54. 3.100
libavdevice 53. 4.100 / 53. 4.100
libavfilter 2. 72.103 / 2. 72.103
libswscale 2. 1.100 / 2. 1.100
libswresample 0. 11.100 / 0. 11.100
libpostproc 52. 0.100 / 52. 0.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x24300e0] max_analyze_duration 5000000 reached at 5187000
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/ftp/javarecorder/1c8f08b7d0d9e7fa4b24066156ad50bc981497a0.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2012-05-16 08:19:41
Duration: 00:13:33.00, start: 0.000000, bitrate: 2164 kb/s
Stream #0:0(eng): Video: qtrle (rle / 0x20656C72), rgb24, 1366x768, 1457 kb/s, 8.43 fps, 1k tbr, 1k tbn, 1k tbc
Metadata:
creation_time : 2012-05-16 08:19:41
handler_name : Apple Alias Data Handler
Stream #0:1(eng): Audio: pcm_s16be (twos / 0x736F7774), 44100 Hz, 1 channels, s16, 705 kb/s
Metadata:
creation_time : 2012-05-16 08:19:41
handler_name : Apple Alias Data Handler
Please use -profile:a or -profile:v, -profile is ambiguous
[buffer @ 0x2446ac0] w:1366 h:768 pixfmt:rgb24 tb:1/1000000 sar:0/1 sws_param:flags=2
[movie @ 0x242f840] seek_point:0 format_name:(null) file_name:/home/adam/logo.png stream_index:0
[overlay @ 0x2442840] auto-inserting filter 'auto-inserted scale 0' between the filter 'src' and the filter 'Parsed_overlay_1'
[overlay @ 0x2442840] auto-inserting filter 'auto-inserted scale 1' between the filter 'Parsed_movie_0' and the filter 'Parsed_overlay_1'
[scale @ 0x24444a0] w:1366 h:768 fmt:rgb24 sar:0/1 -> w:1366 h:768 fmt:yuv420p sar:0/1 flags:0x4
[scale @ 0x2445100] w:80 h:80 fmt:rgba sar:1/1 -> w:80 h:80 fmt:yuva420p sar:1/1 flags:0x4
[overlay @ 0x2442840] main w:1366 h:768 fmt:yuv420p overlay x:1276 y:678 w:80 h:80 fmt:yuva420p
[overlay @ 0x2442840] main_tb:1/1000000 overlay_tb:1/25 -> tb:1/1000000 exact:1
[libx264 @ 0x242d8c0] MB rate (4128000) > level limit (2073600)
[libx264 @ 0x242d8c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShuffle SSE4.2
[libx264 @ 0x242d8c0] profile Constrained Baseline, level 5.2
[libx264 @ 0x242d8c0] 264 - core 124 r2197 69a0443 - H.264/MPEG-4 AVC codec - Copyleft 2003-2012 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=36 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=19.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
[libfaac @ 0x2443540] channel_layout not specified
Guessed Channel Layout for Input Stream #0.1 : mono
Output #0, flv, to '/home/ftp/javarecorder/1c8f08b7d0d9e7fa4b24066156ad50bc981497a0.flv':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2012-05-16 08:19:41
encoder : Lavf54.3.100
Stream #0:0(eng): Video: h264 ([7][0][0][0] / 0x0007), yuv420p, 1366x768, q=-1--1, 1k tbn, 1k tbc
Metadata:
creation_time : 2012-05-16 08:19:41
handler_name : Apple Alias Data Handler
Stream #0:1(eng): Audio: aac ([10][0][0][0] / 0x000A), 44100 Hz, mono, s16, 96 kb/s
Metadata:
creation_time : 2012-05-16 08:19:41
handler_name : Apple Alias Data Handler
Stream mapping:
Stream #0:0 -> #0:0 (qtrle -> libx264)
Stream #0:1 -> #0:1 (pcm_s16be -> libfaac)
Press [q] to stop, [?] for help
Input stream #0:1 frame changed from rate:44100 fmt:s16 ch:1 chl:0x0 to rate:44100 fmt:s16 ch:1 chl:0x4
frame= 6856 fps=105 q=-1.0 Lsize= 36030kB time=00:13:32.83 bitrate= 363.1kbits/s
video:27775kB audio:7540kB global headers:0kB muxing overhead 2.026555%
[libx264 @ 0x242d8c0] frame I:28 Avg QP: 4.61 size:238170
[libx264 @ 0x242d8c0] frame P:6828 Avg QP: 7.31 size: 3189
[libx264 @ 0x242d8c0] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0x242d8c0] mb P I16..4: 0.9% 0.0% 0.0% P16..4: 2.2% 0.0% 0.0% 0.0% 0.0% skip:96.9%
[libx264 @ 0x242d8c0] coded y,uvDC,uvAC intra: 32.3% 30.0% 29.0% inter: 1.0% 1.1% 1.0%
[libx264 @ 0x242d8c0] i16 v,h,dc,p: 66% 32% 1% 1%
[libx264 @ 0x242d8c0] i8c dc,h,v,p: 62% 23% 14% 1%
[libx264 @ 0x242d8c0] kb/s:279.82EDIT :
A coworker was able to view a "corrupted" file. The only difference between my computer and his is that he have a Mac. Same flash version, same JW Player version. Something is not right here.
-
How do I reduce frames with blending in ffmpeg
21 mars 2014, par user3213418I am trying to convert some files into ProRes.
One fairly important part of the conversion is :- reducing frames from 60 to 30
- blending every 2 frames into one and achieving a more fluent movement. (a simple sort of motion blur)
I have tried the
-blend
command, however it was not recognized as a command.-i source.mp4 -r 30 -vcodec prores_ks -profile:v 0 Output.mov
How do I reduce frames with blending in ffmpeg ?