
Recherche avancée
Autres articles (46)
-
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)
Sur d’autres sites (6172)
-
How to Make Video from Arraylist of images with Music
6 avril 2018, par hardik sojitraI am making an app which makes video from selected images which is stored in SD card I have implement the
group : ’com.github.hiteshsondhi88.libffmpeg’, name : ’FFmpegAndroid’, version : ’0.2.5’ this library and make this classes but it does not workUtility.java
public class Utility {
private final static String TAG = Utility.class.getName();
private static Context mContext;
public Utility(Context context) {
mContext = context;
}
public static String excuteCommand(String command)
{
try {
Log.d(TAG, "execute command : " + command);
Process process = Runtime.getRuntime().exec(command);
BufferedReader reader = new BufferedReader(
new InputStreamReader(process.getInputStream()));
int read;
char[] buffer = new char[4096];
StringBuffer output = new StringBuffer();
while ((read = reader.read(buffer)) > 0) {
output.append(buffer, 0, read);
}
reader.close();
process.waitFor();
Log.d(TAG, "command result: " + output.toString());
return output.toString();
} catch (IOException e) {
Log.e(TAG, e.getMessage(), e);
} catch (InterruptedException e) {
Log.e(TAG, e.getMessage(), e);
}
return "";
}
public String getPathOfAppInternalStorage()
{
return Environment.getExternalStorageDirectory().getAbsolutePath();
}
public void saveFileToAppInternalStorage(InputStream inputStream, String fileName)
{
File file = new File(getPathOfAppInternalStorage() + "/" + fileName);
if (file.exists())
{
Log.d(TAG, "SaveRawToAppDir Delete Exsisted File");
file.delete();
}
FileOutputStream outputStream;
try {
outputStream = mContext.openFileOutput(fileName, Context.MODE_PRIVATE);
byte[] buffer = new byte[1024];
int length;
while ((length = inputStream.read(buffer)) > 0)
{
outputStream.write(buffer, 0, length);
}
outputStream.close();
inputStream.close();
} catch (Exception e) {
Log.e(TAG, e.getMessage(), e);
}
}
public static boolean isFileExsisted(String filePath)
{
File file = new File(filePath);
return file.exists();
}
public static void deleteFileAtPath(String filePath)
{
File file = new File(filePath);
file.delete();
}
}ffmpegController.java
package com.example.admin.imagetovideowithnative;
import android.content.Context;
import android.util.Log;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
public class FfmpegController {
private static Context mContext;
private static Utility mUtility;
private static String mFfmpegBinaryPath;
public FfmpegController(Context context) {
mContext = context;
mUtility = new Utility(context);
initFfmpeg();
}
private void initFfmpeg()
{
mFfmpegBinaryPath = mContext.getApplicationContext().getFilesDir().getAbsolutePath() + "/ffmpeg";
if (Utility.isFileExsisted(mFfmpegBinaryPath))
return;
InputStream inputStream = mContext.getResources().openRawResource(R.raw.ffmpeg);
mUtility.saveFileToAppInternalStorage(inputStream, "ffmpeg");
Utility.excuteCommand(CommandHelper.commandChangeFilePermissionForExecuting(mFfmpegBinaryPath));
}
public void convertImageToVideo(String inputImgPath)
{
Log.e("Image Parth", "inputImgPath - "+inputImgPath);
if (Utility.isFileExsisted(pathOuputVideo()))
Utility.deleteFileAtPath(pathOuputVideo());
saveShellCommandImg2VideoToAppDir(inputImgPath);
Utility.excuteCommand("sh" + " " + pathShellScriptImg2Video());
}
public String pathOuputVideo()
{
return mUtility.getPathOfAppInternalStorage() + "/out.mp4";
}
private String pathShellScriptImg2Video()
{
return mUtility.getPathOfAppInternalStorage() + "/img2video.sh";
}
private void saveShellCommandImg2VideoToAppDir(String inputImgPath)
{
String command = CommandHelper.commandConvertImgToVideo(mFfmpegBinaryPath, inputImgPath, pathOuputVideo());
InputStream is = new ByteArrayInputStream(command.getBytes());
mUtility.saveFileToAppInternalStorage(is, "img2video.sh");
}
}CommandHelper.java
package com.example.admin.imagetovideowithnative;
import android.util.Log;
public class CommandHelper {
public static String commandConvertImgToVideo(String ffmpegBinaryPath, String inputImgPath, String outputVideoPath) {
Log.e("ffmpegBinaryPath", "ffmpegBinaryPath - " + ffmpegBinaryPath);
Log.e("inputImgPath", "inputImgPath - " + inputImgPath);
Log.e("outputVideoPath", "outputVideoPath - " + outputVideoPath);
return ffmpegBinaryPath + " -r 1/1 -i " + inputImgPath + " -c:v libx264 -crf 23 -pix_fmt yuv420p -s 640x480 " + outputVideoPath;
}
public static String commandChangeFilePermissionForExecuting(String filePath) {
return "chmod 777 " + filePath;
}
}AsynckTask
class Asynck extends AsyncTask {
FfmpegController mFfmpegController = new FfmpegController(PhotosActivity.this);
Utility mUtility = new Utility(PhotosActivity.this);
@Override
protected void onPreExecute() {
super.onPreExecute();
Log.e("Video Process Start", "======================== Video Process Start ======================================");
}
@Override
protected Void doInBackground(Object... objects) {
mFfmpegController.convertImageToVideo("");
return null;
}
@Override
protected void onPostExecute(Void aVoid) {
super.onPostExecute(aVoid);
Log.e("Video Process Complete", "======================== Video Process Complete ======================================");
Log.e("Video Path", "Path - " + mFfmpegController.pathOuputVideo());
Toast.makeText(PhotosActivity.this, "Video Process Complete", Toast.LENGTH_LONG).show();
}
} -
How do I extract color matrix from MP4 an x264 stream in Media Foundation
23 août 2016, par JulesI am playing a video (mp4 containing x264 encoded video stream) with a custom player using media foundation.
When I convert the YUV information into RGB I need to account for the color matrix and range used at encode time.
Some of my videos have this information, I can use MediaInfo.exe or FFMPEG to see that it is present.
However, for such videos if I look at the relevant Media Foundation properties (Extended Color Information) the properties are not present in the files.
So, somehow I need to find a way to access the information.
Media Foundation does provide access to MF_MT_MPEG4_SAMPLE_DESCRIPTION and MF_MT_MPEG_SEQUENCE_HEADER for the video stream but I can’t find descriptions of what these contain.
I noticed that the MF_MT_MPEG_SEQUENCE_HEADER is much longer for the videos with the information present and this (MPEG Headers Quick Reference) seems to suggest headers might contain the information I need.
I’m looking for Color Range (limited/full), Color Primaries, Transfer Characteristics and Matrix Coefficients (BT.709 etc).
I’d greatly appreciate any help finding this information from a Media Foundation video stream.
Thanks
Jules
Update - Sequence Header
The sequence header appears to be a subset of MPEG4 sample description, though I can’t find anything that indicates what either bits of data actually contains / doesn’t contain specifically.
The sequence header appears to contain data structured as an MP4 byte stream as described in the H264 Standards Document and includes the VUI (Video Usability Information - Annex E of document) which may then include the colour information I’m interested in.
Given that it’s a byte stream I need to know where it starts and whether there’s some existing code I could use to decode it.
In FFMPEG in libavcodec/h264_ps.c there is a function called ff_h264_decode_seq_parameter_set which ends up calling decode_vui_parameters. It seems possible that seq_parameter_set maps to MF_MT_MPEG_SEQUENCE_HEADER and it may be possible to use that code to decode the data.
If anyone one has any direct experience with decoding this data it would be very useful.
Thanks again
Update - Related posts
I found this How to decode sprop-parameter-sets in a H264 SDP ? and Possible Locations for Sequence/Picture Parameter Set(s) for H.264 Stream which are fairly helpful.
The sequence header would appear to be Sequence or picture parameter set (pps) and the parameters I want are the VUI extension subset.
Plus this post H.264 stream structure gives the high level of how the stream data is structured, and the MF_MT_MPEG_SEQUENCE_HEADER appears to start with a NAL 0x00 0x00 0x01 so I’m guessing it is a NAL containing the PPS.
-
avfilter/vf_vectorscope : draw color points names
7 mars 2016, par Paul B Mahol