
Recherche avancée
Autres articles (98)
-
Formulaire personnalisable
21 juin 2013, parCette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire. (...) -
Qu’est ce qu’un masque de formulaire
13 juin 2013, parUn masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
Chaque formulaire de publication d’objet peut donc être personnalisé.
Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (5195)
-
Image to MPEG on Linux works, same code on Android = green video
5 avril 2018, par JScoobyCedEDIT
I have check the execution and found that the error is not (yet) at the swscale point. My current issue is that the JPG image is not found :
No such file or directory
when doing theavformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
Before you tell me I need to register anything, I can tell I already did (I updated the code below).
I also added the Android permission to access the external storage (I don’t think it is related to Android since I can already write to the /mnt/sdcard/ where the image is also located)
END EDITI have been through several tutorials (including the few posted from SO, i.e. http://dranger.com/ffmpeg/, how to compile ffmpeg for Android...,been through dolphin-player source code). Here is what I have :
. Compiled ffmpeg for android
. Ran basic tutorials using NDK to create a dummy video on my android device
. been able to generate a MPEG2 video from images on Ubuntu using a modified version of dummy video code above and a lot of Googling
. running the new code on Android device gives a green screen video (duration 1 sec whatever the number of frames I encode)I saw another post about iPhone in a similar situation that mentioned the ARM processor optimization could be the culprit. I tried a few ldextra-flags (-arch armv7-a and similar) to no success.
I include at the end the code that loads the image. Is there something different to do on Android than on linux ? Is my ffmpeg build not correct for Android video encoding ?
void copyFrame(AVCodecContext *destContext, AVFrame* dest,
AVCodecContext *srcContext, AVFrame* source) {
struct SwsContext *swsContext;
swsContext = sws_getContext(srcContext->width, srcContext->height, srcContext->pix_fmt,
destContext->width, destContext->height, destContext->pix_fmt,
SWS_FAST_BILINEAR, NULL, NULL, NULL);
sws_scale(swsContext, source->data, source->linesize, 0, srcContext->height, dest->data, dest->linesize);
sws_freeContext(swsContext);
}
int loadFromFile(const char* imageFileName, AVFrame* realPicture, AVCodecContext* videoContext) {
AVFormatContext *pFormatCtx = NULL;
avcodec_register_all();
av_register_all();
int ret = avformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
if (ret != 0) {
// ERROR hapening here
// Can't open image file. Use strerror(AVERROR(ret))) for details
return ERR_CANNOT_OPEN_IMAGE;
}
AVCodecContext *pCodecCtx;
pCodecCtx = pFormatCtx->streams[0]->codec;
pCodecCtx->width = W_VIDEO;
pCodecCtx->height = H_VIDEO;
pCodecCtx->pix_fmt = PIX_FMT_YUV420P;
// Find the decoder for the video stream
AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
if (!pCodec) {
// Codec not found
return ERR_CODEC_NOT_FOUND;
}
// Open codec
if (avcodec_open(pCodecCtx, pCodec) < 0) {
// Could not open codec
return ERR_CANNOT_OPEN_CODEC;
}
//
AVFrame *pFrame;
pFrame = avcodec_alloc_frame();
if (!pFrame) {
// Can't allocate memory for AVFrame
return ERR_CANNOT_ALLOC_MEM;
}
int frameFinished;
int numBytes;
// Determine required buffer size and allocate buffer
numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
uint8_t *buffer = (uint8_t *) av_malloc(numBytes * sizeof (uint8_t));
avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
AVPacket packet;
int res = 0;
while (av_read_frame(pFormatCtx, &packet) >= 0) {
if (packet.stream_index != 0)
continue;
ret = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
if (ret > 0) {
// now, load the useful info into realPicture
copyFrame(videoContext, realPicture, pCodecCtx, pFrame);
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
return 0;
} else {
// Error decoding frame. Use strerror(AVERROR(ret))) for details
res = ERR_DECODE_FRAME;
}
}
av_free(pFrame);
// close codec
avcodec_close(pCodecCtx);
// Close the image file
av_close_input_file(pFormatCtx);
return res;
}Some ./configure options :
--extra-cflags="-O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfp -marm -march=armv7-a -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE"
--extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog"
--arch=armv7-a --enable-armv5te --enable-armv6 --enable-armvfp --enable-memalign-hack
-
Image to MPEG on Linux works, same code on Android = green video
27 novembre 2014, par JScoobyCedEDIT
I have check the execution and found that the error is not (yet) at the swscale point. My current issue is that the JPG image is not found :
No such file or directory
when doing theavformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
Before you tell me I need to register anything, I can tell I already did (I updated the code below).
I also added the Android permission to access the external storage (I don’t think it is related to Android since I can already write to the /mnt/sdcard/ where the image is also located)
END EDITI have been through several tutorials (including the few posted from SO, i.e. http://dranger.com/ffmpeg/, how to compile ffmpeg for Android...,been through dolphin-player source code). Here is what I have :
. Compiled ffmpeg for android
. Ran basic tutorials using NDK to create a dummy video on my android device
. been able to generate a MPEG2 video from images on Ubuntu using a modified version of dummy video code above and a lot of Googling
. running the new code on Android device gives a green screen video (duration 1 sec whatever the number of frames I encode)I saw another post about iPhone in a similar situation that mentioned the ARM processor optimization could be the culprit. I tried a few ldextra-flags (-arch armv7-a and similar) to no success.
I include at the end the code that loads the image. Is there something different to do on Android than on linux ? Is my ffmpeg build not correct for Android video encoding ?
void copyFrame(AVCodecContext *destContext, AVFrame* dest,
AVCodecContext *srcContext, AVFrame* source) {
struct SwsContext *swsContext;
swsContext = sws_getContext(srcContext->width, srcContext->height, srcContext->pix_fmt,
destContext->width, destContext->height, destContext->pix_fmt,
SWS_FAST_BILINEAR, NULL, NULL, NULL);
sws_scale(swsContext, source->data, source->linesize, 0, srcContext->height, dest->data, dest->linesize);
sws_freeContext(swsContext);
}
int loadFromFile(const char* imageFileName, AVFrame* realPicture, AVCodecContext* videoContext) {
AVFormatContext *pFormatCtx = NULL;
avcodec_register_all();
av_register_all();
int ret = avformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
if (ret != 0) {
// ERROR hapening here
// Can't open image file. Use strerror(AVERROR(ret))) for details
return ERR_CANNOT_OPEN_IMAGE;
}
AVCodecContext *pCodecCtx;
pCodecCtx = pFormatCtx->streams[0]->codec;
pCodecCtx->width = W_VIDEO;
pCodecCtx->height = H_VIDEO;
pCodecCtx->pix_fmt = PIX_FMT_YUV420P;
// Find the decoder for the video stream
AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
if (!pCodec) {
// Codec not found
return ERR_CODEC_NOT_FOUND;
}
// Open codec
if (avcodec_open(pCodecCtx, pCodec) < 0) {
// Could not open codec
return ERR_CANNOT_OPEN_CODEC;
}
//
AVFrame *pFrame;
pFrame = avcodec_alloc_frame();
if (!pFrame) {
// Can't allocate memory for AVFrame
return ERR_CANNOT_ALLOC_MEM;
}
int frameFinished;
int numBytes;
// Determine required buffer size and allocate buffer
numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
uint8_t *buffer = (uint8_t *) av_malloc(numBytes * sizeof (uint8_t));
avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
AVPacket packet;
int res = 0;
while (av_read_frame(pFormatCtx, &packet) >= 0) {
if (packet.stream_index != 0)
continue;
ret = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
if (ret > 0) {
// now, load the useful info into realPicture
copyFrame(videoContext, realPicture, pCodecCtx, pFrame);
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
return 0;
} else {
// Error decoding frame. Use strerror(AVERROR(ret))) for details
res = ERR_DECODE_FRAME;
}
}
av_free(pFrame);
// close codec
avcodec_close(pCodecCtx);
// Close the image file
av_close_input_file(pFormatCtx);
return res;
}Some ./configure options :
--extra-cflags="-O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfp -marm -march=armv7-a -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE"
--extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog"
--arch=armv7-a --enable-armv5te --enable-armv6 --enable-armvfp --enable-memalign-hack
-
SIGSEV when loading ffmpeg library
20 février 2014, par gookmanI am trying to build ffmpeg for Android similar to AndroidFFmpeg.
I have configured ffmpeg so that I have only what I need. My
android_build.sh
looks like
this :#!/bin/bash
if [ "$NDK" = "" ]; then
echo NDK variable not set, exiting
echo "Use: export NDK=/your/path/to/android-ndk"
exit 1
fi
OS=`uname -s | tr '[A-Z]' '[a-z]'`
function build_ffmpeg
{
PLATFORM=$NDK/platforms/$PLATFORM_VERSION/arch-$ARCH/
CC=$PREBUILT/bin/$EABIARCH-gcc
CROSS_PREFIX=$PREBUILT/bin/$EABIARCH-
PKG_CONFIG=${CROSS_PREFIX}pkg-config
if [ ! -f $PKG_CONFIG ];
then
cat > $PKG_CONFIG << EOF
#!/bin/bash
pkg-config \$*
EOF
chmod u+x $PKG_CONFIG
fi
NM=$PREBUILT/bin/$EABIARCH-nm
cd ffmpeg
export PKG_CONFIG_LIBDIR=$(pwd)/$PREFIX/lib/pkgconfig/
export PKG_CONFIG_PATH=$(pwd)/$PREFIX/lib/pkgconfig/
./configure --target-os=linux \
--prefix=$PREFIX \
--enable-small \
--enable-cross-compile \
--extra-libs="-lgcc" \
--arch=$ARCH \
--cc=$CC \
--cross-prefix=$CROSS_PREFIX \
--nm=$NM \
--sysroot=$PLATFORM \
--extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS " \
--disable-shared \
--enable-static \
--enable-runtime-cpudetect \
--extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog -L$PREFIX/lib" \
--extra-cflags="-I$PREFIX/include" \
--disable-everything \
--enable-muxer=mp4 \
--enable-protocol=file \
--enable-decoder=mpeg4 \
--enable-encoder=mpeg4 \
--enable-avformat \
--enable-avcodec \
--enable-avresample \
--enable-zlib \
--disable-doc \
--disable-ffplay \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--enable-nonfree \
--enable-version3 \
--enable-memalign-hack \
--enable-asm \
$ADDITIONAL_CONFIGURE_FLAG \
|| exit 1
make clean || exit 1
make -j4 install || exit 1
cd ..
}
function build_one {
cd ffmpeg
PLATFORM=$NDK/platforms/$PLATFORM_VERSION/arch-$ARCH/
$PREBUILT/bin/$EABIARCH-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -L$PREFIX/lib -soname $SONAME -shared -nostdlib -z noexecstack -Bsymbolic --whole-archive --no-undefined -o $OUT_LIBRARY -lavcodec -lavformat -lavresample -lavutil -lc -lm -lz -ldl -llog --dynamic-linker=/system/bin/linker -zmuldefs $PREBUILT/lib/gcc/$EABIARCH/4.6/libgcc.a || exit 1
cd ..
}
#arm v5
EABIARCH=arm-linux-androideabi
ARCH=arm
CPU=armv5
OPTIMIZE_CFLAGS="-marm -march=$CPU"
PREFIX=../ffmpeg-build/armeabi
OUT_LIBRARY=$PREFIX/libffmpeg.so
ADDITIONAL_CONFIGURE_FLAG=
SONAME=libffmpeg.so
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.6/prebuilt/$OS-x86_64
PLATFORM_VERSION=android-5
build_ffmpeg
build_one
#arm v7vfpv3
EABIARCH=arm-linux-androideabi
ARCH=arm
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
PREFIX=../ffmpeg-build/armeabi-v7a
OUT_LIBRARY=$PREFIX/libffmpeg.so
ADDITIONAL_CONFIGURE_FLAG=
SONAME=libffmpeg.so
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.6/prebuilt/$OS-x86_64
PLATFORM_VERSION=android-5
build_ffmpeg
build_oneMy
Android.mk
file looks like this :LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg-prebuilt
LOCAL_SRC_FILES := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
LOCAL_EXPORT_C_INCLUDES := ffmpeg-build/$(TARGET_ARCH_ABI)/include
LOCAL_EXPORT_LDLIBS := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
LOCAL_PRELINK_MODULE := true
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_ALLOW_UNDEFINED_SYMBOLS=false
LOCAL_MODULE := native
LOCAL_SRC_FILES := native.c
LOCAL_LDLIBS := -llog
LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-build/$(TARGET_ARCH_ABI)/include
LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
include $(BUILD_SHARED_LIBRARY)The actual code I am trying to use is very simple and looks like this :
#include
#include <android></android>log.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
static jint Java_dk_bendingspoons_clipstitch_VideoLoader_start(JNIEnv * env, jobject thiz, jstring path)
{
__android_log_print(ANDROID_LOG_VERBOSE, "native", "Before register all.");
av_register_all();
return 0;
}The compilation of both ffmpeg and my code is without any errors. The problem is that when I am loading the library in Android it fails with the error :
Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 27050
. This leads me to believe that there is something that I am doing wrong when compiling.I should mention that I am compiling on Ubuntu 13.10 x64 with NDK-r9c-x86_64.
How should I proceed in finding out what the issue is ?