
Recherche avancée
Autres articles (81)
-
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (3541)
-
ffmpeg audio synch issue - fractional framerate
29 mars 2013, par JoeCitizenWe are working on an Android app which uses the ffmpeg library via JNI to edit the frames of a video while keeping size, codec etc the same.
We are having an issue where the audio of the output video is getting out of sync with the video. We believe this is because some input videos have a frame rate which isn't a whole number e.g 25.66 fps and our output will be at 25fps. We have tried to change the time_base field of the output codec to keep the precision i.e by multiply the numerator and denominator but it makes the output frame rate ridiculously high.
Does anyone know how to force ffmpeg to use the exact same output frame rate as the one it reads in ? We have not found a way to output videos with a fractional frame rate.
Example of setting up the output codec :
c = st->codec;
c->codec_id = codec_id;
c->codec_type = AVMEDIA_TYPE_VIDEO;
c->bit_rate = inputCodecCtx->bit_rate;
c->width = inputCodecCtx->width;
c->height = inputCodecCtx->height;
c->time_base.num = 1000;
c->time_base.den = (int)(fps*1000);//fps of the input video *1000 to keep precision
c->gop_size = inputCodecCtx->gop_size;
c->pix_fmt = inputCodecCtx->pix_fmt;
static int write_video_frame(AVFormatContext *oc, AVStream *st, AVFrame *newpict, double fps)
{
int ret = 0;
AVCodecContext* c = st->codec;
AVPacket pkt;
av_init_packet(&pkt);
if (pkt.pts != AV_NOPTS_VALUE)
{
pkt.pts = av_rescale_q(c->coded_frame->pts, c->time_base, st->time_base);
}
if (pkt.dts != AV_NOPTS_VALUE)
{
pkt.dts = av_rescale_q(c->coded_frame->pts, c->time_base, st->time_base);
}
.....
} -
av_read_frame function in ffmpeg in android always returning packet.stream_index as 0
31 juillet 2013, par droidmadI am using the following standard code pasted below (ref : http://dranger.com/ffmpeg/) to use ffmpeg in android using ndk. My code is working fine in ubuntu 10.04 using gcc compiler. But I am facing an issue in android.The issue is
av_read_frame(pFormatCtx, &packet)
is always returningpacket.stream_index=0
. I have tested my code with various rtsp urls and I have the same behaviour in all cases. I do not have any linking or compiling issues as everything seems to be working fine except this issue. I am trying to solve this from last 2 days but I am stuck badly.Please point me in right direction.#include
#include <android></android>log.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include
#define DEBUG_TAG "mydebug_ndk"
jint Java_com_example_tut2_MainActivity_myfunc(JNIEnv * env, jobject this,jstring myjurl) {
AVFormatContext *pFormatCtx = NULL;
int i, videoStream;
AVCodecContext *pCodecCtx = NULL;
AVCodec *pCodec = NULL;
AVFrame *pFrame = NULL;
AVPacket packet;
int frameFinished;
AVDictionary *optionsDict = NULL;
struct SwsContext *sws_ctx = NULL;
jboolean isCopy;
const char * mycurl = (*env)->GetStringUTFChars(env, myjurl, &isCopy);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:p2: [%s]", mycurl);
// Register all formats and codecs
av_register_all();
avformat_network_init();
// Open video file
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_open");
if(avformat_open_input(&pFormatCtx, mycurl, NULL, NULL)!=0)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "start: %d\t%d\n",pFormatCtx->raw_packet_buffer_remaining_size,pFormatCtx->max_index_size);
(*env)->ReleaseStringUTFChars(env, myjurl, mycurl);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_stream");
// Retrieve stream information
if(avformat_find_stream_info(pFormatCtx, NULL)<0)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_stream");
// Find the first video stream
videoStream=-1;
for(i=0; inb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
videoStream=i;
break;
}
if(videoStream==-1)
return -1; // Didn't find a video stream
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_videostream");
// Get a pointer to the codec context for the video stream
pCodecCtx=pFormatCtx->streams[videoStream]->codec;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_codec_context");
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_decoder");
if(pCodec==NULL)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:found_decoder");
// Open codec
if(avcodec_open2(pCodecCtx, pCodec, &optionsDict)<0)
return -1;
// Allocate video frame
pFrame=avcodec_alloc_frame();
sws_ctx = sws_getContext(pCodecCtx->width,pCodecCtx->height,pCodecCtx->pix_fmt,pCodecCtx->width,
pCodecCtx->height,PIX_FMT_YUV420P,SWS_BILINEAR,NULL,NULL,NULL);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_while");
int count=0;
while(av_read_frame(pFormatCtx, &packet)>=0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "entered while: %d %d %d\n", packet.duration,packet.stream_index,packet.size);
if(packet.stream_index==videoStream) {
// Decode video frame
//break;
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished,&packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:in_while");
if(frameFinished) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:gng_out_of_while");
break;
}
}
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
if(++count>1000)
return -2; //infinite while loop
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_while");
// Free the YUV frame
av_free(pFrame);
// Close the codec
avcodec_close(pCodecCtx);
// Close the video file
avformat_close_input(&pFormatCtx);
return 0;
} -
Anomalie #3607 : Problème de compatibilité avec mysql 5.7 pour SPIP 3.1
26 avril 2016, par Guillaume FahrnerLe format est correct, la valeur ne convient par contre pas. La changer parait suffisant :
https://github.com/concrete5/concrete5/issues/3185 :
The DATETIME type is used for values that contain both date and time parts. MySQL retrieves and displays DATETIME values in ’YYYY-MM-DD HH:MM:SS’ format. The supported range is ’1000-01-01 00:00:00’ to ’9999-12-31 23:59:59’.