
Recherche avancée
Autres articles (47)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (6643)
-
Raw H264 + ADTS AAC audio streams muxing results in wrong duration and bitrate using ffmpeg libraries
17 avril 2015, par Muhammad AliI have an imaging hardware that outputs two streams :
Video : H264 bitstream
Audio : ADTS AACI am aware of Input Sources params e-g bitrate , FPS (video), Sampling Rate(audio) etc and I’ve set the params for ffmpeg accordingly.
My desired output is an FLV container with these two streams .
At Stage 1, I was able to Mux H264 bitstream into an FLV container which would play just fine in ffplay. No errors reported on console, duration and bitrate calculations were fine too.
At Stage 2, I tried to mux Audio (ADTS AAC) audio stream with video stream into FLV container. Audio stream required adtstoasc bitstream filter though. But now the duration of the file was wrong and so was the bitrate.
I should mention that my source of PTS is hardware provided PTS which claims that audio and video streams use the same counter for PTS so Audio video frames should always be considered in order.
The playback of the resultant file using ffplay stucks on first video frame but audio keeps playing fine. And console complains a lot about "AVC NAL Size (very large number)".
Any ideas why is the duration / bitrate wrong when I mux the audio along ?
Here is the muxing code :
if ((packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264) || (packet.header.dataType == AM_TRANSFER_PACKET_TYPE_AAC))
{
AVCodecContext *cn;
raw_fd_index = packet.header.streamId << 1;
//printf("GOT VIDEO FRAME : DATA LEN : %d \n",packet.header.dataLen);
if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264)
{
//printf("VIDEO STREAM ID %d | PTS : %d | V.SEQ: %d \n\n",packet.header.streamId,packet.header.dataPTS,packet.header.seqNum);
if(!firstVideoRecvd)
{
//still waiting for I Frame
if(packet.header.frameType == AM_TRANSFER_FRAME_IDR_VIDEO)
{
firstVideoRecvd = 1;
audioEnabled = 1;
lastvPts = packet.header.dataPTS;
printf("\n\n IDR received : AudioEnabled : true | MuxingEnabled : true \n");
}
else
{
printf("... waiting for IDR frame \n\n ");
continue;
}
}
}
else
{
//printf("AUDIO STREAM ID %d | PTS : %d | A.SEQ: %d \n\n",packet.header.streamId + 1,packet.header.dataPTS,packet.header.seqNum);
if(!firstVideoRecvd)
{
printf("\n\n Waiting for a video Frame before we start packing audio... ignoring packet\n");
continue;
}
if(!audioEnabled)
{ printf("\n\n First Video received but audio still not enabled \n\n");
continue;
}
if(recvFirstAudio)
{
recvFirstAudio = 0;
lastaPts = packet.header.dataPTS;
}
}
//******** FFMPEG SPECIFICS
//printf("FRAME SIZE : %d \t FRAME TYPE : %d \n",packet.header.dataLen, packet.header.frameType);
av_init_packet(&pkt);
//pkt.flags |= AV_PKT_FLAG_KEY;
if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264)
{
pkt.stream_index = packet.header.streamId;//0;//ost->st->index; //stream index 0 for vid : 1 for aud
outStreamIndex = outputVideoStreamIndex;
vDuration += (packet.header.dataPTS - lastvPts);
lastvPts = packet.header.dataPTS;
}
else if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_AAC)
{
pkt.stream_index = packet.header.streamId + 1;//1;//ost->st->index; //stream index 0 for vid : 1 for aud
outStreamIndex = outputAudioStreamIndex;
aDuration += (packet.header.dataPTS - lastaPts);
lastaPts = packet.header.dataPTS;
}
//packet.header.streamId
pkt.data = (uint8_t *)packet.data;//raw_data;
pkt.size = packet.header.dataLen;
pkt.pts = pkt.dts= packet.header.dataPTS;
//pkt.duration = 24000; //24000 assumed basd on observation
//duration calculation
if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264)
{
pkt.duration = vDuration;
}
else if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_AAC)
{
pkt.duration = aDuration;
}
in_stream = ifmt_ctx->streams[pkt.stream_index];
out_stream = ofmt_ctx->streams[outStreamIndex];
cn = out_stream->codec;
if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_AAC)
av_bitstream_filter_filter(aacbsfc, in_stream->codec, NULL, &pkt.data, &pkt.size, pkt.data, pkt.size, 0);
//if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264)
// av_bitstream_filter_filter(h264bsfc, in_stream->codec, NULL, &pkt.data, &pkt.size, pkt.data, pkt.size, 0);
if(packet.header.dataType == AM_TRANSFER_PACKET_TYPE_H264)
{
//commented on Tuesday
av_packet_rescale_ts(&pkt, cn->time_base, out_stream->time_base);
pkt.pts = av_rescale_q_rnd(pkt.pts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);
pkt.dts = av_rescale_q_rnd(pkt.dts, in_stream->time_base, out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);
//printf("Pkt Duration before scaling: %d \n ",pkt.duration);
pkt.duration = av_rescale_q(pkt.duration, in_stream->time_base, out_stream->time_base);
//printf("Pkt Duration after scaling: %d \n ",pkt.duration);
}
//enabled on Tuesday
pkt.pos = -1;
pkt.stream_index = outStreamIndex;
//doxygen suggests i use av_write_frame if i am taking care of interleaving
ret = av_interleaved_write_frame(ofmt_ctx, &pkt);
//ret = av_write_frame(ofmt_ctx, &pkt);
if (ret < 0)
{
fprintf(stderr, "Error muxing packet\n");
continue;
}
av_free_packet(&pkt);
}Notice that I am not using pkt.flags . I am not sure what I should set it to and would it matter ? I am not doing so when I am muxing just the video into FLV or muxing both audio and video into flv.
-
lavc : external hardware frame pool initialization
19 octobre 2017, par wm4lavc : external hardware frame pool initialization
This adds a new API, which allows the API user to query the required
AVHWFramesContext parameters. This also reduces code duplication across
the hwaccels by introducing ff_decode_get_hw_frames_ctx(), which uses
the new API function. It takes care of initializing the hw_frames_ctx
if needed, and does additional error handling and API usage checking.Support for VDA and Cuvid missing.
Signed-off-by : Anton Khirnov <anton@khirnov.net>
- [DBH] doc/APIchanges
- [DBH] libavcodec/avcodec.h
- [DBH] libavcodec/decode.c
- [DBH] libavcodec/decode.h
- [DBH] libavcodec/dxva2.c
- [DBH] libavcodec/dxva2_h264.c
- [DBH] libavcodec/dxva2_hevc.c
- [DBH] libavcodec/dxva2_internal.h
- [DBH] libavcodec/dxva2_mpeg2.c
- [DBH] libavcodec/dxva2_vc1.c
- [DBH] libavcodec/vaapi_decode.c
- [DBH] libavcodec/vaapi_decode.h
- [DBH] libavcodec/vaapi_h264.c
- [DBH] libavcodec/vaapi_hevc.c
- [DBH] libavcodec/vaapi_mpeg2.c
- [DBH] libavcodec/vaapi_mpeg4.c
- [DBH] libavcodec/vaapi_vc1.c
- [DBH] libavcodec/vaapi_vp8.c
- [DBH] libavcodec/vdpau.c
- [DBH] libavcodec/vdpau_h264.c
- [DBH] libavcodec/vdpau_hevc.c
- [DBH] libavcodec/vdpau_internal.h
- [DBH] libavcodec/vdpau_mpeg12.c
- [DBH] libavcodec/vdpau_mpeg4.c
- [DBH] libavcodec/vdpau_vc1.c
- [DBH] libavcodec/version.h
-
How Piwik uses Travis CI to deliver a reliable analytics platform to the community
26 mai 2014, par Matthieu Aubry — Development, MetaIn this post, we will explain how the Piwik project uses continuous integration to deliver a quality software platform to dozens of thousands of users worldwide. Read this post if you are interested in Piwik project, Quality Assurance or Automated testing.
Why do we care about tests ?
Continuous Integration brings us agility and peace of mind. From the very beginning of the Piwik project, it was clear to us that writing and maintaining automated tests was a necessity, in order to create a successful open source software platform.
Over the years we have invested a lot of time into writing and maintaining our tests suites. This work has paid off in so many ways ! Piwik platform has fewer bugs, fewer regressions, and we are able to release new minor and major versions frequently.
Which parts of Piwik software are automatically tested ?
- Piwik back-end in PHP5 : we use PHPUnit to write and run our PHP tests : unit tests, integration tests, and plugin tests.
- piwik.js Tracker : the JS tracker is included into all websites that use Piwik. For this reason, it is critical that piwik.js JavaScript tracker always works without any issue or regression. Our Javascript Tracker tests includes both unit and integration tests.
- Piwik front-end : more recently we’ve started to write JavaScript tests for the user interface partially written in AngularJS.
- Piwik front-end screenshots tests : after each change to Piwik, more than 150 different screenshots are automatically taken. For example, we take screenshots of each of the 8-step installation process, we take screenshots of the password reset workflow, etc. Each of these screenshot is then compared pixel by pixel, with the “expected” screenshot, and we can automatically detect whether the last code change has introduced an undesired visual change. Learn more about Piwik screenshot tests.
How often do we run the tests ?
The tests are executed by Travis CI after each change to the Piwik source code. On average all our tests run 20 times per day. Whenever a Piwik developer pushes some code to Github, or when a community member issues a Pull request, Travis CI automatically runs the tests. In case some of the automated tests started failing after a change, the developer that has made the change is notified by email.
Should I use Travis CI ?
Over the last six years, we have used various Continuous Integration servers such as Bamboo, Hudson, Jenkins… and have found that the Travis CI is the ideal continuous integration service for open source projects that are hosted on Github. Travis CI is free for open source projects and the Travis CI team is very friendly and reactive ! If you work on commercial closed source software, you may also use Travis by signing up to Travis CI Pro.
Summary
Tests make the Piwik analytics platform better. Writing tests make Piwik contributors better developers. We save a lot of time and effort, and we are not afraid of change !
Here is the current status of our builds :
Main build :
Screenshot tests build :PS : If you are a developer looking for a challenge, Piwik is hiring a software developer to join our engineering team in New Zealand or Poland.