
Recherche avancée
Autres articles (100)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (7307)
-
Confusion about PTS in video files and media streams
11 novembre 2014, par user2452253Is it possible that the PTS of a particular frame in a file is different with the PTS of the same frame in the same file while it is being streamed ?
When I read a frame using av_read_frame I store the video stream in an AVStream. After I decode the frame with avcodec_decode_video2, I store the time stamp of that frame in an int64_t using av_frame_get_best_effort_timestamp. Now if the program is getting its input from a file I get a different timestamp from when I stream the input (from the same file) to the program.
To change the input type I simply change the argv argument from "/path/to/file.mp4" to something like "udp ://localhost:1234", then I stream the file with ffmpeg in command line : "ffmpeg -re -i /path/to/file.mp4 -f mpegts udp ://localhost:1234". Can it be because the "-f mpegts" arguments change some characteristics of the media ?
Below is my code (simplified). By reading the ffmpeg mailing list archives I realized that the time_base that I’m looking for is in the AVStream and not the AVCodecContext. Instead of using av_frame_get_best_effort_timestamp I have also tried using the packet.pts but the results don’t change.
I need the time stamps to have a notion of frame number in a streaming video that is being received.
I would really appreciate any sort of help.//..
//argv[1]="/file.mp4";
argv[1]="udp://localhost:7777";
// define AVFormatContext, AVFrame, etc.
// register av, avcodec, avformat_network_init(), etc.
avformat_open_input(&pFormatCtx, argv, NULL, NULL);
avformat_find_stream_info(pFormatCtx, NULL);
// find the video stream...
// pointer to the codec context...
// open codec...
pFrame=av_frame_alloc();
while(av_read_frame(pFormatCtx, &packet)>=0) {
AVStream *strem = pFormatCtx->streams[videoStream];
if(packet.stream_index==videoStream) {
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
if(frameFinished) {
int64_t perts = av_frame_get_best_effort_timestamp(pFrame);
if (isMyFrame(pFrame)){
cout << perts*av_q2d(strem->time_base) << "\n";
}
}
}
//free allocated space
}
//.. -
Install ffmpeg on Heruko
5 octobre 2020, par islaloboI have a little node.js app on heroku and I'm trying to use ffmpeg to convert some audio that has been recorded.


To do that I need to install ffmpeg buildpack.


When I add the build pack and deploy, I don't get any errors in the logs, but the application doesn't load and gives me a non-discript error.


Build packs I've tried


- 

- https://elements.heroku.com/buildpacks/jonathanong/heroku-buildpack-ffmpeg-latest
- https://elements.heroku.com/buildpacks/dwnld/heroku-buildpack-ffmpeg






Code to convert audio


try {
 let process = new ffmpeg(`./public/messages/${req.body.message}`);
 process.then((audio) => {
 audio.fnExtractSoundToMP3(`/messages/${req.body.message}.mp3`, (error, file) => {
 if (!error) console.log('Audio File: ', file);
 if (error) console.log(error);
 });
 }, (error) => {
 console.log('Error: ', error);
 });
 }
 catch(error) {
 console.log(error.code, error.msg);
 }



-
How to stream mp4 files over RTSP ?
21 février 2023, par Ramil GalinI wrote an RTSP server and now I want to stream mp4 (H264/AAC) files over it. Using FFMPEG I demux mp4 file into audio and video streams. Then I read
AVPacket
packets, wrap them into rtp packets and send over RTSP server.

So far I'm just trying to send only video packets. I tried to get my RTSP stream using VLC, unfortunately VLC doesn't play anything. Then I tried to put my RTSP stream into the ffmpeg and save my stream into another mp4 file to see what happens with my packets - ffmpeg showed error
no start code is found
.

I googled the issue and found that mp4 container unlike raw h264 doesn't put SPS/PPS info into the packets. To test it I tried the following. Instead of mp4 file I tried reading raw h264 files and send them over my RTSP server - it worked nicely, I can play my stream using VLC.


So, my question is how can I get SPS/PPS info from mp4 container files ? How to correctly pack them into rtp packet and send over RTSP ?


So far I tried to get sps and pps data using this code :


AVCodecParameters *codecpar = video_stream->codecpar;
for (int i = 0; i < codecpar->extradata_size - 4; i++)
{

 if (
 codecpar->extradata[i] == 0x00 &&
 codecpar->extradata[i + 1] == 0x00 &&
 codecpar->extradata[i + 2] == 0x00 &&
 codecpar->extradata[i + 3] == 0x01
 )
 {
 if (sps_data_ == nullptr)
 {
 sps_data_ = codecpar->extradata + i;
 sps_size_ = i > 0 ? i : codecpar->extradata_size;
 }
 else
 {
 pps_data_ = codecpar->extradata + i;
 pps_size_ = i > 0 ? i : codecpar->extradata_size - sps_size_;
 break;
 }
 }
}



But
codecpar->extradata
doesn't contain needed info, so if-condition inside the loop is always false. Any help is appreciated.