
Recherche avancée
Médias (1)
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
Autres articles (42)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (6075)
-
Syncronize RTSP with computer time
17 décembre 2012, par Dídac PérezI am successfully using libav to receive the video stream from an RTSP network source. The point is that I need to syncronize my computer's time with the video capturing, meaning that I need to know which datetime of my computer corresponds to the first frame (pts = 0). My API calls are the following ones :
av_register_all()
avcodec_register_all()
avformat_network_init()
avformat_open_input()
avformat_find_stream_info()
av_read_play()
loop
av_init_packet()
av_read_frame()
[...]
av_free_packet
end loopWith the calls above, I successfully read frames, but I do need to know how can I know the exact absolute datetime that corresponds to the first frame, since it has a pts of 0. Maybe I can use a time() function or GetSystemTime (I am using Windows) between two calls of the above, but do not really know how libav works. I will appreciate your help,
Kind regards,
Dídac Pérez
-
Controlling end time in video player via AVPacket information / setting pts/dts properly
26 janvier 2017, par SyntheticGioI’m currently working in C/C++ using the FFMPEG core (libavcodec, etc.). I’m capturing a stream and writing it in chunks to different files. So imagine the stream is 5 minutes in length and I’m writing five files of one minute in length each. I’m able to do this successfully.
Currently, each file after the first file has a start time equal to the time it would have been in the un-chunked stream. So the second video file starts at 1 minute, the third starts at 2 minutes, etc. This was inadvertent but as it turns out is beneficial in my particular use case.
VLC or other video players that I’ve tried report this start time ’properly’, but the end time shows as the duration (not start time + duration). My gut feeling is that the player simply is making the assumption all videos start at 0 and it shows the length as the ’end time’ but I don’t actually know this so I’d like to know if there is anyway to set the AVPacket information so the player for the third video would start at 2 minutes and end at 3 minutes (for a 1 minute length video) - as an example ?
As an alternative, if I wanted to do this the traditional way (reset each chunk to starting at time 0), I assume I’d normalize the
AVPacket.pts
andAVPacket.dts
by subtracting the values of the final packet in the previous chunk ? This seems like this strategy would work for pts but I’m less sure about it working for dts. I feel like it would generally work for dts but there might be times when this fails, so I’d like to know if this is a safe method (or if there is a better method I should use in this case). -
ffmpeg time-lapse from raw .NEF photos
14 juin 2020, par Emil TermanI have about 3000 .NEF photos on the server, each weighting about 80MB.
I need to make a time-lapse out of them. Since it's just for a demo, it's fine if I reduce the quality, so compressing is fine.



I've tried different things, but I couldn't really find a way to make a time-lapse out of .nef files.
But I did find a way to make time-lapse out of .jpg files :



ffmpeg -r 24 -f concat -i ordered_list_of_photos.txt -s hd1080 -vcodec libx264 out.mp4




This works perfectly, but it only works with .jpgs it seems.



-f concat -i ordered_list_of_photos.txt
- means : read the files from ordered_list_of_photos.txt. That file contains lines like :


file 'jpgs/file1.jpg'
file 'jpgs/file2.jpg'
...




Do you have any suggestions on how to do this ? I'm pretty sure it has something to do with rawvideo demuxer, but I can't figure it out on my own.



Converting the .nef files to .jpg seems like an option, but I can't get
ufraw-batch
to work, as it throws segmentation faults after the first conversion. And I also don't have Desktop access to the computer, I'm using ssh to do all of this (so GUI apps won't work I think).