
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (27)
-
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (6020)
-
FFMPEG encoding 16bit video data results in 10bit
12 mars 2023, par Jl artoI want to compress a depth map that has 16 bits of information per pixel. In general, such depth maps can be stored in different ways, e.g. p016le, gray16le, yuv420p16le, yuv444p16le, ... but for simplicity, let's assume the depth map is a yuv420p16le (where the y-channel contains the depth).


For some reason when encoding with
hevc_nvenc
(I use an NVIDIA GTX 1660 Ti GPU), ffmpeg (the command line tool) changes the pixel format to a 10 or 12 bit variant (p010le, gray12le, yuv420p10le, yuv444p12le, ...), but I would like to keep the full 16 bits, since this affects the quality of the depth stored.

For example :


ffmpeg.exe -s:v 1920x1080 -r 30 -pix_fmt yuv420p16le -i depth_yuv420p16le.yuv -c:v hevc_nvenc -pix_fmt yuv444p16le output.mp4



If I use ffprobe on the output.mp4, it tells me that the underlying pixel format is actually yuv444p10le. (Decoding and looking at the raw pixel data, I can confirm that the precision has decreased from 16 bits to 10 bits).


I hope 16 bit compression is possible, since according to


ffmpeg -h encoder=hevc_nvenc



the supported pixel formats are :


hevc_nvenc: yuv420p nv12 p010le yuv444p p016le yuv444p16le bgr0 rgb0 cuda d3d11



But p016le results in a p010le output, and yuv444p16le in yuv444p10le.


Does anyone know where the problem could lie ? Should I re-install ffmpeg (version 4.3.2-2021-02-27-essentials_build-www.gyan.dev) ? Is it because of Windows 10 having limited encoding/decoding capabilities ? Will buying the HEVC Video Extensions help solve this problem ?


Additional info : using
libx256
does not look like it will work for this purpose, since the supported pixel formats are :

libx256 : yuv420p yuvj420p yuv422p yuvj422p yuv444p yuvj444p gbrp yuv420p10le yuv422p10le yuv444p10le gbrp10le yuv420p12le yuv422p12le yuv444p12le gbrp12le gray gray10le gray12le



Any help would be greatly appreciated.


-
stream usb webcam via http using vlc without transcoding using mjpeg signal from webcam
10 novembre 2020, par Peter NußreinerI wanna stream my (Microsoft live) USB webcam using vlc from a raspberry.
While it works perfectely fine with transcoding my raspberry does not have enough power.


So I checked the output formats of my webcam using :


ffmpeg -f v4l2 -list_formats all -i /dev/video0



wich gives me the following output :


[video4linux2,v4l2 @ 0x559d5aee36c0] Raw : yuyv422 : YUYV 4:2:2 : 640x480 1280x720 960x544 800x448 640x360 424x240 352x288 320x240 800x600 176x144 160x120 1280x800
[video4linux2,v4l2 @ 0x559d5aee36c0] Compressed: mjpeg : Motion-JPEG : 640x480 1280x720 960x544 800x448 640x360 800x600 416x240 352x288 176x144 320x240 160x120
/dev/video0: Immediate exit requested



when running my streaming command :


cvlc v4l2:///dev/video0:chroma=h264:width=1280:height=720 --sout '#standard{access=http,mux=ts,dst=192.168.178.27:8080,name=stream,mime=video/ts}' -vvv



I get the following error :


mux_ts mux warning: rejecting stream with unsupported codec YUY2
main mux error: cannot add this stream
main decoder error: cannot create packetizer output (YUY2)



along with many other lines of information.
As far as I can interpret the error, vlc wants to use the YUY2 RAW signal of my webcam.
Is there any way I can force vlc to use the mjpg output of my camera ?


I've already tried to use 'chroma=mjpg' without success.
Transcoding is not an alternative as my raspi has by far not enough power.


I've also read about using ffmpeg for streaming, maybe someone has an alternative solution since I'm very new to the whole topic of websites and stuff


It will (or should) be used as a live webcam for a windsurfing spot.
Therefore I need a fluent video with at least 10fps to judge the situation. A delay of a minute or so is no problem.
The basic idea was to stream the video to a webserver and embed the image in a website. I've got good internet connection at the spot.


But for now I want to try it locally on my pc.


-
ffmpeg output parse in batch script
3 juillet 2016, par vlad2005I am totally unfamiliar with scripts in Windows, but are forced to use such a script. I would like someone to help me with the following problem. I want to process the output from ffmpeg command to save information about access an webcam to be used later.
More precisely command is following :ffmpeg -stats -hide_banner -list_devices true -f dshow -i dummy
and output is like this :
[dshow @ 02cec400] DirectShow video devices (some may be both video and audio devices)
[dshow @ 02cec400] "Microsoft LifeCam Studio"
[dshow @ 02cec400] Alternative name "@device_pnp_\\?\usb#vid_045e&pid_0772&mi_00#6&2a15e69b&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
[dshow @ 02cec400] DirectShow audio devices
[dshow @ 02cec400] "Desktop Microphone (3- Studio -"
[dshow @ 02cec400] Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Desktop Microphone (3- Studio -"
[dshow @ 02cec400] "Line In (High Definition Audio "
[dshow @ 02cec400] Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Line In (High Definition Audio "
[dshow @ 02cec400] "Microphone (High Definition Aud"
[dshow @ 02cec400] Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Microphone (High Definition Aud"Typically, the first two occurence for ”Alternative name” from DirectShow correspond to video and audio, so for simplicity I want these two information saved in two variables.
In this example is :@device_pnp_\\?\usb#vid_045e&pid_0772&mi_00#6&2a15e69b&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global
and
@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Desktop Microphone (3- Studio -
Can someone more experienced to help me with this task ?
Thanks in advance !