
Recherche avancée
Autres articles (106)
-
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...)
Sur d’autres sites (6852)
-
Stream video from bitmap frames with x264 and ffmpegs swscale
25 mars 2016, par JonesThe goal is to create a video stream from bitmap frames.
Current solution : I am streaming raw 24bpp bitmap data over tcp from my machine to a "remote" server, to distribute the frames to the clients. This works well when everything runs on my local machine.
Problem : The size of a frame is 1440000 (800*600*3) bytes, I only have a 2Mbps upstream and I need the video to be 25 frames per second with a resolution of 800x600.
Approach : So after a bit of research my approach would be to encode the bitmap frames into h264 and stream the resulting video.
Current situation : I compiled x264 and ffmpeg (for swscale, to convert RGB to YUV) and I am able to encode 24bpp bitmap Frames with x264. The last action in my test program is a call to
x264_encoder_encode
. The bitmap data in the test program is randomly generated for every frame.Now to my question(s) :
Where do I go from here ? TCP, UDP, RTMP ? Which data would be transfered ? Do i just transmit the resulting frame from
x264_encoder_encode
?
Which still has a size of 140000 bytes, according to the return value of the function. That would result in
(25*140000 = 3500000) 3,5 Mbps which is still greater than the 2Mpbs I have available.While I dont really know how (yet !), I am confidend that this task is feasable. For example, here is a spreadsheet displaying needed bandwith for different resolutions for streaming platforms like twitch.tv (using RTMP).
My current
x264_param
(from this post) looks like this (just fyi) :x264_param_default_preset(&param, "veryfast", "zerolatency");
param.i_threads = 1;
param.i_width = width;
param.i_height = height;
param.i_fps_num = fps;
param.i_fps_den = 1;
// Intra refres:
param.i_keyint_max = fps;
param.b_intra_refresh = 1;
//Rate control:
param.rc.i_rc_method = X264_RC_CRF;
param.rc.f_rf_constant = 25;
param.rc.f_rf_constant_max = 35;
//For streaming:
param.b_repeat_headers = 1;
param.b_annexb = 1;
x264_param_apply_profile(&param, "baseline");thank you in advance.
-
Playing and recording a live stream from another computer webcam using VLC/FFmpeg [closed]
22 octobre 2012, par user573014I was trying lately to set a video server on one machine and play it in a differen machine, it works with me.. but the problem is that it always stuck and become jammed in the middle .. and it is very slow comparing to the original stream.. something like 5 seconds delay which is not acceptable at all !
The warning messages I get usually includes something like that :
in the client side, which is the one which is jammed[0x24d1ab0] ts demux warning: discontinuity received 0x5 instead of 0xe (pid=68)
[0x7f4340015e50] rtp demux warning: 2 packet(s) lost
reference picture missing during reorder
Missing reference picture
mmco: unref short failure
Reference 4 >= 4 (H264 - MPEG-4 AVC (part 10)) stopped
error while decoding MB 34 14, bytestream (575)and that is the picture of the streaming when it is jammed
and that is what it looks like when it is running smoothly
This is the error message I got on the server
[0x2513820] main generic debug: auto hidding mouse
[0x2296230] main mux warning: late buffer for mux input (1840085)and Finally here is my command line that I am using >
on the server :vlc -vvv v4l2:///dev/video1:v4l2-width=640:v4l2-height=480 --sout
'#duplicate{dst=display,dst="transcode{vcodec=h264,vb=800,ab=128}
:duplicate{dst=rtp{mux=ts,dst=172.22.2.87,port=50004}'on the client :
vlc -vvv rtp://@:50004
I thought that it might be from VLC or from my command .. I tried different protocol for transmission, with no luck I also tried FFmpeg and I got similar results + warning messages .. I thought then that both of them are using the same libraries in Linux
here is the command using FFMpeg :
ffmpeg -f video4linux2 -i /dev/video1 -vcodec libx264 -s 320x240 -pix_fmt
yuv420p -vb 200000 -minrate 200000 -maxrate 200000 -bufsize 2000000 -acodec
libmp3lame -ab 128k -ar 44100 -ac 2 -f mpegts udp://172.22.2.87:5544In conclusion, I would like to find a solution for the latency of the streaming (which is very high) and the jamming problem
I appreciate anyone's input, thank you -
Find the library calls from FFMPEG command line
6 octobre 2013, par BudiusI'm trying to create an Android app that will use video edition, thou, using FFMPEG for the task. I already successfully compiled FFMPEG as a library (libavcodec, libavformat, etc) and included them in the Android project.
Note that it does not contain the ffmpeg.c that can be called as a command line and the problem is that I only know the command lines to be used for all different things I want to accomplish.
So the question is :
from my Linux machine, how would I call ffmpeg
main()
in a "debug mode" to follow line-by-line what is being calling on those libraries, so I can write methods to mimic what I need to get done ? (currently I only have Android Studio installed, but I'm open to install whatever IDE ppl might suggest)