
Recherche avancée
Autres articles (77)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)
Sur d’autres sites (6007)
-
PyAV : force new framerate while remuxing stream ?
7 juin 2019, par ToxicFrogI have a Python program that receives a sequence of H264 video frames over the network, which I want to display and, optionally, record. The camera records at 30FPS and sends frames as fast as it can, which isn’t consistently 30FPS due to changing network conditions ; sometimes it falls behind and then catches up, and rarely it drops frames entirely.
The "display" part is easy ; I don’t need to care about timing or stream metadata, just display the frames as fast as they arrive :
input = av.open(get_video_stream())
for packet in input.demux(video=0):
for frame in packet.decode():
# A bunch of numpy and pygame code here to convert the frame to RGB
# row-major and blit it to the screenThe "record" part looks like it should be easy :
input = av.open(get_video_stream())
output = av.open(filename, 'w')
output.add_stream(template=input.streams[0])
for packet in input.demux(video=0):
for frame in packet.decode():
# ...display code...
packet.stream = output.streams[0]
output.mux_one(packet)
output.close()And indeed this produces a valid MP4 file containing all the frames, and if I play it back with
mplayer -fps 30
it works fine. But that-fps 30
is absolutely required :$ ffprobe output.mp4
Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 960x720,
1277664 kb/s, 12800 fps, 12800 tbr, 12800 tbn, 25600 tbc (default)Note that 12,800 frames/second. It should look something like this (produced by calling
mencoder -fps 30
and piping the frames into it) :$ ffprobe mencoder_test.mp4
Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 960x720,
2998 kb/s, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)Inspecting the packets and frames I get from the input stream, I see :
stream: time_base=1/1200000
codec: framerate=25 time_base=1/50
packet: dts=None pts=None duration=48000 time_base=1/1200000
frame: dst=None pts=None time=None time_base=1/1200000So, the packets and frames don’t have timestamps at all ; they have a time_base which doesn’t match either the timebase that ends up in the final file or the actual framerate of the camera ; the codec has a framrate and timebase that doesn’t match the final file, the camera framerate, or the other video stream metadata !
The PyAV documentation is all but entirely absent when it comes to issues of timing and framerate, but I have tried manually setting various combinations of stream, packet, and frame
time_base
,dts
, andpts
with no success. I can always remux the recorded videos again to get the correct framerate, but I’d rather write video files that are correct in the first place.So, how do I get pyAV to remux the video in a way that produces an output that is correctly marked as 30fps ?
-
Android opengl es YUV to RGB conversion using shaders
1er février 2014, par DSGI am working on a video player for android device, in which I am using ffmpeg for decoding and opengl es for rendering. I am stuck at one point where I am using opengl es shaders for YUV to RGB conversion. Application is able to display image but its not displaying correct colors. After conervting from YUV to RGB, image only displays green and pink colors. I did searched on google, but no solution found. Can any one please help me on this topic ?
I am getting three different buffers (y, u, v) from ffmpeg and then I am passing these buffers to 3 textures as it is.
Here are shaders I am using.
static const char kVertexShader[] =
"attribute vec4 vPosition; \n"
"attribute vec2 vTexCoord; \n"
"varying vec2 v_vTexCoord; \n"
"void main() { \n"
"gl_Position = vPosition; \n"
"v_vTexCoord = vTexCoord; \n"
"} \n";
static const char kFragmentShader[] =
"precision mediump float; \n"
"varying vec2 v_vTexCoord; \n"
"uniform sampler2D yTexture; \n"
"uniform sampler2D uTexture; \n"
"uniform sampler2D vTexture; \n"
"void main() { \n"
"float y=texture2D(yTexture, v_vTexCoord).r;\n"
"float u=texture2D(uTexture, v_vTexCoord).r - 0.5;\n"
"float v=texture2D(vTexture, v_vTexCoord).r - 0.5;\n"
"float r=y + 1.13983 * v;\n"
"float g=y - 0.39465 * u - 0.58060 * v;\n"
"float b=y + 2.03211 * u;\n"
"gl_FragColor = vec4(r, g, b, 1.0);\n"
"}\n";
static const GLfloat kVertexInformation[] =
{
-1.0f, 1.0f, // TexCoord 0 top left
-1.0f,-1.0f, // TexCoord 1 bottom left
1.0f,-1.0f, // TexCoord 2 bottom right
1.0f, 1.0f // TexCoord 3 top right
};
static const GLshort kTextureCoordinateInformation[] =
{
0, 0, // TexCoord 0 top left
0, 1, // TexCoord 1 bottom left
1, 1, // TexCoord 2 bottom right
1, 0 // TexCoord 3 top right
};
static const GLuint kStride = 0;//COORDS_PER_VERTEX * 4;
static const GLshort kIndicesInformation[] =
{
0, 1, 2,
0, 2, 3
};Here is another person who had asked same question : Camera frame yuv to rgb conversion using GL shader language
Thank You.
UPDATE :
ClayMontgomery's shaders.
const char* VERTEX_SHADER = "\
attribute vec4 a_position;\
attribute vec2 a_texCoord;\
varying vec2 gsvTexCoord;\
varying vec2 gsvTexCoordLuma;\
varying vec2 gsvTexCoordChroma;\
\
void main()\
{\
gl_Position = a_position;\
gsvTexCoord = a_texCoord;\
gsvTexCoordLuma.s = a_texCoord.s / 2.0;\
gsvTexCoordLuma.t = a_texCoord.t / 2.0;\
gsvTexCoordChroma.s = a_texCoord.s / 4.0;\
gsvTexCoordChroma.t = a_texCoord.t / 4.0;\
}";
const char* YUV_FRAGMENT_SHADER = "\
precision highp float;\
uniform sampler2D y_texture;\
uniform sampler2D u_texture;\
uniform sampler2D v_texture;\
varying vec2 gsvTexCoord;\
varying vec2 gsvTexCoordLuma;\
varying vec2 gsvTexCoordChroma;\
\
void main()\
{\
float y = texture2D(y_texture, gsvTexCoordLuma).r;\
float u = texture2D(u_texture, gsvTexCoordChroma).r;\
float v = texture2D(v_texture, gsvTexCoordChroma).r;\
u = u - 0.5;\
v = v - 0.5;\
vec3 rgb;\
rgb.r = y + (1.403 * v);\
rgb.g = y - (0.344 * u) - (0.714 * v);\
rgb.b = y + (1.770 * u);\
gl_FragColor = vec4(rgb, 1.0);\
}";Here is output :
-
what is wrong about the I420 render from ffmpeg ?
9 mai 2022, par DLKUNI use glfw render YUV from ffmpeg ;the Y is ok(only use data Y ,and frag texture2D Y is ok ,the color is Grayscale).but when I add U,V ;the display show pink and green ; I try to change frag shader or the imgtexture ,there have no use .


#include <glad></glad>glad.h>
#include <glfw></glfw>glfw3.h>

#include<string>
#include<fstream>
#include<sstream>
#include<iostream>
#include

#include 

// settings
const unsigned int SCR_WIDTH = 544;
const unsigned int SCR_HEIGHT = 960;
const int len = 544 * 960 * 3/2;
BYTE YUVdata [len];//
BYTE Ydata [544 * 960];//
BYTE Udata [272 * 480];//
BYTE Vdata [272 * 480];//
unsigned int VBO = 0;
unsigned int VAO = 0;
unsigned int EBO = 0;
unsigned int texturePIC = 0;
int shaderProgram = 0;

GLuint texIndexarray[3];
GLuint texUniformY = 99;
GLuint texUniformU = 99;
GLuint texUniformV = 99;

void LoadPicture()
{


 glGenTextures(3, texIndexarray);

 glBindTexture(GL_TEXTURE_2D, texIndexarray[0]);
 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

 glBindTexture(GL_TEXTURE_2D, texIndexarray[1]);
 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
 glBindTexture(GL_TEXTURE_2D, texIndexarray[2]);
 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);


 glValidateProgram(shaderProgram);

 texUniformY = glGetUniformLocation(shaderProgram, "dataY");//2
 texUniformU = glGetUniformLocation(shaderProgram, "dataU");//0
 texUniformV = glGetUniformLocation(shaderProgram, "dataV");//1

 
 FILE* fp = fopen("./output544_960.yuv","rb+");//I420
 int returns =fread(YUVdata,1,len,fp);
 int w = 544;
 int h = 960;
 int ysize = w*h;
 int uvsize = w * h / 4;

 void* uptr = &YUVdata[ysize];
 void* vptr = &YUVdata[ysize * 5 / 4];

 memcpy(Ydata,YUVdata,ysize);
 memcpy(Udata, uptr,uvsize);
 memcpy(Vdata, vptr,uvsize);
 glActiveTexture(GL_TEXTURE0);
 glBindTexture(GL_TEXTURE_2D, texIndexarray[0]);
 
 glTexImage2D(GL_TEXTURE_2D, 0 , GL_RED, w, h ,0, GL_RED,GL_UNSIGNED_BYTE ,Ydata);
 glUniform1i(texUniformY, texIndexarray[0]); 


 glActiveTexture(GL_TEXTURE1);
 glBindTexture(GL_TEXTURE_2D, texIndexarray[1]);
 glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, w/2, h/2, 0, GL_RED, GL_UNSIGNED_BYTE,Udata );

 glUniform1i(texUniformU, texIndexarray[1]);


 glActiveTexture(GL_TEXTURE2);
 glBindTexture(GL_TEXTURE_2D, texIndexarray[2]);
 glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, w/2, h/2, 0, GL_RED, GL_UNSIGNED_BYTE,Vdata);
 glUniform1i(texUniformV, texIndexarray[2]);

}


void render()
{
 glBindVertexArray(VAO);
 glUseProgram(shaderProgram);
 glDrawElements(GL_TRIANGLES,6,GL_UNSIGNED_INT,0);
 //glDrawArrays(GL_TRIANGLE_FAN,0,4);
 glUseProgram(0);
 glBindVertexArray(0);
}

void initmodule()
{
 
 float vertexs[] = {
 
 1.0f, 1.0f, 0.0f, 1.0f, 0.0f, 
 1.0f, -1.0f, 0.0f, 1.0f, 1.0f, 
 -1.0f, -1.0f, 0.0f, 0.0f, 1.0f, 
 -1.0f, 1.0f, 0.0f, 0.0f, 0.0f 
 
 
 };
 
 unsigned int indexs[] = {
 0,1,3,
 1,2,3,
 };

 
 glGenVertexArrays(1,&VAO);
 glBindVertexArray(VAO);

 

 glGenBuffers(1, &VBO);
 glBindBuffer(GL_ARRAY_BUFFER, VBO);
 
 glBufferData(GL_ARRAY_BUFFER,sizeof(vertexs), vertexs, GL_STATIC_DRAW);

 
 glGenBuffers(1,&EBO);
 glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,EBO);
 glBufferData(GL_ELEMENT_ARRAY_BUFFER,sizeof(indexs),indexs,GL_STATIC_DRAW);
 
 LoadPicture();

 glVertexAttribPointer(0,3,GL_FLOAT,GL_FALSE,5*sizeof(float),(void*)0);
 
 glEnableVertexAttribArray(0);
 
 glVertexAttribPointer(1,2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)(3 * sizeof(float)));
 
 glEnableVertexAttribArray(1);

 
 glBindBuffer(GL_ARRAY_BUFFER,0);

 
 glBindVertexArray(0);



}

void initshader(const char* verpath,const char* fragpath)
{
 
 std::string VerCode("");
 std::string fregCode("");
 
 std::ifstream vShaderFile;
 std::ifstream fShaderFile;

 vShaderFile.exceptions(std::ifstream::failbit | std::ifstream::badbit);
 fShaderFile.exceptions(std::ifstream::failbit | std::ifstream::badbit);

 try
 {
 vShaderFile.open(verpath);
 fShaderFile.open(fragpath);

 std::stringstream vsstream, fsstream;
 vsstream << vShaderFile.rdbuf();
 fsstream << fShaderFile.rdbuf();
 VerCode = vsstream.str();
 fregCode = fsstream.str();
 
 }
 catch (const std::exception&)
 {
 std::cout << "read file error" << std::endl;
 }

 const char* vshader = VerCode.c_str();
 const char* fshader = fregCode.c_str();

 
 unsigned int vertexID = 0, fragID = 0;
 char infoLog[512];
 int successflag = 0;
 vertexID = glCreateShader(GL_VERTEX_SHADER);
 glShaderSource(vertexID,1,&vshader,NULL );
 glCompileShader(vertexID);
 
 glGetShaderiv(vertexID,GL_COMPILE_STATUS,&successflag);
 if (!successflag)
 {
 glGetShaderInfoLog(vertexID,512,NULL,infoLog);
 std::string errstr(infoLog);
 std::cout << "v shader err"</frag
 fragID = glCreateShader(GL_FRAGMENT_SHADER);
 glShaderSource(fragID, 1, &fshader, NULL);
 glCompileShader(fragID);
 
 glGetShaderiv(fragID, GL_COMPILE_STATUS, &successflag);
 if (!successflag)
 {
 glGetShaderInfoLog(fragID, 512, NULL, infoLog);
 std::string errstr(infoLog);
 std::cout << "f shader err"</
 initmodule();


 
 while (!glfwWindowShouldClose(window))
 {
 
 processInput(window);

 glClearColor(0.0f,0.0f,0.0f,1.0f);
 glClear(GL_COLOR_BUFFER_BIT);
 render();
 
 
 glfwSwapBuffers(window);
 
 glfwPollEvents();
 }

 
 glfwTerminate();
 return 0;
}
</iostream></sstream></fstream></string>


I get the Y data ,and run the code is ok ;the color is gray ;but when I add the U ,the color is Light green;and when i add the V is pink and green ;


#version 330 core
layout(location = 0) out vec4 FragColor;
in vec2 TexCoord;
uniform sampler2D dataY;
uniform sampler2D dataU;
uniform sampler2D dataV;
vec3 yuv;
vec3 rgb;
void main()
{


 yuv.x = texture2D(dataY, TexCoord).r-0.0625;
 yuv.y = texture2D(dataU, TexCoord).r-0.5;
 yuv.z = texture2D(dataV, TexCoord).r-0.5;

 rgb = mat3(1, 1, 1, 
 0, -0.18732, 1.8556, 
 1.57481, -0.46813, 0) * yuv; 
 FragColor = vec4(rgb.x, rgb.y,rgb.z,1); 
};