
Recherche avancée
Autres articles (47)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (8683)
-
How can I make a GStreamer pipeline to read individual frames and publish stream ?
30 janvier 2024, par Alvan RahimliI have an external system which sends individual H264 encoded frames one by one via socket. What I'm trying to do is getting these frames and publishing an RTSP stream to RTSP server that I have.


After getting frames (which is just reading TCP socket in chunks) my current approach is like this :


I read frames, then start a process with following command, and then write every frame to STDIN of the process.


gst-launch-1.0 -e fdsrc fd=0 ! 
 h264parse ! 
 avdec_h264 ! 
 videoconvert ! 
 videorate ! 
 video/x-raw,framerate=25/1 ! 
 avimux ! 
 filesink location=gsvideo3.avi



I know that it writes stream to AVI file, but this is closest I was able to get to a normal video. And it is probably very inefficient and full of redundant pipeline steps.


I am also open to FFMPEG commands, but GStreamer is preferred as I will be able to embed it to my C# project via bindings and keep stuff in-process.


Any help is appreciated, thanks in advance !


-
Using FFMPEG in java to segment a live audio stream based on silence
14 février 2024, par Christian SeyoumI am try to segment a live audio stream. After I get the stream, I want to detect silence and beging saving a wav file each time a silence is detected. Currently I am attempting to use FFMpeg but I am getting an error for stream being closed and I do not see a single wav file being saved.


private void processTraffic() throws IOException {

 //- Prepare File

 // Start FFmpeg process
 ProcessBuilder processBuilder = new ProcessBuilder(
 "ffmpeg", "-i", "-", // '-' indicates FFmpeg to read from stdin
 "-af", "silencedetect=noise=-10dB",
 "-f", "segment", "-segment_time", "30",
 "-c", "copy", "output_%03d.wav"
 );

 Process ffmpegProcess = processBuilder.start();


 try (OutputStream ffmpegStdIn = ffmpegProcess.getOutputStream(); 
 DatagramSocket socket = new DatagramSocket(this._inputSettings.getSettings().getPort())) {

 //- Set Timeout
 socket.setSoTimeout(this._inputSettings.getSettings().getSocketTimeout());


 //- Fetch Data
 while(this._isReceiverRunning.get() == true) {

 byte[] buffer = new byte[this._PACKET_SIZE];
 DatagramPacket packet = new DatagramPacket(buffer, buffer.length);

 while (true) { // Replace this condition with your own logic
 socket.receive(packet); // Receive data from the network
 ffmpegStdIn.write(packet.getData(), packet.getOffset() + 12, packet.getLength() - 12); // Write data to FFmpeg's stdin
 }
 }
 }
 catch(Exception e) {
 } finally {
 ffmpegProcess.destroy();
 }
 }



I have tried saving segments as raw files and later trying to convert them to wav file but the wav file end up being corrupted. This is my attempt with ffmpeg since I have seen it work in the terminal.


-
Any tips on debugging java servlet for playing video
15 octobre 2015, par JohnI have a website that allows users to upload videos that they make and then I have another page where these videos can be viewed. However, some videos that users upload can’t be viewed on my page. But if I download the video and open it with the browser it displays fine. I assume there is something wrong with the way my server is dealing with returning partial content. Any suggestions on how to figure out what the problem is ?
Here’s the page to view a video :
https://userbob.com/shareVideo.jsp?code=NSSQ04TBZAW1QWQ7D3NDAV393IY4B3_944_455128_2831Here’s the relevant servlet code that returns the response :
{
try {
video = getVideo( testResult.getVideoId() );
String rangeHeader = request.getHeader( "Range" );
if ( rangeHeader != null ) {
String[] parts = rangeHeader.split( "=" );
if ( parts.length == 2 && parts[0].equals( "bytes" )) {
parts = parts[1].split("-");
if ( parts.length > 0 ) {
rangeStart = Long.parseLong( parts[0] );
if ( parts.length > 1 ) {
rangeEnd = Long.parseLong( parts[1] );
}
else {
rangeEnd = video.getVideoSize() - 1;
}
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
if ( request.getParameter("download") == null ) {
response.setContentType( "video/webm" );
response.setHeader( "Content-Disposition","inline; filename=screenCast" + testResult.getId() + ".webm");
}
else {
response.setHeader( "Content-Disposition","attachment; filename=screenCast" + testResult.getId() + ".webm");
}
response.setHeader( "Accept-Ranges", "bytes" );
response.setHeader( "Content-Length", Integer.toString( video.getVideoSize() ) );
if ( rangeStart >= 0 ) {
if ( rangeStart >= rangeEnd ) {
response.setStatus( HttpServletResponse.SC_REQUESTED_RANGE_NOT_SATISFIABLE );
String contentRange = "bytes */" + video.getVideoSize() ;
response.setHeader( "Content-Range", contentRange );
return ;
}
response.setStatus( HttpServletResponse.SC_PARTIAL_CONTENT );
String contentRange = "bytes " + rangeStart + "-" + rangeEnd + "/" + video.getVideoSize() ;
response.setHeader( "Content-Range", contentRange );
response.setHeader( "Content-Length", Long.toString(rangeEnd-rangeStart+1) );
}
else {
response.setHeader( "Content-Length", Integer.toString( video.getVideoSize() ) );
}
if ( content ) {
ServletOutputStream outputStream = response.getOutputStream();
InputStream inputStream = video.getVideoStream();
byte[] buffer = new byte[50000];
int bytesRead;
long pos = 0;
if ( rangeStart > 0 ) {
inputStream.skip( rangeStart );
pos += rangeStart ;
}
while ((bytesRead = inputStream.read(buffer)) != -1 ) {
if ( rangeEnd>=0 && pos + bytesRead>rangeEnd ) {
outputStream.write( buffer, 0, (int) (rangeEnd-pos+1) );
outputStream.flush();
pos = rangeEnd ;
break;
}
else {
outputStream.write(buffer, 0, bytesRead);
outputStream.flush();
pos += bytesRead ;
}
}
outputStream.close();
}}