
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (20)
-
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...) -
Les thèmes de MediaSpip
4 juin 20133 thèmes sont proposés à l’origine par MédiaSPIP. L’utilisateur MédiaSPIP peut rajouter des thèmes selon ses besoins.
Thèmes MediaSPIP
3 thèmes ont été développés au départ pour MediaSPIP : * SPIPeo : thème par défaut de MédiaSPIP. Il met en avant la présentation du site et les documents média les plus récents ( le type de tri peut être modifié - titre, popularité, date) . * Arscenic : il s’agit du thème utilisé sur le site officiel du projet, constitué notamment d’un bandeau rouge en début de page. La structure (...) -
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)
Sur d’autres sites (6420)
-
exo player mp2, aac audio format and avi video format [closed]
27 mai 2020, par Muhammet

/*
 * Copyright (C) 2016 The Android Open Source Project
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.google.android.exoplayer2.ext.ffmpeg;

import android.os.Handler;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.ExoPlaybackException;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.audio.AudioProcessor;
import com.google.android.exoplayer2.audio.AudioRendererEventListener;
import com.google.android.exoplayer2.audio.AudioSink;
import com.google.android.exoplayer2.audio.DefaultAudioSink;
import com.google.android.exoplayer2.audio.SimpleDecoderAudioRenderer;
import com.google.android.exoplayer2.drm.DrmSessionManager;
import com.google.android.exoplayer2.drm.ExoMediaCrypto;
import com.google.android.exoplayer2.util.Assertions;
import com.google.android.exoplayer2.util.MimeTypes;
import java.util.Collections;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;

/**
 * Decodes and renders audio using FFmpeg.
 */
public final class FfmpegAudioRenderer extends SimpleDecoderAudioRenderer {

 /** The number of input and output buffers. */
 private static final int NUM_BUFFERS = 16;
 /** The default input buffer size. */
 private static final int DEFAULT_INPUT_BUFFER_SIZE = 960 * 6;

 private final boolean enableFloatOutput;

 private @MonotonicNonNull FfmpegDecoder decoder;

 public FfmpegAudioRenderer() {
 this(/* eventHandler= */ null, /* eventListener= */ null);
 }

 /**
 * @param eventHandler A handler to use when delivering events to {@code eventListener}. May be
 * null if delivery of events is not required.
 * @param eventListener A listener of events. May be null if delivery of events is not required.
 * @param audioProcessors Optional {@link AudioProcessor}s that will process audio before output.
 */
 public FfmpegAudioRenderer(
 @Nullable Handler eventHandler,
 @Nullable AudioRendererEventListener eventListener,
 AudioProcessor... audioProcessors) {
 this(
 eventHandler,
 eventListener,
 new DefaultAudioSink(/* audioCapabilities= */ null, audioProcessors),
 /* enableFloatOutput= */ false);
 }

 /**
 * @param eventHandler A handler to use when delivering events to {@code eventListener}. May be
 * null if delivery of events is not required.
 * @param eventListener A listener of events. May be null if delivery of events is not required.
 * @param audioSink The sink to which audio will be output.
 * @param enableFloatOutput Whether to enable 32-bit float audio format, if supported on the
 * device/build and if the input format may have bit depth higher than 16-bit. When using
 * 32-bit float output, any audio processing will be disabled, including playback speed/pitch
 * adjustment.
 */
 public FfmpegAudioRenderer(
 @Nullable Handler eventHandler,
 @Nullable AudioRendererEventListener eventListener,
 AudioSink audioSink,
 boolean enableFloatOutput) {
 super(
 eventHandler,
 eventListener,
 /* drmSessionManager= */ null,
 /* playClearSamplesWithoutKeys= */ false,
 audioSink);
 this.enableFloatOutput = enableFloatOutput;
 }

 @Override
 @FormatSupport
 protected int supportsFormatInternal(
 @Nullable DrmSessionManager<exomediacrypto> drmSessionManager, Format format) {
 Assertions.checkNotNull(format.sampleMimeType);
 if (!FfmpegLibrary.isAvailable()) {
 return FORMAT_UNSUPPORTED_TYPE;
 } else if (!FfmpegLibrary.supportsFormat(format.sampleMimeType) || !isOutputSupported(format)) {
 return FORMAT_UNSUPPORTED_SUBTYPE;
 } else if (!supportsFormatDrm(drmSessionManager, format.drmInitData)) {
 return FORMAT_UNSUPPORTED_DRM;
 } else {
 return FORMAT_HANDLED;
 }
 }

 @Override
 @AdaptiveSupport
 public final int supportsMixedMimeTypeAdaptation() throws ExoPlaybackException {
 return ADAPTIVE_NOT_SEAMLESS;
 }

 @Override
 protected FfmpegDecoder createDecoder(Format format, @Nullable ExoMediaCrypto mediaCrypto)
 throws FfmpegDecoderException {
 int initialInputBufferSize =
 format.maxInputSize != Format.NO_VALUE ? format.maxInputSize : DEFAULT_INPUT_BUFFER_SIZE;
 decoder =
 new FfmpegDecoder(
 NUM_BUFFERS, NUM_BUFFERS, initialInputBufferSize, format, shouldUseFloatOutput(format));
 return decoder;
 }

 @Override
 public Format getOutputFormat() {
 Assertions.checkNotNull(decoder);
 int channelCount = decoder.getChannelCount();
 int sampleRate = decoder.getSampleRate();
 @C.PcmEncoding int encoding = decoder.getEncoding();
 return Format.createAudioSampleFormat(
 /* id= */ null,
 MimeTypes.AUDIO_RAW,
 /* codecs= */ null,
 Format.NO_VALUE,
 Format.NO_VALUE,
 channelCount,
 sampleRate,
 encoding,
 Collections.emptyList(),
 /* drmInitData= */ null,
 /* selectionFlags= */ 0,
 /* language= */ null);
 }

 private boolean isOutputSupported(Format inputFormat) {
 return shouldUseFloatOutput(inputFormat)
 || supportsOutput(inputFormat.channelCount, C.ENCODING_PCM_16BIT);
 }

 private boolean shouldUseFloatOutput(Format inputFormat) {
 Assertions.checkNotNull(inputFormat.sampleMimeType);
 if (!enableFloatOutput || !supportsOutput(inputFormat.channelCount, C.ENCODING_PCM_FLOAT)) {
 return false;
 }
 switch (inputFormat.sampleMimeType) {
 case MimeTypes.AUDIO_RAW:
 // For raw audio, output in 32-bit float encoding if the bit depth is > 16-bit.
 return inputFormat.pcmEncoding == C.ENCODING_PCM_24BIT
 || inputFormat.pcmEncoding == C.ENCODING_PCM_32BIT
 || inputFormat.pcmEncoding == C.ENCODING_PCM_FLOAT;
 case MimeTypes.AUDIO_AC3:
 // AC-3 is always 16-bit, so there is no point outputting in 32-bit float encoding.
 return false;
 default:
 // For all other formats, assume that it's worth using 32-bit float encoding.
 return true;
 }
 }

}</exomediacrypto>








I use exoplayer but no sound from mp2 and aac audio formats in android application.



I get this error when I open mp2 and aac audio format videos "media includes audio tracks but none



are playable by this device"and some are not working in .avi format, some are working please can you help me





/*
 * Copyright (C) 2016 The Android Open Source Project
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.google.android.exoplayer2.ext.ffmpeg;

import androidx.annotation.Nullable;
import com.google.android.exoplayer2.ExoPlayerLibraryInfo;
import com.google.android.exoplayer2.util.LibraryLoader;
import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.MimeTypes;

/**
 * Configures and queries the underlying native library.
 */
public final class FfmpegLibrary {

 static {
 ExoPlayerLibraryInfo.registerModule("goog.exo.ffmpeg");
 }

 private static final String TAG = "FfmpegLibrary";

 private static final LibraryLoader LOADER =
 new LibraryLoader("avutil", "swresample", "avcodec", "ffmpeg");

 private FfmpegLibrary() {}

 /**
 * Override the names of the FFmpeg native libraries. If an application wishes to call this
 * method, it must do so before calling any other method defined by this class, and before
 * instantiating a {@link FfmpegAudioRenderer} instance.
 *
 * @param libraries The names of the FFmpeg native libraries.
 */
 public static void setLibraries(String... libraries) {
 LOADER.setLibraries(libraries);
 }

 /**
 * Returns whether the underlying library is available, loading it if necessary.
 */
 public static boolean isAvailable() {
 return LOADER.isAvailable();
 }

 /** Returns the version of the underlying library if available, or null otherwise. */
 public static @Nullable String getVersion() {
 return isAvailable() ? ffmpegGetVersion() : null;
 }

 /**
 * Returns whether the underlying library supports the specified MIME type.
 *
 * @param mimeType The MIME type to check.
 */
 public static boolean supportsFormat(String mimeType) {
 if (!isAvailable()) {
 return false;
 }
 String codecName = getCodecName(mimeType);
 if (codecName == null) {
 return false;
 }
 if (!ffmpegHasDecoder(codecName)) {
 Log.w(TAG, "No " + codecName + " decoder available. Check the FFmpeg build configuration.");
 return false;
 }
 return true;
 }

 /**
 * Returns the name of the FFmpeg decoder that could be used to decode the format, or {@code null}
 * if it's unsupported.
 */
 /* package */ static @Nullable String getCodecName(String mimeType) {
 switch (mimeType) {
 case MimeTypes.AUDIO_AAC:
 return "aac";
 case MimeTypes.AUDIO_MPEG:
 case MimeTypes.AUDIO_MPEG_L1:
 case MimeTypes.AUDIO_MPEG_L2:
 return "mp3";
 case MimeTypes.AUDIO_AC3:
 return "ac3";
 case MimeTypes.AUDIO_E_AC3:
 case MimeTypes.AUDIO_E_AC3_JOC:
 return "eac3";
 case MimeTypes.AUDIO_TRUEHD:
 return "truehd";
 case MimeTypes.AUDIO_DTS:
 case MimeTypes.AUDIO_DTS_HD:
 return "dca";
 case MimeTypes.AUDIO_VORBIS:
 return "vorbis";
 case MimeTypes.AUDIO_OPUS:
 return "opus";
 case MimeTypes.AUDIO_AMR_NB:
 return "amrnb";
 case MimeTypes.AUDIO_AMR_WB:
 return "amrwb";
 case MimeTypes.AUDIO_FLAC:
 return "flac";
 case MimeTypes.AUDIO_ALAC:
 return "alac";
 case MimeTypes.AUDIO_MLAW:
 return "pcm_mulaw";
 case MimeTypes.AUDIO_ALAW:
 return "pcm_alaw";
 default:
 return null;
 }
 }

 private static native String ffmpegGetVersion();
 private static native boolean ffmpegHasDecoder(String codecName);

}








[enter image description here][1]
[enter image description here][2]


-
How to create a video file webm from chunks by media recorder api using ffmpeg
17 octobre 2020, par Caio NakaiI'm trying to create a webm video file from blobs generated by MediaRecorderAPI in a NodeJS server using FFMPEG. I'm able to create the .webm file but it's not playable, I ran this command
$ ffmpeg.exe -v error -i lel.webm -f null - >error.log 2>&1
to generate an error log, the error log file contains this :



[null @ 000002ce7501de40] Application provided invalid, non monotonically increasing dts to muxer in stream 0 : 1 >= 1


[h264 @ 000002ce74a727c0] Invalid NAL unit size (804 > 74).


[h264 @ 000002ce74a727c0] Error splitting the input into NAL units.


Error while decoding stream #0:0 : Invalid data found when processing input




This is my web server code


const app = require("express")();
const http = require("http").createServer(app);
const io = require("socket.io")(http);
const fs = require("fs");
const child_process = require("child_process");

app.get("/", (req, res) => {
 res.sendFile(__dirname + "/index.html");
});

io.on("connection", (socket) => {
 console.log("a user connected");

 const ffmpeg = child_process.spawn("ffmpeg", [
 "-i",
 "-",
 "-vcodec",
 "copy",
 "-f",
 "flv",
 "rtmpUrl.webm",
 ]);

 ffmpeg.on("close", (code, signal) => {
 console.log(
 "FFmpeg child process closed, code " + code + ", signal " + signal
 );
 });

 ffmpeg.stdin.on("error", (e) => {
 console.log("FFmpeg STDIN Error", e);
 });

 ffmpeg.stderr.on("data", (data) => {
 console.log("FFmpeg STDERR:", data.toString());
 });

 socket.on("message", (msg) => {
 console.log("Writing blob! ");
 ffmpeg.stdin.write(msg);
 });

 socket.on("stop", () => {
 console.log("Stop recording..");
 ffmpeg.kill("SIGINT");
 });
});

http.listen(3000, () => {
 console.log("listening on *:3000");
});




And this is my client code, using HTML, JS :




 
 
 
 
 
 <code class="echappe-js"><script src='http://stackoverflow.com/socket.io/socket.io.js'></script>

<script>&#xA; const socket = io();&#xA; let mediaRecorder = null;&#xA; const startRecording = (someStream) => {&#xA; const mediaStream = new MediaStream();&#xA; const videoTrack = someStream.getVideoTracks()[0];&#xA; const audioTrack = someStream.getAudioTracks()[0];&#xA; console.log("Video trac ", videoTrack);&#xA; console.log("audio trac ", audioTrack);&#xA; mediaStream.addTrack(videoTrack);&#xA; mediaStream.addTrack(audioTrack);&#xA;&#xA; const recorderOptions = {&#xA; mimeType: "video/webm;codecs=h264",&#xA; videoBitsPerSecond: 3 * 1024 * 1024,&#xA; };&#xA;&#xA; mediaRecorder = new MediaRecorder(mediaStream, recorderOptions);&#xA; mediaRecorder.start(1000); // 1000 - the number of milliseconds to record into each Blob&#xA; mediaRecorder.ondataavailable = (event) => {&#xA; console.debug("Got blob data:", event.data);&#xA; if (event.data &amp;&amp; event.data.size > 0) {&#xA; socket.emit("message", event.data);&#xA; }&#xA; };&#xA; };&#xA;&#xA; const getVideoStream = async () => {&#xA; try {&#xA; const stream = await navigator.mediaDevices.getUserMedia({&#xA; video: true,&#xA; audio: true,&#xA; });&#xA; startRecording(stream);&#xA; myVideo.srcObject = stream;&#xA; } catch (e) {&#xA; console.error("navigator.getUserMedia error:", e);&#xA; }&#xA; };&#xA;&#xA; const stopRecording = () => {&#xA; mediaRecorder.stop();&#xA; socket.emit("stop");&#xA; };&#xA; </script>

 
hello world






 

<script>&#xA; const myVideo = document.getElementById("myvideo");&#xA; myVideo.muted = true;&#xA; </script>

 




Any help is appreciated !


-
Buffer overrun Blackmagic Intensity 4K as input to FFmpeg
24 mai 2016, par colossus47I am trying to take direct video output from a 4k Sony Handycam, via HDMI directly into a Blackmagic Intensity Pro 4K. I can verify that the camera, Hdmi and blackmagic card are working as I can capture and view video using the provided "Media Express" program. When use ffmpeg I do get video output but I also get a buffer overrun.
Here is the command :
time ffmpeg -f decklink -i "Intensity Pro 4K@20" -c:v nvenc -b:v 100M -vf yadif=0:-1:0" -pix_fmt yuv420p -crf 29.97 -strict -2 output.mp4
And I get the following output :
ffmpeg version N-76538-gb83c849 Copyright (c) 2000-2015 the FFmpeg
developers built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --enable-nonfree --enable-nvenc --enable-nvresize --extra-cflags=-I../cudautils --extra-ldflags=-L../cudautils --enable-gpl --enable-libx264 --enable-libx265 --enable-decklink --extra-cflags=-I/home/tristan/Downloads/BlackmagicDeckLinkSDK10.6.5/Linux/include --extra-ldflags=-L/home/tristan/Downloads/BlackmagicDeckLinkSDK10.6.5/Linux/include
libavutil 55. 5.100 / 55. 5.100
libavcodec 57. 15.100 / 57. 15.100
libavformat 57. 14.100 / 57. 14.100
libavdevice 57. 0.100 / 57. 0.100
libavfilter 6. 15.100 / 6. 15.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
[decklink @ 0x1ccd6e0] Found Decklink mode 3840 x 2160 with rate 29.97
[decklink @ 0x1ccd6e0] Stream #1: not enough frames to estimate rate; consider increasing probesize
Guessed Channel Layout for Input Stream #0.0 : stereo
Input #0, decklink, from 'Intensity Pro 4K@20':
Duration: N/A, start: 0.000000, bitrate: 1536 kb/s
Stream #0:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s
Stream #0:1: Video: rawvideo (UYVY / 0x59565955), uyvy422, 3840x2160, -5 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc
Codec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (output.mp4) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
File 'output.mp4' already exists. Overwrite ? [y/N] y
Output #0, mp4, to 'output.mp4':
Metadata:
encoder : Lavf57.14.100
Stream #0:0: Video: h264 (nvenc) ([33][0][0][0] / 0x0021), yuv420p, 3840x2160, q=-1--1, 100000 kb/s, 29.97 fps, 30k tbn, 29.97 tbc
Metadata:
encoder : Lavc57.15.100 nvenc
Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 128 kb/s
Metadata:
encoder : Lavc57.15.100 aac
Stream mapping:
Stream #0:1 -> #0:0 (rawvideo (native) -> h264 (nvenc))
Stream #0:0 -> #0:1 (pcm_s16le (native) -> aac (native))
Press [q] to stop, [?] for help
[decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.15 bitrate=70411.7kbits/s
Last message repeated 1 times
[decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.54 bitrate=73110.9kbits/s
Last message repeated 20 times
[decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.92 bitrate=76270.2kbits/s
Last message repeated 15 times
[decklink @ 0x1ccd6e0] Decklink input buffer overrun!:04.28 bitrate=78367.6kbits/s
Last message repeated 61 times
frame= 140 fps= 22 q=-0.0 Lsize= 57266kB time=00:00:04.67 bitrate=100425.2kbits/s
video:57187kB audio:72kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.009844%
[decklink @ 0x1ccd6e0] Decklink input buffer overrun!
Last message repeated 7 times
[aac @ 0x1cd7020] Qavg: 215.556
real 0m8.808s
user 0m5.785s
sys 0m1.749sSome sort of insight into this, be that just some commands that may fix it the issue, or otherwise.