
Recherche avancée
Médias (91)
-
Valkaama DVD Cover Outside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Valkaama DVD Cover Inside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (103)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)
Sur d’autres sites (6894)
-
Play video using mse (media source extension) in google chrome
23 août 2019, par liyuqihxcI’m working on a project that convert rtsp stream (ffmpeg) and play it on the web page (signalr + mse).
So far it works pretty much as I expected on the latest version of edge and firefox, but not chrome.
here’s the code
public class WebmMediaStreamContext
{
private Process _ffProcess;
private readonly string _cmd;
private byte[] _initSegment;
private Task _readMediaStreamTask;
private CancellationTokenSource _cancellationTokenSource;
private const string _CmdTemplate = "-i {0} -c:v libvpx -tile-columns 4 -frame-parallel 1 -keyint_min 90 -g 90 -f webm -dash 1 pipe:";
public static readonly byte[] ClusterStart = { 0x1F, 0x43, 0xB6, 0x75, 0x01, 0x00, 0x00, 0x00 };
public event EventHandler<clusterreadyeventargs> ClusterReadyEvent;
public WebmMediaStreamContext(string rtspFeed)
{
_cmd = string.Format(_CmdTemplate, rtspFeed);
}
public async Task StartConverting()
{
if (_ffProcess != null)
throw new InvalidOperationException();
_ffProcess = new Process();
_ffProcess.StartInfo = new ProcessStartInfo
{
FileName = "ffmpeg/ffmpeg.exe",
Arguments = _cmd,
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true
};
_ffProcess.Start();
_initSegment = await ParseInitSegmentAndStartReadMediaStream();
}
public byte[] GetInitSegment()
{
return _initSegment;
}
// Find the first cluster, and everything before it is the InitSegment
private async Task ParseInitSegmentAndStartReadMediaStream()
{
Memory<byte> buffer = new byte[10 * 1024];
int length = 0;
while (length != buffer.Length)
{
length += await _ffProcess.StandardOutput.BaseStream.ReadAsync(buffer.Slice(length));
int cluster = buffer.Span.IndexOf(ClusterStart);
if (cluster >= 0)
{
_cancellationTokenSource = new CancellationTokenSource();
_readMediaStreamTask = new Task(() => ReadMediaStreamProc(buffer.Slice(cluster, length - cluster).ToArray(), _cancellationTokenSource.Token), _cancellationTokenSource.Token, TaskCreationOptions.LongRunning);
_readMediaStreamTask.Start();
return buffer.Slice(0, cluster).ToArray();
}
}
throw new InvalidOperationException();
}
private void ReadMoreBytes(Span<byte> buffer)
{
int size = buffer.Length;
while (size > 0)
{
int len = _ffProcess.StandardOutput.BaseStream.Read(buffer.Slice(buffer.Length - size));
size -= len;
}
}
// Parse every single cluster and fire ClusterReadyEvent
private void ReadMediaStreamProc(byte[] bytesRead, CancellationToken cancel)
{
Span<byte> buffer = new byte[5 * 1024 * 1024];
bytesRead.CopyTo(buffer);
int bufferEmptyIndex = bytesRead.Length;
do
{
if (bufferEmptyIndex < ClusterStart.Length + 4)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, 1024));
bufferEmptyIndex += 1024;
}
int clusterDataSize = BitConverter.ToInt32(
buffer.Slice(ClusterStart.Length, 4)
.ToArray()
.Reverse()
.ToArray()
);
int clusterSize = ClusterStart.Length + 4 + clusterDataSize;
if (clusterSize > buffer.Length)
{
byte[] newBuffer = new byte[clusterSize];
buffer.Slice(0, bufferEmptyIndex).CopyTo(newBuffer);
buffer = newBuffer;
}
if (bufferEmptyIndex < clusterSize)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, clusterSize - bufferEmptyIndex));
bufferEmptyIndex = clusterSize;
}
ClusterReadyEvent?.Invoke(this, new ClusterReadyEventArgs(buffer.Slice(0, bufferEmptyIndex).ToArray()));
bufferEmptyIndex = 0;
} while (!cancel.IsCancellationRequested);
}
}
</byte></byte></byte></clusterreadyeventargs>I use ffmpeg to convert the rtsp stream to vp8 WEBM byte stream and parse it to "Init Segment" (ebml head、info、tracks...) and "Media Segment" (cluster), then send it to browser via signalR
$(function () {
var mediaSource = new MediaSource();
var mimeCodec = 'video/webm; codecs="vp8"';
var video = document.getElementById('video');
mediaSource.addEventListener('sourceopen', callback, false);
function callback(e) {
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
var queue = [];
sourceBuffer.addEventListener('updateend', function () {
if (queue.length === 0) {
return;
}
var base64 = queue[0];
if (base64.length === 0) {
mediaSource.endOfStream();
queue.shift();
return;
} else {
var buffer = new Uint8Array(atob(base64).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
queue.shift();
}
}, false);
var connection = new signalR.HubConnectionBuilder()
.withUrl("/signalr-video")
.configureLogging(signalR.LogLevel.Information)
.build();
connection.start().then(function () {
connection.stream("InitVideoReceive")
.subscribe({
next: function(item) {
if (queue.length === 0 && !!!sourceBuffer.updating) {
var buffer = new Uint8Array(atob(item).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
console.log(blockindex++ + " : " + buffer.byteLength);
} else {
queue.push(item);
}
},
complete: function () {
queue.push('');
},
error: function (err) {
console.error(err);
}
});
});
}
video.src = window.URL.createObjectURL(mediaSource);
})chrome just play the video for 3 5 seconds and then stop for buffering, even though there are plenty of cluster transfered and inserted into SourceBuffer.
here’s the information in chrome ://media-internals/
Player Properties :
render_id: 217
player_id: 1
origin_url: http://localhost:52531/
frame_url: http://localhost:52531/
frame_title: Home Page
url: blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
info: Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
pipeline_state: kSuspended
found_video_stream: true
video_codec_name: vp8
video_dds: false
video_decoder: FFmpegVideoDecoder
duration: unknown
height: 720
width: 1280
video_buffering_state: BUFFERING_HAVE_NOTHING
for_suspended_start: false
pipeline_buffering_state: BUFFERING_HAVE_NOTHING
event: PAUSELog
Timestamp Property Value
00:00:00 00 origin_url http://localhost:52531/
00:00:00 00 frame_url http://localhost:52531/
00:00:00 00 frame_title Home Page
00:00:00 00 url blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
00:00:00 00 info ChunkDemuxer: buffering by DTS
00:00:00 35 pipeline_state kStarting
00:00:15 213 found_video_stream true
00:00:15 213 video_codec_name vp8
00:00:15 216 video_dds false
00:00:15 216 video_decoder FFmpegVideoDecoder
00:00:15 216 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:15 216 pipeline_state kPlaying
00:00:15 213 duration unknown
00:00:16 661 height 720
00:00:16 661 width 1280
00:00:16 665 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 665 for_suspended_start false
00:00:16 665 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 667 pipeline_state kSuspending
00:00:16 670 pipeline_state kSuspended
00:00:52 759 info Effective playback rate changed from 0 to 1
00:00:52 759 event PLAY
00:00:52 759 pipeline_state kResuming
00:00:52 760 video_dds false
00:00:52 760 video_decoder FFmpegVideoDecoder
00:00:52 760 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:52 760 pipeline_state kPlaying
00:00:52 793 height 720
00:00:52 793 width 1280
00:00:52 798 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:52 798 for_suspended_start false
00:00:52 798 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:56 278 video_buffering_state BUFFERING_HAVE_NOTHING
00:00:56 295 for_suspended_start false
00:00:56 295 pipeline_buffering_state BUFFERING_HAVE_NOTHING
00:01:20 717 event PAUSE
00:01:33 538 event PLAY
00:01:35 94 event PAUSE
00:01:55 561 pipeline_state kSuspending
00:01:55 563 pipeline_state kSuspendedCan someone tell me what’s wrong with my code, or dose chrome require some magic configuration to work ?
Thanks
Please excuse my english :)
-
Unable to Access IP Camera in OpenCV
3 janvier 2017, par user7258890I’m trying to use Javafx and OpenCV to access a webcam (Axis M1013) over wireless to run vision processing for my FRC team. When I run my code, I can access the GUI that I made using Scenebuilder, but when I try to start the camera, the program crashes. It seems to me like it’s having trouble utilizing the VideoCapture class. I know for a fact that my problem is not with the camera, because i can access the feed through a browser feed, and my laptop’s webcam will work fine. I’m using OpevCV version 2.4.13, jdk 8u101x64, and ffmpeg 3.2
I have looked at other posts at :
stackoverflow.com/questions/18625948/opencv-java-unsatisfiedlinkerror
answers.opencv.org/question/21720/java-webcam-capture
answers.opencv.org/question/20071/unsatisfiedlinkerror-given-by-highghuiimread-on-java
I have tried the solutions mentioned in all of these. So far, none have resolved my problem.
My code is based mostly off the tutorials at :
opencv-java-tutorials.readthedocs.io/en/latest/03-first-javafx-application-with-opencv.html
Here is my main java code :
@Override
public void start(Stage primaryStage) {
System.load("C:\\opencv\\build\\x64\\vc12\\bin\\opencv_ffmpeg_64.dll");
try {
FXMLLoader loader = new FXMLLoader(getClass().getResource("Sample.fxml"));
BorderPane root = (BorderPane) loader.load();
Scene scene = new Scene(root, 400, 400);
scene.getStylesheets().add(getClass().getResource("application.css").toExternalForm());
primaryStage.setScene(scene);
primaryStage.show();
} catch(Exception e) {
System.out.println("error opening camera gui");
}
}
public static void main(String[] args) {
launch(args);
}And here is my camera controller :
public class SampleController {
@FXML
private Button Start_btn;
@FXML
private ImageView currentFrame;
private ScheduledExecutorService timer;
private VideoCapture capture;
private boolean cameraActive = false;
@FXML
public void startCamera()
{
System.loadLibrary("opencv_ffmpeg_64");
if (!this.cameraActive)
{
try
{
capture = new VideoCapture("http://FRC:FRC@axis-camera-223-catapult.lan:554/mjpg/1/video.mjpg");
}
catch (Exception e)
{
System.out.println("an error occured when attempting to access camera.");
}
// is the video stream available?
if (this.capture.isOpened())
{
this.cameraActive = true;
// grab a frame every 33 ms (30 frames/sec)
Runnable frameGrabber = new Runnable() {
@Override
public void run()
{
// effectively grab and process a single frame
Mat frame = grabFrame();
// convert and show the frame
Image imageToShow = Utils.mat2Image(frame);
updateImageView(currentFrame, imageToShow);
}
};
this.timer = Executors.newSingleThreadScheduledExecutor();
this.timer.scheduleAtFixedRate(frameGrabber, 0, 33, TimeUnit.MILLISECONDS);
// update the button content
this.Start_btn.setText("Stop Camera");
}
else
{
// log the error
System.err.println("Impossible to open the camera connection...");
}
}
else
{
// the camera is not active at this point
this.cameraActive = false;
// update again the button content
this.Start_btn.setText("Start Camera");
// stop the timer
this.stopAquisition();
}
System.out.println("Camera is now on.");
}
/**
* Get frame from open video
* @return
*/
private Mat grabFrame()
{
Mat frame = new Mat();
if (this.capture.isOpened())
{
try
{
this.capture.read(frame);
}
catch (Exception e)
{
System.err.println("Exception during the image elaboration: " + e);
}
}
return frame;
}
private void stopAquisition()
{
if (this.timer!=null && !this.timer.isShutdown())
{
try
{
this.timer.shutdown();
this.timer.awaitTermination(33, TimeUnit.MILLISECONDS);
}
catch (InterruptedException e)
{
System.err.println("Exception in stopping the frame capture, trying to release camera now..." + e);
}
}
if (this.capture.isOpened())
{
this.capture.release();
}
}
private void updateImageView(ImageView view, Image image)
{
Utils.onFXThread(view.imageProperty(), image);
}
protected void setClosed()
{
this.stopAquisition();
}
}When I try and run this as i have said before, the GUI will launch, but when i try and open the camera, i get an error message :
Exception in thread "JavaFX Application Thread" java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at javafx.fxml.FXMLLoader$MethodHandler.invoke(FXMLLoader.java:1774)
at javafx.fxml.FXMLLoader$ControllerMethodEventHandler.handle(FXMLLoader.java:1657)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:86)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:238)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:49)
at javafx.event.Event.fireEvent(Event.java:198)
at javafx.scene.Node.fireEvent(Node.java:8411)
at javafx.scene.control.Button.fire(Button.java:185)
at com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:182)
at com.sun.javafx.scene.control.skin.BehaviorSkinBase$1.handle(BehaviorSkinBase.java:96)
at com.sun.javafx.scene.control.skin.BehaviorSkinBase$1.handle(BehaviorSkinBase.java:89)
at com.sun.javafx.event.CompositeEventHandler$NormalEventHandlerRecord.handleBubblingEvent(CompositeEventHandler.java:218)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:80)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:238)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:54)
at javafx.event.Event.fireEvent(Event.java:198)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3757)
at javafx.scene.Scene$MouseHandler.access$1500(Scene.java:3485)
at javafx.scene.Scene.impl_processMouseEvent(Scene.java:1762)
at javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2494)
at com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:380)
at com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:294)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.tk.quantum.GlassViewEventHandler.lambda$handleMouseEvent$354(GlassViewEventHandler.java:416)
at com.sun.javafx.tk.quantum.QuantumToolkit.runWithoutRenderLock(QuantumToolkit.java:389)
at com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:415)
at com.sun.glass.ui.View.handleMouseEvent(View.java:555)
at com.sun.glass.ui.View.notifyMouse(View.java:937)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$null$148(WinApplication.java:191)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at sun.reflect.misc.Trampoline.invoke(Unknown Source)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at sun.reflect.misc.MethodUtil.invoke(Unknown Source)
at javafx.fxml.FXMLLoader$MethodHandler.invoke(FXMLLoader.java:1771)
... 48 more
Caused by: java.lang.UnsatisfiedLinkError: org.opencv.highgui.VideoCapture.VideoCapture_1(Ljava/lang/String;)J
at org.opencv.highgui.VideoCapture.VideoCapture_1(Native Method)
at org.opencv.highgui.VideoCapture.<init>(VideoCapture.java:128)
at jfxtest1.SampleController.startCamera(SampleController.java:36)
... 58 more
</init>If anyone can help me, that would be much appreciated.
Thanks in advance.
-
avcodec/ac3dec : only export matrix encoding and downmix info side data when necessary
16 janvier, par James Almeravcodec/ac3dec : only export matrix encoding and downmix info side data when necessary
Don't export a matrix encoding side data when there's none signaled.
And if downmixing was handled by the decoder itself, then the downmix info does
not apply to the frame.Signed-off-by : James Almer <jamrial@gmail.com>