
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (103)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (5965)
-
FFMPEG dumping a RTSP streaming works on Windows but not in Ubuntu
9 août 2016, par cuentafalsa7I have a camera with a
RTSP
server. When I try to save the streaming, usingFFMPEG
in Windows works ; in Ubuntu,FFMPEG
it doesn’t.The command, in Windows is :
ffmpeg.exe -i "rtsp://192.168.1.10:1236/?videoapi=mc&h264=1000-20-1280-960" -r 20 test.mp4
In Linux :
ffmpeg -i "rtsp://192.168.1.10:1236/?videoapi=mc&h264=1000-20-1280-960" -r 20 test.mp4
The output in Linux is :
$ ffmpeg -i "rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960" -r 20 test.mp4
ffmpeg version 2.8.6-1ubuntu2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-11ubuntu1) 20160311
configuration: --prefix=/usr --extra-version=1ubuntu2 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
[rtsp @ 0x1f18340] UDP timeout, retrying with TCP
[rtsp @ 0x1f18340] Nonmatching transport in server reply
[rtsp @ 0x1f18340] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960: could not find codec parameters
Input #0, rtsp, from 'rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960':
Metadata:
title : Unnamed
comment : N/A
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
Output #0, mp4, to 'test.mp4':
Output file #0 does not contain any streamThe ffmpeg.exe output is :
>ffmpeg.exe
ffmpeg version N-81300-gce2217b Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.4.0 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-lib
ebur128 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfree
type --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-lib
openjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame
--enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-
libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 51.100 / 57. 51.100
libavformat 57. 46.100 / 57. 46.100
libavdevice 57. 0.102 / 57. 0.102
libavfilter 6. 50.100 / 6. 50.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
Use -h to get full help or, even better, run 'man ffmpeg'The ffmpeg output in Ubuntu is :
$ ffmpeg
ffmpeg version 2.8.6-1ubuntu2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-11ubuntu1) 20160311
configuration: --prefix=/usr --extra-version=1ubuntu2 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
Use -h to get full help or, even better, run 'man ffmpeg'I’m using Windows 7 64 bits and Ubuntu 16.04.
I have tried to increase the suggested values but didn’t work either.
Any idea, except switching to Windows ? -
FFMPEG HLS Multiple audio languages with var_stream_map - EXT-X-MEDIA:TYPE doesn't contain LANGUAGE
30 janvier 2020, par Moonsurfer_1I have a question involving multiple audio languages on HLS with FFMPEG.
I’m currently using the following command to mux a transport stream with multiple audio languages into an HLS stream with a master playlist :
ffmpeg -re -i $INPUT_URL -map 0:v -c:v copy -map 0:a -c:a copy -f hls -hls_time 6 -hls_list_size 10 -hls_flags delete_segments+program_date_time -hls_segment_filename "$FULL_OUTPUT_FOLDER/stream_%v_%d.ts" -var_stream_map "v:0,agroup:groupname a:0,agroup:groupname a:1,agroup:groupname a:2,agroup:groupname a:3,agroup:groupname" -master_pl_name master.m3u8 $FULL_OUTPUT_FOLDER/stream-%v.m3u8
The output of this command looks like this :
ffmpeg version 4.0.3-1~16.04.york0 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.10) 20160609
configuration: --prefix=/usr --extra-version='1~16.04.york0' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 56. 14.100 / 56. 14.100
libavcodec 58. 18.100 / 58. 18.100
libavformat 58. 12.100 / 58. 12.100
libavdevice 58. 3.100 / 58. 3.100
libavfilter 7. 16.100 / 7. 16.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 1.100 / 5. 1.100
libswresample 3. 1.100 / 3. 1.100
libpostproc 55. 1.100 / 55. 1.100
[mpegts @ 0x55c8cb7fee40] start time for stream 5 is not set in estimate_timings_from_pts
[mpegts @ 0x55c8cb7fee40] start time for stream 6 is not set in estimate_timings_from_pts
[mpegts @ 0x55c8cb7fee40] start time for stream 7 is not set in estimate_timings_from_pts
Input #0, mpegts, from '/home/user/Videos/output/example.ts':
Duration: 00:37:50.26, start: 1.498667, bitrate: 1257 kb/s
Program 1
Metadata:
service_name : example
service_provider: FFmpeg
Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(progressive), 696x572 [SAR 64:45 DAR 3712:2145], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0x101](eng): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 159 kb/s
Stream #0:2[0x102](fra): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 126 kb/s
Stream #0:3[0x103](ita): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 132 kb/s
Stream #0:4[0x104](eng): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 132 kb/s
Stream #0:5[0x105](eng): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:6[0x106](fra): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:7[0x107](deu): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_0_0.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_1_0.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_2_0.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_3_0.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_4_0.ts' for writing
[mpegts @ 0x55c8cb880380] frame size not set
[mpegts @ 0x55c8cb882040] frame size not set
[mpegts @ 0x55c8cb8837c0] frame size not set
[mpegts @ 0x55c8cb885280] frame size not set
Output #0, hls, to '/var/www/html/live/stream-%v.m3u8':
Metadata:
encoder : Lavf58.12.100
Stream #0:0: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(progressive), 696x572 [SAR 64:45 DAR 3712:2145], q=2-31, 25 fps, 25 tbr, 90k tbn, 25 tbc
Stream #0:1(eng): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 159 kb/s
Stream #0:2(fra): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 126 kb/s
Stream #0:3(ita): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 132 kb/s
Stream #0:4(eng): Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 132 kb/s
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Stream #0:2 -> #0:2 (copy)
Stream #0:3 -> #0:3 (copy)
Stream #0:4 -> #0:4 (copy)
Press [q] to stop, [?] for help
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_1_1.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-1.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_2_1.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-2.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_3_1.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-3.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_4_1.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-4.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_1_2.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-1.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_2_2.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-2.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_3_2.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-3.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_4_2.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-4.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_0_1.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-0.m3u8.tmp' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/master.m3u8' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream_1_3.ts' for writing
[hls @ 0x55c8cb834980] Opening '/var/www/html/live/stream-1.m3u8.tmp' for writingThe master playlist that’s being output by this command looks like this :
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="group_groupname",NAME="audio_0",DEFAULT=YES,URI="stream-1.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="group_groupname",NAME="audio_0",DEFAULT=YES,URI="stream-2.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="group_groupname",NAME="audio_0",DEFAULT=YES,URI="stream-3.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="group_groupname",NAME="audio_0",DEFAULT=YES,URI="stream-4.m3u8"
#EXT-X-STREAM-INF:BANDWIDTH=174900,RESOLUTION=696x572,AUDIO="group_groupname"
stream-0.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=174900,CODECS="mp4a.40.2",AUDIO="group_groupname"
stream-1.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=174900,CODECS="mp4a.40.2",AUDIO="group_groupname"
stream-2.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=174900,CODECS="mp4a.40.2",AUDIO="group_groupname"
stream-3.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=174900,CODECS="mp4a.40.2",AUDIO="group_groupname"
stream-4.m3u8While this does seem to work with VLC, I need the EXT-X-MEDIA:TYPE tag to contain the LANGUAGE property. (the player I’m working with uses it to parse the audio tracks’ language) I can’t figure out how to do that based on the documentation I’ve found.
Could anyone help me with this ?
Thanks !
-
offloading to ffmpeg via named pipes in c#/dotnet core
1er avril 2022, par bepI tried to break this down to the base elements so I hope this is clear. I want to take in a network stream, it may be a 1 way, it may be a protocol that requires 2 way communication, such as RTMP during handshake.


I want to pass that stream straight through to a spawned FFMPEG process. I then want to capture the output of FFMPEG, in this example I just want to pipe it out to a file. The file is not my end goal, but for simplicity if I can get that far I think I'll be ok.




I want the code to be as plain as possible and offload the core processing to FFMPEG. If I ask FFMPEG to output webrtc stream, a file, whatever, I just want to capture that. FFMPEG shouldn't be used directly, just indirectly via
IncomingConnectionHandler
.

Only other component is OBS, which I am using to create the RTMP stream coming in.


As things stand now, running this results in the following error, which I'm a little unclear on. I don't feel like I'm causing concurrent reads at any point.


System.InvalidOperationException: Concurrent reads are not allowed
 at Medallion.Shell.Throw`1.If(Boolean condition, String message)
 at Medallion.Shell.Streams.Pipe.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, TimeSpan timeout, CancellationToken cancellationToken)
 at Medallion.Shell.Streams.Pipe.PipeOutputStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
 at System.IO.Stream.ReadAsync(Memory`1 buffer, CancellationToken cancellationToken)
 at System.IO.StreamReader.ReadBufferAsync(CancellationToken cancellationToken)
 at System.IO.StreamReader.ReadLineAsyncInternal()
 at Medallion.Shell.Streams.MergedLinesEnumerable.GetEnumeratorInternal()+MoveNext()
 at System.String.Join(String separator, IEnumerable`1 values)
 at VideoIngest.IncomingRtmpConnectionHandler.OnConnectedAsync(ConnectionContext connection) in Program.cs:line 55
 at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Infrastructure.KestrelConnection`1.ExecuteAsync()



Code :


namespace VideoIngest
{
 public class IncomingRtmpConnectionHandler : ConnectionHandler
 {
 private readonly ILogger<incomingrtmpconnectionhandler> logger;

 public IncomingRtmpConnectionHandler(ILogger<incomingrtmpconnectionhandler> logger)
 {
 this.logger = logger;
 }

 public override async Task OnConnectedAsync(ConnectionContext connection)
 {
 logger?.LogInformation("connection started");

 var outputFileName = @"C:\Temp\bunny.mp4";

 var rtmpPassthroughPipeName = Guid.NewGuid().ToString();
 var cmdPath = @"C:\Opt\ffmpeg\bin\ffmpeg.exe";
 var cmdArgs = $"-i pipe:{rtmpPassthroughPipeName} -preset slow -c copy -f mp4 -y pipe:1";

 var cancellationToken = connection.ConnectionClosed;
 var rtmpStream = connection.Transport;

 using (var outputStream = new FileStream(outputFileName, FileMode.Create))
 using (var cmd = Command.Run(cmdPath, options: o => { o.StartInfo(i => i.Arguments = cmdArgs); o.CancellationToken(cancellationToken); }))
 {
 // create a pipe to pass the RTMP data straight to FFMPEG. This code should be dumb to proto etc being used
 var ffmpegPassthroughStream = new NamedPipeServerStream(rtmpPassthroughPipeName, PipeDirection.InOut, 10, PipeTransmissionMode.Byte, System.IO.Pipes.PipeOptions.Asynchronous);

 // take the network stream and pass data to/from ffmpeg process
 var fromFfmpegTask = ffmpegPassthroughStream.CopyToAsync(rtmpStream.Output.AsStream(), cancellationToken);
 var toFfmpegTask = rtmpStream.Input.AsStream().CopyToAsync(ffmpegPassthroughStream, cancellationToken);

 // take the ffmpeg process output (not stdout) into target file
 var outputTask = cmd.StandardOutput.PipeToAsync(outputStream);

 while (!outputTask.IsCompleted && !outputTask.IsCanceled)
 {
 var errs = cmd.GetOutputAndErrorLines();
 logger.LogInformation(string.Join(Environment.NewLine, errs));

 await Task.Delay(1000);
 }

 CommandResult result = result = cmd.Result;

 if (result != null && result.Success)
 {
 logger.LogInformation("Created file");
 }
 else
 {
 logger.LogError(result.StandardError);
 }
 }

 logger?.LogInformation("connection closed");
 }
 }

 public class Startup
 {
 public void ConfigureServices(IServiceCollection services) { }

 public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
 {
 app.Run(async (context) =>
 {
 var log = context.RequestServices.GetRequiredService>();
 await context.Response.WriteAsync("Hello World!");
 });
 }
 }

 public class Program
 {
 public static void Main(string[] args)
 {
 CreateHostBuilder(args).Build().Run();
 }

 public static IWebHostBuilder CreateHostBuilder(string[] args) =>
 WebHost
 .CreateDefaultBuilder(args)
 .ConfigureServices(services =>
 {
 services.AddLogging(options =>
 {
 options.AddDebug().AddConsole().SetMinimumLevel(LogLevel.Information);
 });
 })
 .UseKestrel(options =>
 {
 options.ListenAnyIP(15666, builder =>
 {
 builder.UseConnectionHandler<incomingrtmpconnectionhandler>();
 });

 options.ListenLocalhost(5000);

 // HTTPS 5001
 options.ListenLocalhost(5001, builder =>
 {
 builder.UseHttps();
 });
 })
 .UseStartup<startup>();
 }
 

}
</startup></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler>


Questions :


- 

- Is this a valid approach, do you see any fundamental issues ?
- Is the pipe naming correct, is the convention just
pipe:someName
? - Any ideas on what specifically may be causing the
Concurrent reads are not allowed
? - If #3 is solved, does the rest of this seem valid ?