
Recherche avancée
Autres articles (62)
-
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (4866)
-
Cant rotate video Android FFMPEG
24 novembre 2016, par ashishguptabnsI am using https://github.com/WritingMinds/ffmpeg-android-java. Can trim, join mp4 files but not able to rotate and speed up/down.
Tried almost all SO answers but no luck.
I am using latest Android Studio.
D/ffmpeg_work: [-noautorotate, -i, /storage/emulated/0/Pictures/VideoApp/v_1479895157.mp4, -vf, transpose=1, /storage/emulated/0/Pictures/VideoApp/v_1480001945.mp4]
D/FFmpeg: Running publishing updates method
D/ffmpeg_work: onProgress: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
D/ffmpeg_work: onProgress: built with gcc 4.8 (GCC)
D/ffmpeg_work: onProgress: configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
D/ffmpeg_work: onProgress: libavutil 55. 17.103 / 55. 17.103
D/ffmpeg_work: onProgress: libavcodec 57. 24.102 / 57. 24.102
D/ffmpeg_work: onProgress: libavformat 57. 25.100 / 57. 25.100
D/ffmpeg_work: onProgress: libavdevice 57. 0.101 / 57. 0.101
D/ffmpeg_work: onProgress: libavfilter 6. 31.100 / 6. 31.100
D/ffmpeg_work: onProgress: libswscale 4. 0.100 / 4. 0.100
D/ffmpeg_work: onProgress: libswresample 2. 0.101 / 2. 0.101
D/ffmpeg_work: onProgress: libpostproc 54. 0.100 / 54. 0.100
D/ffmpeg_work: onProgress: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/VideoApp/v_1479895157.mp4':
D/ffmpeg_work: onProgress: Metadata:
D/ffmpeg_work: onProgress: major_brand : isom
D/ffmpeg_work: onProgress: minor_version : 512
D/ffmpeg_work: onProgress: compatible_brands: isomiso2avc1mp41
D/ffmpeg_work: onProgress: encoder : Lavf57.25.100
D/ffmpeg_work: onProgress: Duration: 00:00:09.88, start: -3.000000, bitrate: 1816 kb/s
D/ffmpeg_work: onProgress: Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 1280x960, 1755 kb/s, 4.46 fps, 7.50 tbr, 90k tbn, 180k tbc (default)
D/ffmpeg_work: onProgress: Metadata:
D/ffmpeg_work: onProgress: rotate : 270
D/ffmpeg_work: onProgress: handler_name : VideoHandler
D/ffmpeg_work: onProgress: Side data:
D/ffmpeg_work: onProgress: displaymatrix: rotation of 90.00 degrees
D/ffmpeg_work: onProgress: Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 96 kb/s (default)
D/ffmpeg_work: onProgress: Metadata:
D/ffmpeg_work: onProgress: handler_name : SoundHandlerTried these also :
[-noautorotate, -i, /storage/emulated/0/Pictures/VideoApp/v_1479895157.mp4, -filter:v, transpose=1, /storage/emulated/0/Pictures/VideoApp/v_1480002208.mp4]
[-noautorotate, -i, /storage/emulated/0/Pictures/VideoApp/v_1479895157.mp4, -filter:v, transpose=1, -metadata:s:v, rotate=0, /storage/emulated/0/Pictures/VideoApp/v_1480002275.mp4]
[-noautorotate, -i, /storage/emulated/0/Pictures/VideoApp/v_1479895157.mp4, -vf, transpose=1, /storage/emulated/0/Pictures/VideoApp/v_1480002362.mp4]After this nothing happens in both cases
-
What is the most efficient way to broadcast a live stream ? [closed]
3 août 2020, par HarshI want to build a live streaming system for a classroom. The amount on information on this subject is so confusing. These are the features/requirements that I want to have in my app :


- 

- Room type system.
- One teacher - N students (N<200).
- Broadcast video/audio. This needs to be only 1 way. (1T ---> 200S)
- Audio chat should be possible if a teacher allows a student to speak.
- Need not to record the session, though it would be a great feature to have.












Now, from my research I have established there are many ways to go about it. The best one to me seems using WebRTC. In that case I do not have to worry about the platform that much.
WebRTC needs a STUN/TURN server, that can be easily set-up using the coturn project.
I'll also need a SFU which forwards my stream to the client, like Janus or Mediasoup.
But that's where I'm getting confused.


Can I not directly use a live stream, send it to the server, transcode it in real time using ffmpeg to HLS/DASH and publish it to a S3 bucket from where the users can access it. Wouldn't that be more efficient and able to handle much more students easily.


For the audio part I could just use the p2p functionality of webrtc in the browser itself, so no need to route that through the server.


That is how far I've come to understand the system. I still don't completely understand how SFU works and I'm confused about how many live streams can one server handle (say a 4C/8GB). Or if using ffmpeg on VPS is a bad thing and I should use the AWS services instead ?


Can someone please help me understand this ?


Thanks !


-
Merge commit ’46430fd47c6239ef8742d0a34f9412d5060fa798’
15 mai 2013, par Michael NiedermayerMerge commit ’46430fd47c6239ef8742d0a34f9412d5060fa798’
* commit ’46430fd47c6239ef8742d0a34f9412d5060fa798’ :
vc1dec : Don’t attempt error concealment on field pictures
vc1dec : fieldtx is only valid for interlaced frame pictures
aacenc : Fix erasure of surround channels
aacenc : Fix target bitrate for twoloop quantiser searchConflicts :
libavcodec/vc1dec.cMerged-by : Michael Niedermayer <michaelni@gmx.at>