
Recherche avancée
Autres articles (33)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (4645)
-
Révision 19643 : Lorsqu’on crée une table avec un outil externe à SPIP, les échappements des colo...
23 juin 2012, par marcimat - -
Delphi android, deploying AND dynamic loading (external) libraries
13 décembre 2020, par cobanI am trying to start creating an application to test FFmpeg libraries, kind of a mediaplayer application, for ANDROID using Delphi 10.3/10.4


I am getting some (strange ?) behaviors on different machines and locations of the files on the phone/tablet.


The very first question should be ; what folder is the right one to put (external) libraries for dynamic/static loading ?


I tried 2 locations ; '.\assets\internal' -> 'files' folder of the app
and 'library\lib\armeabi-v7a' -> bin folder (if i'm right)


behavior on mobile phone Android 8


when I choose to place the (FFmpeg) libraries in the Files folder '.\assets\internal' and try to load the libraries, 3 of the 7 libraries succesfully loads, while the other does not. Every tiem the same libraries which fail and succeeds to load. The succesfully loading libraries are 'libavutil.so', 'swresample.so' and 'libswscale.so'.


When I choose to place the libraries in the bin folder 'library\lib\armeabi-v7a', all libraries are succesfully loaded.


behavior on tablet android 4.4.4


When choosing to put the libraries in the 'Files' folder, exact the same behavior as "Android 8 phone".


The strange thing is ; When I choos the bin folder, none of the libraries are being loaded ?


I did not compile/build the (FFmpeg) libraries myself, I downloaded them.
I tried libraries from different places.
In every attempt I checked for the existance of the files.
I used 'loadlibrary' function, after some reading and suggestions on the internet I allso tried 'dlopen' function directly which looks like unnecessary to use it directly after all.
I was not able to debug using D10.4 and Android 4.4.4 tablet, because of the minsdk version. Using D10.3 I am able to try on both machines.


Delphi10.3 'Android SDK 25.2.5 32bit', 'jdk1.8.0_60'


Delphi10.4 'Android SDK 25.2.5 32bit', 'AdoptOpenJDK jdk-8.0.242.08-hotspot'


Any idea why 3 of the libraries are able to load in case of they are in the 'Files' folder while all of them can be loaded when they are in the 'BIN' folder (android 8) ?
And why does nothing load by android 4.4.4 when the files are in the 'Bin' folder while 3 of them are able to be loaded when they are placed in the 'Files' Folder ?


I've been using FFmpeg libraries for windows (allmost)without issues, my question should not be FFmpeg specific but Delphi+android+(external)libraries specific except if this behavior is FFmpeg specific.


Both are Samsung machines,


Android 4.4 tablet cpu (using 'syscheck' embarcadero recommends);

family = ARM
processor = ARMv7 processor rev 5(v7I)
CPU Cores = 4
neon supported yes
armv7 (ARMv7 compatible architecture) yes

Android 8 phone cpu

Family ARM
processor unknown
CPU Cores 8
Neon yes
armv7 = Arm
armv7 (ARMv7 compatible architecture) yes



Edit


Test on Android 10, redmi note 10 lite


None of the library files are being loaded from the 'Files'->'.\assets\internal' folder. All library files are being succesfully loaded from the 'Bin'->'library\lib\armeabi-v7a' Folder.


I'll need an reasonably explanation for this. It looks like Andrid specific behavior ?


Edit 2


One of the reasons seems like that some those FFmpeg libraries are loading other FFmpeg libraries, even if they are in the same directory, if they are outside the folder of the EXE file, or not in a (library) folder where the OS searches by default, they cannot find/load eachother.


This looks like the explanation why some of them are able to load in the 'Files'->'.\assets\internal' folder.


-
FFMPEG Android Camera Streaming to RTMP
2 février 2017, par Omer AbbasI need help for streaming android Camera using FFMPEG to a RTMP Server. I have compiled FFMPEG for android as a shared library. Everything from FFMPGEG side is working perfectly fine. I have tried to stream a already existed video file to RTMP and its working great. Then i have used Camera and write rawvideo to a file and stream it to RTMP server while writing camera rawvideo to the file, It’s also working but the issue is file size keep increasing. I have read about MediaRecorder LocalSocket that can stream data to ffmpeg as Unix Domain name local socket. I have used MediaRecorder Sample source of android sample https://github.com/googlesamples/android-MediaRecorder modified MediaRecorder to use local socket
receiver = new LocalSocket();
try {
localSocketServer = new LocalServerSocket("camera2rtsp");
// FileDescriptor the Camera can send to
sender = localSocketServer.accept();
sender.setReceiveBufferSize(500000);
sender.setSendBufferSize(500000);
} catch (IOException e1) {
e1.printStackTrace();
super.onResume();
finish();
//return;
}
mMediaRecorder.setOutputFile(sender.getFileDescriptor());Tried to access this socket with ffmpeg command, But it’s failed with the error unix ://camera2rtsp : no directory or file found.
ffmpeg -i unix ://camera2rtsp -vcodec libx264 -f flv rtmp ://server
so i tried ParcelFileDescriptor pipe
pipe = getPipe();
ParcelFileDescriptor parcelWrite = new ParcelFileDescriptor(pipe[1]);
mMediaRecorder.setOutputFile(parcelWrite.getFileDescriptor());With command
"ffmpeg -re -r 30 -f rawvideo -i pipe :"+ pipe[0].getfd() +" -vcodec libx264 -f flv rtmp ://server"
But it also not working, Seems like ffmpeg trying to read empty pipe and giving a warning of Size of Stream#0:0
Also MediaRecorder with setOutputFile as ParcelFileDescriptor is giving error "E/MediaRecorder : start failed : -2147483648" in Galaxy S7 but working in a motorolla Phone with Kitkat.
I might have been using Pipe or Socket Incorrectly Please if someone have any idea or experience of using ffmpeg stream with android camera please help me.
I have figured out that MediaRecorder encode in MP4 format which is not seekable or streamable format because of headers or meta data.
So i read a tutorial http://www.oodlestechnologies.com/blogs/Stream-video-from-camera-preview-to-wowza-server-in-Android this tutorial shows we can write raw video to a file and can pass this file to ffmpeg. It’s working great, But the issue is file size it increasing. So now my question is can i pass camera raw video to ffmpeg with ParcelFileDescriptor Pipe or LocalSocket ?
Code :
File f = new File(Environment.getExternalStorageDirectory()+"/test.data");
if(!f.exists()){
f.createNewFile();
}
OutputStream outStream = new FileOutputStream(write);
Camera.Parameters parameters = mCamera.getParameters();
int imageFormat = parameters.getPreviewFormat();
if (imageFormat == ImageFormat.NV21) {
Camera.Size previewSize = parameters.getPreviewSize();
int frameWidth = previewSize.width;
int frameHeight = previewSize.height;
Rect rect = new Rect(0, 0, frameWidth, frameHeight);
YuvImage img = new YuvImage(arg0, ImageFormat.NV21, frameWidth, frameHeight, null);
outStream.write(arg0);
outStream.flush();
//img.compressToJpeg(rect, 50, processIn);
}