
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (111)
-
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Encodage et transformation en formats lisibles sur Internet
10 avril 2011MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)
Sur d’autres sites (6652)
-
java.io.IOException : Error running exec() Working Directory : null Environment : null
16 janvier 2018, par Muhammad Hamza ShahidI am using WritingMinds/ffmpeg-android-java in application.
here is my code
loadFFmpeg();
String cmd="ffmpeg -i /storage/emulated/0/media/audio/a.mp3 -i /storage/emulated/0/recording.3gp -filter_complex \"[0:a][1:a]amerge=inputs=2[aout]\" -map \"[aout]\" " + outputFile;
executeFFmpeg(cmd.split(" "));and
private void loadFFmpeg() {
FFmpeg ffmpeg = FFmpeg.getInstance(MainActivity.this.getApplicationContext());
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
@Override
public void onStart() {}
@Override
public void onFailure() {}
@Override
public void onSuccess() {}
@Override
public void onFinish() {}
});
} catch (FFmpegNotSupportedException e) {
// Handle if FFmpeg is not supported by device
}
}
private void executeFFmpeg(String[] cmd)
{
/*String workFolder = getApplicationContext().getFilesDir() + "/ffmpeg";
String environment = Environment.getExternalStorageDirectory().getAbsolutePath();
Map map = new HashMap();
map.put("Working Directory", workFolder);
map.put("Environment",environment);*/
FFmpeg ffmpeg = FFmpeg.getInstance(MainActivity.this.getApplicationContext());
try {
// to execute "ffmpeg -version" command you just need to pass "-version"
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {}
@Override
public void onProgress(String message) {}
@Override
public void onFailure(String message) {}
@Override
public void onSuccess(String message) {}
@Override
public void onFinish() {
stop.setEnabled(false);
play.setEnabled(true);
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
}
}but I am getting following error
6784-6962/com.flipartstudio.playandrecord E/FFmpeg : Exception while trying to run : [Ljava.lang.String ;@41803270
java.io.IOException : Error running exec(). Command : [/data/data/com.flipartstudio.playandrecord/files/ffmpeg, /system/bin/ls, -l, /data/data/com.example.foo/files/ffmpeg] Working Directory : null Environment : null
at java.lang.ProcessManager.exec(ProcessManager.java:211)
at java.lang.Runtime.exec(Runtime.java:168)
at java.lang.Runtime.exec(Runtime.java:123)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:287)
at java.util.concurrent.FutureTask.run(FutureTask.java:234)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1080)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:573)
at java.lang.Thread.run(Thread.java:856)
Caused by : java.io.IOException : No such file or directory
at java.lang.ProcessManager.exec(Native Method)
at java.lang.ProcessManager.exec(ProcessManager.java:209)
at java.lang.Runtime.exec(Runtime.java:168)
at java.lang.Runtime.exec(Runtime.java:123)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:287)
at java.util.concurrent.FutureTask.run(FutureTask.java:234)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1080)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:573)
at java.lang.Thread.run(Thread.java:856)I have also added
-
What is substitute for ffmpeg video filter "between" for audio filter
18 janvier 2018, par Hekimeni want make a short video preview from long video with audio, but i have problem to select audio stream segments on specific timestamps.
I am using this option for segmenting video-filter_complex "[0:v]select='between(t,216,220.5)+between(t,432,436.5)+between(t,648,652.5)+between(t,864,868.5)+between(t,1080,1084.5)+between(t,1296,1300.5)+between(t,1512,1516.5)+between(t,1728,1732.5)+between(t,1944,1948.5)+between(t,2160,2164.5)'[outv]"
and my question is how this option rewrite for audio stream, because when i use just -filter without selecting any stream then i get error : Cannot connect video filter to audio input
I am using this ffmpeg version :
ffmpeg version 3.3.3-static http://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2017 the FFmpeg developers built with gcc 6.4.0 (Debian 6.4.0-2) 20170724 configuration : —enable-gpl —enable-version3 —enable-static —disable-debug —disable-ffplay —disable-indev=sndio —disable-outdev=sndio —cc=gcc-6 —enable-fontconfig —enable-frei0r —e nable-gnutls —enable-gray —enable-libass —enable-libfreetype —enable-libfribidi —enable-libmp3lame —enable-libopencore-amrnb —enable-libopencore-amrwb —enable-libopus —enable-librtm p —enable-libsoxr —enable-libspeex —enable-libtheora —enable-libvidstab —enable-libvo-amrwbenc —enable-libvorbis —enable-libvpx —enable-libwebp —enable-libx264 —enable-libxvid libavutil 55. 58.100 / 55. 58.100 libavcodec 57. 89.100 / 57. 89.100 libavformat 57. 71.100 / 57. 71.100 libavdevice 57. 6.100 / 57. 6.100 libavfilter 6. 82.100 / 6. 82.100 libswscale 4. 6.100 / 4. 6.100 libswresample 2. 7.100 / 2. 7.100 libpostproc 54. 5.100 / 54. 5.100
-
ffmpeg dash Segment offset
18 mars 2019, par inkubuxI’m trying to integrate live-transcoding like "plex" or "emby" with my application.
I am able to serve dash content over to shaka-player or dash.js but only in ’live-mode’. But I want to enable seeking through the player.
I looked at plex and to enable this they create their own mpd file with duration so the player will have a full seekbar.
However when seeking the player will ask for a segment number eg : 449. I need to stop ffmpeg and restart with an offset
(-ss <<segment>>)</segment>
, but ffmpeg will just restart a transcode session from segment 0 with an initial segment.What I want is to tell ffmpeg to start at a seekpoint but only output from segment number and now-on.
When playing with hls and mpegts, I can tell ffmpeg to output at a certain segment : with the option
-segment_start_number
but this is not available for dash. And plex use their own transcoder based of ffmpeg with the option-skip_to_segment
I tried to ’hack’ around by keeping a manual offset on my web-server, even if I serve the "supposed" right segment after the seek point dash.js and shaka-player can’t recover the stream.. VLC on the other habd is able to (probably more tolerent) to errors in segments.
Is the supposed right segment after a seek in dash (contains the initial segment) or only the segment.
Is ffmpeg able to start segmenting dash as a supposed segment (for seek and resume)
The same technique works in hls with forced key frames and a custom m3u8 (with all the "predicted" segments) but calculating the right segment length and the right bandwidth is much harder and hackish and dash is more tolerant to variation.
I would really like to be able to seek through my live transcoding video.
For reference here is a custom mpd file I serve to enable "seeking" :
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" suggestedpresentationdelay="PT1S" mediapresentationduration="PT49M2.920S" maxsegmentduration="PT2S" minbuffertime="PT10S">
<period start="PT0S" duration="PT49M2.920S">
<adaptationset segmentalignment="true">
<segmenttemplate timescale="1" duration="1" initialization="$RepresentationID$/initial.mp4" media="$RepresentationID$/$Number$.m4s" startnumber="1">
</segmenttemplate>
<representation mimetype="video/mp4" codecs="avc1.640029" bandwidth="3766000" width="1920" height="1080">
</representation>
</adaptationset>
<adaptationset segmentalignment="true">
<segmenttemplate timescale="1" duration="1" initialization="$RepresentationID$/initial.mp4" media="$RepresentationID$/$Number$.m4s" startnumber="1">
</segmenttemplate>
<representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="188000" audiosamplingrate="48000">
<audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="6"></audiochannelconfiguration>
</representation>
</adaptationset>
</period>
</mpd>And here is the ffmpeg command to pull it off :
ffmpeg -ss 0 -i movie.mkv -y -acodec aac -vcodec libx264 -f dash -min_seg_duration 1000000 -individual_header_trailer 0 -pix_fmt yuv420p -vf scale=trunc(min(max(iw\,ih*dar)\,1920)/2)*2:trunc(ow/dar/2)*2 -bsf:v h264_mp4toannexb -profile:v high -level 4.1 -map_chapters -1 -map_metadata -1 -preset veryfast -movflags frag_keyframe+empty_moov -use_template 1 -use_timeline 0 -remove_at_exit 1 -crf 23 -bufsize 7532k -maxrate 3766k -start_at_zero -threads 0 -force_key_frames expr:if(isnan(prev_forced_t),eq(t,t),gte(t,prev_forced_t+1)) -init_seg_name $RepresentationID$/0_initial.mp4 -media_seg_name $RepresentationID$/0_$Number$.m4s /transcoding_temp/Z1GVWEc/index.mpd
The
media_seg_name
is where I prepend the custom seek_point let’s say I want to seek to segment 1233 the template would be :-media_seg_name $RepresentationID$/1233_$Number$.m4s
and the segments would be 1233_1 1233_2 1233_* So I can serve the right segment after seek. but the player does not recover and still downloading subsequent segments. I guess since a new initial segment is generated and I somehow miss headers for continuous playback after seek but I’m probably wrong.
Thanks for your help