
Recherche avancée
Autres articles (95)
-
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (4425)
-
squeeze image while capturing video with FFmpegFrameRecorder
12 avril 2016, par SatyI am trying to stream video with FFmpegFrameRecorder using javacv. All are working great except I find videos which are actually images are bit squeeze from the height.
I am using the below code which half of the internet community is using to live stream
public class MainActivity extends Activity implements OnClickListener {
private final static String LOG_TAG = "MainActivity";
private PowerManager.WakeLock mWakeLock;
private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
//private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";
private volatile FFmpegFrameRecorder recorder;
boolean recording = false;
long startTime = 0;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;
private Thread audioThread;
volatile boolean runAudioThread = true;
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private CameraView cameraView;
private IplImage yuvIplimage = null;
private Button recordButton;
private LinearLayout mainLayout;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(R.layout.activity_main);
initLayout();
initRecorder();
}
@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
mWakeLock.acquire();
}
}
@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
}
private void initLayout() {
mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);
recordButton = (Button) findViewById(R.id.recorder_control);
recordButton.setText("Start");
recordButton.setOnClickListener(this);
cameraView = new CameraView(this);
LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);
mainLayout.addView(cameraView, layoutParam);
Log.v(LOG_TAG, "added cameraView to mainLayout");
}
private void initRecorder() {
Log.w(LOG_TAG,"initRecorder");
if (yuvIplimage == null) {
// Recreated after frame size is set in surface change method
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
//yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
Log.v(LOG_TAG, "IplImage.create");
}
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);
recorder.setFormat("flv");
Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");
recorder.setSampleRate(sampleAudioRateInHz);
Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");
// re-set in the surface changed method as well
recorder.setFrameRate(frameRate);
Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");
// Create audio recording thread
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
}
// Start the capture
public void startRecording() {
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
// This should stop the audio thread from running
runAudioThread = false;
if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
// Quit when back button is pushed
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
recordButton.setText("Stop");
} else {
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
recordButton.setText("Start");
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
// Set the thread priority
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
short[] audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = new short[bufferSize];
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
// Audio Capture/Encoding Loop
while (runAudioThread) {
// Read from audioRecord
bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
if (bufferReadResult > 0) {
//Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);
// Changes in this variable may not be picked up despite it being "volatile"
if (recording) {
try {
// Write to FFmpegFrameRecorder
Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};
recorder.record(buffer);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished");
/* Capture/Encoding finished, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private boolean previewRunning = false;
private SurfaceHolder holder;
private Camera camera;
private byte[] previewBuffer;
long videoTimestamp = 0;
Bitmap bitmap;
Canvas canvas;
public CameraView(Context _context) {
super(_context);
holder = this.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(this);
Camera.Parameters currentParams = camera.getParameters();
Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);
// Use these values
imageWidth = currentParams.getPreviewSize().width;
imageHeight = currentParams.getPreviewSize().height;
frameRate = currentParams.getPreviewFrameRate();
bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);
/*
Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
camera.addCallbackBuffer(previewBuffer);
camera.setPreviewCallbackWithBuffer(this);
*/
camera.startPreview();
previewRunning = true;
}
catch (IOException e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);
// We would do this if we want to reset the camera parameters
/*
if (!recording) {
if (previewRunning){
camera.stopPreview();
}
try {
//Camera.Parameters cameraParameters = camera.getParameters();
//p.setPreviewSize(imageWidth, imageHeight);
//p.setPreviewFrameRate(frameRate);
//camera.setParameters(cameraParameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
previewRunning = true;
}
catch (IOException e) {
Log.e(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
*/
// Get the current parameters
Camera.Parameters currentParams = camera.getParameters();
Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);
// Use these values
imageWidth = currentParams.getPreviewSize().width;
imageHeight = currentParams.getPreviewSize().height;
frameRate = currentParams.getPreviewFrameRate();
// Create the yuvIplimage if needed
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
//yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
camera.setPreviewCallback(null);
previewRunning = false;
camera.release();
} catch (RuntimeException e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (yuvIplimage != null && recording) {
videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);
// Put the camera preview frame right into the yuvIplimage object
yuvIplimage.getByteBuffer().put(data);
// FAQ about IplImage:
// - For custom raw processing of data, getByteBuffer() returns an NIO direct
// buffer wrapped around the memory pointed by imageData, and under Android we can
// also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
// - To get a BufferedImage from an IplImage, we may call getBufferedImage().
// - The createFrom() factory method can construct an IplImage from a BufferedImage.
// - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.
// Let's try it..
// This works but only on transparency
// Need to find the right Bitmap and IplImage matching types
/*
bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
//bitmap.setPixel(10,10,Color.MAGENTA);
canvas = new Canvas(bitmap);
Paint paint = new Paint();
paint.setColor(Color.GREEN);
float leftx = 20;
float topy = 20;
float rightx = 50;
float bottomy = 100;
RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
canvas.drawRect(rectangle, paint);
bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
*/
//Log.v(LOG_TAG,"Writing Frame");
try {
// Get the correct time
recorder.setTimestamp(videoTimestamp);
// Record the image into FFmpegFrameRecorder
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}}
-
Fix min/max validation. Closes gh-666. Fixes #648
20 mars 2013, par ekonijnFix min/max validation. Closes gh-666. Fixes #648
In 1.10.0, min/max validation was supported for input type="text",
where min/max were interpreted as numbers. This means min/max
for date would not work : min="2012-02-13" was interpreted as min="Not a Number".In 1.11.0, min/max were no longer converted to numbers. This means
min/max for dates worked, but min/max for numbers failed :
"50" < "150" < "1000" does not hold.For an example, see http://jsbin.com/awokex/3
This commit makes the behaviour of min/max dependent on input type :
* input type=text (or not type attribute) has numeric min/max, as in 1.10.0
* input type=date has working min/max for type date ;
on mobile browsers you also get a date picker,
plus the browser may reject invalid dates before
javascript gets a chance to complain.
* input type=number or range get numeric min/max,
plus numeric keypad or slider on mobile browsers,
plus browser may reject invalid input before javascript
gets a chance to complainAllowing use of min/max with type=number/range/date is important
for mobile browsers, where the numeric keypad or date picker
make the input much easier to use than a generic text input field.
In this situation jquery-validate remains necessary to support
older browsers that do not do input validation based on type
and min/max.For situations where numeric input should be validated by jquery
without giving the browser a chance to validate the input format,
input type=text in combination with min/max can be used, as in 1.10.0. -
Writing an MP4 file on the Mac with OpenCV ffmpeg
13 janvier 2015, par Sameer ParekhI am using OpenCV with ffmpeg on a mac to write video. I’ve been able to successfully write .avi files using the codec/fourcc code, FMP4. I would like to write .mp4 files, however. When I try to write an .mp4 file using fourcc FMP4 I get this error :
[mp4 @ 0x100b4ec00] Tag FMP4/0x34504d46 incompatible with output codec id '13' ( [0][0][0])
When I use AVC1 I get the following error :
[libx264 @ 0x104003000] broken ffmpeg default settings detected
[libx264 @ 0x104003000] use an encoding preset (e.g. -vpre medium)
[libx264 @ 0x104003000] preset usage: -vpre <speed> -vpre <profile>
[libx264 @ 0x104003000] speed presets are listed in x264 --help
[libx264 @ 0x104003000] profile is optional; x264 defaults to high
Could not open codec 'libx264': Unspecified error
</profile></speed>Does anyone here know the right codec to use with OpenCV and ffmpeg to write to an MP4 container on the Mac ?
If AVC1 is the right codec, how do I install ffmpeg + OpenCV correctly ? I did
brew install gpac
brew install ffmpeg
brew install opencvThe call I am using to open the videowriter :
fourcc = cv2.cv.CV_FOURCC('A', 'V', 'C', '1')
video_out = cv2.VideoWriter(
filename=output_filename,
fourcc=fourcc,
fps=video_fps,
frameSize=(video_width,video_height),
isColor=1)When I run
x264 --help
I get% x264 --help
x264 core:125
Syntax: x264 [options] -o outfile infile
Infile can be raw (in which case resolution is required),
or YUV4MPEG (*.y4m),
or Avisynth if compiled with support (no).
or libav* formats if compiled with lavf support (no) or ffms support (no).
Outfile type is selected by filename:
.264 -> Raw bytestream
.mkv -> Matroska
.flv -> Flash Video
.mp4 -> MP4 if compiled with GPAC support (no)
Output bit depth: 8 (configured at compile time)
Options:
-h, --help List basic options
--longhelp List more options
--fullhelp List all options
Example usage:
Constant quality mode:
x264 --crf 24 -o <output> <input />
Two-pass with a bitrate of 1000kbps:
x264 --pass 1 --bitrate 1000 -o <output> <input />
x264 --pass 2 --bitrate 1000 -o <output> <input />
Lossless:
x264 --qp 0 -o <output> <input />
Maximum PSNR at the cost of speed and visual quality:
x264 --preset placebo --tune psnr -o <output> <input />
Constant bitrate at 1000kbps with a 2 second-buffer:
x264 --vbv-bufsize 2000 --bitrate 1000 -o <output> <input />
Presets:
--profile <string> Force the limits of an H.264 profile
Overrides all settings.
- baseline,main,high,high10,high422,high444
--preset <string> Use a preset to select encoding settings [medium]
Overridden by user settings.
- ultrafast,superfast,veryfast,faster,fast
- medium,slow,slower,veryslow,placebo
--tune <string> Tune the settings for a particular type of source
or situation
Overridden by user settings.
Multiple tunings are separated by commas.
Only one psy tuning can be used at a time.
- psy tunings: film,animation,grain,
stillimage,psnr,ssim
- other tunings: fastdecode,zerolatency
Frame-type options:
-I, --keyint <integer or="or"> Maximum GOP size [250]
--tff Enable interlaced mode (top field first)
--bff Enable interlaced mode (bottom field first)
--pulldown <string> Use soft pulldown to change frame rate
- none, 22, 32, 64, double, triple, euro (requires cfr input)
Ratecontrol:
-B, --bitrate <integer> Set bitrate (kbit/s)
--crf <float> Quality-based VBR (0-51) [23.0]
--vbv-maxrate <integer> Max local bitrate (kbit/s) [0]
--vbv-bufsize <integer> Set size of the VBV buffer (kbit) [0]
-p, --pass <integer> Enable multipass ratecontrol
- 1: First pass, creates stats file
- 2: Last pass, does not overwrite stats file
Input/Output:
-o, --output <string> Specify output file
--sar width:height Specify Sample Aspect Ratio
--fps Specify framerate
--seek <integer> First frame to encode
--frames <integer> Maximum number of frames to encode
--level <string> Specify level (as defined by Annex A)
--quiet Quiet Mode
Filtering:
--vf, --video-filter <filter0>/<filter1>/... Apply video filtering to the input file
Filter options may be specified in <filter>:<option>=<value> format.
Available filters:
crop:left,top,right,bottom
select_every:step,offset1[,...]
</value></option></filter></filter1></filter0></string></integer></integer></string></integer></integer></integer></float></integer></string></integer></string></string></string></output></output></output></output></output></output>Thanks,
s