
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (27)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Création définitive du canal
12 mars 2010, parLorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
A la validation, vous recevez un email vous invitant donc à créer votre canal.
Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (2718)
-
FFMpeg drawtext not working in my code
10 juillet 2018, par DevI am trying to overlay a text on existing movie using
FFMpeg
. Overlaying of an image works perfectly, but failed to drawtext. My code is as below.GeneralUtils.checkForPermissionsMAndAbove(RecordVideo.this, true);
LoadJNI vk = new LoadJNI();
String[] complexCommand = {
"ffmpeg", "-i", FilePath, "-vf",
"drawtext=text='Hello Dev..':"
+ "fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th",
FilePath1};
vk.run(complexCommand , StorageDIR , getApplicationContext());I am getting error as follows
ffmpeg version git-2016-10-26-efa89a8 Copyright (c) 2000-2016 the FFmpeg
developers
built with gcc 4.9 (GCC) 20140827 (prerelease)
ffmpeg4android 3.22_00_full_LM322
libavutil 55. 35.100 / 55. 35.100
libavcodec 57. 65.100 / 57. 65.100
libavformat 57. 57.100 / 57. 57.100
libavdevice 57. 2.100 / 57. 2.100
libavfilter 6. 66.100 / 6. 66.100
libswscale 4. 3.100 / 4. 3.100
libswresample 2. 4.100 / 2. 4.100
libpostproc 54. 2.100 / 54. 2.100
Splitting the commandline.
Reading option '-i' ... matched as input file with argument '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi'.
Reading option '-vf' ... matched as option 'vf' (set video filters) with argument 'drawtext=text='Hello Dev..':fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th'.
Reading option '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi' ... matched as output file.
Finished splitting the commandline.
Parsing a group of options: global .
Successfully parsed a group of options.
Parsing a group of options: input file /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi.
Successfully parsed a group of options.
Opening an input file: /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi.
Setting default whitelist 'file,crypto'
Probing mov,mp4,m4a,3gp,3g2,mj2 score:100 size:2048
Probing mp3 score:1 size:2048
Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
type: 70797466 'ftyp' parent:'root' sz: 24 8 3096055
ISO: File Type Major Brand: mp42
type: 7461646d 'mdat' parent:'root' sz: 3093790 32 3096055
type: 766f6f6d 'moov' parent:'root' sz: 2241 3093822 3096055
type: 6468766d 'mvhd' parent:'moov' sz: 108 8 2233
time scale = 1000
type: 61746475 'udta' parent:'moov' sz: 52 116 2233
type: 4e4c4453 'SDLN' parent:'udta' sz: 16 8 44
type: 64726d73 'smrd' parent:'udta' sz: 16 24 44
type: 61746d73 'smta' parent:'udta' sz: 12 40 44
type: 6174656d 'meta' parent:'moov' sz: 119 168 2233
type: 726c6468 'hdlr' parent:'meta' sz: 33 8 111
ctype= (0x00000000)
stype= mdta
type: 7379656b 'keys' parent:'meta' sz: 43 41 111
type: 74736c69 'ilst' parent:'meta' sz: 35 84 111
type: 01000000 '' parent:'ilst' sz: 27 8 27
lang " " tag "com.android.version" value "7.0" atom "" 7 3
type: 6b617274 'trak' parent:'moov' sz: 1023 287 2233
type: 64686b74 'tkhd' parent:'trak' sz: 92 8 1015
type: 6169646d 'mdia' parent:'trak' sz: 923 100 1015
type: 6468646d 'mdhd' parent:'mdia' sz: 32 8 915
type: 726c6468 'hdlr' parent:'mdia' sz: 44 40 915
ctype= (0x00000000)
stype= vide
type: 666e696d 'minf' parent:'mdia' sz: 839 84 915
type: 64686d76 'vmhd' parent:'minf' sz: 20 8 831
type: 666e6964 'dinf' parent:'minf' sz: 36 28 831
type: 66657264 'dref' parent:'dinf' sz: 28 8 28
type url size 12
Unknown dref type 0x08206c7275 size 12
type: 6c627473 'stbl' parent:'minf' sz: 775 64 831
type: 64737473 'stsd' parent:'stbl' sz: 171 8 767
size=155 4CC= avc1/0x31637661 codec_type=0
type: 43637661 'avcC' parent:'stsd' sz: 34 8 69
type: 70736170 'pasp' parent:'stsd' sz: 16 42 69
type: 726c6f63 'colr' parent:'stsd' sz: 19 58 69
nclx: pri 1 trc 1 matrix 1 full 0
type: 73747473 'stts' parent:'stbl' sz: 304 179 767
track[0].stts.entries = 36
sample_count=1, sample_duration=5942
.
sample_count=2, sample_duration=5401
type: 73737473 'stss' parent:'stbl' sz: 24 483 767
keyframe_count = 2
type: 7a737473 'stsz' parent:'stbl' sz: 188 507 767
sample_size = 0 sample_count = 42
type: 63737473 'stsc' parent:'stbl' sz: 52 695 767
track[0].stsc.entries = 3
type: 6f637473 'stco' parent:'stbl' sz: 28 747 767
AVIndex stream 0, sample 0, offset 831f, dts 0, size 212746, distance 0, keyframe 1
.
AVIndex stream 0, sample 41, offset 2e28a5, dts 221935, size 68753, distance 11, keyframe 0
type: 6b617274 'trak' parent:'moov' sz: 931 1310 2233
type: 64686b74 'tkhd' parent:'trak' sz: 92 8 923
type: 6169646d 'mdia' parent:'trak' sz: 831 100 923
type: 6468646d 'mdhd' parent:'mdia' sz: 32 8 823
type: 726c6468 'hdlr' parent:'mdia' sz: 44 40 823
ctype= (0x00000000)
stype= soun
type: 666e696d 'minf' parent:'mdia' sz: 747 84 823
type: 64686d73 'smhd' parent:'minf' sz: 16 8 739
type: 666e6964 'dinf' parent:'minf' sz: 36 24 739
type: 66657264 'dref' parent:'dinf' sz: 28 8 28
type url size 12
Unknown dref type 0x08206c7275 size 12
type: 6c627473 'stbl' parent:'minf' sz: 687 60 739
type: 64737473 'stsd' parent:'stbl' sz: 91 8 679
size=75 4CC= mp4a/0x6134706d codec_type=1
audio channels 2
version =0, isom =1
type: 73647365 'esds' parent:'stsd' sz: 39 8 39
MPEG-4 description: tag=0x03 len=25
MPEG-4 description: tag=0x04 len=17
esds object type id 0x40
MPEG-4 description: tag=0x05 len=2
Specific MPEG-4 header len=2
mp4a config channels 2 obj 2 ext obj 0 sample rate 48000 ext sample rate 0
type: 73747473 'stts' parent:'stbl' sz: 32 99 679
track[1].stts.entries = 2
sample_count=1, sample_duration=1024
sample_count=113, sample_duration=1024
type: 7a737473 'stsz' parent:'stbl' sz: 476 131 679
sample_size = 0 sample_count = 114
type: 63737473 'stsc' parent:'stbl' sz: 52 607 679
track[1].stsc.entries = 3
type: 6f637473 'stco' parent:'stbl' sz: 28 659 679
AVIndex stream 1, sample 0, offset 20, dts 0, size 682, distance 0, keyframe 1
AVIndex stream 1, sample 1, offset 2ca, dts 1024, size 683, distance 0, keyframe 1
.
AVIndex stream 1, sample 99, offset 28939d, dts 101376, size 682, distance 0, keyframe 1
.
AVIndex stream 1, sample 113, offset 28b8f2, dts 115712, size 683, distance 0, keyframe 1
on_parse_exit_offset=3096055
rfps: 16.500000 0.014060
rfps: 16.583333 0.003530
rfps: 16.666667 0.000000
rfps: 16.666667 0.000000
rfps: 16.750000 0.003470
rfps: 16.750000 0.003470
rfps: 16.833333 0.013939
rfps: 50.000000 0.000001
rfps: 50.000000 0.000001
Before avformat_find_stream_info() pos: 3096055 bytes read:35009 seeks:1 nb_streams:2
nal_unit_type: 7, nal_ref_idc: 3
nal_unit_type: 8, nal_ref_idc: 3
stream 0, sample 0, dts 0
.
stream 1, sample 47, dts 1002667
nal_unit_type: 5, nal_ref_idc: 3
Reinit context to 1920x1088, pix_fmt: yuv420p
All info found
stream 0: start_time: 0.000 duration: 2.526
stream 1: start_time: 0.000 duration: 2.432
format: start_time: 0.000 duration: 2.520 bitrate=9828 kb/s
After avformat_find_stream_info() pos: 246313 bytes read:281290 seeks:2 frames:48
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2018-07-10T08:44:06.000000Z
com.android.version: 7.0
Duration: 00:00:02.52, start: 0.000000, bitrate: 9828 kb/s
Stream #0:0(eng), 1, 1/90000: Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 9551 kb/s, SAR 1:1 DAR 16:9, 16.63 fps, 16.67 tbr, 90k tbn, 180k tbc (default)
Metadata:
creation_time : 2018-07-10T08:44:06.000000Z
handler_name : VideoHandle
Stream #0:1(eng), 47, 1/48000: Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 256 kb/s (default)
Metadata:
creation_time : 2018-07-10T08:44:06.000000Z
handler_name : SoundHandle
Successfully opened the file.
Parsing a group of options: output file /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi.
Applying option vf (set video filters) with argument drawtext=text='Hello Dev..':fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th.
Successfully parsed a group of options.
Opening an output file: /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi.
Setting default whitelist 'file,crypto'
Successfully opened the file.
No such filter: 'drawtext'
Error opening filters!
Statistics: 0 seeks, 0 writeouts
Statistics: 281290 bytes read, 2 seeksCan anyone tell what I am doing wrong ? I want to do it in the simplest form.
-
FFMPEG : change input without stoping proccess
10 juillet 2018, par admin883how i can change input on ffmpeg without stop process on linux Debian 9 ?
im user decklink input and i need to change to file mp4 input.ffmpeg -f decklink -i 'DeckLink Mini Recorder' -vf setpts=PTS-STARTPTS -pix_fmt uyvy422 -s 1920x1080 -r 25000/1000 -f decklink 'DeckLink Mini Monitor'
-
Recording openCV processed frames on Android using CameraBridgeViewBase
16 août 2018, par Razvan IlinI basically want to do the same thing as here : Recording Live OpenCV Processing on Android
I am able to preview the camera using openCV’s JavaCameraView, process the frames (finding color blobs) and then I got stuck when I had to record the video as well.
I tried to implement the highest voted answer on the above question, but due to my limited knowledge with these technologies, I couldn’t understand exactly how to do it. Then I tried to use ffmpeg and used this as an example : https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java
The point where I got stuck here, was how to pass my
CameraBridgeViewBase
to theCameraView
that is used to record the frames. I’m pretty sure I need to somehow modify myonCreate()
method somehow, but I need some pointers on how to approach it.I’ll paste my code below :
OpenCVActivity.java :
package ch.hepia.iti.opencvnativeandroidstudio;
import android.Manifest;
import android.content.Context;
import android.content.pm.PackageManager;
import android.hardware.Camera;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.media.MediaScannerConnection;
import android.os.Bundle;
import android.os.Environment;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import android.widget.Toast;
import org.bytedeco.javacpp.avutil;
import org.bytedeco.javacpp.opencv_core;
import org.bytedeco.javacv.FFmpegFrameFilter;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import org.bytedeco.javacv.FrameFilter;
import org.bytedeco.javacv.OpenCVFrameConverter;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import org.opencv.core.Scalar;
import java.io.File;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
public class OpenCVActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2, View.OnClickListener {
private String ffmpeg_link = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test2Video.mp4";
private String LOG_TAG = "VideoTest";
private static final String TAG = "OCVSample::Activity";
private CameraBridgeViewBase _cameraBridgeViewBase;
private Button recordButton;
long startTime = 0;
boolean recording = false;
private FFmpegFrameRecorder recorder;
private boolean isPreviewOn = false;
/*Filter information, change boolean to true if adding a fitler*/
private boolean addFilter = true;
private String filterString = "";
FFmpegFrameFilter filter;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;
private Frame yuvImage = null;
/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
/* The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 0;
Frame[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;
private BaseLoaderCallback _baseLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS: {
Log.i(TAG, "OpenCV loaded successfully");
// Load ndk built module, as specified in moduleName in build.gradle
// after opencv initialization
System.loadLibrary("native-lib");
_cameraBridgeViewBase.enableView();
}
break;
default: {
super.onManagerConnected(status);
}
}
}
};
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_main);
// Permissions for Android 6+
ActivityCompat.requestPermissions(OpenCVActivity.this,
new String[]{Manifest.permission.CAMERA},
1);
_cameraBridgeViewBase = (CameraBridgeViewBase) findViewById(R.id.main_surface);
_cameraBridgeViewBase.setVisibility(SurfaceView.VISIBLE);
_cameraBridgeViewBase.setCvCameraViewListener(this);
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(_cameraBridgeViewBase, layoutParam);
recordButton = (Button) findViewById(R.id.recorder_control);
recordButton.setText("Start");
recordButton.setOnClickListener(this);
/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.w(LOG_TAG,"init recorder");
if (RECORD_LENGTH > 0) {
imagesIndex = 0;
images = new Frame[RECORD_LENGTH * frameRate];
timestamps = new long[images.length];
for (int i = 0; i < images.length; i++) {
images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
timestamps[i] = -1;
}
} else if (yuvImage == null) {
yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
Log.i(LOG_TAG, "create yuvImage");
}
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
// The filterString is any ffmpeg filter.
// Here is the link for a list: https://ffmpeg.org/ffmpeg-filters.html
filterString = "transpose=0";
filter = new FFmpegFrameFilter(filterString, imageWidth, imageHeight);
//default format on android
filter.setPixelFormat(avutil.AV_PIX_FMT_NV21);
Log.i(LOG_TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}
public void startRecording() {
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
if(addFilter) {
filter.start();
}
} catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
// reset interrupt to be nice
Thread.currentThread().interrupt();
return;
}
audioRecordRunnable = null;
audioThread = null;
if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG,"Writing frames");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
long t = timestamps[i % timestamps.length] - startTime;
if (t >= 0) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(images[i % images.length]);
}
}
firstIndex = samplesIndex % samples.length;
lastIndex = (samplesIndex - 1) % samples.length;
if (samplesIndex <= samples.length) {
firstIndex = 0;
lastIndex = samplesIndex - 1;
}
if (lastIndex < firstIndex) {
lastIndex += samples.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
recorder.recordSamples(samples[i % samples.length]);
}
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
filter.stop();
filter.release();
} catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (RECORD_LENGTH > 0) {
samplesIndex = 0;
samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
for (int i = 0; i < samples.length; i++) {
samples[i] = ShortBuffer.allocate(bufferSize);
}
} else {
audioData = ShortBuffer.allocate(bufferSize);
}
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
if (RECORD_LENGTH > 0) {
audioData = samples[samplesIndex++ % samples.length];
audioData.position(0).limit(0);
}
//Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
if (RECORD_LENGTH <= 0) try {
recorder.recordSamples(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraView(Context context, Camera camera) {
super(context);
Log.w("camera","camera view");
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
stopPreview();
Camera.Parameters camParams = mCamera.getParameters();
List sizes = camParams.getSupportedPreviewSizes();
// Sort the list in ascending order
Collections.sort(sizes, new Comparator() {
public int compare(final Camera.Size a, final Camera.Size b) {
return a.width * a.height - b.width * b.height;
}
});
// Pick the first preview size that is equal or bigger, or pick the last (biggest) option if we cannot
// reach the initial settings of imageWidth/imageHeight.
for (int i = 0; i < sizes.size(); i++) {
if ((sizes.get(i).width >= imageWidth && sizes.get(i).height >= imageHeight) || i == sizes.size() - 1) {
imageWidth = sizes.get(i).width;
imageHeight = sizes.get(i).height;
Log.v(LOG_TAG, "Changed to supported resolution: " + imageWidth + "x" + imageHeight);
break;
}
}
camParams.setPreviewSize(imageWidth, imageHeight);
Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
camParams.setPreviewFrameRate(frameRate);
Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
mCamera.setParameters(camParams);
// Set the holder (which might have changed) again
try {
mCamera.setPreviewDisplay(holder);
mCamera.setPreviewCallback(CameraView.this);
startPreview();
} catch (Exception e) {
Log.e(LOG_TAG, "Could not set preview display in surfaceChanged");
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}
public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}
public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
yuvImage = images[i];
timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
}
/* get video data */
if (yuvImage != null && recording) {
((ByteBuffer)yuvImage.image[0].position(0)).put(data);
if (RECORD_LENGTH <= 0) try {
Log.v(LOG_TAG,"Writing Frame");
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
if(addFilter) {
filter.push(yuvImage);
Frame frame2;
while ((frame2 = filter.pull()) != null) {
recorder.record(frame2, filter.getPixelFormat());
}
} else {
recorder.record(yuvImage);
}
} catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
recordButton.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
recordButton.setText("Start");
}
}
@Override
public void onPause() {
super.onPause();
disableCamera();
}
@Override
public void onResume() {
super.onResume();
if (!OpenCVLoader.initDebug()) {
Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_0_0, this, _baseLoaderCallback);
} else {
Log.d(TAG, "OpenCV library found inside package. Using it!");
_baseLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
switch (requestCode) {
case 1: {
// If request is cancelled, the result arrays are empty.
if (grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED) {
// permission was granted, yay! Do the
// contacts-related task you need to do.
} else {
// permission denied, boo! Disable the
// functionality that depends on this permission.
Toast.makeText(OpenCVActivity.this, "Permission denied to read your External storage", Toast.LENGTH_SHORT).show();
}
return;
}
// other 'case' lines to check for other
// permissions this app might request
}
}
public void onDestroy() {
super.onDestroy();
disableCamera();
}
public void disableCamera() {
if (_cameraBridgeViewBase != null)
_cameraBridgeViewBase.disableView();
}
public void onCameraViewStarted(int width, int height) {
setup();
}
public void onCameraViewStopped() {
}
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
Mat rgba = inputFrame.rgba();
salt(rgba.getNativeObjAddr());
return rgba;
}
public native void salt(long matAddrGray);
public native void setup();
}activity_main.xml :
<?xml version="1.0" encoding="utf-8"?>
<linearlayout>
<button></button>
</linearlayout>Could really use some help with this. Has anyone done this before ?