
Recherche avancée
Autres articles (55)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (4910)
-
FFMPEG : change input without stoping proccess
10 juillet 2018, par admin883how i can change input on ffmpeg without stop process on linux Debian 9 ?
im user decklink input and i need to change to file mp4 input.ffmpeg -f decklink -i 'DeckLink Mini Recorder' -vf setpts=PTS-STARTPTS -pix_fmt uyvy422 -s 1920x1080 -r 25000/1000 -f decklink 'DeckLink Mini Monitor'
-
FFMpeg drawtext not working in my code
10 juillet 2018, par DevI am trying to overlay a text on existing movie using
FFMpeg
. Overlaying of an image works perfectly, but failed to drawtext. My code is as below.GeneralUtils.checkForPermissionsMAndAbove(RecordVideo.this, true);
LoadJNI vk = new LoadJNI();
String[] complexCommand = {
"ffmpeg", "-i", FilePath, "-vf",
"drawtext=text='Hello Dev..':"
+ "fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th",
FilePath1};
vk.run(complexCommand , StorageDIR , getApplicationContext());I am getting error as follows
ffmpeg version git-2016-10-26-efa89a8 Copyright (c) 2000-2016 the FFmpeg
developers
built with gcc 4.9 (GCC) 20140827 (prerelease)
ffmpeg4android 3.22_00_full_LM322
libavutil 55. 35.100 / 55. 35.100
libavcodec 57. 65.100 / 57. 65.100
libavformat 57. 57.100 / 57. 57.100
libavdevice 57. 2.100 / 57. 2.100
libavfilter 6. 66.100 / 6. 66.100
libswscale 4. 3.100 / 4. 3.100
libswresample 2. 4.100 / 2. 4.100
libpostproc 54. 2.100 / 54. 2.100
Splitting the commandline.
Reading option '-i' ... matched as input file with argument '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi'.
Reading option '-vf' ... matched as option 'vf' (set video filters) with argument 'drawtext=text='Hello Dev..':fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th'.
Reading option '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi' ... matched as output file.
Finished splitting the commandline.
Parsing a group of options: global .
Successfully parsed a group of options.
Parsing a group of options: input file /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi.
Successfully parsed a group of options.
Opening an input file: /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi.
Setting default whitelist 'file,crypto'
Probing mov,mp4,m4a,3gp,3g2,mj2 score:100 size:2048
Probing mp3 score:1 size:2048
Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
type: 70797466 'ftyp' parent:'root' sz: 24 8 3096055
ISO: File Type Major Brand: mp42
type: 7461646d 'mdat' parent:'root' sz: 3093790 32 3096055
type: 766f6f6d 'moov' parent:'root' sz: 2241 3093822 3096055
type: 6468766d 'mvhd' parent:'moov' sz: 108 8 2233
time scale = 1000
type: 61746475 'udta' parent:'moov' sz: 52 116 2233
type: 4e4c4453 'SDLN' parent:'udta' sz: 16 8 44
type: 64726d73 'smrd' parent:'udta' sz: 16 24 44
type: 61746d73 'smta' parent:'udta' sz: 12 40 44
type: 6174656d 'meta' parent:'moov' sz: 119 168 2233
type: 726c6468 'hdlr' parent:'meta' sz: 33 8 111
ctype= (0x00000000)
stype= mdta
type: 7379656b 'keys' parent:'meta' sz: 43 41 111
type: 74736c69 'ilst' parent:'meta' sz: 35 84 111
type: 01000000 '' parent:'ilst' sz: 27 8 27
lang " " tag "com.android.version" value "7.0" atom "" 7 3
type: 6b617274 'trak' parent:'moov' sz: 1023 287 2233
type: 64686b74 'tkhd' parent:'trak' sz: 92 8 1015
type: 6169646d 'mdia' parent:'trak' sz: 923 100 1015
type: 6468646d 'mdhd' parent:'mdia' sz: 32 8 915
type: 726c6468 'hdlr' parent:'mdia' sz: 44 40 915
ctype= (0x00000000)
stype= vide
type: 666e696d 'minf' parent:'mdia' sz: 839 84 915
type: 64686d76 'vmhd' parent:'minf' sz: 20 8 831
type: 666e6964 'dinf' parent:'minf' sz: 36 28 831
type: 66657264 'dref' parent:'dinf' sz: 28 8 28
type url size 12
Unknown dref type 0x08206c7275 size 12
type: 6c627473 'stbl' parent:'minf' sz: 775 64 831
type: 64737473 'stsd' parent:'stbl' sz: 171 8 767
size=155 4CC= avc1/0x31637661 codec_type=0
type: 43637661 'avcC' parent:'stsd' sz: 34 8 69
type: 70736170 'pasp' parent:'stsd' sz: 16 42 69
type: 726c6f63 'colr' parent:'stsd' sz: 19 58 69
nclx: pri 1 trc 1 matrix 1 full 0
type: 73747473 'stts' parent:'stbl' sz: 304 179 767
track[0].stts.entries = 36
sample_count=1, sample_duration=5942
.
sample_count=2, sample_duration=5401
type: 73737473 'stss' parent:'stbl' sz: 24 483 767
keyframe_count = 2
type: 7a737473 'stsz' parent:'stbl' sz: 188 507 767
sample_size = 0 sample_count = 42
type: 63737473 'stsc' parent:'stbl' sz: 52 695 767
track[0].stsc.entries = 3
type: 6f637473 'stco' parent:'stbl' sz: 28 747 767
AVIndex stream 0, sample 0, offset 831f, dts 0, size 212746, distance 0, keyframe 1
.
AVIndex stream 0, sample 41, offset 2e28a5, dts 221935, size 68753, distance 11, keyframe 0
type: 6b617274 'trak' parent:'moov' sz: 931 1310 2233
type: 64686b74 'tkhd' parent:'trak' sz: 92 8 923
type: 6169646d 'mdia' parent:'trak' sz: 831 100 923
type: 6468646d 'mdhd' parent:'mdia' sz: 32 8 823
type: 726c6468 'hdlr' parent:'mdia' sz: 44 40 823
ctype= (0x00000000)
stype= soun
type: 666e696d 'minf' parent:'mdia' sz: 747 84 823
type: 64686d73 'smhd' parent:'minf' sz: 16 8 739
type: 666e6964 'dinf' parent:'minf' sz: 36 24 739
type: 66657264 'dref' parent:'dinf' sz: 28 8 28
type url size 12
Unknown dref type 0x08206c7275 size 12
type: 6c627473 'stbl' parent:'minf' sz: 687 60 739
type: 64737473 'stsd' parent:'stbl' sz: 91 8 679
size=75 4CC= mp4a/0x6134706d codec_type=1
audio channels 2
version =0, isom =1
type: 73647365 'esds' parent:'stsd' sz: 39 8 39
MPEG-4 description: tag=0x03 len=25
MPEG-4 description: tag=0x04 len=17
esds object type id 0x40
MPEG-4 description: tag=0x05 len=2
Specific MPEG-4 header len=2
mp4a config channels 2 obj 2 ext obj 0 sample rate 48000 ext sample rate 0
type: 73747473 'stts' parent:'stbl' sz: 32 99 679
track[1].stts.entries = 2
sample_count=1, sample_duration=1024
sample_count=113, sample_duration=1024
type: 7a737473 'stsz' parent:'stbl' sz: 476 131 679
sample_size = 0 sample_count = 114
type: 63737473 'stsc' parent:'stbl' sz: 52 607 679
track[1].stsc.entries = 3
type: 6f637473 'stco' parent:'stbl' sz: 28 659 679
AVIndex stream 1, sample 0, offset 20, dts 0, size 682, distance 0, keyframe 1
AVIndex stream 1, sample 1, offset 2ca, dts 1024, size 683, distance 0, keyframe 1
.
AVIndex stream 1, sample 99, offset 28939d, dts 101376, size 682, distance 0, keyframe 1
.
AVIndex stream 1, sample 113, offset 28b8f2, dts 115712, size 683, distance 0, keyframe 1
on_parse_exit_offset=3096055
rfps: 16.500000 0.014060
rfps: 16.583333 0.003530
rfps: 16.666667 0.000000
rfps: 16.666667 0.000000
rfps: 16.750000 0.003470
rfps: 16.750000 0.003470
rfps: 16.833333 0.013939
rfps: 50.000000 0.000001
rfps: 50.000000 0.000001
Before avformat_find_stream_info() pos: 3096055 bytes read:35009 seeks:1 nb_streams:2
nal_unit_type: 7, nal_ref_idc: 3
nal_unit_type: 8, nal_ref_idc: 3
stream 0, sample 0, dts 0
.
stream 1, sample 47, dts 1002667
nal_unit_type: 5, nal_ref_idc: 3
Reinit context to 1920x1088, pix_fmt: yuv420p
All info found
stream 0: start_time: 0.000 duration: 2.526
stream 1: start_time: 0.000 duration: 2.432
format: start_time: 0.000 duration: 2.520 bitrate=9828 kb/s
After avformat_find_stream_info() pos: 246313 bytes read:281290 seeks:2 frames:48
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/ReporterLive/VID_10072018141402.avi':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2018-07-10T08:44:06.000000Z
com.android.version: 7.0
Duration: 00:00:02.52, start: 0.000000, bitrate: 9828 kb/s
Stream #0:0(eng), 1, 1/90000: Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 9551 kb/s, SAR 1:1 DAR 16:9, 16.63 fps, 16.67 tbr, 90k tbn, 180k tbc (default)
Metadata:
creation_time : 2018-07-10T08:44:06.000000Z
handler_name : VideoHandle
Stream #0:1(eng), 47, 1/48000: Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 256 kb/s (default)
Metadata:
creation_time : 2018-07-10T08:44:06.000000Z
handler_name : SoundHandle
Successfully opened the file.
Parsing a group of options: output file /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi.
Applying option vf (set video filters) with argument drawtext=text='Hello Dev..':fontsize=24:fontfile=/system/fonts/DroidSans.ttf:box=1:boxcolor=black@0.5:x=w-tw:y=h-th.
Successfully parsed a group of options.
Opening an output file: /storage/emulated/0/Pictures/ReporterLive/VID_10072018141402_1.avi.
Setting default whitelist 'file,crypto'
Successfully opened the file.
No such filter: 'drawtext'
Error opening filters!
Statistics: 0 seeks, 0 writeouts
Statistics: 281290 bytes read, 2 seeksCan anyone tell what I am doing wrong ? I want to do it in the simplest form.
-
Is there a case where I can not create a file if I run a lot of threads ? ( feat. Ffmpeg, thread)
26 juin 2018, par JunburgI am now creating an app that will duet with the singer. We use a thread (mAudioPlayer) that outputs background music, a vAudioPlayer that prints the voice of the singer, a mRecordThread that records the voice of the user, and a thread that weaves and attaches each file (mp3ConcatThread).
It works by stopping the singer’s voice while recording background music and recording by the user. Of course, when the user does not record, the singer’s voice is output. In this way, each section must be made into an mp3 file and merged into a single file. However, it often happens that files that do not record and merge properly are created.
Audio processing is done using Ffmpeg. I guess the following error might be the reason.
06-26 21:37:11.084 13017-13017/com.softcode.kihnoplay I/Choreographer: Skipped 72 frames! The application may be doing too much work on its main thread.
Could not generate a file because of this kind of error ?
If you know the answer to this question, please answer. Thank you.
Below are related codes. For more information, please leave a comment.
Because the code is too long, I’ve included only the code that looks like it’s needed.Record Thread.class
public class Record_Thread {
private static final String LOG_TAG = Record_Thread.class.getSimpleName();
private static final int SAMPLE_RATE = 44100;
private int bufferSize = 0;
private String currentOutFile = null;
private Context context;
byte RECORDER_BPP = 16;
public Record_Thread(Record_interface listener) {
mListener = listener;
Player.currentCreateFileName = SmcInfo.APPDIRPATH + "/ucc/" + Player.getCurrentTime(false);
currentOutFile = Player.currentCreateFileName + ".pcm";
}
public Record_Thread(Record_interface listener, Context context) {
mListener = listener;
RecordActivity.currentCreateFileName = SmcInfo.APPDIRPATH + "/ucc/" + RecordActivity.getCurrentTime(false);
currentOutFile = RecordActivity.currentCreateFileName + ".pcm";
this.context = context;
}
private boolean isSampleTranspo;
private boolean isRecording;
public boolean isSharding = false;
private Record_interface mListener;
private Thread mThread;
public boolean recording() {
return mThread != null;
}
public void setSampleTranspo(boolean booleanValue) {
this.isSampleTranspo = booleanValue;
}
public boolean getSampleTranspo() {
return this.isSampleTranspo;
}
long startpoint = 0;
boolean posWrite = false;
public void startRecording() {
if (mThread != null)
return;
isRecording = true;
mThread = new Thread(new Runnable() {
@Override
public void run() {
record();
}
});
mThread.start();
}
public void stopRecording() {
if (mThread == null)
return;
isRecording = false;
mThread = null;
posWrite = false;
startpoint = 0;
}
public void startFileWrite(long startpoint) {
this.startpoint = startpoint;
this.posWrite = true;
}
public void stopFileWrite() {
this.posWrite = false;
}
private void record() {
try {
Log.v(LOG_TAG, "Start");
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_AUDIO);
bufferSize = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT);
if (bufferSize == AudioRecord.ERROR || bufferSize == AudioRecord.ERROR_BAD_VALUE) {
bufferSize = SAMPLE_RATE * 2;
}
short[] audioBuffer = new short[bufferSize];
short[] audioZero = new short[bufferSize];
AudioRecord record = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (record.getState() != AudioRecord.STATE_INITIALIZED) {
Log.e(LOG_TAG, "Audio Record can't initialize!");
return;
}
record.startRecording();
Log.v(LOG_TAG, "Start recording");
long shortsRead = 0;
int readsize = 0;
File tempFile = new File(currentOutFile);
if (tempFile.exists())
tempFile.delete();
FileOutputStream fos = new FileOutputStream(currentOutFile);
byte[] audiodata = new byte[bufferSize];
while (isRecording && record != null) {
readsize = record.read(audiodata, 0, audiodata.length);
if (AudioRecord.ERROR_INVALID_OPERATION != readsize && fos != null) {
try {
if (readsize > 0 && readsize <= audiodata.length) {
fos.write(audiodata, 0, readsize);//TypeCast.shortToByte(audioBuffer)
fos.flush();
}
} catch (Exception ex) {
Log.e("AudioRecorder", ex.getMessage());
}
}
ShortBuffer sb = ByteBuffer.wrap(audiodata).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
if (isSampleTranspo) {
mListener.onAudioDataReceived(samples);
} else {
mListener.onAudioDataReceived(audioZero);
}
if (posWrite) {
FileOutputStream pos = null;
try {
if (startpoint > 0) {
if (context instanceof RecordActivity) {
pos = new FileOutputStream(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(String.valueOf((int) (startpoint / 1000)), false) + "_uv.pcm", true);/////파일에 이어서 쓰기
Log.d(TAG, "record: " + pos.toString());
} else {
pos = new FileOutputStream(Player.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(String.valueOf((int) (startpoint / 1000)), false) + "_uv.pcm", true);/////파일에 이어서 쓰기
}
}
pos.write(audiodata);
pos.flush();
} catch (Exception e) {
e.printStackTrace();
} finally {
pos.close();
pos = null;
}
}
}
if (fos != null)
fos.close();
mListener.onRecordEnd();
record.stop();
record.release();
} catch (IOException e) {
Log.e("AudioRecorder", e.getMessage());
}
}
private String caltime(String sMillis, boolean timeFormat) {
double dMillis = 0;
int minutes = 0;
int seconds = 0;
int millis = 0;
String sTime;
try {
dMillis = Double.parseDouble(sMillis);
} catch (Exception e) {
System.out.println(e.getMessage());
}
seconds = (int) (dMillis / 1000) % 60;
millis = (int) (dMillis % 1000);
if (seconds > 0) {
minutes = (int) (dMillis / 1000 / 60) % 60;
if (minutes > 0) {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", minutes, seconds, millis);
else
sTime = String.format("%02d%02d%d", minutes, seconds, millis);
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, seconds, millis);
else
sTime = String.format("%02d%02d%d", 0, seconds, millis);
}
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, 0, millis);
else
sTime = String.format("%02d%02d%d", 0, 0, millis);
}
return sTime;
}
}RecordActivity.class
public class RecordActivity extends AppCompatActivity implements Player_interface, SeekBar.OnSeekBarChangeListener {
private static final String TAG = "RecordActivity";
public Context context = this;
private LinearLayout recordLayout;
private RelativeLayout recordBtn, saveBtn;
private CircleImageView userImg, artistImg;
private TextView songTitleTxt, playTimeTxt, progressTimeTxt;
private BlurBitmap blurBitmap;
private SeekBar seekBar;
private ImageView micBg1, micBg2;
private String assPath;
private String ampPath;
private int deviceWidth, deviceHeight;
public static AssRenderView assView;
public static LinearLayout lyricsLayout;
public static int lyricsWidth, lyricsHeight, layoutWidth;
public static LinearLayout.LayoutParams assViewParams;
public static String currentCreateFileName = null;
public static String mrPath;
public static String voicePath;
private String recMusicPath;
Player_Thread mAudioPlayer = null, vAudioPlayer = null, testPlayer = null;
private Record_Thread mRecordThread;
public static Mp3Concat_Thread mMp3ConcatThread;
long lastDuration = 0L;
private boolean isSeekbarTouch = false;
private ArrayList<long> combineList;
CNetProgressdialog createMp3Dialog;
int bufferSize = 7104;
int SAMPLE_RATE = 44100;
int RECORDER_SAMPLERATE = 44100;
byte RECORDER_BPP = 16;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
removeDir(SmcInfo.APPDIRPATH + "/tmp");
setContentView(R.layout.activity_record_phone);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
Window window = getWindow();
window.addFlags(WindowManager.LayoutParams.FLAG_LAYOUT_NO_LIMITS);
window.addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION);
}
recordLayout = (LinearLayout) findViewById(R.id.record_layout);
userImg = (CircleImageView) findViewById(R.id.user_img);
artistImg = (CircleImageView) findViewById(R.id.artist_img);
songTitleTxt = (TextView) findViewById(R.id.song_title_txt);
progressTimeTxt = (TextView) findViewById(R.id.progress_time_txt);
playTimeTxt = (TextView) findViewById(R.id.play_time_txt);
recordBtn = (RelativeLayout) findViewById(R.id.record_btn);
saveBtn = (RelativeLayout) findViewById(R.id.save_btn);
seekBar = (SeekBar) findViewById(R.id.seek_bar);
micBg1 = (ImageView) findViewById(R.id.mic_bg_small);
micBg2 = (ImageView) findViewById(R.id.mic_bg_big);
createMp3Dialog = new CNetProgressdialog(this);
GradientDrawable drawable = new GradientDrawable();
drawable.setColors(new int[]{
Color.parseColor("#32c49b"),
Color.parseColor("#19b2c3")
});
Intent intent = getIntent();
final String artistImgPath = intent.getStringExtra("artistImgPath");
final String songTitle = intent.getStringExtra("songTitle");
assPath = intent.getStringExtra("assPath");
ampPath = intent.getStringExtra("ampPath");
String playTime = intent.getStringExtra("playTime");
blurBitmap = new BlurBitmap();
songTitleTxt.setText(songTitle);
playTimeTxt.setText(playTime);
final Bitmap artistImgBitmap = blurBitmap.toBitmap(artistImgPath);
final Bitmap userImgBitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.dummy_artist_2);
final Bitmap userBlurImg = blurBitmap.blurRenderScript(this, userImgBitmap, 25);
final Bitmap artistBlurImg = blurBitmap.blurRenderScript(this, artistImgBitmap, 25);
artistImg.setImageBitmap(artistImgBitmap);
userImg.setImageBitmap(userBlurImg);
drawable.setGradientType(GradientDrawable.LINEAR_GRADIENT);
drawable.setOrientation(GradientDrawable.Orientation.TOP_BOTTOM);
recordLayout.setBackground(drawable);
play(ampToMp3(ampPath));
mRecordThread = new Record_Thread(new Record_interface() {
@Override
public void onAudioDataReceived(short[] data) {
}
@Override
public void onRecordEnd() {
}
}, context);
mMp3ConcatThread = new Mp3Concat_Thread(new Mp3Concat_interface() {
@Override
public void onAudioDataReceived(short[] data) {
}
@Override
public void onRecordEnd() {
createMp3Dialog.dismiss();
startPrelisteningActivity(recMusicPath, songTitle);
}
}, this);
if (!mRecordThread.recording()) {
mRecordThread.startRecording();
}
final Animation animZoomIn = AnimationUtils.loadAnimation(this, R.anim.zoom_in);
final Animation animZoomOut = AnimationUtils.loadAnimation(this, R.anim.zoom_out);
final Animation animMic1 = AnimationUtils.loadAnimation(this, R.anim.bg_mic_anim_1_phone);
final Animation animMic2 = AnimationUtils.loadAnimation(this, R.anim.bg_mic_anim_2_phone);
artistImg.startAnimation(animZoomIn);
combineList = new ArrayList<long>();
recordBtn.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
switch (motionEvent.getAction()) {
case MotionEvent.ACTION_DOWN: {
long currentDuration = vAudioPlayer.getCurrentDuration();
// 녹음 시작 ( combineList 사이즈가 짝수일 때 )
if (mRecordThread != null) {
if (combineList.size() % 2 == 0) {
mRecordThread.startFileWrite(currentDuration);
combineList.add(currentDuration);
}
vAudioPlayer.setSampleTranspo(true);
mRecordThread.setSampleTranspo(true);
}
}
micBg1.setVisibility(View.VISIBLE);
micBg2.setVisibility(View.VISIBLE);
micBg1.startAnimation(animMic1);
micBg2.startAnimation(animMic2);
userImg.setImageBitmap(userImgBitmap);
userImg.startAnimation(animZoomIn);
artistImg.setImageBitmap(artistBlurImg);
artistImg.startAnimation(animZoomOut);
break;
case MotionEvent.ACTION_UP: {
long currentDuration = vAudioPlayer.getCurrentDuration();
if (mRecordThread != null) {
if (combineList.size() % 2 == 1) {
mRecordThread.startRecording();
mRecordThread.stopFileWrite();
File waveFile = new File(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/")
+ "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_uv.pcm");
if (waveFile.exists()) {
copyWaveFile(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_uv.pcm",
RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_u0.wav");
Log.d(TAG, "onTouch: " + currentCreateFileName);
if (mMp3ConcatThread != null) {
mMp3ConcatThread.startCombine(null, 3333333333333333333L, combineList.get(combineList.size() - 1), currentDuration);
}
}
combineList.add(currentDuration);
Log.d(TAG, "onTouch: " + combineList.size());
if (combineList.size() == 2) {
mMp3ConcatThread.startCombine(null, 0, combineList.get(combineList.size() - 2), currentDuration);
} else {
mMp3ConcatThread.startCombine(null, combineList.get(combineList.size() - 3), combineList.get(combineList.size() - 2), currentDuration);
}
}
vAudioPlayer.setSampleTranspo(false);
mRecordThread.setSampleTranspo(false);
}
}
micBg1.setVisibility(View.GONE);
micBg2.setVisibility(View.GONE);
micBg1.clearAnimation();
micBg2.clearAnimation();
userImg.setImageBitmap(userBlurImg);
userImg.startAnimation(animZoomOut);
artistImg.setImageBitmap(artistImgBitmap);
artistImg.startAnimation(animZoomIn);
break;
}
return false;
}
});
saveBtn.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
createMp3Dialog.show();
vAudioPlayer.setSampleTranspo(true);
vAudioPlayer.setlistenerStop(true);
if (assView != null)
assView.Destroy();
if (lyricsLayout != null) {
lyricsLayout.removeAllViews();
}
seekBar.setProgress(0);
seekBar.setMax(100);
Log.d(TAG, "donep3: " + "done");
if (mMp3ConcatThread != null) {
try {
mMp3ConcatThread.startCombine(combineList, 7777777777777777777L, combineList.get(combineList.size() - 1), lastDuration);
} catch (ArrayIndexOutOfBoundsException e) {
e.getMessage();
finish();
}
}
releaseAudioPlayer();
recMusicPath = SmcInfo.APPDIRPATH + "/ucc/" + currentCreateFileName.substring(currentCreateFileName.lastIndexOf('/') + 1, currentCreateFileName.length()) + ".mp3";
}
});
DisplayMetrics displayMetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
deviceWidth = displayMetrics.widthPixels;
deviceHeight = displayMetrics.heightPixels;
lyricsWidth = deviceWidth;
lyricsHeight = deviceHeight;
Log.d(TAG, "onCreate: " + lyricsWidth + "/" + lyricsHeight);
layoutWidth = lyricsWidth * 2 / 3;
int parentAssViewHeight = ((lyricsHeight * 50) / 91) - 2;
if (layoutWidth > parentAssViewHeight)
layoutWidth = (parentAssViewHeight * 8) / 10;
assViewParams = new LinearLayout.LayoutParams(new ViewGroup.LayoutParams(layoutWidth * 2, layoutWidth));
assViewParams.gravity = Gravity.CENTER;
lyricsLayout = (LinearLayout)
findViewById(R.id.lyrics_layout);
if (assView != null) {
assView.Destroy();
}
if (lyricsLayout != null) {
lyricsLayout.removeAllViews();
}
assView = new AssRenderView(getApplicationContext(), layoutWidth * 13 / 10, layoutWidth);
File assFile = new File(assPath);
if (assFile.exists()) {
assView.ReadASSFile(assFile.toString(), true, layoutWidth * 2, layoutWidth * 5 / 7);
}
lyricsLayout.addView(assView, assViewParams);
lyricsLayout.setGravity(Gravity.CENTER);
assView.ShowASS(true);
seekBar.setOnSeekBarChangeListener(this);
seekBar.setProgress(0);
seekBar.setMax(100);
}
private void startPrelisteningActivity(String recMusicPath, String songTitle) {
Intent intent = new Intent(RecordActivity.this, PrelisteningActivity.class);
intent.putExtra("recMusicPath", recMusicPath);
intent.putExtra("songTitle", songTitle);
startActivityForResult(intent, 1);
}
private String[] ampToMp3(String ampPath) {
String[] pathArray = new String[2];
try {
File ampFile = new File(ampPath);
String ampName = ampFile.getName();
int size;
BufferedInputStream buf = null;
FileInputStream fis = null;
size = (int) ampFile.length();
byte[] bytes = new byte[size];
fis = new FileInputStream(ampFile);
buf = new BufferedInputStream(fis, 8 * 1024);
buf.read(bytes, 0, bytes.length);
byte[] vocalbytes = AMPFileUtility.getByteData(bytes, "voice");
byte[] mrbytes = AMPFileUtility.getByteData(bytes, "mr");
voicePath = SmcInfo.APPDIRPATH + "/audio/" + ampName.replaceAll(".amp", "") + "_voice.mp3";
mrPath = SmcInfo.APPDIRPATH + "/audio/" + ampName.replaceAll(".amp", "") + "_mr.mp3";
BufferedOutputStream bosVocal = new BufferedOutputStream(new FileOutputStream(voicePath));
bosVocal.write(vocalbytes);
bosVocal.flush();
bosVocal.close();
BufferedOutputStream bosMr = new BufferedOutputStream(new FileOutputStream(mrPath));
bosMr.write(mrbytes);
bosMr.flush();
bosMr.close();
} catch (Exception e) {
e.getMessage();
}
pathArray[0] = voicePath;
pathArray[1] = mrPath;
return pathArray;
}
private void play(String[] pathArray) {
releaseAudioPlayer();
String voicePath = pathArray[0];
String mrPath = pathArray[1];
mAudioPlayer = new Player_Thread();
mAudioPlayer.setOnAudioStreamInterface(this);
mAudioPlayer.setUrlString(mrPath);
mAudioPlayer.setlistenerStop(true);
vAudioPlayer = new Player_Thread();
vAudioPlayer.setOnAudioStreamInterface(this);
vAudioPlayer.setUrlString(voicePath);
vAudioPlayer.setlistenerStop(false);
try {
mAudioPlayer.play();
vAudioPlayer.play();
} catch (IOException e) {
e.printStackTrace();
}
}
private void releaseAudioPlayer() {
if (mAudioPlayer != null) {
mAudioPlayer.stop();
mAudioPlayer.release();
mAudioPlayer = null;
}
if (vAudioPlayer != null) {
vAudioPlayer.stop();
vAudioPlayer.release();
vAudioPlayer = null;
}
if (mRecordThread != null) {
mRecordThread.stopRecording();
}
}
public static String getCurrentTime(boolean dateForm) {
SimpleDateFormat dateFormat;
if (dateForm)
dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS"); //SSS가 밀리세컨드 표시
else
dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmssSSS");
Calendar calendar = Calendar.getInstance();
return dateFormat.format(calendar.getTime());
}
private String caltime(long sMillis, boolean timeFormat) {
double dMillis = 0;
int minutes = 0;
int seconds = 0;
int millis = 0;
String sTime;
try {
dMillis = Double.parseDouble(String.valueOf(sMillis));
} catch (Exception e) {
System.out.println(e.getMessage());
}
seconds = (int) (dMillis / 1000) % 60;
millis = (int) (dMillis % 1000);
if (seconds > 0) {
minutes = (int) (dMillis / 1000 / 60) % 60;
if (minutes > 0) {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", minutes, seconds, millis);
else
sTime = String.format("%02d%02d%d", minutes, seconds, millis);
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, seconds, millis);
else
sTime = String.format("%02d%02d%d", 0, seconds, millis);
}
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, 0, millis);
else
sTime = String.format("%02d%02d%d", 0, 0, millis);
}
Log.d(TAG, "caltime: " + sTime);
return sTime;
}
public void copyWaveFile(String inFilename, String outFilename) {
FileInputStream in = null;
FileOutputStream out = null;
long totalAudioLen = 0;
long totalDataLen = totalAudioLen + 36;
long longSampleRate = SAMPLE_RATE;
int channels = 2;/////////////////byte 저장은 1에서 완벽함 AudioFormat.CHANNEL_IN_MONO: channels = 1;AudioFormat.CHANNEL_IN_STEREO: channels = 2;
long byteRate = RECORDER_BPP * SAMPLE_RATE * channels / 8;
try {
in = new FileInputStream(inFilename);
out = new FileOutputStream(outFilename);
byte[] data = new byte[bufferSize];
totalAudioLen = in.getChannel().size();
totalDataLen = totalAudioLen + 36;
AppLog.logString("File size: " + totalDataLen);
WriteWaveFileHeader(out, totalAudioLen, totalDataLen, longSampleRate, channels, byteRate);
while (in.read(data) != -1) {
out.write(data);
}
in.close();
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void WriteWaveFileHeader(FileOutputStream out, long totalAudioLen, long totalDataLen, long longSampleRate, int channels, long byteRate) throws IOException {
byte[] header = new byte[44];
header[0] = 'R';
header[1] = 'I';
header[2] = 'F';
header[3] = 'F';
header[4] = (byte) (totalDataLen & 0xff);
header[5] = (byte) ((totalDataLen >> 8) & 0xff);
header[6] = (byte) ((totalDataLen >> 16) & 0xff);
header[7] = (byte) ((totalDataLen >> 24) & 0xff);
header[8] = 'W';
header[9] = 'A';
header[10] = 'V';
header[11] = 'E';
header[12] = 'f';
header[13] = 'm';
header[14] = 't';
header[15] = ' ';
header[16] = 16;
header[17] = 0;
header[18] = 0;
header[19] = 0;
header[20] = 1;
header[21] = 0;
header[22] = (byte) channels;
header[23] = 0;
header[24] = (byte) (longSampleRate & 0xff);
header[25] = (byte) ((longSampleRate >> 8) & 0xff);
header[26] = (byte) ((longSampleRate >> 16) & 0xff);
header[27] = (byte) ((longSampleRate >> 24) & 0xff);
header[28] = (byte) (byteRate & 0xff);
header[29] = (byte) ((byteRate >> 8) & 0xff);
header[30] = (byte) ((byteRate >> 16) & 0xff);
header[31] = (byte) ((byteRate >> 24) & 0xff);
header[32] = (byte) (2 * 16 / 8);
header[33] = 0;
header[34] = RECORDER_BPP;
header[35] = 0;
header[36] = 'd';
header[37] = 'a';
header[38] = 't';
header[39] = 'a';
header[40] = (byte) (totalAudioLen & 0xff);
header[41] = (byte) ((totalAudioLen >> 8) & 0xff);
header[42] = (byte) ((totalAudioLen >> 16) & 0xff);
header[43] = (byte) ((totalAudioLen >> 24) & 0xff);
out.write(header, 0, 44);
}
</long></long>