
Recherche avancée
Autres articles (34)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (3914)
-
correctly encode .mov file for various browsers and devices
7 décembre 2015, par khinesteri have the following template, which uses node-blade that is similar to jade
.show-for-medium-up
-var poster_url = '//' + settings.cloudFrontDomain + '/images/poster/2015_2016.jpeg'
.fullwidth
.about-page
video.video-js.vjs-default-skin.vjs-playing.vjs-big-play-centered(id="home" poster=poster_url)
source(src="//#{settings.cloudFrontDomain}/assets/videos/home/winter_2015/SD_720.webm" type='video/webm')
source(src="//#{settings.cloudFrontDomain}/assets/videos/home/winter_2015/SD_720.mp4" type='video/mp4')
source(src="//#{settings.cloudFrontDomain}/assets/videos/home/winter_2015/SD_720.ogg" type='video/ogg')
source(src="//#{settings.cloudFrontDomain}/assets/videos/home/winter_2015/SD_360.m4v" type='video/mp4')
:javascript
videojs("home", {
preload: 'auto',
autoplay: true,
loop: true,
fluid: true,
aspectRatio: '16:9',
controls: false
});i have encoded the videos using ffmpeg using the following commands
ffmpeg -i $s.mov -vcodec libvpx -acodec libvorbis -aq 5 -ac 2 -qmax 25 -b 614400 -s 1280x720 $s-SD_720.webm
ffmpeg -i $s.mov -c:v libtheora -c:a libvorbis -q:v 6 -q:a 5 $s-SD_720.ogg
ffmpeg -i $s.mov -vcodec h264 -acodec mp2 $s-SD_720.mp4the original video is
ffprobe ../HD.mov
ffprobe version N-76639-g58d32c0 Copyright (c) 2007-2015 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
configuration: --prefix=/home/khine/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/khine/ffmpeg_build/include --extra-ldflags=-L/home/khine/ffmpeg_build/lib --bindir=/home/khine/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
libavutil 55. 5.100 / 55. 5.100
libavcodec 57. 15.100 / 57. 15.100
libavformat 57. 14.100 / 57. 14.100
libavdevice 57. 0.100 / 57. 0.100
libavfilter 6. 15.100 / 6. 15.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '../HD.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2015-12-02 22:35:26
Duration: 00:01:19.20, start: 0.000000, bitrate: 55745 kb/s
Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 55733 kb/s, SAR 1:1 DAR 16:9, 25 fps, 25 tbr, 25 tbn, 50 tbc (default)
Metadata:
creation_time : 2015-12-02 22:35:26
handler_name : Apple Alias Data Handler
encoder : H.264
timecode : 00:03:02:19
Stream #0:1(eng): Data: none (tmcd / 0x64636D74) (default)
Metadata:
creation_time : 2015-12-02 23:44:32
handler_name : Apple Alias Data Handler
timecode : 00:03:02:19what is the correct way to reduce the size to use on the web and also for use on mobile devices without flash.
any advice much appreciated
-
How to queue ffmpeg jobs for transcoding ?
25 juillet 2019, par sujit patelThis scripts below check
ftp
fro any media file and starts transcoding usingffmpeg
. Problem is it starts so manyffmpeg
process simultaneously. since so manyffmpeg
process running servers becomes too slow and takes heavy amount of time to transcodes videos. Sometimes server stops working.How to put jobs in a queue ?
#!/usr/bin/env python3
import os, sys, time, threading, subprocess
import logging
from config import MEDIA_SERVER, MEDIA_SERVER_USERNAME, MEDIA_DIRS, LOCAL_MEDIA_DIR_ROOT, TRANSCODING_SERVER, TRANSCODING_SERVER_USERNAME, RSYNC_SERVER, RSYNC_USERNAME, RSYNC_DIR, PROCESSING_DIR, PROCESSING_GPU_SCRIPT, PROCESSING_CPU_SCRIPT, EMAIL_SEND_TO, EMAIL_SEND_FROM
from send_email import sendEmail
import sqlite3
logger = logging.getLogger(__name__)
class FuncThread(threading.Thread):
def __init__(self, target, *args):
self._targett = target
self._argst = args
threading.Thread.__init__(self)
def run(self):
self._targett(*self._argst)
class Automator(threading.Thread):
def __init__(self):
threading.Thread.__init__(self)
self.sleepTime=60
self.lastSMILFileCheckTime = 0
self.newlastSMILFileCheckTime = 0
self.lastMediaFileCheckTime = 0
self.newLastMediaFileCheckTime = 0
self.db = None
self.fluid_threads = []
self.scriptRoot = os.path.dirname(os.path.realpath(__file__))
def fluid_thread_add(self, thd):
self.fluid_threads.append(thd)
thd.start()
def processFluidThreads(self):
fluid_cond = [x.is_alive() for x in self.fluid_threads]
finished_threads = [x for x, y in zip(self.fluid_threads, fluid_cond) if not y]
self.fluid_threads = [x for x, y in zip(self.fluid_threads, fluid_cond) if y]
if len(finished_threads) > 0:
logger.info('Fluid threads finished: %s (joining on them now...)' % str(len(finished_threads)))
[thd.join() for thd in finished_threads]
logger.info('Joined finished threads successfully')
if len(self.fluid_threads) > 0:
logger.info('Fluid threads remaining: %s' % str(len(self.fluid_threads)))
def run(self):
self.setupDB()
self.fetchlastCheckTime()
while True:
self.process()
time.sleep(self.sleepTime)
def process(self):
logger.debug("process")
try:
self.handleNewSMILFiles()
self.rsyncFromRemote()
self.handleNewSourceFiles()
# fluid_thread_add(FuncThread(start_single_task, user, task_name, task_path, selected_region))
self.processFluidThreads()
self.updatelastCheckTimes()
except Exception as e:
print(e)
logger.error("Something went wrong while running this")
def handleNewSourceFiles(self):
logger.info("Looking for medial files since " + str(self.lastMediaFileCheckTime))
for root, dirs, filenames in os.walk(PROCESSING_DIR):
for subdir in dirs:
pass
for f in filenames:
if (f.lower().endswith("complete")):
file_path = os.path.join(root, f)
mod_time = self.modification_date(file_path)
if (mod_time > self.lastMediaFileCheckTime):
logger.info("Found a new media File " + file_path)
relDir = os.path.relpath(root, PROCESSING_DIR)
f_name = root.split("/")[-1]
new_output_localdir = os.path.join(LOCAL_MEDIA_DIR_ROOT, relDir)
new_output_localdir = os.path.abspath(os.path.join(new_output_localdir, os.pardir))
new_output_remotedir = new_output_localdir
if new_output_remotedir.startswith(LOCAL_MEDIA_DIR_ROOT):
new_output_remotedir = new_output_remotedir[len(LOCAL_MEDIA_DIR_ROOT):]
if(self.startATranscodingThread(root, new_output_localdir, new_output_remotedir, f_name+".mp4")):
if(mod_time > self.newLastMediaFileCheckTime):
self.newLastMediaFileCheckTime = mod_time
def startATranscodingThread(self, inputFile, outputLocalDIR, outputRemoteDIR, fileName):
self.fluid_thread_add(FuncThread(self.runTranscoder, inputFile, outputLocalDIR, outputRemoteDIR, fileName, MEDIA_SERVER))
return True
def handleNewSMILFiles(self):
if (MEDIA_SERVER != TRANSCODING_SERVER):
logger.info("Media server is separate, fetching last 24 hours SMIL files from " + MEDIA_SERVER)
self.rsyncSMILFiles()
logger.info("Looking for SMIL files since " + str(self.lastSMILFileCheckTime) + " in " + LOCAL_MEDIA_DIR_ROOT)
for root, dirs, filenames in os.walk(LOCAL_MEDIA_DIR_ROOT):
for subdir in dirs:
pass
for f in filenames:
file_path = os.path.join(root, f)
if (f.lower().endswith("stream.smil")):
file_path = os.path.join(root, f)
mod_time = self.modification_date(file_path)
if(mod_time > self.lastSMILFileCheckTime):
logger.info("Found a new SMIL File " + file_path)
relDir = os.path.relpath(root, LOCAL_MEDIA_DIR_ROOT)
f = f.split(".")[0]
new_dir_name = os.path.splitext(os.path.basename(f))[0]
new_dir_name = os.path.join(relDir, new_dir_name)
if(self.createARemoteDirectory(new_dir_name)):
if(mod_time > self.newlastSMILFileCheckTime):
self.newlastSMILFileCheckTime = mod_time
def modification_date(self, filename):
t = os.path.getmtime(filename)
return t
def createARemoteDirectory(self, dirName):
HOST = RSYNC_SERVER
DIR_NAME=RSYNC_DIR + "/" + dirName
COMMAND = "ssh {}@{} mkdir -p {}".format(RSYNC_USERNAME, RSYNC_SERVER, DIR_NAME)
logger.info("Going to execute :-- " + COMMAND)
rv = subprocess.check_call(COMMAND, shell=True)
return True
def rsyncSMILFiles(self):
HOST = RSYNC_SERVER
for MEDIA_DIR in MEDIA_DIRS:
epoch_time = int(time.time())
TEMP_FILE="/tmp/rsync_files.{}".format(epoch_time)
COMMAND = "ssh -o ConnectTimeout=10 {}@{} \"cd {} && find . -mtime -3 -name *.stream.smil > {} && rsync -azP --files-from={} . {}@{}:{}/{}\"".format(MEDIA_SERVER_USERNAME, MEDIA_SERVER, MEDIA_DIR, TEMP_FILE, TEMP_FILE, TRANSCODING_SERVER_USERNAME, TRANSCODING_SERVER, LOCAL_MEDIA_DIR_ROOT, MEDIA_DIR)
logger.info("Going to execute :-- " + COMMAND)
try:
rv = subprocess.check_call(COMMAND, shell=True)
except Exception as e:
logger.error("Unable to connect to media server")
return True
def rsyncFromRemote(self):
HOST = RSYNC_SERVER
COMMAND="rsync -azP --delete {}@{}:{} {} ".format(RSYNC_USERNAME, RSYNC_SERVER, RSYNC_DIR+"/", PROCESSING_DIR)
logger.info("Going to execute :-- " + COMMAND)
rv = subprocess.check_call(COMMAND, shell=True)
return True
def runTranscoder(self, inputDIR, outputLocalDIR, outputRemoteDIR, fileName, media_server):
HOST = RSYNC_SERVER
COMMAND="bash {} {} {} {} {} {}".format(PROCESSING_CPU_SCRIPT, inputDIR, outputLocalDIR, outputRemoteDIR, fileName, media_server)
# if (len(self.fluid_threads)>2):
# COMMAND="bash {} {} {} {} {} {}".format(PROCESSING_CPU_SCRIPT, inputDIR, outputLocalDIR, outputRemoteDIR, fileName, media_server)
logger.info("Going to execute :-- " + COMMAND)
#sendEmail(EMAIL_SEND_TO, EMAIL_SEND_FROM, "Transcoding started for file" + fileName.replace('_','-').replace('/','-'), outputRemoteDIR+fileName)
# sendEmail(EMAIL_SEND_TO, EMAIL_SEND_FROM, "Transcoding started for file" , outputRemoteDIR+fileName)
try:enter code here
rv = subprocess.check_call(COMMAND, shell=True)
# if (rv !=0):
# sendEmail(EMAIL_SEND_TO, EMAIL_SEND_FROM, "Transcoding Failed for a file", outputRemoteDIR+fileName)
except Exception as e:
logger.error("Transcoding Failed for a file :- " + outputRemoteDIR+fileName);
# sendEmail(EMAIL_SEND_TO, EMAIL_SEND_FROM, "Transcoding Failed for a file", outputRemoteDIR+fileName+"\n contact dev@example.com")
return True
def setupDB(self):
self.db = sqlite3.connect('automator.db', detect_types=sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES)
sql = "create table if not exists last_smil_check (last_check_time int)"
self.db.execute(sql)
self.db.commit()
sql = "create table if not exists last_mediafile_check(last_check_time int)"
self.db.execute(sql)
self.db.commit()
def fetchlastCheckTime(self):
cursor = self.db.execute("select * from last_smil_check")
count = 0
for row in cursor:
logger.info(row)
count = count+1
self.lastSMILFileCheckTime = row[0]
self.newlastSMILFileCheckTime = self.lastSMILFileCheckTime
cursor = self.db.execute("select * from last_mediafile_check")
count = 0
for row in cursor:
logger.info(row)
count = count+1
self.lastMediaFileCheckTime = row[0]
self.newLastMediaFileCheckTime = self.lastMediaFileCheckTime
def updatelastCheckTimes(self):
self.lastSMILFileCheckTime = self.newlastSMILFileCheckTime
self.lastMediaFileCheckTime = self.newLastMediaFileCheckTime
cursor = self.db.execute("select * from last_smil_check")
count = 0
for row in cursor:
count = count +1
if(count == 0):
sql_query = "insert into last_smil_check values ({})".format(self.lastSMILFileCheckTime)
logger.info("Executing " + sql_query)
self.db.execute(sql_query)
else:
self.db.execute("update last_smil_check set last_check_time ={}".format(self.lastSMILFileCheckTime))
self.db.commit()
cursor = self.db.execute("select * from last_mediafile_check")
logger.info(cursor)
count = 0
for row in cursor:
count = count +1
if(count == 0):
sql_query = "insert into last_mediafile_check values ({})".format(self.lastMediaFileCheckTime)
logger.info("Executing " + sql_query)
self.db.execute(sql_query)
else:
sql_query = "update last_mediafile_check set last_check_time ={}".format(self.lastMediaFileCheckTime)
logger.info("Executing " + sql_query)
self.db.execute(sql_query)
self.db.commit()
if __name__ == '__main__':
logging.basicConfig(level=logging.DEBUG)
automator = Automator()
automator.start()
enter code here -
Android FFmpegPlayer Streaming Service onClick notification
8 octobre 2013, par agonyI have a MainActivity class that displays the list of streams available for my project and the StreamingActivity class where the streaming is done.
If the user selected an item from the list it will start the StreamingActivity and start playing the stream.
I'm having trouble to continue streaming music when the user pressed the notification and returning it to the StreamingActivity class if the user pressed or clicked the home menu or when the app goes to onDestroy().I'm using FFmpegPlayer for my project 'coz it requires to play mms :// live streams for local FM station.
Here's my code :
public class StreamingActivity extends BaseActivity implements ActionBar.TabListener,
PlayerControlListener, IMediaPlayerServiceClient {
private StatefulMediaPlayer mMediaPlayer;
private FFmpegService mService;
private boolean mBound;
public static final String TAG = "StationActivity";
private static Bundle mSavedInstanceState;
private static PlayerFragment mPlayerFragment;
private static DJListFragment mDjListFragment;
private SectionsPagerAdapter mSectionsPagerAdapter;
private ViewPager mViewPager;
private String stream = "";
private String fhz = "";
private String page = "0";
private Dialog shareDialog;
private ProgressDialog dialog;
private boolean isStreaming;
/*************************************************************************************************************/
@Override
public void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_station);
Bundle bundle = getIntent().getExtras();
if(bundle !=null){
fhz = bundle.getString("fhz");
stream = bundle.getString("stream");
}
Log.d(TAG, "page: " + page + " fhz: " + fhz + " stream: " + stream + " isStreaming: " + isStreaming);
getSupportActionBar().setTitle("Radio \n" + fhz);
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
getSupportActionBar().setNavigationMode(ActionBar.NAVIGATION_MODE_TABS);
mPlayerFragment = (PlayerFragment) Fragment.instantiate(this, PlayerFragment.class.getName(), null);
mDjListFragment = (DJListFragment) Fragment.instantiate(this, DJListFragment.class.getName(), null);
mSectionsPagerAdapter = new SectionsPagerAdapter(getSupportFragmentManager());
mViewPager = (ViewPager) findViewById(R.id.pager);
mViewPager.setAdapter(mSectionsPagerAdapter);
mViewPager.setCurrentItem(Integer.parseInt(page));
mSavedInstanceState = savedInstanceState;
Tab playingTab = getSupportActionBar().newTab();
playingTab.setText(getString(R.string.playing_label));
playingTab.setTabListener(this);
Tab djTab = getSupportActionBar().newTab();
djTab.setText(getString(R.string.dj_label));
djTab.setTabListener(this);
getSupportActionBar().addTab(playingTab);
getSupportActionBar().addTab(djTab);
// When swiping between different sections, select the corresponding
// tab. We can also use ActionBar.Tab#select() to do this if we have
// a reference to the Tab.
mViewPager.setOnPageChangeListener(new ViewPager.SimpleOnPageChangeListener() {
@Override
public void onPageSelected(int position) {
StationActivity.this.getSupportActionBar().setSelectedNavigationItem(position);
}
});
if (mSavedInstanceState != null) {
getSupportActionBar().setSelectedNavigationItem(mSavedInstanceState.getInt("tab", 0));
}
dialog = new ProgressDialog(this);
bindToService();
UriBean.getInstance().setStream(stream);
Log.d(TAG ,"stream: " + UriBean.getInstance().getStream());
}
/********************************************************************************************************/
public class SectionsPagerAdapter extends FragmentPagerAdapter {
public SectionsPagerAdapter(FragmentManager fm) {
super(fm);
}
@Override
public Fragment getItem(int position) {
if (position == 0) {
return mPlayerFragment;
} else {
return mDjListFragment;
}
}
@Override
public int getCount() {
return 2;
}
}
@Override
public void onTabSelected(Tab tab, FragmentTransaction ft) {
// When the given tab is selected, switch to the corresponding page in the ViewPager.
mViewPager.setCurrentItem(tab.getPosition());
}
@Override
public void onTabUnselected(Tab tab, FragmentTransaction ft) { }
@Override
public void onTabReselected(Tab tab, FragmentTransaction ft) { }
/********************************************************************************************************/
public void showLoadingDialog() {
dialog.setMessage("Buffering...");
dialog.show();
}
public void dismissLoadingDialog() {
dialog.dismiss();
}
/********************************************************************************************************/
/**
* Binds to the instance of MediaPlayerService. If no instance of MediaPlayerService exists, it first starts
* a new instance of the service.
*/
public void bindToService() {
Intent intent = new Intent(this, FFmpegService.class);
if (Util.isFFmpegServiceRunning(getApplicationContext())){
// Bind to Service
Log.i(TAG, "bindService");
bindService(intent, mConnection, Context.BIND_AUTO_CREATE);
} else {
//start service and bind to it
Log.i(TAG, "startService & bindService");
startService(intent);
bindService(intent, mConnection, Context.BIND_AUTO_CREATE);
}
}
/**
* Defines callbacks for service binding, passed to bindService()
*/
private ServiceConnection mConnection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName className, IBinder serviceBinder) {
Log.d(TAG,"service connected");
//bound with Service. get Service instance
MediaPlayerBinder binder = (FFmpegService.MediaPlayerBinder) serviceBinder;
mService = binder.getService();
//send this instance to the service, so it can make callbacks on this instance as a client
mService.setClient(StationActivity.this);
mBound = true;
Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
//if
startStreaming();
}
@Override
public void onServiceDisconnected(ComponentName arg0) {
mBound = false;
mService = null;
}
};
/********************************************************************************************************/
@Override
public void onPlayerPlayStop() {
Log.d(TAG, "onPlayerPlayStop");
Log.v(TAG, "isStreaming: " + isStreaming);
Log.v(TAG, "mBound: " + mBound);
if (mBound) {
Log.d(TAG, "bound.............");
mMediaPlayer = mService.getMediaPlayer();
//pressed pause ->pause
if (!PlayerFragment.play.isChecked()) {
if (mMediaPlayer.isStarted()) {
Log.d(TAG, "pause");
mService.pauseMediaPlayer();
}
} else { //pressed play
// STOPPED, CREATED, EMPTY, -> initialize
if (mMediaPlayer.isStopped() || mMediaPlayer.isCreated() || mMediaPlayer.isEmpty()) {
startStreaming();
} else if (mMediaPlayer.isPrepared() || mMediaPlayer.isPaused()) { //prepared, paused -> resume play
Log.d(TAG, "start");
mService.startMediaPlayer();
}
}
Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
}
}
/********************************************************************************************************/
@Override
public void onDownload() {
Toast.makeText(this, "Not yet available...", Toast.LENGTH_SHORT).show();
}
@Override
public void onComment() {
FragmentManager fm = getSupportFragmentManager();
DialogFragment newFragment = MyAlertDialogFragment.newInstance();
newFragment.show(fm, "comment_dialog");
}
@Override
public void onShare() {
showShareDialog();
}
/********************************************************************************************************/
private void startStreaming() {
Log.d(TAG, "@startLoading");
boolean isNetworkFound = Util.checkConnectivity(getApplicationContext());
if(isNetworkFound) {
Log.d(TAG, "network found");
mService.initializePlayer(stream);
isStreaming = true;
} else {
Toast.makeText(getApplicationContext(), "No internet connection found...", Toast.LENGTH_SHORT).show();
}
Log.d(TAG, "isStreaming: " + isStreaming);
Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
}
@Override
public void onInitializePlayerStart() {
showLoadingDialog();
}
@Override
public void onInitializePlayerSuccess() {
dismissLoadingDialog();
PlayerFragment.play.setChecked(true);
Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
}
@Override
public void onError() {
Toast.makeText(getApplicationContext(), "Not connected to the server...", Toast.LENGTH_SHORT).show();
}
@Override
public void onDestroy() {
Log.d(TAG, "onDestroy");
super.onDestroy();
uiHelper.onDestroy();
Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
if (mBound) {
mService.unRegister();
unbindService(mConnection);
mBound = false;
}
Log.d(TAG, "service: " + Util.isFFmpegServiceRunning(getApplicationContext()));
}
@Override
public void onStop(){
Log.d(TAG, "onStop");
super.onStop();
}
/*******************************************************************************************************/
@Override
public boolean onOptionsItemSelected(MenuItem item) {
int itemId = item.getItemId();
switch (itemId){
case android.R.id.home:
onBackPressed();
break;
default:
break;
}
return true;
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
Log.d(TAG, "@onKeyDown");
if (keyCode == KeyEvent.KEYCODE_BACK && event.getRepeatCount() == 0){
//this.moveTaskToBack(true);
onBackPressed();
return true;
}
return super.onKeyDown(keyCode, event);
}
}
public class FFmpegService extends Service implements IMediaPlayerThreadClient {
private FFmpegPlayerThread mMediaPlayerThread = new FFmpegPlayerThread(this);
private final Binder mBinder = new MediaPlayerBinder();
private IMediaPlayerServiceClient mClient;
//private StreamStation mCurrentStation;
private boolean mIsSupposedToBePlaying = false;
private boolean isPausedInCall = false;
private PhoneStateListener phoneStateListener;
private TelephonyManager telephonyManager;
@Override
public void onCreate(){
mMediaPlayerThread.start();
}
/**
* A class for clients binding to this service. The client will be passed an object of this class
* via its onServiceConnected(ComponentName, IBinder) callback.
*/
public class MediaPlayerBinder extends Binder {
/**
* Returns the instance of this service for a client to make method calls on it.
* @return the instance of this service.
*/
public FFmpegService getService() {
return FFmpegService.this;
}
}
/**
* Returns the contained StatefulMediaPlayer
* @return
*/
public StatefulMediaPlayer getMediaPlayer() {
return mMediaPlayerThread.getMediaPlayer();
}
public boolean isPlaying() {
return mIsSupposedToBePlaying;
}
@Override
public IBinder onBind(Intent arg0) {
return mBinder;
}
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
telephonyManager = (TelephonyManager) getSystemService(Context.TELEPHONY_SERVICE);
phoneStateListener = new PhoneStateListener() {
@Override
public void onCallStateChanged(int state, String incomingNumber) {
// String stateString = "N/A";
Log.v("FFmpegService", "Starting CallStateChange");
switch (state) {
case TelephonyManager.CALL_STATE_OFFHOOK:
case TelephonyManager.CALL_STATE_RINGING:
if (mMediaPlayerThread != null) {
pauseMediaPlayer();
isPausedInCall = true;
}
break;
case TelephonyManager.CALL_STATE_IDLE:
// Phone idle. Start playing.
if (mMediaPlayerThread != null) {
if (isPausedInCall) {
isPausedInCall = false;
startMediaPlayer();
}
}
break;
}
}
};
// Register the listener with the telephony manager
telephonyManager.listen(phoneStateListener, PhoneStateListener.LISTEN_CALL_STATE);
return START_STICKY;
}
/**
* Sets the client using this service.
* @param client The client of this service, which implements the IMediaPlayerServiceClient interface
*/
public void setClient(IMediaPlayerServiceClient client) {
this.mClient = client;
}
public void initializePlayer(final String station) {
//mCurrentStation = station;
mMediaPlayerThread.initializePlayer(station);
}
public void startMediaPlayer() {
Intent notificationIntent = new Intent(getApplicationContext(), StreamingActivity.class);
//notificationIntent.putExtra("page", "0");
//notificationIntent.putExtra("isPlaying", isPlaying());
notificationIntent.addFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP | Intent.FLAG_ACTIVITY_CLEAR_TOP);
PendingIntent contentIntent = PendingIntent.getActivity(getApplicationContext(), 0 , notificationIntent , PendingIntent.FLAG_UPDATE_CURRENT);
NotificationCompat.Builder mBuilder = new NotificationCompat.Builder(this)
.setSmallIcon(R.drawable.ic_launcher)
.setContentTitle("You are listening to Radio...")
.setContentText("test!!!")
.setContentIntent(contentIntent);
startForeground(1, mBuilder.build());
NotificationManager notificationManager = (NotificationManager) getSystemService(Context.NOTIFICATION_SERVICE);
notificationManager.notify(1, mBuilder.build());
mIsSupposedToBePlaying = true;
mMediaPlayerThread.startMediaPlayer();
}
public void dismissNotification(Context context) {
String ns = Context.NOTIFICATION_SERVICE;
NotificationManager mNotificationManager = (NotificationManager) getSystemService(ns);
mNotificationManager.cancel(1);
}
/**
* Pauses playback
*/
public void pauseMediaPlayer() {
Log.d("MediaPlayerService","pauseMediaPlayer() called");
mMediaPlayerThread.pauseMediaPlayer();
stopForeground(true);
mIsSupposedToBePlaying = false;
dismissNotification(this);
}
/**
* Stops playback
*/
public void stopMediaPlayer() {
stopForeground(true);
mMediaPlayerThread.stopMediaPlayer();
mIsSupposedToBePlaying = false;
dismissNotification(this);
}
public void resetMediaPlayer() {
mIsSupposedToBePlaying = false;
stopForeground(true);
mMediaPlayerThread.resetMediaPlayer();
dismissNotification(this);
}
@Override
public void onError() {
mIsSupposedToBePlaying = false;
mClient.onError();
dismissNotification(this);
}
@Override
public void onInitializePlayerStart() {
mClient.onInitializePlayerStart();
}
@Override
public void onInitializePlayerSuccess() {
startMediaPlayer();
mClient.onInitializePlayerSuccess();
mIsSupposedToBePlaying = true;
}
public void unRegister() {
this.mClient = null;
mIsSupposedToBePlaying = false;
dismissNotification(this);
}
}Hoping someone can help me here...