Flutter demo audioplayers on background via audio_service

In this article i will show you how to use audio_players with audio_service.

You can find full demo on GitHub

I assume that you are familiar with bloc as state manager and get_it as dependency injection.

Audio_players are a great library for playing audio in your Flutter project. But, what if you want your music to keep playing while your phone is locked? Using audio_service library will solve this.

To reach our goal we will use next libraries:

When you are using audioplayers normally, audio will be stop when the phone locks. This happens because the player works in the same isolate as the UI and it will stop when the phone locks. Audio_service can use an existing isolate or create a new isolate for our player and we need to move the player into this isolate. This is the main idea for working in the background: move players into this isolate and use actions and events to control playback.

First, go to the documentation for both of these libraries and input platform settings for iOS and Android.

Now wrap your main route with widget AudioServiceWidget:

void main() => runApp(new MyApp());
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Audio Service Demo',
home: AudioServiceWidget(child: MusicPlayer()),
);
}
}

MusicPlayer is just a regular StatelessWidget that has buttons and sliders to control playback status. The most interesting part happens in MediaPlayerCubit and AudioPlayerTask. You can think about AudioPlayerTask as a task that will be running in isolate. MediaPlayerCubit will keep state of the player and send requests to AudioPlayerTask to control the player.

Run player in isolate

Let’s start with AudioPlayerTask. AudioPlayerTask needs to extend BackgroundAudioTask.

import 'package:audio_service/audio_service.dart';
import 'package:audioplayers/audioplayers.dart';
class AudioPlayerTask extends BackgroundAudioTask {
final _player = AudioPlayer();
}

At this point we finish withAudioPlayerTask. It’s not yet ready to play audio, but we will return to it later.

Now move toMediaPlayerCubit. Declare top level function _entryPoint in media_player_cubit.dart

void _entryPoint() {
AudioServiceBackground.run(() => AudioPlayerTask());
}

Next, create static method _startAudioService(), that will be used to recreate isolate which we will discuss later.

static Future<void> _startAudioService(
MediaPlayerCubit mediaPlayer,
) async {

await AudioService.start(backgroundTaskEntrypoint: _entryPoint);
}

Call _startAudioService() in constructor. It will create an isolate at the start of application.

factory MediaPlayerCubit() {
var mediaPlayer = MediaPlayerCubit._init();

_startAudioService(mediaPlayer);

return mediaPlayer;
}

Let’s discuss why we created isolate at the start of application. Creating a new isolate is a time consuming operation. If we create the isolate at the same time a user taps play it will noticeably freeze. To avoid this we will create an isolate on the start application and when the user taps play it starts to play immediately.

In MediaPlayerCubit add property _audioTrack it represent the task currently playing:

import 'package:audioplayersaudioservice/features/audio_track/domain/entities/audio_track.dart';class MediaPlayerCubit extends Cubit<MediaPlayerStateAbstract> {
AudioTrack _audioTrack;
}

Play track in background

To start playing audio let’s create method play() in MediaPlayerCubit:

Future<void> play() async {
emit(MediaPlayerLoadingTrackState(null));

if (_audioTrack == null) {
_audioTrack = await getAudioTrackUseCase.next();

if (_audioTrack is AudioTrack) {
AudioService.updateQueue([]);
AudioService.addQueueItem(MediaItem(
id: track.url,
album: track.author,
title: track.title,
));

AudioService.play();
} else {
errorBarCubit.show('no tracks in library');
}
} else {
AudioService.customAction(AudioPlayerTask.RESUME_ACTION);
}
emit(MediaPlayerPlayingState(_audioTrack));
}

Here we check if we have the track playing _audioTrack == null if we have, we need to resume playing (more about resume later) otherwise, we need to add a new track to the player. getAudioTrackUseCase.next() will extract the first track from our improvising playlist. If we successfully got first track we clear AudioService queue of tracks AudioService.updateQueue([]) , than add new track to queue

AudioService.addQueueItem(MediaItem(
id: track.url,
album: track.author,
title: track.title,
));

AudioService.addQueueItem will send command to background isolate and we need to handle it there, which means we need to override method onAddQueueItem() at AudioPlayerTask

class AudioPlayerTask extends BackgroundAudioTask {
...
@override
Future<void> onAddQueueItem(MediaItem mediaItem) async {
AudioServiceBackground.setMediaItem(mediaItem);
}
}

Here we set the incoming audio track as current.

Back to MediaPlayerCubit.play(). LineAudioService.play() will send play command to background isolate and we need to handle it there, which means we need to override method onPlay() at AudioPlayerTask

class AudioPlayerTask extends BackgroundAudioTask {
...
@override
Future<void> onPlay() async {
var mediaItem = AudioServiceBackground.mediaItem;
await _player.play(
mediaItem.id,
position: Duration(seconds: 0),
isLocal: false,
);
}
}

Here we take params of the current audio track and pass them into our _player.

Right now if you tap play button audio starts to play, but you cannot stop it because we haven’t yet implemented pause. Let’s do it.

class MediaPlayerCubit extends Cubit<MediaPlayerStateAbstract> {
Future<void> pause() async {
emit(MediaPlayerPausedState(_audioTrack));
AudioService.pause();
}
}

and

class AudioPlayerTask extends BackgroundAudioTask {
@override
Future<void> onPause() async {
if (_player.state != AudioPlayerState.PAUSED) {
_player.pause();
}
}
}

Now you can pause your audio. But if you try to tap play again the player will start playing from the beginning. This is because our MediaPlayerCubit.play() only knows how to play new tracks but doesn’t know how to resume. Since AudioService has no resume() method we will use AudioService.customAction() to create our own action RESUME_ACTION. Add constant RESUME_ACTION into AudioPlayerTask, it will represent the name of a custom action.

class AudioPlayerTask extends BackgroundAudioTask {
...
static const RESUME_ACTION = 'RESUME_ACTION';
}

At the MediaPlayerCubit.play() we already call:

AudioService.customAction(AudioPlayerTask.RESUME_ACTION);

Here AudioService will pass the isolate to the name of our custom action. To handle it in isolate override method onCustomAction at AudioPlayerTask:

class AudioPlayerTask extends BackgroundAudioTask {
@override
Future<dynamic> onCustomAction(String name, dynamic arguments) async {

if (name == RESUME_ACTION) {
return await _player.resume();
}
}

Great! Now we can play, pause and resume our audio track and it keep working while the phone is locked.

But if you look closer to our UI you will see that our slider is completely dead, it does not show playback progress. Next we need to bring it to life.

Pass current position to slider

To do it we need to listen to events from background isolate, react to them and update our UI. It can be done with two ways:

Personally, I prefer the second approach. To do it we need to override method onStart() at AudioPlayerTask

class AudioPlayerTask extends BackgroundAudioTask {
...
@override
Future<void> onStart(Map<String, dynamic> params) async {
_player.onAudioPositionChanged.listen((position) {
AudioServiceBackground.sendCustomEvent(
{POSITION_SECONDS_EVENT: position.inSeconds});
_broadcastState();
});
}

sendCustomEvent() takes parameters with type dynamic. Since we will have several custom events it is more convenient to put in a Map where keys will be the name of the custom event and value will be necessary data, in our case it will be position in seconds. We can not pass instances of Duration between isolates, so we will pass a basic type of double, which represents position in seconds.

To catch this event in MediaPlayerCubit we need to listen AudioService.customEventStream. Method _startAudioService() is a good place for it:

static Future<void> _startAudioService(MediaPlayerCubit mediaPlayer) async {
AudioService.customEventStream.listen((event) {
if (event[AudioPlayerTask.POSITION_SECONDS_EVENT] != null) {
mediaPlayer.setPosition(
Duration(seconds: event[AudioPlayerTask.POSITION_SECONDS_EVENT]));
}
});

await AudioService.start(backgroundTaskEntrypoint: _entryPoint);
}

mediaPlayer.setPosition() is a cubit method which emits PlayingPositionState and UI react for this state.

Now our UI shows the current playback position by slider. But we still don’t see headset buttons, the Android lock screen and notification, iOS control center.

Show lock screen controllers

To fix it we need to use AudioServiceBackground.setState() on player state changing.

class AudioPlayerTask extends BackgroundAudioTask {
...
Future<void> _broadcastState() async {
var position = Duration(milliseconds: await _player.getCurrentPosition());
await AudioServiceBackground.setState(
controls: [
MediaControl.skipToPrevious,
_player.state == AudioPlayerState.PLAYING
? MediaControl.pause
: MediaControl.play,
MediaControl.stop,
MediaControl.skipToNext,
],
systemActions: [
MediaAction.seekTo,
MediaAction.seekForward,
MediaAction.seekBackward,
],
androidCompactActions: [0, 1, 3],
processingState: _getProcessingState(),
playing: _player.state == AudioPlayerState.PLAYING,
position: position,
);
}
@override
Future<void> onStart(Map<String, dynamic> params) async {
_player.onDurationChanged.listen((duration) {
if (duration != AudioServiceBackground.mediaItem.duration) {
AudioServiceBackground.setMediaItem(
AudioServiceBackground.mediaItem.copyWith(duration: duration));
AudioServiceBackground.sendCustomEvent(
{DURATION_SECONDS_EVENT: duration.inSeconds});
}
});
_player.onAudioPositionChanged.listen((position) {
AudioServiceBackground.sendCustomEvent(
{POSITION_SECONDS_EVENT: position.inSeconds});
_broadcastState();
});
_player.onPlayerStateChanged.listen((event) {
_broadcastState();
});
_player.onPlayerCommand.listen((event) {
_broadcastState();
});
}
}

It is very important to listen _player.onDurationChanged and set mediaItem:

AudioServiceBackground.setMediaItem(
AudioServiceBackground.mediaItem.copyWith(duration: duration));

Without this listener iOS control center can not show progress playback audio.

Now you finally can see headset buttons, the Android lock screen and notification, iOS control center. More about AudioServiceBackground.setState() you can read in audio_service documentation.

Controlling audio from background

For now we discussing only controlling playback state from the app UI. Next, we need to make the buttons in the background work.

To let our UI know about player state we will use AudioServiceBackground.sendCustomEvent(). Let’s implement on tap pause on the lock screen. We already have overridden onPlay() method at AudioPlayerTask, now all you need is add sending a custom event

class AudioPlayerTask extends BackgroundAudioTask {
@override
Future<void> onPause() async {
if (_player.state != AudioPlayerState.PAUSED) {
AudioServiceBackground.sendCustomEvent({PAUSE_EVENT: true});//<-- add this line
_player.pause();
}
}
}

and in MediaPlayerCubit add handling of these events

static Future<void> _startAudioService(MediaPlayerCubit mediaPlayer) async {
AudioService.customEventStream.listen((event) {
...
if (event[AudioPlayerTask.PAUSE_EVENT] != null) {
mediaPlayer.pause();
}
});

await AudioService.start(backgroundTaskEntrypoint: _entryPoint);
}

Now MediaPlayerCubit will call pause() method and UI will respond to it.

Recreate isolate

If you tap the stop button on the Android lock screen AudioService will terminate isolate and only restart the app will allow it to play again. To improve user experience we can check AudioService is running and if not start it again _startAudioService will help us. First we will create a new method _addActionToAudioService. This method will take callback function that will be executed after checking AudioService is running

class MediaPlayerCubit extends Cubit<MediaPlayerStateAbstract> {  ...
Future<dynamic> _addActionToAudioService(Function callback) async {
if (!AudioService.running) {
await _startAudioService(this);
}

return callback();
}
}

Now all calls of AudioService need to be done through _addActionToAudioService. For example, here is how pause method looks like:

Future<void> pause() async {
emit(MediaPlayerPausedState(_audioTrack));
_addActionToAudioService(() => AudioService.pause());
}

That is it! Full demo application you can find on GitHub.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store