I'm using the just_audio package, I have a playlist, and after playing all the tracks, the current index remains equal to the index of the last track. Is there a way to make the index equal to 0 when the playlist ends?
final myPlayList = ConcatenatingAudioSource(
children: listAudios,
);
await _player.setAudioSource(myPlayList, initialIndex: 0, preload: false);
Listen for the completed state and seek to index 0. Assuming you also want to stop playing at that point, call pause() too:
_player.processingStateStream.listen((state) async {
if (state == ProcessingState.completed) {
await _player.pause();
await _player.seek(Duration.zero, index: 0);
}
});
Regarding the call to pause(): just_audio state is a composite of playing and processingState and you will only hear audio when playing=true, processingState=ready which is what you have during normal playback. The playing state only ever changes when you tell it to change, but the processingState may change independently depending on what is happening in the audio.
When reaching the end of the playlist, it will transition to playing=true, processingState=completed, hence no audio. If you seek back to the beginning, it will be playing=true, processingState=ready so you'll hear the audio automatically start back up. This may surprise you, but think of it as comparable to what happens in a YouTube video after it completes, but if you seek to the beginning it continues playing (because it's now ready, not completed).
If you want it to stop playing, you need to actually set playing=false with either pause() or stop(), where the former keeps things in memory, and the latter releases all resources.
For more information about the state model and its rationale, see The State Model section of the README.
Related
I'm using Flutter just audio and I'm creating a ContatenatingAudioSource like this:
// Define the playlist
final playlist = ConcatenatingAudioSource(
// Start loading next item just before reaching it
useLazyPreparation: true,
// Customise the shuffle algorithm
shuffleOrder: DefaultShuffleOrder(),
// Specify the playlist items
children: [
AudioSource.uri(Uri.parse('https://example.com/track1.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track2.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track3.mp3')),
],
);
This works perfectly, but I can't seem to prevent the next audio from playing when the current one ends.
All I want is a playlist where I have to explicitly seek the next audio, whenever the current one ends.
Is it possible to do that?
It sounds like ConcatenatingAudioSource is not the right solution to your problem, because what it does is "concatenate" the audio of multiple items together so that they can be played continuously.
If all you want is to have separate items that the user can switch between, then you can program it exactly that way:
final items = [
AudioSource.uri(Uri.parse('https://example.com/track1.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track2.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track3.mp3')),
];
Future<void> selectItem(int index) {
await player.setAudioSource(items[index]!);
}
Now, there is only ever one item in just_audio's playlist which means that when it completes, it completes. You can call selectItem(i) to switch to item i.
This solution does mean that player.sequence will not contain a sequence of all items, it will only ever contain a single item. So if you were relying on that to render the list in your UI, you should instead use items (declared above).
I'm trying to play two sound tracks at the time, the first one work as a BGM and the other works as a SFX
What happened now is the BGM is playing and when a SFX sound start to play the BGM is stopped and then starts from the beginning.
I want both sounds to be played without stopping, I searched the docs and I couldn't found a clear answer to this.
You can use asset_audio_player to open multiple audio instances.
///play 3 songs in parallel
AssetsAudioPlayer.newPlayer().open(
Audio.asset("assets/audios/song1.mp3")
);
AssetsAudioPlayer.newPlayer().open(
Audio.asset("assets/audios/song2.mp3")
);
//another way, with create, open, play & dispose the player on finish
AssetsAudioPlayer.playAndForget(
Audio.asset("assets/audios/song3.mp3")
);
You can read the documentation fore more info.
You can refer to this question for a more comprehensive answer, but you can simply create two players, one to play the BGM and one to play the SFX:
final bgmPlayer = AudioPlayer();
final sfxPlayer = AudioPlayer();
await bgmPlayer.setAsset('music.mp3');
await sfxPlayer.setAsset('effect.mp3');
bgmPlayer.play(); // start playing background music
// now you can play the sound effect any time you want
// while the BGM continues to play:
await sfxPlayer.play();
await sfxPlayer.play();
await sfxPlayer.play();
I want to have a variable length pause between tracks in a playlist created with a just_audio AudioPlayer instance (there is a background track which I want playing during this interval). Something to the effect of:
_voiceAudioPlayer.currentIndexStream.listen((event) {
_voiceAudioPlayer.pause();
Future.delayed(const Duration(seconds: 4), () => _voiceAudioPlayer.play());
});
This throws an error:
"Unhandled Exception: Bad state: Cannot fire new event. Controller is already firing an event"
Is there a clean way to do this? I'm considering adding silent mp3s at every other track in the playlist, but feel there there ought to be a better way.
This error happens because currentIndexStream is a "sync" broadcast stream, so you can't trigger another state change event while the current event is being processed (i.e. in the same cycle of the event loop). But you can get around that by scheduling a microtask to happen after the current cycle:
_voiceAudioPlayer.currentIndexStream.listen((index) {
scheduleMicrotask(() async {
_voiceAudioPlayer.pause();
await Future.delayed(Duration(seconds: 4));
_voiceAudioPlayer.play();
});
});
Still, I wouldn't depend on this callback being executed soon enough due to the gapless nature of just_audio's playlists. That is, the next audio track will begin playing immediately, so you're bound to hear at least a fraction of the next item's audio before the pause happens.
There is an open feature request for a SilenceAudioSource which could be inserted into a playlist (you can vote for that issue by clicking the thumbs up button if you'd like it to be implemented.) A silent audio file which you proposed is actually the simplest alternative to SilenceAudioSource.
Otherwise, another approach would be to not use the gapless playlists feature at all (since you don't need the gapless feature anyway), and just implement your own logic to advance the queue:
final queue = [source1, source2, source3, ...];
for (var source in queue) {
await _voiceAudioPlayer.setAudioSource(source);
await _voiceAudioPlayer.play();
await Future.delayed(seconds: 4);
}
The above example does not handle pause/resume logic, but it is just to show that it is possible for you to take the playlist logic into your own hands if you don't require the gapless feature.
I've been playing around with Flutter and trying to get it so I can record the front facing camera (using the camera plugin [https://pub.dev/packages/camera]) as well as playing a video to the user (using the video_player plugin [https://pub.dev/packages/video_player]).
Next I use ffmpeg to horizontally stack the videos together. This all works fine but when I play back the final output there is a slight delay when listening to the audio. I'm calling Future.wait([cameraController.startVideoRecording(filePath), videoController.play()]); but there is a slight delay in these tasks actually starting. I don't even need them to fire at the exact same time (which I'm realising is probably impossible), instead if I knew exactly when each of the tasks begun then I can use the time difference to sync the audio using ffmpeg or similar.
I've tried adding a listener on the videoController to see when isPlaying first returns true, and also watching the output directory for when the recorded video appears on the filesystem:
listener = () {
if (videoController.value.isPlaying) {
isPlaying = DateTime.now().microsecondsSinceEpoch;
log('isPlaying '+isPlaying.toString());
}
videoController.removeListener(listener);
};
videoController.addListener(listener);
var watcher = DirectoryWatcher('${extDir.path}/');
watcher.events.listen((event) {
if (event.type == ChangeType.ADD) {
fileAdded = DateTime.now().microsecondsSinceEpoch;
log('added '+fileAdded.toString());
}
});
Then likewise for checking if the camera is recording:
var listener;
listener = () {
if (cameraController.value.isRecordingVideo) {
log('isRecordingVideo '+DateTime.now().microsecondsSinceEpoch.toString());
//cameraController.removeListener(listener);
}
};
cameraController.addListener(listener);
This results in (for example) the following order and microseconds for each event:
is playing: 1606478994797247
is recording: 1606478995492889 (695642 microseconds later)
added: 1606478995839676 (346787 microseconds later)
However, when I play back the video the syncing is off by approx 0.152 seconds so doesn't marry up to the time differences reported above.
Does anyone have any idea how I could accomplish near perfect syncing when combining 2 videos? Thanks.
For example, when the user starts watching the stream, I will initialize the player and that starts playing from the latest position in the live stream. If the user then pauses the video they will be x seconds behind the latest stream position. I want to add functionality to allow them to seek directly the latest live position.
My current approach is to simply reinitialize the stream...
_videoPlayerController1 = VideoPlayerController.network('theUrl/index.m3u8');
_chewieController = ChewieController(
videoPlayerController: _videoPlayerController1,
autoPlay: true,
isLive: true,
autoInitialize: true,
);
However, as the two mentioned libraries heavily emphasize the need to call dispose() when finished with them I'm worried about creating memory leaks here. Is this approach okay?
I did try calling dispose() on them both before calling the above code...
_videoPlayerController1.dispose();
_chewieController.dispose();
Although that led to the following error 'A VideoPlayerController was used after being disposed. Once you have called dispose() on a VideoPlayerController, it can no longer be used.'
Does my approach lead to memory leaks and is there a better way to seek the latest live stream position?
I figured it out. The simplest way jump to the latest position in a live stream is to call:
_chewieController.seekTo(Duration(days: 30));
Under the hood, Chewie is calling the seekTo method on its videoController which is documented with the following:
/// If [moment] is outside of the video's full range it will be automatically
/// and silently clamped.
So it seems that if you don't know the streams latest position then you can pass in a Duration value that is much larger than what you expect the stream's latest position to be (30 days in this example) and the VideoController will set it's position to its latest known position.
Try the following code:
_videoPlayerController.seekTo(_videoPlayerController.value.duration);