I have list of song on song screen. If user click the one item in the list, I call the loadFirstPlaylist() to load the list of songs(all song in album) into queue and then skip the queue and play. It is working on android but I got following error on iOS.
GitHub Sources Code
[NowPlaying] [MRNowPlaying] Ignoring setPlaybackState because application does not contain entitlement com.apple.mediaremote.set-playback-state for platform
Future<void> loadFirstPlayList(List<MediaItem> playlist, int index) async {
await emptyPlaylist();
if (playlist.isNotEmpty) {
await _audioHandler.addQueueItems(playlist);
await _audioHandler.skipToQueueItem(index);
await _audioHandler.play();
}
}
Audio Handler Method
#override
Future<void> addQueueItems(List<MediaItem> mediaItems) async {
// manage Just Audio
final audioSource = mediaItems.map(_createAudioSource);
_playlist.addAll(audioSource.toList());
// notify system
final newQueue = queue.value..addAll(mediaItems);
queue.add(newQueue);
}
#override
Future<void> skipToQueueItem(int index) async {
if (index < 0 || index >= queue.value.length) return;
if (_player.shuffleModeEnabled) {
index = _player.shuffleIndices![index];
}
_player.seek(Duration.zero, index: index);
}
#override
Future<void> play() => _player.play();
I do not know if you have figured it out or not already. However, I had the same issue when trying to load an empty playlist. I was following this example, which includes a _loadEmptyPlaylist method. However, when I implemented it this caused the player to fail silently. It now seems to be working by not calling loadAudioSource on an empty audio sequence.
Related
I have implemented this method so that when a user clicks the dynamic link it will be redirected to a specific page. Everything works alright while the app is running, but when I kill/close the app and try to do the same thing, it opens the app on the initial screen (Home Page). How can I make it work in this case?
Future<void> initDynamicLinks() async {
FirebaseDynamicLinks.instance.onLink.listen((dynamicLinkData) {
id = dynamicLinkData.link
.toString()
.substring(dynamicLinkData.link.toString().lastIndexOf('/') + 1);
Get.to(
() => Page(
id: id,
),
);
}).onError((error) {
if (kDebugMode) {
print(error.message);
}
});
}
void initState() {
// TODO: implement initState
initDynamicLinks();
super.initState();
}
I think .onLink.listen() function only get hit when app is resumed from background.
If you want your deeplink work when app have a fresh start then just put this code above .onLink.listen() function...
WidgetsFlutterBinding.ensureInitialized();
await Firebase.initializeApp();
final PendingDynamicLinkData data = await FirebaseDynamicLinks.instance.getInitialLink();
final Uri deepLink = data?.link;
// Here you should navigate to your desired screen
Hope it helps you
I am developing a PodCast app. I want to save every index of a correlated episode that the user have started listening to, into a box in Hive.
So I am using a listener to catch the current/latest index of the players positionStream( I need the position as well ).
But Hive doesn't write to the box inside the listener. The same code is working when wrapped inside a button with an onTap event, and tapped of course. Why I that? Is there any way around it?
No error message is thrown what so ever...
Below is minimal reproduction example which doesn't write to the boxBools.
void main(){
WidgetsFlutterBinding.ensureInitialized();
final appDocDir = await getApplicationDocumentsDirectory();
await Hive.initFlutter(appDocDir.path);
await Hive.openBox<List<bool>>("Bools");
}
void listenToChanges() async {
final boxBools = await Hive.box<List<bool>>("Bools");
int? _index;
List<bool> randomList = [false,false,false,false,false,false,false,false,false,false];
_player.positionStream.listen((event) {
if (event.inSeconds != 0) {
_index = _player.currentIndex;
randomList[_index] = true;
print("RandomList => $randomList");
boxBools.put("randomPod", randomList);
}
});
}
The print statement "RandomList => $randomList" prints the correct values, but randomList doesn't get written to the boxBools.
I am developing an app that can record and play sound. Everything works fine on my emulator. But when I try to run the app on my device, it always gives me this error:
java.lang.NullPointerException: Attempt to invoke virtual method 'long android.os.storage.StorageVolume.getMaxFileSize()' on a null object reference
The way I implement the recording feature is to record the audio to a temporary file, and playback from it. This is the corresponding code, I am using flutter sound by the way:
String pathToSaveAudio = '';
class SoundRecorder {
FlutterSoundRecorder? _audioRecorder;
bool _isRecordingInitialised = false;
bool get isRecording => _audioRecorder!.isRecording;
// getters
bool getInitState() {
return _isRecordingInitialised;
}
FlutterSoundRecorder? getRecorder() {
return _audioRecorder;
}
/// init recorder
Future init() async {
_audioRecorder = FlutterSoundRecorder();
final status = await Permission.microphone.request();
if (status != PermissionStatus.granted) {
throw RecordingPermissionException('Microphone permission denied');
}
await _audioRecorder!.openRecorder();
_isRecordingInitialised = true;
var tempDir = await getTemporaryDirectory();
pathToSaveAudio = '${tempDir.path}/audio.mp4';
}
/// dipose recorder
void dispose() {
_audioRecorder!.closeRecorder();
_audioRecorder = null;
_isRecordingInitialised = false;
}
/// start record
Future _record() async {
assert(_isRecordingInitialised);
await _audioRecorder!
.startRecorder(toFile: pathToSaveAudio, codec: Codec.aacMP4);
}
/// stop record
Future _stop() async {
if (!_isRecordingInitialised) return;
await _audioRecorder!.stopRecorder();
}
I think what I'm doing is record the sound and put it in the file in that temp directory. But appearently the app want to acces the file before I even start recording. At this point, I don't know what to do, please help me.
I am working on a Flutter Web project that involves a Speech to Text component. I intend to use Google's Speech to Text API (https://cloud.google.com/speech-to-text/docs). One of the critical requirements that I have is to use the API's Single Utterance capability to recognize when a speaker is done speaking automatically. This requires that I stream audio directly to Google's API from the Flutter client so that I can receive that event and do something with it. I am using the google_speech dart plugin (https://pub.dev/packages/google_speech) and am fairly certain it will meet my needs.
What I am struggling with is finding an implementation that can successfully record audio to a stream that I can then send to Google that works in Flutter Web.
So far the only one I cant find that seems to meet my needs is the flutter_sound (https://pub.flutter-io.cn/packages/flutter_sound) plugin as it claims to support Flutter Web and seems to have the ability to record to a dart Stream without the use of a File. I have an initial implementation but I seem to be missing something somewhere as the library seems to hang.
Here is my implementation so far:
class _LandingPageState extends State<LandingPage> {
FlutterSoundRecorder _mRecorder = FlutterSoundRecorder();
String _streamText = 'not yet recognized';
_LandingPageState(this.interviewID);
#override
Widget build(BuildContext context) {
//Build method works fine
}
// Called elsewhere to start the recognition
_submit() async {
// Do some stuff
await recognize();
// Do some other stuff
}
Future recognize() async {
// Set up the Google Speech (google_speech plugin) recongition apparatus
String serviceAccountPath = await rootBundle
.loadString('PATH TO MY SERVICE ACCOUNT CREDENTIALS');
final serviceAccount = ServiceAccount.fromString(serviceAccountPath);
final speechToText = SpeechToText.viaServiceAccount(serviceAccount);
final config = _getConfig();
// Create the stream controller (flutter_sound plugin)
StreamController recordingDataController = StreamController<Food>();
// Start the recording and specify what stream sink it is using
// which is the above stream controller's sink
await _mRecorder.startRecorder(
toStream: recordingDataController.sink,
codec: Codec.pcm16,
numChannels: 1,
sampleRate: 44000,
);
// Set up the recognition stream and pass it the stream
final responseStream = speechToText.streamingRecognize(
StreamingRecognitionConfig(
config: config,
interimResults: true,
singleUtterance: true,
),
recordingDataController.stream,
);
responseStream.listen((data) {
setState(() {
_streamText =
data.results.map((e) => e.alternatives.first.transcript).join('\n');
});
}, onDone: () {
setState(() {
print("STOP LISTENING");
print("STREAM TEXT = ");
print("--------------------------------------");
print(_streamText);
print("--------------------------------------");
// Stop listening to the mic
recordingDataController.close();
});
});
}
init() async {
await Future.delayed(Duration(seconds: 1));
await _sumbit();
}
#override
void initState() {
super.initState();
_openRecorder();
}
#override
void dispose() {
super.dispose();
_stopRecorder();
}
RecognitionConfig _getConfig() => RecognitionConfig(
encoding: AudioEncoding.LINEAR16,
model: RecognitionModel.basic,
enableAutomaticPunctuation: true,
sampleRateHertz: 16000,
languageCode: 'en-US');
Future<void> _openRecorder() async {
// These Permission calls dont seem to work on Flutter web
// var status = await Permission.microphone.request();
// if (status != PermissionStatus.granted) {
// throw RecordingPermissionException('Microphone permission not granted');
// }
await _mRecorder.openAudioSession();
}
Future<void> _stopRecorder() async {
await _mRecorder.stopRecorder();
}
}
When this is debugged the library hangs on starting the recorder and states "Waiting for the recorder being opened" but it just waits there forever. I've tried to debug the library but it is very unclear what is going on. I worry that this library does not support Flutter Web after all. Could it be that because Microphone permissions have not been granted that the library would hang?
I've been using this example for flutter_sound to implement this: https://github.com/Canardoux/tau/blob/master/flutter_sound/example/lib/recordToStream/record_to_stream_example.dart
Is there another library or approach that supports recording audio to a dart stream in Flutter web?
The problem is that to record to stream you must use the PCM16 codec, but this codec is not compatible to record audio in the browser. You can see the associated documentation https://tau.canardoux.xyz/guides_codec.html
I use audioplayers to listen to a radio channel (streaming) in Flutter. It works well in the background except when the user receives a call. In this case the playing stops but doesn't restart automatically at the end of the call.
AudioPlayer audioPlayer = new AudioPlayer();
const kUrl = "http://5.39.71.159:8865/stream";
Future play() async {
int result = await audioPlayer.play(kUrl);
setState(() {
playerState = PlayerState.playing;
});
}
Future stop() async {
int result = await audioPlayer.stop();
}
How can I put my radio in pause mode when a call comes and restart the playing at the end of the call?
Try this
audioPlayer.onPlayerStateChanged.listen((AudioPlayerState s) => {
print('Current player state: $s');
setState(() => palyerState = s);
});