I'm building a flutter app and inside I use flutter_sound for voice recording and playing.
It works totally fine with Android devices but not with iOS device.
Right after building the app, I play a sound fetched from firebase but there was no sound at all. However, after I record a new voice, the previously fetched sound can play properly.
This is the code for playing sound
void startPlay() {
isPlaying = true;
_player = FlutterSoundPlayer();
_player!.openPlayer();
_player!.startPlayer(
fromDataBuffer: widget.bodyBytes,
codec: Codec.pcm16,
sampleRate: 48000,
whenFinished: () {
isPlaying = false;
if(mounted) {
setState(() {});
}
});
setState(() {});
}
This is the code for recording
Future<bool> record() async {
var status = await Permission.microphone.request();
if (status != PermissionStatus.granted) {
status = await Permission.microphone.request();
return false;
}
recordingDataController = StreamController<Food>.broadcast();
// write to data
_sink = File(_tempFilePath!).openWrite();
recordingDataSubscriptor = recordingDataController!.stream.listen((buffer) {
if (buffer is FoodData) {
_sink.add(buffer.data!);
}
});
await _recorder!.openRecorder();
await _recorder!.startRecorder(
toStream: recordingDataController!.sink,
codec: Codec.pcm16,
numChannels: 1,
sampleRate: 48000,
);
return true;
}
Please help me, Love you all <3
the below stream method is triggered every frame
stream() async {
_isStreaming=true;
var counter=0;
channel = WebSocketChannel.connect(
Uri.parse('ws://192.168.11.8:8000/real-time/')
) as IOWebSocketChannel?;
_cameraController?.startImageStream((CameraImage img) async{
final imageBytes= await convertYUV420toImageColor(img);
if(_isStreaming) {
channel?.sink.add(imageBytes);
}
});
channel?.stream.listen((message) {
print(message);
var jsonDecodex = jsonDecode(message);
var prediction = jsonDecodex['prediction'];
var predictProba = jsonDecodex['predict_proba'];
_prediction = prediction;
_accuracy =predictProba+"%";
if(mounted) {
setState(() {});
}
},
);}
I'm working on a flutter app that opens a stream and sends every frame to a Django server via WebSockets, However, nearly after 1 minute of streaming 'out of memory' error happens.
I am developing an app that can record and play sound. Everything works fine on my emulator. But when I try to run the app on my device, it always gives me this error:
java.lang.NullPointerException: Attempt to invoke virtual method 'long android.os.storage.StorageVolume.getMaxFileSize()' on a null object reference
The way I implement the recording feature is to record the audio to a temporary file, and playback from it. This is the corresponding code, I am using flutter sound by the way:
String pathToSaveAudio = '';
class SoundRecorder {
FlutterSoundRecorder? _audioRecorder;
bool _isRecordingInitialised = false;
bool get isRecording => _audioRecorder!.isRecording;
// getters
bool getInitState() {
return _isRecordingInitialised;
}
FlutterSoundRecorder? getRecorder() {
return _audioRecorder;
}
/// init recorder
Future init() async {
_audioRecorder = FlutterSoundRecorder();
final status = await Permission.microphone.request();
if (status != PermissionStatus.granted) {
throw RecordingPermissionException('Microphone permission denied');
}
await _audioRecorder!.openRecorder();
_isRecordingInitialised = true;
var tempDir = await getTemporaryDirectory();
pathToSaveAudio = '${tempDir.path}/audio.mp4';
}
/// dipose recorder
void dispose() {
_audioRecorder!.closeRecorder();
_audioRecorder = null;
_isRecordingInitialised = false;
}
/// start record
Future _record() async {
assert(_isRecordingInitialised);
await _audioRecorder!
.startRecorder(toFile: pathToSaveAudio, codec: Codec.aacMP4);
}
/// stop record
Future _stop() async {
if (!_isRecordingInitialised) return;
await _audioRecorder!.stopRecorder();
}
I think what I'm doing is record the sound and put it in the file in that temp directory. But appearently the app want to acces the file before I even start recording. At this point, I don't know what to do, please help me.
I have list of song on song screen. If user click the one item in the list, I call the loadFirstPlaylist() to load the list of songs(all song in album) into queue and then skip the queue and play. It is working on android but I got following error on iOS.
GitHub Sources Code
[NowPlaying] [MRNowPlaying] Ignoring setPlaybackState because application does not contain entitlement com.apple.mediaremote.set-playback-state for platform
Future<void> loadFirstPlayList(List<MediaItem> playlist, int index) async {
await emptyPlaylist();
if (playlist.isNotEmpty) {
await _audioHandler.addQueueItems(playlist);
await _audioHandler.skipToQueueItem(index);
await _audioHandler.play();
}
}
Audio Handler Method
#override
Future<void> addQueueItems(List<MediaItem> mediaItems) async {
// manage Just Audio
final audioSource = mediaItems.map(_createAudioSource);
_playlist.addAll(audioSource.toList());
// notify system
final newQueue = queue.value..addAll(mediaItems);
queue.add(newQueue);
}
#override
Future<void> skipToQueueItem(int index) async {
if (index < 0 || index >= queue.value.length) return;
if (_player.shuffleModeEnabled) {
index = _player.shuffleIndices![index];
}
_player.seek(Duration.zero, index: index);
}
#override
Future<void> play() => _player.play();
I do not know if you have figured it out or not already. However, I had the same issue when trying to load an empty playlist. I was following this example, which includes a _loadEmptyPlaylist method. However, when I implemented it this caused the player to fail silently. It now seems to be working by not calling loadAudioSource on an empty audio sequence.
Well, I'm stuck on this problem. I have a code for audioservice (audioplayer.dart) which takes a queue to play. I'm getting the queue from playlist.dart in audioplayer.dart using ModalRoute and save in a global variable queue. Then, I initialize the AudioPlayerService. Now everything till here is fine but inside the AudioPlayerTask class which extends BackgroundAudioTask, when I try to access the variable (inside onStart) it comes out to be an empty list. I don't know where the problem is and I'm not very much familier with the BackgroundAudioTask class. Here's how it looks like:
import .....
List<MediaItem> queue = [];
class TempScreen extends StatefulWidget {
#override
_TempScreenState createState() => _TempScreenState();
}
class _TempScreenState extends State<TempScreen> {
#override
Widget build(BuildContext context) {
queue = ModalRoute.of(context).settings.arguments;
// NOW HERE THE QUEUE IS FINE
return Container(.....all ui code);
}
// I'm using this button to start the service
audioPlayerButton() {
AudioService.start(
backgroundTaskEntrypoint: _audioPlayerTaskEntrypoint,
androidNotificationChannelName: 'Audio Service Demo',
androidNotificationColor: 0xFF2196f3,
androidNotificationIcon: 'mipmap/ic_launcher',
androidEnableQueue: true,
);
AudioService.updateQueue(queue);
print('updated queue at the start');
print('queue now is $queue');
AudioService.setRepeatMode(AudioServiceRepeatMode.none);
AudioService.setShuffleMode(AudioServiceShuffleMode.none);
AudioService.play();
}
}
void _audioPlayerTaskEntrypoint() async {
AudioServiceBackground.run(() => AudioPlayerTask());
}
class AudioPlayerTask extends BackgroundAudioTask {
AudioPlayer _player = AudioPlayer();
Seeker _seeker;
StreamSubscription<PlaybackEvent> _eventSubscription;
String kUrl = '';
String key = "38346591";
String decrypt = "";
String preferredQuality = '320';
int get index => _player.currentIndex == null ? 0 : _player.currentIndex;
MediaItem get mediaItem => index == null ? queue[0] : queue[index];
// This is just a function i'm using to get song URLs
fetchSongUrl(songId) async {
print('starting fetching url');
String songUrl =
"https://www.jiosaavn.com/api.php?app_version=5.18.3&api_version=4&readable_version=5.18.3&v=79&_format=json&__call=song.getDetails&pids=" +
songId;
var res = await get(songUrl, headers: {"Accept": "application/json"});
var resEdited = (res.body).split("-->");
var getMain = jsonDecode(resEdited[1]);
kUrl = await DesPlugin.decrypt(
key, getMain[songId]["more_info"]["encrypted_media_url"]);
kUrl = kUrl.replaceAll('96', '$preferredQuality');
print('fetched url');
return kUrl;
}
#override
Future<void> onStart(Map<String, dynamic> params) async {
print('inside onStart of audioPlayertask');
print('queue now is $queue');
// NOW HERE QUEUE COMES OUT TO BE AN EMPTY LIST
final session = await AudioSession.instance;
await session.configure(AudioSessionConfiguration.speech());
if (queue.length == 0) {
print('queue is found to be null.........');
}
_player.currentIndexStream.listen((index) {
if (index != null) AudioServiceBackground.setMediaItem(queue[index]);
});
// Propagate all events from the audio player to AudioService clients.
_eventSubscription = _player.playbackEventStream.listen((event) {
_broadcastState();
});
// Special processing for state transitions.
_player.processingStateStream.listen((state) {
switch (state) {
case ProcessingState.completed:
AudioService.currentMediaItem != queue.last
? AudioService.skipToNext()
: AudioService.stop();
break;
case ProcessingState.ready:
break;
default:
break;
}
});
// Load and broadcast the queue
print('queue is');
print(queue);
print('Index is $index');
print('MediaItem is');
print(queue[index]);
try {
if (queue[index].extras == null) {
queue[index] = queue[index].copyWith(extras: {
'URL': await fetchSongUrl(queue[index].id),
});
}
await AudioServiceBackground.setQueue(queue);
await _player.setUrl(queue[index].extras['URL']);
onPlay();
} catch (e) {
print("Error: $e");
onStop();
}
}
#override
Future<void> onSkipToQueueItem(String mediaId) async {
// Then default implementations of onSkipToNext and onSkipToPrevious will
// delegate to this method.
final newIndex = queue.indexWhere((item) => item.id == mediaId);
if (newIndex == -1) return;
_player.pause();
if (queue[newIndex].extras == null) {
queue[newIndex] = queue[newIndex].copyWith(extras: {
'URL': await fetchSongUrl(queue[newIndex].id),
});
await AudioServiceBackground.setQueue(queue);
// AudioService.updateQueue(queue);
}
await _player.setUrl(queue[newIndex].extras['URL']);
_player.play();
await AudioServiceBackground.setMediaItem(queue[newIndex]);
}
#override
Future<void> onUpdateQueue(List<MediaItem> queue) {
AudioServiceBackground.setQueue(queue = queue);
return super.onUpdateQueue(queue);
}
#override
Future<void> onPlay() => _player.play();
#override
Future<void> onPause() => _player.pause();
#override
Future<void> onSeekTo(Duration position) => _player.seek(position);
#override
Future<void> onFastForward() => _seekRelative(fastForwardInterval);
#override
Future<void> onRewind() => _seekRelative(-rewindInterval);
#override
Future<void> onSeekForward(bool begin) async => _seekContinuously(begin, 1);
#override
Future<void> onSeekBackward(bool begin) async => _seekContinuously(begin, -1);
#override
Future<void> onStop() async {
await _player.dispose();
_eventSubscription.cancel();
await _broadcastState();
// Shut down this task
await super.onStop();
}
Future<void> _seekRelative(Duration offset) async {
var newPosition = _player.position + offset;
// Make sure we don't jump out of bounds.
if (newPosition < Duration.zero) newPosition = Duration.zero;
if (newPosition > mediaItem.duration) newPosition = mediaItem.duration;
// Perform the jump via a seek.
await _player.seek(newPosition);
}
void _seekContinuously(bool begin, int direction) {
_seeker?.stop();
if (begin) {
_seeker = Seeker(_player, Duration(seconds: 10 * direction),
Duration(seconds: 1), mediaItem)
..start();
}
}
/// Broadcasts the current state to all clients.
Future<void> _broadcastState() async {
await AudioServiceBackground.setState(
controls: [
MediaControl.skipToPrevious,
if (_player.playing) MediaControl.pause else MediaControl.play,
MediaControl.stop,
MediaControl.skipToNext,
],
systemActions: [
MediaAction.seekTo,
MediaAction.seekForward,
MediaAction.seekBackward,
],
androidCompactActions: [0, 1, 3],
processingState: _getProcessingState(),
playing: _player.playing,
position: _player.position,
bufferedPosition: _player.bufferedPosition,
speed: _player.speed,
);
}
AudioProcessingState _getProcessingState() {
switch (_player.processingState) {
case ProcessingState.idle:
return AudioProcessingState.stopped;
case ProcessingState.loading:
return AudioProcessingState.connecting;
case ProcessingState.buffering:
return AudioProcessingState.buffering;
case ProcessingState.ready:
return AudioProcessingState.ready;
case ProcessingState.completed:
return AudioProcessingState.completed;
default:
throw Exception("Invalid state: ${_player.processingState}");
}
}
}
This is the full code for AudioService in-case needed.
(Answer update: Since v0.18, this sort of pitfall doesn't exist since the UI and background code run in a shared isolate. The answer below is only relevant for v0.17 and earlier.)
audio_service runs your BackgroundAudioTask in a separate isolate. In the README, it is put this way:
Note that your UI and background task run in separate isolates and do not share memory. The only way they communicate is via message passing. Your Flutter UI will only use the AudioService API to communicate with the background task, while your background task will only use the AudioServiceBackground API to interact with the UI and other clients.
The key point there is that isolates do not share memory. If you set a "global" variable in the UI isolate, it will not be set in the background isolate because the background isolate has its own separate block of memory. That is why your global queue variable is null. It is not actually the same variable, because now you actually have two copies of the variable: one in the UI isolate which has been set with a value, and the other in the background isolate which has not (yet) been set with a value.
Now, your background isolate does "later" set its own copy of the queue variable to something, and this happens via the message passing API where you pass the queue from the UI isolate into updateQueue and the background isolate receive that message and stores it into its own copy of the variable in onUpdateQueue. If you were to print out the queue after this point it would no longer be null.
There is also a line in your onStart where you are attempting to set the queue, although you should probably delete that code and let the queue only be set in onUpdateQueue. You should not attempt to access the queue in onStart since your queue won't receive its value until onUpdateQueue. If you want to avoid any null pointer exception before its set, you can initialise the queue in the background isolate to an empty list, and it will eventually get replaced by a non-empty list in onUpdateQueue without ever being null.
I would also suggest you avoid making queue a global variable. Global variables are generally bad, but in this case, it may actually be confusing you into thinking that that queue variable is the same in both the UI and the background isolate when in reality each isolate will have its own copy of the variable perhaps with different values. Thus, your code will be clearer if you make two separate "local" variables. One inside the UI and one inside the background task.
One more suggestion is that you should note that the methods in the message passing API are asynchronous methods. You should wait for the audio service to start before you send messages to it, such as setting the queue. AND you should wait for the queue to be set before you try to play from the queue:
await AudioService.start(....);
// Now the service has started, it is safe to send messages.
await AudioService.updateQueue(...);
// Now the queue has been updated, it is safe to play from it.