Hello Flutter developers,
I am trying to play a beeping noise that has a duration of 250 milliseconds at a frequency for 30 to 300 Hz. This is part of a simulated medical monitor. I have tried multiple audio packages and my most recent attempt is with just_audio, which works very well for every platform except the desktop safari browser. The app can be tested at benmed.org/sim (login required unfortunately).
Here is the audio playlist:
final playlist = ConcatenatingAudioSource(children: [
AudioSource.uri(Uri.parse('asset:///assets/audio/beepfiles/98.wav')),
AudioSource.uri(Uri.parse('asset://assets//audio/beepfiles/96.wav')),
AudioSource.uri(Uri.parse('asset:///assets/audio/beepfiles/94.wav')),
AudioSource.uri(Uri.parse('asset:///assets/audio/beepfiles/92.wav')),
]);
Here is initilisation of the audio player:
Future\<void\> \_initAudio() async {
await player.setLoopMode(LoopMode.one);
final session = await AudioSession.instance;
await session.configure(const AudioSessionConfiguration.speech());
try {
player.setAudioSource(playlist);
} catch (e) {
debugPrint("Error loading audio source: $e");
}
}
And here is the function that plays the beeping sounds with delays for variation in heart rate:
void playBeeps(WidgetRef ref) async {
var obs = ref.watch(obsProvider);
obs.whenData((obs) async {
Future.doWhile(() async {
var obs2 = ref.watch(testLocalObsProvider);
int hr = obs2.heartRate;
int sats = obs2.saturationsO2;
bool audioIsMuted = ref.read(muteAudio);
int track = (50 - ((sats / 2).round())).clamp(0, 15);
player.seek(Duration.zero, index: track);
await Future.delayed(
Duration(milliseconds: ((60 / hr) * 1000).round()));
player.play().catchError((e) {
debugPrint(e);
});
await Future.delayed(const Duration(milliseconds: 250));
player.pause().catchError((e) {
debugPrint(e);
});
return !audioIsMuted;
});
});
}
I have tried just_audio among many other audio packages but nothing worked
I tried removing the "/assets" from the files
I tried .mp3 and .wav files
I wonder if the sequence of loading and playing the audio files is incorrect when compiled by safari.
Many thanks for your help with this issue.
Related
I am looping over a set of byte streams (tts generated voices) and it seems that await for the play of just-audio does not wait for the audio to finish playing.
When I add a sleep for 3 seconds after the await call, all the audios are played, so it seems to be that await for the play does not work in this case.
Code (the last function is using the azure tts package for getting the byte stream):
final player = AudioPlayer();
...
playAudioByteStream(Uint8List? audio) async {
print('Play START');
await player.setAudioSource(ByteAudioSource(audio!), preload: false);
await player.play();
print('Play END');
}
...
class ByteAudioSource extends StreamAudioSource {
final Uint8List _buffer;
ByteAudioSource(this._buffer) : super(tag: 'MyAudioSource');
#override
Future<StreamAudioResponse> request([int? start, int? end]) async {
// Returning the stream audio response with the parameters
return StreamAudioResponse(
sourceLength: _buffer.length,
contentLength: (start ?? 0) - (end ?? _buffer.length),
offset: start ?? 0,
stream: Stream.fromIterable([_buffer.sublist(start ?? 0, end)]),
contentType: 'audio/mp3',
);
}
}
...
Future<Uint8List> getAudioByteStream(String text, Voice voice) async {
TtsParams params = TtsParams(
voice: voice,
audioFormat: AudioOutputFormat.audio16khz32kBitrateMonoMp3,
rate: 1.0,
text: text);
final ttsResponse = await AzureTts.getTts(params) as AudioSuccess;
return ttsResponse.audio;
}
I'm building a flutter app and inside I use flutter_sound for voice recording and playing.
It works totally fine with Android devices but not with iOS device.
Right after building the app, I play a sound fetched from firebase but there was no sound at all. However, after I record a new voice, the previously fetched sound can play properly.
This is the code for playing sound
void startPlay() {
isPlaying = true;
_player = FlutterSoundPlayer();
_player!.openPlayer();
_player!.startPlayer(
fromDataBuffer: widget.bodyBytes,
codec: Codec.pcm16,
sampleRate: 48000,
whenFinished: () {
isPlaying = false;
if(mounted) {
setState(() {});
}
});
setState(() {});
}
This is the code for recording
Future<bool> record() async {
var status = await Permission.microphone.request();
if (status != PermissionStatus.granted) {
status = await Permission.microphone.request();
return false;
}
recordingDataController = StreamController<Food>.broadcast();
// write to data
_sink = File(_tempFilePath!).openWrite();
recordingDataSubscriptor = recordingDataController!.stream.listen((buffer) {
if (buffer is FoodData) {
_sink.add(buffer.data!);
}
});
await _recorder!.openRecorder();
await _recorder!.startRecorder(
toStream: recordingDataController!.sink,
codec: Codec.pcm16,
numChannels: 1,
sampleRate: 48000,
);
return true;
}
Please help me, Love you all <3
I want to connect two other mp3 files, but it just plays one mp3 file because the audio player does not play the mp3 file sequentially.
And finally, I try to delay 5 seconds after below other mp3 file played like this.
void getAudio() async{
var url = _API_AUDIO_PREFIX;
var path;
int duration = 0;
Map<String, dynamic> json = await server.postReq(data);
print(json["voices"]);
for(int i=0;i<json["voices"].length;i++){
path = url + json["voices"][i]["title"];
print(path);
duration = i * 500;
Future.delayed(const Duration(milliseconds: duration), (){
playAudio(path);
});
}
}
Future<void> playAudio(var url) async {
int result = await audioPlayer.play(url);
if (result == 1){
print("success");
}
//duration = await audioPlayer.getDuration();
}
but in milliseconds, they have errors like this.
Evaluation of this constant expression throws an exception.
How do I fix this? And How do I played the mp3 file continue to other mp3 files?
I fixed error
Evaluation of this constant expression throws an exception.
to remove const in const Duration(milliseconds: duration).
but it can't work continue to next files because path variable is changed equally. :(
I'm using ImagePicker to upload videos either from gallery or via capturing them from camera.
Problem is that I don't want the video to exceed 1 minute duration, when in gallery picking mode, I check the duration of selected video and show a message if video is longer than 1 minute.
How can I do something like retrica, open camera but with limit on video duration ?
use maxDuration provided by image_picker
final PickedFile videoFile = await picker.getVideo(
source: ImageSource.camera,
maxDuration: const Duration(seconds: 60),
);
I think you cant do this by ImagePicker because of this plugin capture video by phone default camera app and you haven't access to check and manage duration while capturing until the user stops capturing and return to your application
but if you use camera plugin you can do this because of this plugin capture video by your application and you have access to check video duration while user capture video
https://pub.dev/packages/camera
you can't controller it if you want to get this feature use Camera plugin
https://pub.dev/packages/camera
and use timer to stop recording
//Timer
timer = Timer.periodic(Duration(seconds: 60), (Timer t) {
_onStopButtonPressed();
timer.cancel();
});
});
//stop recording when click on the button
void _onStopButtonPressed() {
setState(() {
buttonColor = Colors.white;
});
_stopVideoRecording().then((_) {
if (mounted) setState(() {});
});
timer.cancel(); //when user close it manually
}
// stop funcation
Future<void> _stopVideoRecording() async {
if (!controller.value.isRecordingVideo) {
return null;
}
try {
await controller.stopVideoRecording();
} on CameraException catch (e) {
_showCameraException(e);
return null;
}
}
also you can use video_player plugin to replay the video after recording
https://pub.dev/packages/video_player#-installing-tab-
I'm using image_picker to record from the camera and my app is something like SnapChat.
but I couldn't limit the recording time on 5 seconds.
Is there any way to do that?
this link for the plugin https://pub.dev/packages/image_picker
code :
File _video;
Future getVideo() async {
var video= await ImagePicker.pickVideo(source: ImageSource.camera);
setState(() {
_video= video;
});
}