I'm trying to use the plugin Audio Players for my recorded audios ( it's working perfectly for android ) but when I try to play the audio I get this error:
"iOS => call setVolume, playerId 57c0b5cc-3c75-4f8f-bf99-ad8ddbcb7709"
"iOS => call setUrl, playerId 57c0b5cc-3c75-4f8f-bf99-ad8ddbcb7709"
""
2022-06-23 18:27:00.168194-0300 Runner[16264:695114] flutter: AVPlayerItem.Status.failed
Any idea of what can I do to solve this problem?
await audioPlayer2.setVolume(1.0);
await audioPlayer2.setUrl(filepath, isLocal: true);
await audioPlayer2.play(filepath, isLocal: true);
audioPlayer2.onPlayerStateChanged.listen((event) {
setState(() {
_isPlaying = false;
});
Instead of using setUrl and then play I only used this:
final _audioPlayer = AudioPlayer()..setReleaseMode(ReleaseMode.stop);
await _audioPlayer.play(UrlSource(widget.postData.audioURL));
you have to add capabilities in xcode
Go to xcode -> runner -> add capabilities
Inter-App Audio
Thanks
Related
if can anyone write the code to run in flutter version 3.7.3 please i will be so appreciated
How can i download my audios from FirebaseStorage in flutter app and show them in ListView and make bottom to play them for know i have liek 10000 audios in my firebase_storage please some help if so❤
If you are geting a Url or audio file the you can simple pass the url of your audio file in package: audioplayer
Code:
AudioPlayer audioPlayer = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
play() async {
int result = await audioPlayer.play(url);
if (result == 1) {
// success
}
}
for more controll and custumization you can also use : audio_service:
Recently I am using a package named flutter_sound v9.1.7. Here are some of the codes.
String _mPath = 'tau_file.mp4';
Codec _codec = Codec.aacMP4;
File? file;
FlutterSoundPlayer? _mPlayer = FlutterSoundPlayer();
FlutterSoundRecorder? _mRecorder = FlutterSoundRecorder();
void record() async {
_mRecorder!
.startRecorder(
toFile: _mPath,
codec: _codec,
audioSource: AudioSource.microphone,
)
.then((value) {});
setState(() {
recording = true;
});
}
I have succeeded in recording and playing audio, but when I finish recording and try to analyze the seeming audio file tau_file.mp4, such like get the length of the file, an error occurred:
Cannot retrieve length of file, path = 'tau_file.mp4' (OS Error: No such file or directory, errno = 2).
The analysis code is here:
file = File(_mPath);
print(file?.path);
print(file?.absolute);
print(file?.length.toString());
I tried to seek answer in source codes, but only found an interface... So is the audio really be written to the file tau_file.mp4? Or maybe the process of analysis is wrong?
This is the first time that I use flutter_sound. Thanks for your help.
void stopRecorder() async {
await _mRecorder!.stopRecorder().then((value) {
setState(() {
//var url = value;
recordedUrl = value;
debugPrint('path : -------- $recordedUrl');
_mplaybackReady = true;
});
});
}
This is your recorded files url, when you stop the record you can get it.
The document includes this code. It gives you //var url = value;. Then you can handle it. like var recordedFile = File(url);. it's not a temporary one. You can upload it somewhere or whatever you like.
so I recently got started with the flutter package cast in order to communicate with Chromecast devices. But I couldn't find any details on how to use it. If you could give me some help in actually playing a media file such as a song or a video that would be wholesome!
My current code:
CastSession session;
Future<void> _connect(BuildContext context, CastDevice object) async {
session = await CastSessionManager().startSession(object);
session.stateStream.listen((state) {
if (state == CastSessionState.connected) {
// Close my custom GUI
Navigator.pop(context);
_sendMessage(session);
}
});
session.messageStream.listen((message) {
print('receive message: $message');
});
}
// My video playing code
session.sendMessage(CastSession.kNamespaceReceiver, {
'type': 'MEDIA',
'link': 'http://somegeneratedurl.com',
});
Ok, so I found a solution to the answer. There is unfortunately no command to play a video file. I've looked through the Gcast protocol reference and there is no command for playing video files. I found this package that can cast videos, and I'm gonna use that package instead.
I have this page where the camera is initialized and ready with a button that will record and stop the video, so I tried this :
FlatButton(
onPressed: () => {
!isRecording
? {
setState(() {
isRecording = true;
}),
cameraController.prepareForVideoRecording(),
cameraController.startVideoRecording('assets/Videos/test.mp4')
}
: cameraController.stopVideoRecording(),
},
............
but throws this error : nhandled Exception: CameraException(videoRecordingFailed, assets/Videos/test.mp4: open failed: ENOENT (No such file or directory)).
I don't understand, I don't want to open this file I want to save it there, Is there sth wrong with my code ?
In the new version, static method startRecordingVideo doesn't take any string parameter.
When you want to start the recording just see whether a video is already getting recorded, if not start
if (!_controller.value.isRecordingVideo) {
_controller.startVideoRecording();
}
and when you want to finish the recording you can call the static method stopVideoRecording() and it will give you a object of the class XFile, it will have the path to your video.
if (_controller.value.isRecordingVideo) {
XFile videoFile = await _controller.stopVideoRecording();
print(videoFile.path);//and there is more in this XFile object
}
This thing has worked for me. I am new to flutter please improve my answer if you know more.
You are trying to save a video in your assets folder which is not possible ,
What you need to do is to save to device locally either common folders like downloads or app directory.
Here is an example of how to go about it
dependencies:
path_provider:
Flutter plugin for getting commonly used locations on host platform
file systems, such as the temp and app data directories.
We will be saving the video to app directory.
We need to get the path to the directory where the file is or will be. Usually a file is put in the application's document directory, in the application's cache directory, or in the external storage directory. To get the path easily and reduce the chance of type, we can use PathProvider
Future<String> _startVideoRecording() async {
if (!controller.value.isInitialized) {
return null;
}
// Do nothing if a recording is on progress
if (controller.value.isRecordingVideo) {
return null;
}
//get storage path
final Directory appDirectory = await getApplicationDocumentsDirectory();
final String videoDirectory = '${appDirectory.path}/Videos';
await Directory(videoDirectory).create(recursive: true);
final String currentTime = DateTime.now().millisecondsSinceEpoch.toString();
final String filePath = '$videoDirectory/${currentTime}.mp4';
try {
await controller.startVideoRecording(filePath);
videoPath = filePath;
} on CameraException catch (e) {
_showCameraException(e);
return null;
}
//gives you path of where the video was stored
return filePath;
}
I'm using onsignal notifications with my flutter app,I've tested my app on three Samsunge devices and notifcations working perfectly in all these devices when the app on foreground, background, and also when I swiped it away.
after that I tested the app on huawei device which using EMUI 9.0.1 Os
the notifications only works if the app is active or on background
if I swiped it away I can't receiving any notifications.
Any help would be greatly appreciated, I've been struggling with this for a long time. I'll post my code for setting onesignal below
Future<void> initPlatformState() async {
if (!mounted) return;
OneSignal.shared.setLogLevel(OSLogLevel.verbose, OSLogLevel.none);
OneSignal.shared.setRequiresUserPrivacyConsent(true);
OneSignal.shared.consentGranted(true);
var settings = {
OSiOSSettings.autoPrompt: false,
OSiOSSettings.promptBeforeOpeningPushUrl: true
};
OneSignal.shared.setNotificationReceivedHandler((notification) {
this.setState(() {
print('Notifiaction received');
});
});
OneSignal.shared
.setNotificationOpenedHandler((OSNotificationOpenedResult result) {
this.setState(() {
newUrl = result.notification.payload.additionalData['url'].toString();
});
Navigator.of(context).pushReplacement(
MaterialPageRoute(builder: (context) => WebNotification(newUrl)));
});
// NOTE: Replace with your own app ID from https://www.onesignal.com
await OneSignal.shared
.init("xxxx-xxxx-xxxx-xxxx-xxxx", iOSSettings: settings);
OneSignal.shared
.setInFocusDisplayType(OSNotificationDisplayType.notification);
OneSignal.shared.inFocusDisplayType();
}
You need to set up a service extension. Take a look at our docs on Background Notifications. Also, consider Notification Behavior when designing your implementation
make sure you sue latest version of onesignal
for huwaii use HMS if possible ( onesignal support HMS )
In your root build.gradle, under buildscript, add the following 2 new lines to your existing repositories and dependencies sections
buildscript {
repositories {
// ...
maven { url 'https://plugins.gradle.org/m2/' } // Gradle Plugin Portal
}
dependencies {
// ...
// OneSignal-Gradle-Plugin
classpath 'gradle.plugin.com.onesignal:onesignal-gradle-plugin:[0.12.8, 0.99.99]'
}
}
Add the following to the top of your app/build.gradle
apply plugin: 'com.onesignal.androidsdk.onesignal-gradle-plugin'
you need to make user add your app to ignore battry optmztion in huwaii it is (protacted app )
in flutter you can make button and attach it to the app setting ( use these plugin https://pub.dev/packages/app_settings)