Actually I need to send a voice message to my contacts using flutter.I have searched a lot.But I didn't get any idea on this.Is there any way to do this.?
if you want to share a voice message in a chat(e.g firebase chat) you can use flutter_sound, send a voice file to your cloud, and you can play it again with flutter_sound too.
You can use the audio_recorder package:
https://pub.dev/packages/audio_recorder
// Import package
import 'package:audio_recorder/audio_recorder.dart';
// Check permissions before starting
bool hasPermissions = await AudioRecorder.hasPermissions;
// Get the state of the recorder
bool isRecording = await AudioRecorder.isRecording;
// Start recording
await AudioRecorder.start(path: _controller.text, audioOutputFormat: AudioOutputFormat.AAC);
// Stop recording
Recording recording = await AudioRecorder.stop();
print("Path : ${recording.path}, Format : ${recording.audioOutputFormat}, Duration : ${recording.duration}, Extension : ${recording.extension},");
Related
if can anyone write the code to run in flutter version 3.7.3 please i will be so appreciated
How can i download my audios from FirebaseStorage in flutter app and show them in ListView and make bottom to play them for know i have liek 10000 audios in my firebase_storage please some help if so❤
If you are geting a Url or audio file the you can simple pass the url of your audio file in package: audioplayer
Code:
AudioPlayer audioPlayer = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
play() async {
int result = await audioPlayer.play(url);
if (result == 1) {
// success
}
}
for more controll and custumization you can also use : audio_service:
is it possible to save the processed image as a File?
Here is what I'm trying to do, our app have a KYC (Know your customer) and we implemented the
face detection to make the users do several poses. What I want is to save them as an image file and upload it on the database
Example Scenario:
App ask the user to smile > The user smiled > save the image.
Here is what I have right now:
Where the app checks if the user smiled
if (faces.isNotEmpty) {
if (inputImage.inputImageData?.size != null &&
inputImage.inputImageData?.imageRotation != null) {
if (faces[0].smilingProbability! > 0.85) {
await _getImg();
}
}
}
Then I call a Function to stop image stream then take a picture (this works but on some physical device it crashes) but if I dont stop the image stream then called the takePicture() right-away it just crashes all the time.
_getImg() async {
setState(() {
globalBusy = true;
});
await _controller.stopImageStream();
var img = await _controller.takePicture();
VerificationVarHandler.livelinesImgsPaths.add(img.path);
}
As you can see it's not the best way at least for me I think, so maybe I can use the
inputImage from the _processCameraImage() because it has a byte? then I can pass that bytes to a decoder and save it locally when I trigger a function?
Or maybe better yet there is more elegant way of achieving this?
You can check out this gfg article - https://www.geeksforgeeks.org/face-detection-in-flutter-using-firebase-ml-kit/
Learn flutter in easiest way - https://auth.geeksforgeeks.org/user/ms471841/articles
I am creating some kind of streaming app on mobile phones. I am launching front camera and displaying it on phone. Main purpose is to send that footage to my backend and then display it on website.
How I am supposed to do this? I need some server to do it?
I am using https://pub.dev/packages/flutter_webrtc
final RTCVideoRenderer _localRenderer = RTCVideoRenderer()
_localRenderer.initialize();
var stream = await navigator.mediaDevices
.getUserMedia({'video': true, 'audio': true});
setState(() {
_localRenderer.srcObject = stream;
});
so I recently got started with the flutter package cast in order to communicate with Chromecast devices. But I couldn't find any details on how to use it. If you could give me some help in actually playing a media file such as a song or a video that would be wholesome!
My current code:
CastSession session;
Future<void> _connect(BuildContext context, CastDevice object) async {
session = await CastSessionManager().startSession(object);
session.stateStream.listen((state) {
if (state == CastSessionState.connected) {
// Close my custom GUI
Navigator.pop(context);
_sendMessage(session);
}
});
session.messageStream.listen((message) {
print('receive message: $message');
});
}
// My video playing code
session.sendMessage(CastSession.kNamespaceReceiver, {
'type': 'MEDIA',
'link': 'http://somegeneratedurl.com',
});
Ok, so I found a solution to the answer. There is unfortunately no command to play a video file. I've looked through the Gcast protocol reference and there is no command for playing video files. I found this package that can cast videos, and I'm gonna use that package instead.
I need to record voice and send it with Firebase. I am using flutter_sound but the plugin doesn't generate a recorded file and I don't know why.
Future<String> result =flutterSound.startRecorder(null,androidEncoder: AndroidEncoder.AMR_WB);
result.then((value){
print('startRecorder: $value');
_recorderSubscription = flutterSound.onRecorderStateChanged.listen((e) {
DateTime date = new DateTime.fromMillisecondsSinceEpoch(e.currentPosition.toInt());
String txt = DateFormat('mm:ss:SS', 'en_US').format(date);
print(txt);
});
});