I have a flutter project (iOS, Android) that uses WebRTC. I need to send video from camera (working correctly) and screen capture by WebRTC. How to share the screen on WebRTC with the flutter_webrtc package?
You can use the flutter_webrtc plugin and do like this method ( use getDisplayMedia method in webRTC to get display ) :
class ScreenSharing {
MediaStream? _localStream;
final RTCVideoRenderer _localRenderer = RTCVideoRenderer();
Future<void> _makeScreenSharing() async {
final mediaConstraints = <String, dynamic>{'audio': true, 'video': true};
try {
var stream = await navigator.mediaDevices.getDisplayMedia(mediaConstraints);
_localStream = stream;
_localRenderer.srcObject = _localStream;
} catch (e) {
print(e.toString());
}
}
}
Related
Plug-in functions return version in imagepath.
same behaviour in all these functions getImagesPath() ,getVideoPath(),getAudioPath(),getFilePath().
Plug in: https://pub.dev/packages/storage_path , storage_path: ^0.2.0
Future<void> getImagesPath() async {
String imagespath = "";
try {
**imagespath = await StoragePath.imagesPath;** // return IOS Version (Example: Ios 15.2)
var response = jsonDecode(imagespath);
print(response);
var imageList = response as List;
List<FileModel> list = imageList.map<FileModel>((json) => FileModel.fromJson(json)).toList();
setState(() {
imagePath = list[11].files[0];
});
} on PlatformException {
imagespath = 'Failed to get path';
}
return imagespath;
}
I got it. Actually, I missed reading the Readme file. This plug-in is only available with android.
ONLY FOR ANDROID
await _audioPlayer!.startPlayer(
fromURI: 'https://URL/TestFiles/sssssssss.acc',
codec: Codec.aacADTS,
);
} catch (e) {
print(e);
}
the exception is platform exceptio error unknown startplayer() error,null .
in the debug console I got these
FlutterSoundPlayer.log (package:flutter_sound/public/flutter_sound_player.dart:500:13
MethodChannelFlutterSoundPlayer.channelMethodCallHandler (package:flutter_sound_platform_interface/method_channel_flutter_sound_player.dart:161:19
I am using real android device
flutter doctor returns everything is good
this remote file works well in angular client
It's played Now I was missing that the file extension is aac not acc
so it should be
await _audioPlayer!.startPlayer(
fromURI: 'https://URL/TestFiles/sssssssss.aac',
);
} catch (e) {
print(e);
}
to play audios i use this library
audioplayers: ^0.19.1
declare this variable to access audio
String audioCorrect = "audio/access_granted.mp3";
String audioInCorrect = "audio/access_denied.mp3";
method for init player
void initPlayer() {
advancedPlayer = new AudioPlayer();
audioCache = new AudioCache(fixedPlayer: advancedPlayer);
}
call initPlayer in initState method
play the audio this way
audioCache.play(audioCorrect);
Flutter 2 announced just few days ago and some packages updated for web like Image Picker. I'm trying to get image and upload it to cloud storage. Image picker working very well but when
I want to upload image to storage it gives me error.
Unsupported operation: Platform._operatingSystem
UI Code
onImageSelect: (selectedImage) async {
try {
final url = await Get.find<FirebaseStorageService>()
.uploadProfilePicture(
customer.uuid, File(selectedImage.path));
print(url);
} catch (e) {
print(e);
}
customer.profileImageUrl = selectedImage.path;
_isImageSelected = true;
}),
Service Class funciton:
class FirebaseStorageService {
final FirebaseStorage _firebaseStorage = FirebaseStorage.instance;
Reference _storageReference;
Future<String> uploadProfilePicture(
String userID, File uploadingImagePath) async {
_storageReference = _firebaseStorage
.ref()
.child(userID)
.child('images')
.child("profile-photo.png");
var uploadTask = _storageReference.putFile(uploadingImagePath);
var url = await (await uploadTask).ref.getDownloadURL();
return url;
}
}
I tried flutter clean and flutter upgrade. I also searched old questions and some of them used html.File instead of dart:io library but firebase_storage:7.0.0 package only accept File class which is from dart:io library.
Code work on Android and IOS but not web.
I'm using stable channel.
I am working on a Flutter Web project that involves a Speech to Text component. I intend to use Google's Speech to Text API (https://cloud.google.com/speech-to-text/docs). One of the critical requirements that I have is to use the API's Single Utterance capability to recognize when a speaker is done speaking automatically. This requires that I stream audio directly to Google's API from the Flutter client so that I can receive that event and do something with it. I am using the google_speech dart plugin (https://pub.dev/packages/google_speech) and am fairly certain it will meet my needs.
What I am struggling with is finding an implementation that can successfully record audio to a stream that I can then send to Google that works in Flutter Web.
So far the only one I cant find that seems to meet my needs is the flutter_sound (https://pub.flutter-io.cn/packages/flutter_sound) plugin as it claims to support Flutter Web and seems to have the ability to record to a dart Stream without the use of a File. I have an initial implementation but I seem to be missing something somewhere as the library seems to hang.
Here is my implementation so far:
class _LandingPageState extends State<LandingPage> {
FlutterSoundRecorder _mRecorder = FlutterSoundRecorder();
String _streamText = 'not yet recognized';
_LandingPageState(this.interviewID);
#override
Widget build(BuildContext context) {
//Build method works fine
}
// Called elsewhere to start the recognition
_submit() async {
// Do some stuff
await recognize();
// Do some other stuff
}
Future recognize() async {
// Set up the Google Speech (google_speech plugin) recongition apparatus
String serviceAccountPath = await rootBundle
.loadString('PATH TO MY SERVICE ACCOUNT CREDENTIALS');
final serviceAccount = ServiceAccount.fromString(serviceAccountPath);
final speechToText = SpeechToText.viaServiceAccount(serviceAccount);
final config = _getConfig();
// Create the stream controller (flutter_sound plugin)
StreamController recordingDataController = StreamController<Food>();
// Start the recording and specify what stream sink it is using
// which is the above stream controller's sink
await _mRecorder.startRecorder(
toStream: recordingDataController.sink,
codec: Codec.pcm16,
numChannels: 1,
sampleRate: 44000,
);
// Set up the recognition stream and pass it the stream
final responseStream = speechToText.streamingRecognize(
StreamingRecognitionConfig(
config: config,
interimResults: true,
singleUtterance: true,
),
recordingDataController.stream,
);
responseStream.listen((data) {
setState(() {
_streamText =
data.results.map((e) => e.alternatives.first.transcript).join('\n');
});
}, onDone: () {
setState(() {
print("STOP LISTENING");
print("STREAM TEXT = ");
print("--------------------------------------");
print(_streamText);
print("--------------------------------------");
// Stop listening to the mic
recordingDataController.close();
});
});
}
init() async {
await Future.delayed(Duration(seconds: 1));
await _sumbit();
}
#override
void initState() {
super.initState();
_openRecorder();
}
#override
void dispose() {
super.dispose();
_stopRecorder();
}
RecognitionConfig _getConfig() => RecognitionConfig(
encoding: AudioEncoding.LINEAR16,
model: RecognitionModel.basic,
enableAutomaticPunctuation: true,
sampleRateHertz: 16000,
languageCode: 'en-US');
Future<void> _openRecorder() async {
// These Permission calls dont seem to work on Flutter web
// var status = await Permission.microphone.request();
// if (status != PermissionStatus.granted) {
// throw RecordingPermissionException('Microphone permission not granted');
// }
await _mRecorder.openAudioSession();
}
Future<void> _stopRecorder() async {
await _mRecorder.stopRecorder();
}
}
When this is debugged the library hangs on starting the recorder and states "Waiting for the recorder being opened" but it just waits there forever. I've tried to debug the library but it is very unclear what is going on. I worry that this library does not support Flutter Web after all. Could it be that because Microphone permissions have not been granted that the library would hang?
I've been using this example for flutter_sound to implement this: https://github.com/Canardoux/tau/blob/master/flutter_sound/example/lib/recordToStream/record_to_stream_example.dart
Is there another library or approach that supports recording audio to a dart stream in Flutter web?
The problem is that to record to stream you must use the PCM16 codec, but this codec is not compatible to record audio in the browser. You can see the associated documentation https://tau.canardoux.xyz/guides_codec.html
I am able to implement voice and video call using agora.io library which is available at https://www.agora.io/ && https://github.com/AgoraIO/Flutter-SDK
how ever the process for starting a call is both user has to join a particular channel name defined by the user manually or automatically. which is not the practical way.
Is there any way to create a separate signalling system (may be using, nodejs socket, firebase or one-signal notification? )
What's the simultaneous/parallel way to be used along side that?
or what's the complete alternative?
Agora.io doesn't provide any method other passing a channel name manually or a default string. But what you can do is use Firebase dynamic link to share the channel name via a dynamic link. This link will redirect you to the page where you're taking channel name as input and fill the channel name according to the parameters passed. So your code will look something like:
class AgoraImpementation extends State<AgoraImplementation> {
#override
void initState() {
super.initState();
this.initDynamicLinks();
}
initDynamicLinks(BuildContext context) async {
await Future.delayed(Duration(seconds: 3));
var data = await FirebaseDynamicLinks.instance.getInitialLink();
var deepLink = data?.link;
final queryParams = deepLink.queryParameters;
if (queryParams.length > 0) {
var channelName = queryParams['channel_name'];
openFormScreen(channelName);
}
FirebaseDynamicLinks.instance.onLink(onSuccess: (dynamicLink)
async {
var deepLink = dynamicLink?.link;
final queryParams = deepLink.queryParameters;
if (queryParams.length > 0) {
var userName = queryParams['channel_name'];
openFormScreen(channelName);
}
debugPrint('DynamicLinks onLink $deepLink');
}, onError: (e) async {
debugPrint('DynamicLinks onError $e');
});
}
openFormScreen(String userName){
Navigator.of(context).pushNamed("routeFormScreen", arguments: {"channelName": channelName});
}
}