I am working on a Flutter Web project that involves a Speech to Text component. I intend to use Google's Speech to Text API (https://cloud.google.com/speech-to-text/docs). One of the critical requirements that I have is to use the API's Single Utterance capability to recognize when a speaker is done speaking automatically. This requires that I stream audio directly to Google's API from the Flutter client so that I can receive that event and do something with it. I am using the google_speech dart plugin (https://pub.dev/packages/google_speech) and am fairly certain it will meet my needs.
What I am struggling with is finding an implementation that can successfully record audio to a stream that I can then send to Google that works in Flutter Web.
So far the only one I cant find that seems to meet my needs is the flutter_sound (https://pub.flutter-io.cn/packages/flutter_sound) plugin as it claims to support Flutter Web and seems to have the ability to record to a dart Stream without the use of a File. I have an initial implementation but I seem to be missing something somewhere as the library seems to hang.
Here is my implementation so far:
class _LandingPageState extends State<LandingPage> {
FlutterSoundRecorder _mRecorder = FlutterSoundRecorder();
String _streamText = 'not yet recognized';
_LandingPageState(this.interviewID);
#override
Widget build(BuildContext context) {
//Build method works fine
}
// Called elsewhere to start the recognition
_submit() async {
// Do some stuff
await recognize();
// Do some other stuff
}
Future recognize() async {
// Set up the Google Speech (google_speech plugin) recongition apparatus
String serviceAccountPath = await rootBundle
.loadString('PATH TO MY SERVICE ACCOUNT CREDENTIALS');
final serviceAccount = ServiceAccount.fromString(serviceAccountPath);
final speechToText = SpeechToText.viaServiceAccount(serviceAccount);
final config = _getConfig();
// Create the stream controller (flutter_sound plugin)
StreamController recordingDataController = StreamController<Food>();
// Start the recording and specify what stream sink it is using
// which is the above stream controller's sink
await _mRecorder.startRecorder(
toStream: recordingDataController.sink,
codec: Codec.pcm16,
numChannels: 1,
sampleRate: 44000,
);
// Set up the recognition stream and pass it the stream
final responseStream = speechToText.streamingRecognize(
StreamingRecognitionConfig(
config: config,
interimResults: true,
singleUtterance: true,
),
recordingDataController.stream,
);
responseStream.listen((data) {
setState(() {
_streamText =
data.results.map((e) => e.alternatives.first.transcript).join('\n');
});
}, onDone: () {
setState(() {
print("STOP LISTENING");
print("STREAM TEXT = ");
print("--------------------------------------");
print(_streamText);
print("--------------------------------------");
// Stop listening to the mic
recordingDataController.close();
});
});
}
init() async {
await Future.delayed(Duration(seconds: 1));
await _sumbit();
}
#override
void initState() {
super.initState();
_openRecorder();
}
#override
void dispose() {
super.dispose();
_stopRecorder();
}
RecognitionConfig _getConfig() => RecognitionConfig(
encoding: AudioEncoding.LINEAR16,
model: RecognitionModel.basic,
enableAutomaticPunctuation: true,
sampleRateHertz: 16000,
languageCode: 'en-US');
Future<void> _openRecorder() async {
// These Permission calls dont seem to work on Flutter web
// var status = await Permission.microphone.request();
// if (status != PermissionStatus.granted) {
// throw RecordingPermissionException('Microphone permission not granted');
// }
await _mRecorder.openAudioSession();
}
Future<void> _stopRecorder() async {
await _mRecorder.stopRecorder();
}
}
When this is debugged the library hangs on starting the recorder and states "Waiting for the recorder being opened" but it just waits there forever. I've tried to debug the library but it is very unclear what is going on. I worry that this library does not support Flutter Web after all. Could it be that because Microphone permissions have not been granted that the library would hang?
I've been using this example for flutter_sound to implement this: https://github.com/Canardoux/tau/blob/master/flutter_sound/example/lib/recordToStream/record_to_stream_example.dart
Is there another library or approach that supports recording audio to a dart stream in Flutter web?
The problem is that to record to stream you must use the PCM16 codec, but this codec is not compatible to record audio in the browser. You can see the associated documentation https://tau.canardoux.xyz/guides_codec.html
Related
I have integrated Firebase Dynamic link in my Flutter application to open and navigate application users to specific screen in app.
For that first of all I have added below plugin in pubspec.yaml file:
firebase_dynamic_links: ^5.0.5
Then, I have created a separate class to handle related stuffs as below:
class DynamicLinkService {
late BuildContext context;
FirebaseDynamicLinks dynamicLinks = FirebaseDynamicLinks.instance;
Future<void> initDynamicLinks(BuildContext context) async {
this.context = context;
dynamicLinks.onLink.listen((dynamicLinkData) {
var dynamicLink=dynamicLinkData.link.toString();
if (dynamicLink.isNotEmpty &&
dynamicLink.startsWith(ApiConstants.baseUrl) &&
dynamicLink.contains("?")) {
//Getting data here and navigating...
...
...
...
}
}).onError((error) {
print("This is error >>> "+error.message);
});
}
}
Now, I am initialising Deep-link as below in my home_screen:
final DynamicLinkService _dynamicLinkService = DynamicLinkService();
and then calling below method in initState()
#override
void initState() {
SchedulerBinding.instance.addPostFrameCallback((_) async {
await _dynamicLinkService.initDynamicLinks(context);
});
}
This is working like a charm! when my application is in recent mode or in background mode.
But the issue is when the application is closed/Killed, clicking on dynamic link just open the app but could not navigate.
What might be the issue? Thanks in advance.
Let me answer my own question, It might be useful for someone!
So, In above code I forgot to add code to handle dynamic link while the app is in closed/kill mode.
We need to add this code separately:
//this is when the app is in closed/kill mode
final PendingDynamicLinkData? initialLink = await FirebaseDynamicLinks.instance.getInitialLink();
if (initialLink != null) {
handleDynamicLink(initialLink);
}
So, final code looks like as below:
//this is when the app is in closed/kill mode
final PendingDynamicLinkData? initialLink = await FirebaseDynamicLinks.instance.getInitialLink();
if (initialLink != null) {
handleDynamicLink(initialLink);
}
//this is when the app is in recent/background mode
dynamicLinks.onLink.listen((dynamicLinkData) {
handleDynamicLink(dynamicLinkData);
}).onError((error) {
print("This is error >>> "+error.message);
});
Its working like a charm now! That's All.
I have implemented this method so that when a user clicks the dynamic link it will be redirected to a specific page. Everything works alright while the app is running, but when I kill/close the app and try to do the same thing, it opens the app on the initial screen (Home Page). How can I make it work in this case?
Future<void> initDynamicLinks() async {
FirebaseDynamicLinks.instance.onLink.listen((dynamicLinkData) {
id = dynamicLinkData.link
.toString()
.substring(dynamicLinkData.link.toString().lastIndexOf('/') + 1);
Get.to(
() => Page(
id: id,
),
);
}).onError((error) {
if (kDebugMode) {
print(error.message);
}
});
}
void initState() {
// TODO: implement initState
initDynamicLinks();
super.initState();
}
I think .onLink.listen() function only get hit when app is resumed from background.
If you want your deeplink work when app have a fresh start then just put this code above .onLink.listen() function...
WidgetsFlutterBinding.ensureInitialized();
await Firebase.initializeApp();
final PendingDynamicLinkData data = await FirebaseDynamicLinks.instance.getInitialLink();
final Uri deepLink = data?.link;
// Here you should navigate to your desired screen
Hope it helps you
I am implementing a password recovery function based on the url sent to the email. Opening the app based on that url was successful. But instead of directly opening the required page in the app that is in the background, it duplicates the app. Although it still leads me to the password recovery page, now there will be 2 same apps running side by side
Procedure
Enter your email to send the password reset link
Click submit
Open the email containing the recovery link
Duplicate the app and open a recovery password page
Things what happen
Splash screen, first page open in the app, I am trying to do as instructed from uni_links package but still no success. Currently the function getInitialLink has the effect of opening the app based on the recovery link
class SplashController extends GetxController {
final SharedPreferencesHelper _helper = Get.find<SharedPreferencesHelper>();
late StreamSubscription sub;
#override
void onReady() async {
super.onReady();
await checkToken();
}
Future<void> checkToken() async {
await Future.delayed(Duration(seconds: 3));
var token = _helper.getToken();
if (token == null) {
Get.offNamed(Routes.LOGIN);
} else {
Get.offNamed(Routes.MAIN);
}
}
#override
void onInit() {
super.onInit();
initUniLinks();
}
Future<Null> initUniLinks() async {
// Platform messages may fail, so we use a try/catch PlatformException.
try {
String? initialLink = await getInitialLink();
if (initialLink != null) {
print("okay man");
Get.toNamed(Routes.RECOVERY);
}
sub = getLinksStream().listen((link) {
}, onError: (err) {
});
} on PlatformException {
// Handle exception by warning the user their action did not succeed
// return?
}
}
}
I found the solution, actually this answer is already on Stackoverflow, and it's really simple.
In the AndroidManifest.xml file of the app. Find "android:launchMode" and change its old value to singleTask. And here is the result
android:launchMode="singleTask"
I have list of song on song screen. If user click the one item in the list, I call the loadFirstPlaylist() to load the list of songs(all song in album) into queue and then skip the queue and play. It is working on android but I got following error on iOS.
GitHub Sources Code
[NowPlaying] [MRNowPlaying] Ignoring setPlaybackState because application does not contain entitlement com.apple.mediaremote.set-playback-state for platform
Future<void> loadFirstPlayList(List<MediaItem> playlist, int index) async {
await emptyPlaylist();
if (playlist.isNotEmpty) {
await _audioHandler.addQueueItems(playlist);
await _audioHandler.skipToQueueItem(index);
await _audioHandler.play();
}
}
Audio Handler Method
#override
Future<void> addQueueItems(List<MediaItem> mediaItems) async {
// manage Just Audio
final audioSource = mediaItems.map(_createAudioSource);
_playlist.addAll(audioSource.toList());
// notify system
final newQueue = queue.value..addAll(mediaItems);
queue.add(newQueue);
}
#override
Future<void> skipToQueueItem(int index) async {
if (index < 0 || index >= queue.value.length) return;
if (_player.shuffleModeEnabled) {
index = _player.shuffleIndices![index];
}
_player.seek(Duration.zero, index: index);
}
#override
Future<void> play() => _player.play();
I do not know if you have figured it out or not already. However, I had the same issue when trying to load an empty playlist. I was following this example, which includes a _loadEmptyPlaylist method. However, when I implemented it this caused the player to fail silently. It now seems to be working by not calling loadAudioSource on an empty audio sequence.
I am able to implement voice and video call using agora.io library which is available at https://www.agora.io/ && https://github.com/AgoraIO/Flutter-SDK
how ever the process for starting a call is both user has to join a particular channel name defined by the user manually or automatically. which is not the practical way.
Is there any way to create a separate signalling system (may be using, nodejs socket, firebase or one-signal notification? )
What's the simultaneous/parallel way to be used along side that?
or what's the complete alternative?
Agora.io doesn't provide any method other passing a channel name manually or a default string. But what you can do is use Firebase dynamic link to share the channel name via a dynamic link. This link will redirect you to the page where you're taking channel name as input and fill the channel name according to the parameters passed. So your code will look something like:
class AgoraImpementation extends State<AgoraImplementation> {
#override
void initState() {
super.initState();
this.initDynamicLinks();
}
initDynamicLinks(BuildContext context) async {
await Future.delayed(Duration(seconds: 3));
var data = await FirebaseDynamicLinks.instance.getInitialLink();
var deepLink = data?.link;
final queryParams = deepLink.queryParameters;
if (queryParams.length > 0) {
var channelName = queryParams['channel_name'];
openFormScreen(channelName);
}
FirebaseDynamicLinks.instance.onLink(onSuccess: (dynamicLink)
async {
var deepLink = dynamicLink?.link;
final queryParams = deepLink.queryParameters;
if (queryParams.length > 0) {
var userName = queryParams['channel_name'];
openFormScreen(channelName);
}
debugPrint('DynamicLinks onLink $deepLink');
}, onError: (e) async {
debugPrint('DynamicLinks onError $e');
});
}
openFormScreen(String userName){
Navigator.of(context).pushNamed("routeFormScreen", arguments: {"channelName": channelName});
}
}