How to implement an audio listening stream in Flutter Web? - flutter

I'm making a Flutter Web App which has to access the microphone and streams the audio data as an array of integers for further processing.
I already succeeded doing this in plain JavaScript.
Things I've tried:
The flutter_sound library, but I couldn't get it to work. I also can't find any working examples for that library.
dart:web_audio seems to be a thing, but apparently you can't even import it yet in normal Flutter Apps.
dart:js is what im trying to do right now. I was able to create an AudioContext with var audioContext = JsObject(context['AudioContext']);. However, after that I dont know what syntax can be used to transfer the JavaScript code into Dart. Here is what I'm doing in JavaScript:
function initAudio() {
try {
audioCtx = new AudioContext();
const GotAudioStream = function(stream) {
const audioSource = audioCtx.createMediaStreamSource(stream);
const audioProcessor = audioCtx.createScriptProcessor(bufSize, 1, 1);
audioSource.connect(audioProcessor);
audioProcessor.connect(audioCtx.destination);
audioStarted = true;
audioProcessor.onaudioprocess = function(e) {
checkAudioBuffer(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(GotAudioStream);
}
catch (err) {
console.log(err);
}
}
Does anyone have experience with the dart:js library or another Idea on how to implement a simple (live!) audio stream in Flutter Web?
Regards,
Kaisky

Related

Flutter - how to cancel a SSE Stream

Hey I cant figure out how to cancel an SSE Stream in my flutter app.
I am using this package to use SSE in my app: https://pub.dev/packages/sse_channel
I feel like it has a major lack in the documentation.
The SSE works fine just don't know how to stop it with a function.
Im using this to start the SSE
channel = SseChannel.connect(Uri.parse(dotenv.env['BASE_URL'] + '/sse/rn-updates'));
and
channel.stream.listen((message) { code... and some other stuff }
that part works but I don't know how to close it.
Tried stuff like this
static void cancelStream() {
if (channel != null)
channel.sink.close()
}
You should create a global var in your class
StreamSubscription? stream;
and assign it to this variable
stream = channel.stream.listen((message) { code... and some other stuff }
and you can use this code to cancel stream
stream?.cancel();

How to get audio buffers from the flutter webrtc plugin?

I am using the flutter-webrtc-plugin and would like to record both local and remote audio streams. Is there any way for me to get audio buffers from the media streams? I have tried using the AudioFileRenderer in the unified-plan branch. In the startRecording function of MediaRecorderImpl.java, I supplied the file storage path e.g. "storage/emulated/0/Android/data", a file is successfully created everytime I ended my call but the recording file is broken so it can't be played. There are no errors coming from the terminal. I'm using flutter v1.22.6 and forked the flutter-webrtc from 0.5.8. I added the AudioFileRenderer file to the flutter-webrtc 0.5.8, my code is as below:
public void startRecording(File file) throws Exception {
recordFile = file;
if (isRunning)
return;
isRunning = true;
//noinspection ResultOfMethodCallIgnored
file.getParentFile().mkdirs();
if (videoTrack != null) {
System.out.println("try123 1");
videoFileRenderer = new VideoFileRenderer(
file.getAbsolutePath(),
EglUtils.getRootEglBaseContext(),
audioInterceptor != null
);
videoTrack.addSink(videoFileRenderer);
if (audioInterceptor != null)
audioInterceptor.attachCallback(id, videoFileRenderer);
} else {
Log.e(TAG, "Video track is null");
if (audioInterceptor != null) {
//TODO(rostopira): audio only recording
// throw new Exception("Audio-only recording not implemented yet");
Log.d(TAG, "Try to use onWebrtcSamplesReady");
audioFileRenderer = new AudioFileRenderer(file);
audioInterceptor.attachCallback(id, audioFileRenderer);
}
}
}
Any help is appreciated! Thanks!
I am also looking for the same solution, none has been found so far.
So, I am using webview for the the RTC part (communication & recording), while keeping the Firebase messaging and EventSource/SSE (I'm not using socket) in Flutter.
This is not directly answer your question, just providing alternative solution, it's better than having no solution at all, probably in the future when flutter RTC updated and supporting voice only recording, we can update the apps we develop.

Any API documentation for the Cast package?

so I recently got started with the flutter package cast in order to communicate with Chromecast devices. But I couldn't find any details on how to use it. If you could give me some help in actually playing a media file such as a song or a video that would be wholesome!
My current code:
CastSession session;
Future<void> _connect(BuildContext context, CastDevice object) async {
session = await CastSessionManager().startSession(object);
session.stateStream.listen((state) {
if (state == CastSessionState.connected) {
// Close my custom GUI
Navigator.pop(context);
_sendMessage(session);
}
});
session.messageStream.listen((message) {
print('receive message: $message');
});
}
// My video playing code
session.sendMessage(CastSession.kNamespaceReceiver, {
'type': 'MEDIA',
'link': 'http://somegeneratedurl.com',
});
Ok, so I found a solution to the answer. There is unfortunately no command to play a video file. I've looked through the Gcast protocol reference and there is no command for playing video files. I found this package that can cast videos, and I'm gonna use that package instead.

Problems with WebAudio

I'm creating a research experiment that uses WebAudio API to record audio files spoken by the user.
I came up with a solution for this using recorder.js and everything was working fine... until I tried it yesterday.
I am now getting this error in Chrome:
"The AudioContext was not allowed to start. It must be resumed (or
created) after a user gesture on the page."
And it refers to this link: Web Audio API policy.
This appears to be a consequence of Chrome's new policy outlined at the link above.
So I attempted to solve the problem by using resume() like this:
var gumStream; //stream from getUserMedia()
var rec; //Recorder.js object
var input; //MediaStreamAudioSourceNode we'll be recording
// shim for AudioContext when it's not avb.
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioContext = new AudioContext; //new audio context to help us record
function startUserMedia() {
var constraints = { audio: true, video:false };
audioContext.resume().then(() => { // This is the new part
console.log('context resumed successfully');
});
navigator.mediaDevices.getUserMedia(constraints).then(function(stream) {
console.log("getUserMedia() success, stream created, initializing Recorder.js");
gumStream = stream;
input = audioContext.createMediaStreamSource(stream);
rec = new Recorder(input, {numChannels:1});
audio_recording_allowed = true;
}).catch(function(err) {
console.log("Error");
});
}
Now in the console I'm getting:
Error
context resumed successfully
And the stream is not initializing.
This happens in both Firefox and Chrome.
What do I need to do?
I just had this exact same problem! And technically, you helped me to find this answer. My error message wasn't as complete as yours for some reason and the link to those policy changes had the answer :)
Instead of resuming, it's best practise to create the audio context after the user interacted with the document (when I say best practise, if you have a look at padenot's first comment of 28 Sept 2018 on this thread, he mentions why in the first bullet point).
So instead of this:
var audioContext = new AudioContext; //new audio context to help us record
function startUserMedia() {
audioContext.resume().then(() => { // This is the new part
console.log('context resumed successfully');
});
}
Just set the audio context like this:
var audioContext;
function startUserMedia() {
if(!audioContext){
audioContext = new AudioContext;
}
}
This should work, as long as startUserMedia() is executed after some kind of user gesture.

reading videos using cordova-plugin-media-streaming close the window automatically

I'm working on an ionic mobile application where I needed to read videos on streaming by providing the URI of the video online. So I used cordova-plugin-media-streamingplugin offered by cordova.
My problem is that: the window reading the video closes automatically after the video finishes, the user won't be able to play the video again in this window.
In the official documentation of the plugin [that i found here], there is an attribute called shouldAutoClosethat should be set to false to avoid that problem. But this didn't work for me.
Here is the code I used to play a video on streaming :
startVideo(item : Multimediasendtrust) {
let options = {
successCallback: () => { console.log('Finished Video') },
errorCallback: (e) => { console.log('Error: ', e) },
orientation: 'portrait',
controls: true,
shouldAutoClose: false
};
console.log('those are option ',options );
console.log('the link of the video ', item.url_media);
this.streamingMedia.playVideo(item.url_media, options); }
Can anyone help please. Thanks in advance.