I am creating some kind of streaming app.
I have open camera and I implemented scanning qr codes in background using https://pub.dev/packages/google_ml_kit
Here is my code for that:
var stream = await navigator.mediaDevices
.getUserMedia({'video': true, 'audio': true});
setState(() {
_localRenderer.srcObject = stream;
});
streamTrack = stream.getVideoTracks().first;
await Future.delayed(Duration(seconds: 2));
_getSnapshotTimer = Timer.periodic(Duration(seconds: 1), (timer) async { // skanowanie kodów QR
final frame = await streamTrack.captureFrame();
File file = await File('${_tempDir.path}/image.png').create();
file.writeAsBytesSync(frame.asUint8List());
final _qrCodes =
await _qrCodeScanner.processImage(InputImage.fromFile(file));
My problem is because of that video from camera is lagging every second. There is like a little freeze.
There is some option to improve this? To make video from camera smooth all time?
Running the QR code scanner while your device is running a dev version and tethered to your computer capturing debug data can slow it down. I have an app with a QR scanner that works great in production but shows the same lagging symptoms in the development environment. I can't comment specifically on your project, as it seems like you're doing more than just capturing a QR code, but there is definitely a lag effect from running it in the development environment.
Related
I am trying to build a midi player using web audio API. I used tonejs to parse midi file into JSON. I am using mp3 files to play notes. Following are the relevant parts of the code:
//create audio samples
static async setupSample(audioContext, filepath) {
const response = await fetch(filepath);
const arrayBuffer = await response.arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
return audioBuffer;
}
//play a single sample
static playSample(audioContext, audioBuffer, time) {
const sampleSource = new AudioBufferSourceNode(audioContext, {
buffer: audioBuffer,
playbackRate: 1,
});
sampleSource.connect(audioContext.destination);
sampleSource.start(time);
return sampleSource;
}
Scheduling samples:
async start() {
this.startTime = this.audioCtx.currentTime;
this.play();
}
play() {
let nextNote = this.notes[this.noteIndex];
//schedule samples
while ((nextNote.time + this.startTime) - this.audioCtx.currentTime <= 0.250) {
let s = Audio.playSample(this.audioCtx, this.samples[nextNote.midi], this.startTime + nextNote.time);
s.stop(this.startTime + nextNote.time + nextNote.duration);
this.noteIndex++;
if (this.noteIndex == this.notes.length) {
break;
}
nextNote = this.notes[this.noteIndex];
}
if (this.noteIndex == this.notes.length) {
return;
}
requestAnimationFrame(() => {
this.play();
});
}
I am testing code with a midi file which contains C major scale. I have tested the midi file using timidity and it is fine.
The code does play the midi file correctly execpet a small problem: I hear some clicking sounds during playback. The clicking increases with increasing tempo but does not completely go away even with tempo as small as 50bpm. Any ideas what could be going wrong?
Full code can be viewed at : https://test.meedee.in/
Nothing is "wrong". You are observing a phenomenon intrinsic to the physics of audio.
Chopping audio samples arbitrarily like this creates clicks at the transitions. Any instantaneous change in level is heard as a click. To get rid of the clicks, apply an envelope to the sample, blend adjacent notes, or apply a low-pass filter.
Thanks in Advance
Currently my app uses camera to take video input and process each frames . The app also works in background for hours. So to avoid heating issues I need to turn the camera off and on it later.
The camera is turned off when I use
_camera.dispose()
but its not turned on after disposing when I use
_camera.initialize()
CameraDescription description = await getCamera(_direction);
ImageRotation rotation = rotationIntToImageRotation(
description.sensorOrientation,
);
_camera =CameraController(description, ResolutionPreset.low, enableAudio: false);
await _camera.initialize();
_camera.startImageStream((CameraImage image) {
_baseStopwatch.start();
if (_baseStopwatch.elapsed.inSeconds > 5) {
_camera.stopImageStream(); }
if (_baseStopwatch.elapsed.inSeconds > 10){
// need to restart the camera streaming!!
}}
Hope someone can help me. I am a blind guy and am creating a app to recognise ZA banknotes. I find my results is better after controller.stopImageStream(); rather than running the image stream continuesly. How can I restart controller.startImageStream() after the stop? or is there a way to pause the imageStream rather then stopping it. Thank you
setState(() {});
controller.startImageStream((CameraImage img) {
controller.stopImageStream();
if (!isDetecting) {
isDetecting = true;
int startTime = new DateTime.now().millisecondsSinceEpoch;
Tflite.runModelOnFrame(
I'm coding a small game with the Flutter Framework.
I'm using audioplayers for the Sounds.
It works fine like this, when calling it for example 2 times a second.
But whenn I call it more than 5 times and again in the next second at some point the sound has like a delay and then after a second or so all the sounds play at once :) That sounds weired.
I also tested the audioplayers example from github on my iphone. Repeating the sounds in low frequency is ok, but when I repeat clicking the button as fast as possible at some point it gets some delay and the same thing is happening.
Is there some way to stop the previous Sound before and then playing the next one or isnt this possible?
Or is there some other Idea how to deal with the sounds?
This is how I use it:
AudioCache upgradeSound = new AudioCache();
void playUpgradeSound() {
_playUpgradeSound(upgradeSound);
}
void _playUpgradeSound(AudioCache ac) async{
await ac.play('audio/upgr.mp3');
}
Thank you very much in advance
I solve similar problem by having singleton class, and after first play I can get the state, and I can stop previous play.
class Audio {
static final playerCache = AudioCache();
static AudioPlayer audioPlayer;
static void play(String audioName) async {
audioPlayer = await playerCache.play('$audioName.mp3');
}
static void stop() {
audioPlayer.stop();
}
}
...
child: IconButton(
onPressed: () {
try {
if (Audio.audioPlayer.state ==
AudioPlayerState.PLAYING) {
Audio.stop();
} else {
Audio.play('bid');
}
} catch (e) {
Audio.play('bid');
}
},
...
There is a line of code in its own documentation.
To use the low latency API, better for gaming sounds, use:
AudioPlayer audioPlayer = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
In this mode the backend won't fire any duration or position updates. Also, it is not possible to use the seek method to set the audio a specific position. This mode is also not available on web.
I hope it is useful.
Essentially the app is like snapchat. I take pics and reset back to camera mode, the issue comes when I record video and reset, it goes back to camera mode but the audio form the video keeps playing in the background. The functions are somwhat exactly like the camera doc, with a few addition to reset the camera.
I added this:
_reset() {
if (mounted)
setState(() {
if (this._didCapture) {
this._didCapture = false;
this._isRecording = false;
this._isPosting = false;
this._file = File('');
this._fileType = null;
this._captions.clear();
this._textEditingControllers.clear();
this._videoController = null;
this._videoPlayerListener = null;
}
});
}
It works just fine but the audio in the background is still on. Also wondering if the video/picture is saved on the phone, which I don't want to...
i had been looking for a similar answer, but i didn´t find it. You could try to stop it adding this to your function:
this._controller.setVolume(0.0);
that´s what i did in my app