Flutter record front facing camera at exact same time as playing video - flutter

I've been playing around with Flutter and trying to get it so I can record the front facing camera (using the camera plugin [https://pub.dev/packages/camera]) as well as playing a video to the user (using the video_player plugin [https://pub.dev/packages/video_player]).
Next I use ffmpeg to horizontally stack the videos together. This all works fine but when I play back the final output there is a slight delay when listening to the audio. I'm calling Future.wait([cameraController.startVideoRecording(filePath), videoController.play()]); but there is a slight delay in these tasks actually starting. I don't even need them to fire at the exact same time (which I'm realising is probably impossible), instead if I knew exactly when each of the tasks begun then I can use the time difference to sync the audio using ffmpeg or similar.
I've tried adding a listener on the videoController to see when isPlaying first returns true, and also watching the output directory for when the recorded video appears on the filesystem:
listener = () {
if (videoController.value.isPlaying) {
isPlaying = DateTime.now().microsecondsSinceEpoch;
log('isPlaying '+isPlaying.toString());
}
videoController.removeListener(listener);
};
videoController.addListener(listener);
var watcher = DirectoryWatcher('${extDir.path}/');
watcher.events.listen((event) {
if (event.type == ChangeType.ADD) {
fileAdded = DateTime.now().microsecondsSinceEpoch;
log('added '+fileAdded.toString());
}
});
Then likewise for checking if the camera is recording:
var listener;
listener = () {
if (cameraController.value.isRecordingVideo) {
log('isRecordingVideo '+DateTime.now().microsecondsSinceEpoch.toString());
//cameraController.removeListener(listener);
}
};
cameraController.addListener(listener);
This results in (for example) the following order and microseconds for each event:
is playing: 1606478994797247
is recording: 1606478995492889 (695642 microseconds later)
added: 1606478995839676 (346787 microseconds later)
However, when I play back the video the syncing is off by approx 0.152 seconds so doesn't marry up to the time differences reported above.
Does anyone have any idea how I could accomplish near perfect syncing when combining 2 videos? Thanks.

Related

Сompletion playlist and resetting index in just_audio

I'm using the just_audio package, I have a playlist, and after playing all the tracks, the current index remains equal to the index of the last track. Is there a way to make the index equal to 0 when the playlist ends?
final myPlayList = ConcatenatingAudioSource(
children: listAudios,
);
await _player.setAudioSource(myPlayList, initialIndex: 0, preload: false);
Listen for the completed state and seek to index 0. Assuming you also want to stop playing at that point, call pause() too:
_player.processingStateStream.listen((state) async {
if (state == ProcessingState.completed) {
await _player.pause();
await _player.seek(Duration.zero, index: 0);
}
});
Regarding the call to pause(): just_audio state is a composite of playing and processingState and you will only hear audio when playing=true, processingState=ready which is what you have during normal playback. The playing state only ever changes when you tell it to change, but the processingState may change independently depending on what is happening in the audio.
When reaching the end of the playlist, it will transition to playing=true, processingState=completed, hence no audio. If you seek back to the beginning, it will be playing=true, processingState=ready so you'll hear the audio automatically start back up. This may surprise you, but think of it as comparable to what happens in a YouTube video after it completes, but if you seek to the beginning it continues playing (because it's now ready, not completed).
If you want it to stop playing, you need to actually set playing=false with either pause() or stop(), where the former keeps things in memory, and the latter releases all resources.
For more information about the state model and its rationale, see The State Model section of the README.

Flutter video player seekTo beginning of a video, takes time to play again in Android, works as expected in iOS

I'm developing a video app, using video_player package. I want to repeat the videos n times. I'm doing this by adding listener and when video reaches the end, if the current repetition is not equal to the N, I simply call the _videoPlayerController.seekTo(Duration.zero), it works as expected in iOS, but in Android it takes some time to play it again, because of the buffer I guess. I have some logs. When I call the seekTo, first log is:
D/AudioTrack(26148): stop(59): called with 194560 frames delivered
After some time(3-4 seconds) the other logs come and video plays after these logs:
D/AudioTrack(26148): getTimestamp_l(60): device stall time corrected using current time 21442303047753
D/AudioTrack(26148): getTimestamp_l(60): stale timestamp time corrected, currentTimeNanos: 21440694441753 < limitNs: 21442169902753 < mStartNs: 21442230902753
W/AudioTrack(26148): getTimestamp_l(60): retrograde timestamp time corrected, 21442169902753 < 21442303047753
I'm not sure the problem in here is the buffering but if so, is there anyway to prevent buffering while using seekTo in android or is there any other way to achieve this functionality?
I'm thinking this problem come from the initial thread so if you call your first seek initial code in did dependencies method or you can try to wrap initial code to addpostframe call back with binding.
WidgetsBinding.instance?.addPostFrameCallback((timeStamp) {
_playController(index: 0, durationInSeconds: 5);
});
or try to call didChangeDepencies after the initState initialize.
#override
void didChangeDependencies() {
super.didChangeDependencies();
}

Synchronizing Playback of Multiple Audio Files in Audio Kit

I am developing a small audio sequencer application using AudioKit. I only need to play back 4 channels of audio. However I need to play them back perfectly synchronized down to the sample level. When I run a test using just two audio files, I can hear that they are not synchronized. The difference is only a few samples, but even a one sample discrepancy would be a problem. I am currently using multiple AKClipPlayer objects routed to an AKMixer object. I called him with the basics for loop like this:
private var clipPlayers : [AKClipPlayer] = []
func play(){
for player in clipPlayers{
player.play()
}
}
Is sample accurate playback timing of multiple audio files possible using AudioKit?
Yes, you need to schedule playback to start in the future with play(at:).
// This can take longer than expected, so do this before choosing a future time
clipPlayers.forEach { $0.prepare(withFrameCount: 10_000) }
let nearFuture = AVAudioTime.now() + 0.2
clipPlayers.forEach { $0.play(at: nearFuture) }

Resume game from time at which it was paused

I am writing a game using CreateJS and using CocoonJS to distribute. Within CocoonJS API are a couple of listener functions that allow callbacks when pausing and resuming of the game. The game's timer and (time-based) animations are driven by the Ticker event's "delta" property. The issue that I am having at the moment is, on resuming the game following on from pausing it, the timer will pick up from the time at which it paused plus the time spent whilst paused. For example, I pause the game after 20 seconds for exactly 4 seconds, on resuming the game the timer will carry on from 24 seconds (not 20 seconds, which is intended). I've tried storing the ticker event's "runTime" property before pausing and attempting to then set the ticker event's "runTime" to this stored value on resume, but this doesn't work.
A snippet of my original code (before tinkering) is like the following:
createjs.Ticker.setFPS(60);
createjs.Ticker.on("tick", onTick, this);
Cocoon.App.on("activated", function() {
console.log("---[[[[[ APP RESUMING ]]]]]---");
createjs.Ticker.paused = false;
});
Cocoon.App.on("suspending", function() {
console.log("---[[[[[ APP PAUSING ]]]]]---");
createjs.Ticker.paused = true;
});
onTick = function (e) {
if (!e.paused) {
animateUsingTicker(e.deltaTime);
countDownTimerUsingTicker(e.deltaTime);
//etc...
stage.update();
}
};
Can someone please assist me on this?
Many thanks
One really easy way to deal with this is to use Timer.maxDelta. For example, if you are targeting 60fps, you could set this to something like 32ms (double the expected delta), to prevent getting huge values back when resuming an app/game.

In IOS core audio, how do you find the true current play head position of a file player audio unit?

I have a program that uses a file player audio unit to play, pause and stop a audio file. The way I am accomplishing this is by initializing the file player audio unit to play the file at position zero and then when the user presses the pause button, I stop the AUGraph, capture the current position, and then use that position as the start position when the user presses the play button. Everything is working as it should, but every 3 or 4 times I hit pause and then play, the song starts playing a half to a full second BEFORE the point where I hit pause.
I can't figure out why this is happening, do any of you have any thoughts? here is a simplified version of my code.
//initialize AUGraph and File player Audio unit
...
...
...
//Start AUGraph
...
...
...
// pause playback
- (void) pauseAUGraph {
//first stop the AuGrpah
result = AUGraphStop (processingGraph);
// get current play head position
AudioTimeStamp ts;
UInt32 size = sizeof(ts);
result = AudioUnitGetProperty(filePlayerUnit,
kAudioUnitProperty_CurrentPlayTime, kAudioUnitScope_Global, 0, &ts,
&size);
//save our play head position for use later
//must add it to itself to take care of multiple presses of the pause button
sampleFrameSavedPosition = sampleFrameSavedPosition + ts.mSampleTime;
//this stops the file player unit from playing
AudioUnitReset(filePlayerUnit, kAudioUnitScope_Global, 0);
NSLog (#"AudioUnitReset - stopped file player from playing");
//all done
}
// Stop playback
- (void) stopAUGraph {
// lets set the play head to zero, so that when we restart, we restart at the beginning of the file.
sampleFrameSavedPosition = 0;
//ok now that we saved the current pleayhead position, lets stop the AUGraph
result = AUGraphStop (processingGraph);
}
May be you should use packet counts instead of timestamps, since you just want to pause and play the music, not display the time information.
See BufferedAudioPlayer for an example of using this method.
It may be due to rounding problems with your code:
For example, if every time you hit the pause button, your timer would record at a 0.5/4 seconds before your actual pause time, you would still see a desired result. But after repeating for four more times, the amount of space you have created is 0.5/4 times 4 which is the half of a second you seem to be experiencing.
Thus, I would pay careful attention to the object types you are using and make sure they don't round inappropriately. Try using a double float for your sample times to try to alleviate that problem!
Hope this is clear and helpful! :)