How to know the delay of frames between 2 videos, to sync an audio from video 1 to video 2? - powershell

world.
I have many videos that I want to compare one-to-one to check if they are the same, and get from there the delay of frames, let's say. What I do now is opening both video files with virtualdub and checking manually at the beginning of video 1 that a given frame is at position, i.e., 4325. Then I check video 2 to see the position of the same frame, i.e., 5500. That would make a delay of +1175 frames. Then I check at the end of the video 1 another given frame, position let's say 183038. I check too the video 2 (imagine the position is 184213) and I calculate the difference, again +1175: eureka, same video!
The frame I chose to compare aren't exactly random, it must be one that I know it is exactly one I can compare to (for example, a scene change, an explosion that appears from one frame to another, a dark frame after a lighten one...) and I always try to check for the first comparison frames within the first 10000 positions and for the second check I take at the end.
What I do next is to convert the audio from video 1 to video 2 calculating the number of ms needed, but I don't need help with that. I'd love to automatize the comparison so I just have to select video 1 and video 2, nothing else, that way I could forget forever virtualdub and save a lot of time.
I'm tagging this post as powershell too because I'm making a script where at the moment I have to introduce the delay between frames (after comparing manually) myself. It would be perfect that I could add this at the beginning of the script.
Thanks!

Related

How to getting acquired frames at full speed ? - Image Event Listener does not seem to be executing after every event

My goal is to read out 1 pixel from the GIF camera in VIEW mode (live acquisition) and save it to a file every time the data is updated. The camera is ostensibly updating every 0.0001 seconds, because this is the minimum acquisition time Digital Micrograph lets me select in VIEW mode for this camera.
I can attach an Image Event Listener to the live image of the camera, with the message map (messagemap = "data_changed:MyFunctiontoExecute"), and MyFunctiontoExecute is being successfully ran, giving me a file with numerous pixel values.
However, if I let this event listener run for a second, I only obtain close to 100 pixel values, when I was expecting closer 10,000 (if the live image is being updated every 0.0001 seconds).
Is this because the Live image is not updated as quickly I think?
The event-listener certainly is executed at each event.
However, the live-display of a high-speed camera will near-certainly not update at each acquired-frame. It will either perform some sort of cumulative or sampled display. The exact answer will depend on the exact system you are on and configurations that are made.
It should be noted that super-high frame-rates can usually only be achieved by dedicated firmware and optimized systems. It's unlikely that a "general software approach" - in particular of interpreted and non-compiled code - will be able to provide the necessary speed. This type of approach the problem might be doomed from the start.
(Instead, one will likely have to create a buffer and then set-up the system to acquire data directly into the buffer at highest-possible frame rate. This will be coding the camera-acquisition directly)

Huge problems with reading in movie frames with MATLAB

I've been working on a project that reads in video frames, stores them in an array and then performs operations on them. The frames are each split into 6 subsections that I have to analyze individually.I had previously been cropping the video beforehand and then was loading it in. I now have the program allow the user to load in the whole movie and then crop each 6th themselves and then the program runs consecutively on each 6th. The problem is that matlab just crashes when loading this now 6-times more pixel dense video in (It's about 120k frames). assuming I can get the user to specify the 6 cropping areas before, is there anyway to load in only a specific area of the movie at a time? Rather than storing the whole frame, only store a 6th? (Unlike how I currently store the whole and THEN crop out a 6th, just store a 6th right off the bat).
VideoReader does not allow you to load in part of a frame into memory. However, it allows you to load in only certain frames from the video into MATLAB instead of loading in the entire video. Agree with sam that loading in 120K frames of video into MATLAB is a very bad idea. Consider using the READ syntax that allows you to specify the start and stop frames to only read in the video in chunks after which you can use array indexing to slice each frame into 6 portions.
Dinesh

How to transition from a prerecorded video to real time video?

I have come up with an algorithm on Matlab, that permits me to recognize hand gestures on prerecorded videos. Now, I would like to run the same code but for real time video this time, I am not sure how to do it after putting these 2 lines:
vid=videoinput('winvideo',1);
preview(vid);
(real time video is on)
I am thinking about a loop : while the video is on, snap repeatedly some images in order to analyze them.
for k=1:numFrames
my code is applied here.
So, I would like to know how to make this transition from prerecorded videos to real time video.
Your help in this so much appreciated!
I would suggest you to first verify whether you can perform acquisition + gesture recognition in real-time using your algorithm. for that, first read video frames in a loop and render or save them and compute the reading and rendering overhead of a single frame say t1. Also compute the time taken by your algorithm to process one image say t2. The throughput(no. of frames process per second) of your system will be
throughput = 1/(t1 + t2)
It is important to know how many frames you need to process a gesture. First, try to compute the minimum no. of images that you need to identify a gesture in a given time and then verify in real-time whether you can process the same no. of images in the same time.

How Do I set multiple File Player Audio Units to start at the same time?

Right now I have a loop that loops through an array of file player audio units and tells them what position in the audio file to start playing. (this works) In this same loop I have the following code to tell the units when to start playing (-1 makes them play in the next render cycle). The problem is that they are not starting at the same time because the first track starts playing before i have had a chance to tell the third track to play. What I want to say is "track one, you play in exactly 5 cycles, Track 2 you play in exactly 4 cycles, Track 3 you play in exactly 3 cycles... etc. that way they play at the same time. Is this the right approach? If so, what value do you set for startTime.mSampleTime ? I have not found any documentation that tells me how to do this. Thanks
// tell the file player AU when to start playing (-1 sample time means next render cycle)
AudioTimeStamp startTime;
memset (&startTime, 0, sizeof(startTime));
startTime.mFlags = kAudioTimeStampSampleTimeValid;
startTime.mSampleTime = -1;
AudioUnitSetProperty(fileUnitArray[mycount], kAudioUnitProperty_ScheduleStartTimeStamp, kAudioUnitScope_Global, 0, &startTime, sizeof(startTime));
I was not able to track down any information on setting mSampleTime to any other value then -1 (i.e start on the next cycle) but I was able to work around the problem. Instead of keeping the AUGraph running, using AudioUnitReset to reset the file player audio unit and then using the code above to restart the file players, I store the current file player play-head position, stop the AUGraph completely, reinitialize the AUGraph with the current play-head position instead of telling it to start at position zero, and then restart the AUGRAPH.
What you need to do is schedule the playback of the audio files from the render thread.
You load the files on a different thread but you tell the files to play in the next render cycle from the render thread. This way all of your files wil start at the same time.

MPMoviePlayerController: key frame issue

I have a MPMoviePlayerController in my project.
Documentation says that next call:
moviePlayer.initialPlaybackTime = time;
starts at the closest key frame prior to the provided time.
Is it possible to start playing video from the specified time (not from the nearest key frame)?
No, it really isn't. Temporally compressed video streams can only generally start playback on a keyframe, as inter-frames depend on the keyframe for rendering. If seekability is important to you, consider making files with smaller keyframe intervals.