distortion in making netlogo video - netlogo

I am making video of simulation on netlogo. The total time of video is around 30 minutes. When i play movie, it works fine for first couple of minutes, then the screen of movie starts distorting and after some times the black screen appears. I tried to making video on by setting different frame set (i.e. 6, 12 and 24). But every time i got the same behavior in movie. Any suggestion?

Related

Using hls.js to play a live stream with a rolling live window, how do you keep captions in webvtt aligned?

I can play a live stream with the hls.js player using playlists master.m3u8, video.m3u8 and metadata.m3u8. The video is created using ffmpeg hls commands and is using a rolling live window with the args:
-hls_list_size 20 -hls_time 2 -hls_flags delete_segments
This creates video fragments starting with video0.ts to video19.ts, then starts removing the first fragments as it adds new ones. The video.m3u8 eventually looks like...
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:25
#EXTINF:2.002133,
video25.ts
#EXTINF:2.002133,
video26.ts
...
My metadata.m3u8 playlist looks similar though I am creating that from a separate source. The video and metadata playlist are kept in sync and play fine from the start of my live.
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:25
#EXTINF:2.000
sub_25.vtt
#EXTINF:2.000
sub_26.vtt
...
The problem starts when I reload the player page. On a reload, the player loads the playlists and will play correctly at the current live point.
I see it load fragments around video45.ts and sub_45.vtt. This is seemingly correct as the media sequence in video.m3u8 is at 25. Add in the 20 fragment playlist size and the live position is around 45th fragment. This is around 90 seconds into the live.
However, the media time in the player shows 40 seconds. It seems to use the number of fragments in the playlist only to come up with 40 seconds, even though the real live time is 90 seconds.
The final resulting problem is that the 40 second media time is being used to reference the text track cues and showing captions for 40 seconds.. not 90 seconds mark where the video actually is from.
Is there a way to get the player to correctly reflect the 'real' time, despite the rolling window of live, so the captions (which are correctly loaded) to display at the correct time?
Or is the rolling window live playback not going to work with subtitles in vtt?
If I disable the rolling window support I can reload the live many times and the 'full' live time loads and the captions line up ok.

iOS - trying to understand how movies work

I am working on an application that has to be tight control of time flow during a movie recording.
Apple says the iPhone 5 can capture HD video up to 30 fps. If shoot a video and play it on quicktime I see a variable FPS, that reaches 30 fps at some moments, but at the same time quicktime reports the video as being 29.75 fps.
As far as I understand, for each second of video, an integer number of frames should be displayed, not a fractional number. I first thought that could be related to drop frames. Then I decided to design an method to measure drop frames and realized that for every second of video, the iPhone drops from 1 to 4 frames. Also discovered that every time a frame is dropped iPhone simply copies the last frame again to fill the gap. So in theory, dropping a frame would make no difference in the total number of frames a move would have.
So, this is my problem. What this 29.75 fps is telling? How this number is obtained?
It's not so much that x number of frames are shown per second, but each frame is shown for 1/x seconds. NTSC (the TV standard in US, Japan and others), runs at 29.97fps. So, each frame is shown for a bit more than 3/100ths of a second before the next frame is drawn. So, in your case, each frame is displayed for roughly .0336 of a second before the next one is shown.

Playing a real time video stream from iPhone camera on a 20 second delay

I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).

AVPlayer - Benny Hill style speeded up video issue

I'm using AVPlayer to show a live stream. Works fine. Part of my app requires a modal view to appear over the video. Actually, it flips the video view to show the modal view. I want the video's audio to keep playing, which it does fine.
However, the problem is when I return to the video. The video's visuals speed up for a few seconds, Benny Hill style, to compensate for the period they were off the screen. Then the video starts playing normally. Is there a way to stop this speed up? If not, can I remove the visual element until the speeded up bit stops.

How to play a video slowly for marking

I am Creating application for coaching. I struck with the marking on video. So I choose ffmpeg for converting video to image frame. That make me time delay as well as memory issues. I need to provide the user play the video slowly as frame by frame. Is there any other way to do that with out image conversion. V1 Golf did that process very quick manner. Please help me.
I would try converting video frame in separate thread and I would extract a few frames ahead as images in the background when user gets into 'slow motion mode'.
Here is example for one frame, so you should be quick with others: Video frame capture by NSOperation.
This should reduce delays and frames could be converted while user is eye-consuming subsequent frames.