How to stop animating GIF in SDWebImage? - swift

I'm using SDWebImage to fetch images from web and display them.
But some are GIFs, how to stop the GIF?

What you can do to stop the animation is to simply set your SDAnimatedImageView to for example
sdAnimatedImageView.image = SDAnimatedImage(named: "your.gif")?.animatedImageFrame(at: 0)
that way you will get one frame of your gif only and it will not be animated :)
its a hack but does the job

Assuming you are using >= v4.0; from the docs:
Starting with the 4.0 version, we rely on FLAnimatedImage to take care of our animated images.
From FLAnimatedImage's docs; there are 2 ways to control playback:
// An FLAnimatedImageView can take an FLAnimatedImage and plays it automatically when in view hierarchy and stops when removed.
// The animation can also be controlled with the UIImageView methods -start/stop/isAnimating.

Related

flutter: record a video that has same duration as animated webp

what I'd like to do:
I would like to record a video using Flutter's CameraController that has the same duration as an animated webp. On top of my screen, the animated webp is playing and below there is a CameraPreview() widget that records whatever my camera catches. This recorded video's duration should be exact as long as the animated webp's duration.
what I've tried so far:
Since Giphy offers not only a webp-version, but also an mp4-version, I downloaded the mp4 version and used ffmpeg to get the duration of that file.
I then used a timer and called VideoController.stopVideoRecording() after this duration automatically after VideoController.startVideoRecording().
what I'd expect to happen:
I'd expect this recorded video to be as long as the animated webp. Unfortunately, it's not.
So, my question is:
Do you guys have any idea how I could manage to record a video with same duration as an animated webp?
Thanks :)
Ok, sort of found what the issue is: Webp (and also Gifs) in Flutter are played slower than in browsers. Dont know if that is the case for all webps and gifs, but the ones I tested are all animated faster in a desktop-browser than in flutter. So, animation time of those webp is not the same as playing-time of the respective .mp4 file.
I use those mp4-versions now instead and that does the job.

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.

MPMoviePlayer or AVPlayer Frame advance

Is there a way to playback a video file frame by frame using either MPMoviePlayer or AVPlayer?
Or even another movie player that I do not know about?
Here is what I want to do. I want to load a video into a fullscreen player and move the content one frame at a time based on user interaction. This will need to be pretty solid as I would need to accurately control what frame the movie player was displaying at any one time.
Ideally I would love to know if it were possible to load a video and control the frame displayed using code.
I know that I could do this using a UIImageView animation but tests show that this uses FAR to much memory.
When using AVPlayer, you can use the - (void)stepByCount:(NSInteger)stepCount method of its current AVPlayerItem to step forward or backward.
AVPlayer *mPlayer = [AVPlayer playerWithURL:url];
[mPlayer.currentItem stepByCount:1];

How to play a video slowly for marking

I am Creating application for coaching. I struck with the marking on video. So I choose ffmpeg for converting video to image frame. That make me time delay as well as memory issues. I need to provide the user play the video slowly as frame by frame. Is there any other way to do that with out image conversion. V1 Golf did that process very quick manner. Please help me.
I would try converting video frame in separate thread and I would extract a few frames ahead as images in the background when user gets into 'slow motion mode'.
Here is example for one frame, so you should be quick with others: Video frame capture by NSOperation.
This should reduce delays and frames could be converted while user is eye-consuming subsequent frames.

Is it Possible to Have own scrubber for MPMoviePlayercontroller

I want to play videos. I am using MPMoviePlayer, but I don't want to use the controls provided by MPMoviePlayer. So I am trying to create my own custom controls. All the functionality like play, pause, fullscreen, forward, backward are done. The only problem is with the scrubber. I am having one UISlider but I don't know how exactly work with this. How to track the currently playing video time? How to play video from where I will slide the thumb of slider?
If anyone knows this kindly help me in this.
Thanks in advance.
I was having a similar problem. I figured out how to create custom movie controls and put it up on github. Let me know if that helps. Feel free to ask me any questions if you want details.
First, we should note that all of this is possible in iOS 3.2+, if you are OK not to support iOS 3.1.x.
In iOS 3.2+, MPMoviePlayerController implements the MPMediaPlayback protocol, meaning that it responds to play, stop, etc., all the controls you would expect -- sounds like you already have some of this working. Please see the reference for the MPMediaPlayback protocol.
To get the MPMoviePlayerController to stop showing its own controls, do this on initialization:
yourPlayer.controlStyle = MPMovieControlStyleNone;
Finally, to get the scrubber to work, you need to set the UISlider valueChanged: callback to something, and update the value of currentPlaybackTime property. If you want to seek 10 seconds in:
yourPlayer.currentPlaybackTime = 10;