Duration of GIF in swift - swift

I am setting GIF in UIImageView as per following code in Splash Screen.
let url = Bundle.main.url(forResource: "Splash", withExtension: "gif")
self.imgGifSplash.sd_setImage(with: url!, completed: { (img, error, type, url) in
// this block is called when Gif is loaded in UIImageView, not when image is completed playing once.
DispatchQueue.main.asyncAfter(deadline: .now() + 3.2) { // Change `2.0` to the desired number of seconds.
if let _ = UserData.shared.loginData?.token {
let VC = mainStoryboard.instantiateViewController(withIdentifier: "HomeVCNav") as! UINavigationController
objAppDelegate.window?.rootViewController = VC
}
else
{
let VC = mainStoryboard.instantiateViewController(withIdentifier: "LoginNav") as! UINavigationController
objAppDelegate.window?.rootViewController = VC
}
}
}
I want Gif to be played only once.. So i tried using timer of 3.2 seconds but when i am using different devices then i am getting different results like sometimes it played only once, sometime twice, sometime not even completing once.
How can i achieve a perfect animated splash screen that plays GIF only once ?
Hope to get perfect answer. Thanks in advance.

Related

Play Icon missing in AV Player iOS 14

I am working on a swift project, using XCode 12 and after the iOS 14 upgrade, the play icon is missing from AVPlayer. The issue is very weird
When I load the app for the first time I am able to see the play icon (ScreenShot1). After going back to the home screen and coming back, the play icon is not visible (ScreenShot2).
ScreenShot1
ScreenShot2
Currently, AVPlayer is embedded inside UIImage View. Below is the Code
imageView.isUserInteractionEnabled = true
mediaPlayer.view.translatesAutoresizingMaskIntoConstraints = false
imageView.addSubview(mediaPlayer.view)
mediaPlayer.view.pin(to: imageView)
guard let itemURL = URL(string: asset.AssetURL) else { return }
let item = AVPlayerItem(url: itemURL)
let player = AVPlayer(playerItem: item)
mediaPlayer.player = player
mediaPlayer.videoGravity = .resizeAspectFill
mediaPlayer.entersFullScreenWhenPlaybackBegins = true
mediaPlayer.exitsFullScreenWhenPlaybackEnds = true

How do you add an overlay while recording a video in Swift?

I am trying to record, and then save, a video in Swift using AVFoundation. This works. I am also trying to add an overlay, such as a text label containing the date, to the video.
For example: the video saved is not only what the camera sees, but the timestamp as well.
Here is how I am saving the video:
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
saveVideo(toURL: movieURL!)
}
private func saveVideo(toURL url: URL) {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: url)
}) { (success, error) in
if(success) {
print("Video saved to Camera Roll.")
} else {
print("Video failed to save.")
}
}
}
I have a movieOuput that is an AVCaptureMovieFileOutput. My preview layer does not contain any sublayers. I tried adding the timestamp label's layer to the previewLayer, but this did not succeed.
I have tried Ray Wenderlich's example as well as this stack overflow question. Lastly, I also tried this tutorial, all of which to no avail.
How can I add an overlay to my video that is in the saved video in the camera roll?
Without more information it sounds like what you are asking for is a WATERMARK.
Not an overlay.
A watermark is a markup on the video that will be saved with the video.
An overlay is generally showed as subviews on the preview layer and will not be saved with the video.
Check this out here: https://stackoverflow.com/a/47742108/8272698
func addWatermark(inputURL: URL, outputURL: URL, handler:#escaping (_ exportSession: AVAssetExportSession?)-> Void) {
let mixComposition = AVMutableComposition()
let asset = AVAsset(url: inputURL)
let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]
let timerange = CMTimeRangeMake(kCMTimeZero, asset.duration)
let compositionVideoTrack:AVMutableCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))!
do {
try compositionVideoTrack.insertTimeRange(timerange, of: videoTrack, at: kCMTimeZero)
compositionVideoTrack.preferredTransform = videoTrack.preferredTransform
} catch {
print(error)
}
let watermarkFilter = CIFilter(name: "CISourceOverCompositing")!
let watermarkImage = CIImage(image: UIImage(named: "waterMark")!)
let videoComposition = AVVideoComposition(asset: asset) { (filteringRequest) in
let source = filteringRequest.sourceImage.clampedToExtent()
watermarkFilter.setValue(source, forKey: "inputBackgroundImage")
let transform = CGAffineTransform(translationX: filteringRequest.sourceImage.extent.width - (watermarkImage?.extent.width)! - 2, y: 0)
watermarkFilter.setValue(watermarkImage?.transformed(by: transform), forKey: "inputImage")
filteringRequest.finish(with: watermarkFilter.outputImage!, context: nil)
}
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset640x480) else {
handler(nil)
return
}
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileType.mp4
exportSession.shouldOptimizeForNetworkUse = true
exportSession.videoComposition = videoComposition
exportSession.exportAsynchronously { () -> Void in
handler(exportSession)
}
}
And heres how to call the function.
let outputURL = NSURL.fileURL(withPath: "TempPath")
let inputURL = NSURL.fileURL(withPath: "VideoWithWatermarkPath")
addWatermark(inputURL: inputURL, outputURL: outputURL, handler: { (exportSession) in
guard let session = exportSession else {
// Error
return
}
switch session.status {
case .completed:
guard NSData(contentsOf: outputURL) != nil else {
// Error
return
}
// Now you can find the video with the watermark in the location outputURL
default:
// Error
}
})
Let me know if this code works for you.
It is in swift 3 so some changes will be needed.
I currently am using this code on an app of mine. Have not updated it to swift 5 yet
I do not have an actual development environment for Swift that can utilize AVFoundation. Thus, I can't provide you with any example code.
For adding meta data(date, location, timestamp, watermark, frame rate, etc...) as an overlay to the video while recording, you would have to process the video feed, frame by frame, live, while recording. Most likely you would have to store the frames in a buffer and process them before actually record them.
Now when it come to the meta data, there are two type, static and dynamic. For static type such as a watermark, it should be easy enough, as all the frames will get the same thing.
However, for dynamic meta data type such as timestamp or GPS location, there are a few things that needed to be taken into consideration. It takes computational power and time to process the video frames. Thus, depends on the type of dynamic data and how you got them, sometime the processed value may not be a correct value. For example, if you got a frame at 1:00:01, you process it and add a timestamp to it. Just pretend that it took 2 seconds to process the timestamp. The next frame you got is at 1:00:02, but you couldn't process it until 1:00:03 because processing the previous frame took 2 seconds. Thus, depend on how you got that new timestamp for the new frame, that timestamp value may not be the value that you wanted.
For processing dynamic meta data, you should also take into consideration of hardware lag. For example, the software is supposed to add live GPS location data to each frame and there weren't any lags in development or in testing. However, in real life, a user used the software in an area with a bad connection, and his phone lag while obtaining his GPS location. Some of his lags lasted as long as 5 seconds. What do you do in that situation? Do you set a time out for the GPS location and used the last good position? Do you report the error? Do you defer that frame to be process later when the GPS data become available(This may ruin live recording) and using an expensive algorithm to try to predict the user's location for that frame?
Besides those to take into consideration, I have some references here that I think may help you. I thought the one from medium.com looked pretty good.
https://medium.com/ios-os-x-development/ios-camera-frames-extraction-d2c0f80ed05a
Adding watermark to currently recording video and save with watermark
Render dynamic text onto CVPixelBufferRef while recording video
Adding on to #Kevin Ng, you can do an overlay on video frames with an UIViewController and an UIView.
UIViewController will have:
property to work with video stream
private var videoSession: AVCaptureSession?
property to work with overlay(the UIView class)
private var myOverlay: MyUIView{view as! MyUIView}
property to work with video output queue
private let videoOutputQueue = DispatchQueue(label:
"outputQueue", qos: .userInteractive)
method to create video session
method to process and display overlay
UIView will have task-specific helper methods needed to to act as overlay. For example, if you are doing hand detection, this overlay class can have helper methods to draw points on coordinates(ViewController class will detect coordinates of hand features, do necessary coordinate conversions, then pass the coordinates to the UIView class to display coordinates as an overlay)

Failed to get video thumbnail from AVPlayer using Fairplay HLS

I'm trying to build a custom progress bar for a video player app in tvOS, and would like to show thumbnails of the video while the user scans the video.
I'm using AVPlayer and Fairplay HLS to play remote video files. I've tried to do this using 2 methods. One with AVAssetImageGenerator's copyCGImage, and the other with AVPlayerItemVideoOutput's copyPixelBuffer method. Both return nil.
When I tried with a local video file, the first method worked.
Method 1:
let imageGenerator = AVAssetImageGenerator(asset: playerItem.asset)
let progressSeconds = playerItem.duration.seconds * Double(progress)
let time = CMTime(seconds: progressSeconds, preferredTimescale: 5)
if let imageRef = try? imageGenerator.copyCGImage(at: time, actualTime: nil) {
image = UIImage(cgImage:imageRef)
}
Method 2:
let videoThumbnailsOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])
player?.currentItem?.add(videoThumbnailsOutput)
if let pixelBuffer = videoThumbnailsOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil) {
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
Any ideas what I'm doing wrong or is there any other way?
Thanks!
This is usually done by making use of the trick play stream associated to your actual stream.
https://en.wikipedia.org/wiki/Trick_mode
You can find it declared with the key EXT-X-I-FRAME-STREAM-INF in the manifest of your HLS stream. A regex might be needed in order to parse its value.
"#EXT-X-I-FRAME-STREAM-INF[^#]*URI=[^#]*"
Once you have the URL of the trick play stream, you can use a paused instance of AVPlayer as a thumbnail. And when the user swipe left and right, you should seek the player in the thumbnail to show the right frame.

SpriteKit scene deformed after modal presentation swift

Could use some help troubleshooting an issue with a SpriteKit scene.
I have a scene that displays some coins in the main section of the app.
When I present a viewcontroller from the bottom I have no issue. Same for tab bar navigation, no issues.
Here is the view as it should always be displayed.
The issue comes only when I present a viewcontroller from the side.
When the new viewcontroller is dismissed, the scene works, but is distorted.
this is how it is displayed after a viewcontroller is displayed modally and later on dismissed.
EDIT: I forgot to mention that if I swipe vertically on the distorted scene, the distortion is fixed and all is good.
Here is some of the code in viewDidAppear of the viewcontroller.
Thanks for the help.
EDIT 2:
I just tested the app on a iPhone 5 using iOS 10 and the issue doesn't happen. Any chance this might be iOS 11 related?
func configureScene(_ completion: () -> Void) {
defer { completion() }
guard wScene == nil else { return }
let skView = SKView(frame: self.view.frame)
skView.isUserInteractionEnabled = false
skView.backgroundColor = .clear
wScene = WScene(size: view.frame.size)
wScene.backgroundColor = .clear
skView.presentScene(wScene)
view.insertSubview(skView, belowSubview: collectionView)
if let buttonsObstacle = doubleButton?.buttonsView {
let obstacleSize = CGSize.init(width: buttonsObstacle.frame.width, height: buttonsObstacle.frame.height)
obstacle = SKSpriteNode.init(color: .clear, size: obstacleSize)
guard let obstacle = obstacle else { return }
obstacle.name = WScene.obstacleNodeName
let convertedOrigin = view.convert(buttonsObstacle.center, from: buttonsObstacle.superview)
let skConvertedOrigin = skView.convert(convertedOrigin, to: wScene)
obstacle.position = skConvertedOrigin
obstacle.physicsBody = SKPhysicsBody(rectangleOf: obstacleSize)
obstacle.physicsBody?.allowsRotation = false
obstacle.physicsBody?.isDynamic = false
source.scrollHandler = { [weak self] (scrollView) in
guard let strongSelf = self else { return }
strongSelf.buttonsMoved(inView: skView, withScroll: scrollView)
}
wScene.addChild(obstacle)
presenter.loadData()
}
}
I solved my issue.
It was related to the new iOS 11 adjustedContentInset property.
My Coin SK scene was being moved by the scroll handler when the view appeared after a modal transition.
My solution is to disable the scrolling for the first 0.1 second after the view appears. In this way iOS 11 doesn't touch the coins anymore while users are able to scroll correctly because they interact with the view most of the time after at least 0.1 seconds.

Play MP4 using MPMoviePlayerController() in Swift

I can't for the life of me figure out a way to play an MP4 that takes up the entire background in a UIViewController.
So far I have the following, which doesn't even play the video at all.
I can confirm that the bokeh.mp4 video exists because if I change the file to something else then it throws an error that it's missing.
override func viewDidAppear(animated: Bool) {
let filePath = NSBundle.mainBundle().pathForResource("bokeh", ofType: "mp4")
self.moviePlayerController.contentURL = NSURL(string: filePath)
self.moviePlayerController.prepareToPlay()
self.moviePlayerController.repeatMode = .One
self.moviePlayerController.controlStyle = .Embedded
self.view.addSubview(self.moviePlayerController.view)
}
I get an error in the console:
2014-06-22 21:22:42.347 MoviePlayer[26655:60b] _itemFailedToPlayToEnd: {
kind = 1;
new = 2;
old = 0;
}
I've also tried adding a UIView that takes up the entire screen and adding the player to that view, but it's the same problem. It doesn't start playing.
I'm trying to achieve the same effect that Vine has when you first load up the application where it has the video playing in the background.
So this was blindly annoying:
All I did was change:
self.moviePlayerController.contentURL = NSURL(string: filePath)
TO:
self.moviePlayerController.contentURL = NSURL.fileURLWithPath(filePath)