I am trying to display an image over top of a video for the first few seconds only. I am adding an image to a CALayer and then attempting to hide it using CABasicAnimation. I have tried a few different iterations of the code below and cannot get the image to disappear. I have also tried setting the delegate property and having the parent view extend CAAnimationDelegate with an implementation of animationDidStop. However, this seems to only fire once when the view is rendered and not at the desired time within the video. What is the proper way to perform animations when constructing videos using AVMutableComposition?
let mixComposition = AVMutableComposition.init()
let videoLayer = CALayer()
videoLayer?.frame = CGRect.init(x: 0, y: 0, width: 414, height: 414)
let parentLayer = CALayer()
parentLayer?.frame = CGRect.init(x: 0, y: 0, width: 414, height: 414)
parentLayer?.addSublayer(videoLayer!)
let imageLayer = CALayer()
imageLayer.backgroundColor = UIColor.green.cgColor
imageLayer.frame = CGRect(x: 100, y: 100, width: 50, height: 50)
imageLayer.contents = UIImage(named: "music")?.cgImage
let fadeOut = CABasicAnimation(keyPath: "opacity")
fadeOut.fromValue = 1.0
fadeOut.toValue = 0.0
fadeOut.duration = 2.0
fadeOut.setValue("video", forKey:"fadeOut")
fadeOut.fillMode = CAMediaTimingFillMode.forwards
imageLayer.add(fadeOut, forKey: nil);
parentLayer?.addSublayer(imageLayer)
// Main video composition instruction
mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction?.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: insertTime)
mainInstruction?.layerInstructions = arrayLayerInstructions
mainInstruction?.backgroundColor = UIColor.systemPink.cgColor
// Main video composition
let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction!]
mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
mainComposition.renderSize = outputSize!
Related
I am trying to place an imagelayer on videolayer, using CALayer.
For that, playerview is added to videolayer and it is added to parent layer. Next I am trying to add imagelayer to playerview by "contentoverlayview" property, but it is showing nil. How can I add imagelayer to videolayer?
let videoURL = URL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
var player: AVPlayer? = nil
if let videoURL = videoURL {
player = AVPlayer(url: videoURL)
}
let playerView = AVPlayerView(frame: self.view.frame)
playerView.player = player
let videoLayer = CALayer()
videoLayer.frame = playerView.frame
videoLayer.contents = playerView
let parentLayer = CALayer()
parentLayer.frame = self.view.frame
//playerView.player?.play()
//Image
let overlayImage = NSImageView(frame:NSRect(x: 10, y: 30, width: 72, height: 72))
let img = NSImage(contentsOfFile:"/Users/Documents/app/mages/image.png")
overlayImage.image = img
//playerView.addSubview(overlayImage)
let imageLayer = CALayer()
imageLayer.contents = img
imageLayer.frame = NSRect(x: 50, y: 30, width: 72, height: 72)
// parentLayer.addSublayer(videoLayer)
// parentLayer.addSublayer(imageLayer)
// let overlayVIew = NSView(frame: NSRect(x: 10, y: 30, width: 100, height: 100))
// playerView.contentOverlayView = overlayVIew
// self.view.layer = parentLayer
self.view.addSubview(playerView)
print("layering \(playerView.contentOverlayView)")
if let layer = playerView.contentOverlayView?.layer {
print("layer is available")
layer.addSublayer(imageLayer)
}
I wanted to know how to add image on video top using CALayers in macOS.
Trying to make a square video through animationTool.
See code below. The video is enlarged (https://i.stack.imgur.com/HscTk.jpg), how can i fix it?
let exportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)
let strFilePath: String = generateMergedVideoFilePath()
try? FileManager.default.removeItem(atPath: strFilePath)
exportSession?.outputURL = URL(fileURLWithPath: strFilePath)
exportSession?.outputFileType = .mp4
exportSession?.shouldOptimizeForNetworkUse = true
let mutableVideoComposition = AVMutableVideoComposition(propertiesOf: composition)
mutableVideoComposition.instructions = instructions
mutableVideoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
mutableVideoComposition.renderSize = CGSize(width: 1080, height: 1080)
let parentLayer = CALayer()
parentLayer.frame = CGRect(x: 0, y: 0, width: 1080, height: 1080)
let videoLayer = CALayer()
videoLayer.frame.size = videoSize
videoLayer.position = parentLayer.position
videoLayer.contentsGravity = .resizeAspectFill
parentLayer.addSublayer(videoLayer)
mutableVideoComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
mutableVideoComposition.renderSize = CGSize (width: 1080, height: 1080)
With this resolution, it takes the image relative to the top point, so I moved transform up
let coeConst = videoAssetWidth/videoAssetHeight
transform.translatedBy(x: -(videoAssetHeight-videoAssetHeight*coeConst)/2, y: 0)
Can someone tell me how can I animate a cross dissolve transition while chaging the initial frame?
My code:
self.image = initialImage
UIView.transition(with: _self, duration: 10.0, options: [.transitionCrossDissolve, .allowUserInteraction], animations: {
self.image = newImage
})
Also tried:
let transition = CATransition()
transition.duration = 10
transition.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseInEaseOut)
transition.type = kCATransitionFade
transition.delegate = self
self?.layer.add(transition, forKey: nil)
Changing the frame size while the animation runs leaves the initial image at the initial frame size, while the newImage adapts to the updated frame.
Thank you.
For an easier approach, create two image view instances and animate the alpha of the first one, while changing the frame for both of them.
let firstImageView = UIImageView(image: UIImage(named: "first"))
firstImageView.frame = CGRect(x: 100, y: 100, width: 200, height: 200)
let secondImageView = UIImageView(image: UIImage(named: "second"))
secondImageView.frame = firstImageView.frame
//the second image view should obscure the first one, that's why it's the first in the array
let imageViews = [secondImageView, firstImageView]
imageViews.forEach { view.addSubview($0) }
let newFrame = CGRect(x: 50, y: 50, width: 100, height: 100)
UIView.animate(withDuration: 10) {
firstImageView.alpha = 0
imageViews.forEach { $0.frame = newFrame }
}
I have an array of type [UIBezierPath] that I transform into cgPaths and then animate onto a CAShapeLayer I called shapeLayer. Now for some reason all my paths are upside down, so all the paths get drawn upside down. How can I fix this, I tried a couple of methods but sadly none of them worked... I did however figure out how to scale the path. This is my code to draw the swiftPath, which is a path made up of UIBezierPaths found in the Forms class under the function swiftBirdForm(). Drawing the path is working fine, I just can't figure out how to flip it 180 degrees.
#objc func drawForm() {
var swiftPath = Forms.swiftBirdForm()
let shapeLayer = CAShapeLayer()
shapeLayer.fillColor = UIColor.clear.cgColor
shapeLayer.strokeColor = UIColor.black.cgColor
shapeLayer.lineWidth = 1
shapeLayer.frame = CGRect(x: -120, y: 120, width: 350, height: 350)
var paths: [UIBezierPath] = swiftPath
guard let path = paths.first else {
return
}
paths.dropFirst()
.forEach {
path.append($0)
}
shapeLayer.transform = CATransform3DMakeScale(0.6, 0.6, 0)
shapeLayer.path = path.cgPath
self.view.layer.addSublayer(shapeLayer)
let strokeEndAnimation = CABasicAnimation(keyPath: "strokeEnd")
strokeEndAnimation.duration = 1.0
strokeEndAnimation.fromValue = 0.0
strokeEndAnimation.toValue = 1.0
shapeLayer.add(strokeEndAnimation, forKey: nil)
}
Using CATransform3D
shapeLayer.transform = CATransform3DMakeScale(1, -1, 1)
Transforming path,
let shapeBounds = shapeLayer.bounds
let mirror = CGAffineTransform(scaleX: 1,
y: -1)
let translate = CGAffineTransform(translationX: 0,
y: shapeBounds.size.height)
let concatenated = mirror.concatenating(translate)
bezierPath.apply(concatenated)
shapeLayer.path = bezierPath.cgPath
Transforming layer,
let shapeFrame = CGRect(x: -120, y: 120, width: 350, height: 350)
let mirrorUpsideDown = CGAffineTransform(scaleX: 1,
y: -1)
shapeLayer.setAffineTransform(mirrorUpsideDown)
shapeLayer.frame = shapeFrame
I am animating my GMSMarker so that it pulse once in couple seconds.
func addWave()
{
// circleView.layer.cornerRadius = size / 2
//Scale
let scaleAnimation = CABasicAnimation(keyPath: "transform.scale")
scaleAnimation.fromValue = 1
scaleAnimation.toValue = zoom
//Opacity
let alphaAnimation = CABasicAnimation(keyPath: "opacity")
alphaAnimation.toValue = 0.0
//Corner radius
// let cornerRadiusAnimation = CABasicAnimation(keyPath: "cornerRadius")
// cornerRadiusAnimation.fromValue = size / 2
// cornerRadiusAnimation.toValue = (size * zoom)/2
//Animation Group
let animations: [CAAnimation] = [scaleAnimation, alphaAnimation]
let animationGroup = CAAnimationGroup()
animationGroup.duration = duration
animationGroup.animations = animations
animationGroup.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseOut)
animationGroup.repeatCount = 1
animationGroup.fillMode = kCAFillModeForwards
animationGroup.isRemovedOnCompletion = false
circleView.layer.add(animationGroup, forKey: "group")
}
The result looks like this:
And if I uncomment Corner radius section it looks like this:
So I need an advice.
Based on my observations, I think it must be a issue of wrong path of your CAShapeLayer hence the masking of the circle.
I just wrote a radar animation, I hope it might help you.
let scaleAnimation = CABasicAnimation(keyPath: "transform.scale")
scaleAnimation.fromValue = 0
scaleAnimation.toValue = 1
let alphaAnimation = CABasicAnimation(keyPath: "opacity")
alphaAnimation.fromValue = 1
alphaAnimation.toValue = 0
let animations = CAAnimationGroup()
animations.duration = 0.8
animations.repeatCount = Float.infinity
animations.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseOut)
animations.animations = [scaleAnimation, alphaAnimation]
circleView.layer.add(animations, forKey: "animations")
What I did was, I used two CAShapeLayer (one being the orange marker and other is the rader layer at the back of marker). The animations were applied on the radar layer.
radarLayer = CAShapeLayer()
radarLayer.frame = CGRect(x: 0, y: 0, width: self.frame.size.width, height: self.frame.size.height);
radarLayer.path = UIBezierPath(rect: radarLayer.frame).cgPath
radarLayer.fillColor = UIColor.orange.cgColor
radarLayer.position = CGPoint(x: self.frame.size.width/2, y: self.frame.size.height/2)
radarLayer.cornerRadius = radarLayer.frame.size.width/2
radarLayer.masksToBounds = true
radarLayer.opacity = 0
self.layer.addSublayer(radarLayer)
circleLayer = CAShapeLayer()
circleLayer.frame = CGRect(x: 0, y: 0, width: 16, height: 16);
circleLayer.path = UIBezierPath(rect: circleLayer.frame).cgPath
circleLayer.fillColor = UIColor.orange.cgColor
circleLayer.position = CGPoint(x: self.frame.size.width/2, y: self.frame.size.height/2)
circleLayer.cornerRadius = circleLayer.frame.size.width/2
circleLayer.masksToBounds = true
self.layer.addSublayer(circleLayer)
PS. I'm new to Swift, so enlighten me if I'm wrong somewhere :)
Example : http://i.imgur.com/v2jFWgw.jpg
The following in an excerpt from Google documentation on Markers:
The view behaves as if clipsToBounds is set to YES, regardless of its actual value. You can apply transforms that work outside the bounds, but the object you draw must be within the bounds of the object. All transforms/shifts are monitored and applied. In short: subviews must be contained within the view.
The consequence of that, at least for my case, was exactly the behaviour mentioned in the question: a squared circle animation. The image I wanted to add inside the marker, at the end of the animation, was bigger than the container, so once added to the marker, it was clipped to the bounds of it.
The following is what I did:
public var pulseImageView: UIImageView = {
let imageView = UIImageView(frame: CGRect(x: 0, y: 0, width: 300, height: 300))
imageView.image = PaletteElements.pulseLocation.value
imageView.contentMode = .center
let pulseAnimation = CABasicAnimation(keyPath: "transform.scale.xy")
pulseAnimation.repeatCount = Float.infinity
pulseAnimation.fromValue = 0
pulseAnimation.toValue = 2.0
pulseAnimation.isRemovedOnCompletion = false
let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
fadeOutAnimation.duration = 2.5
fadeOutAnimation.fromValue = 1.0
fadeOutAnimation.toValue = 0
fadeOutAnimation.repeatCount = Float.infinity
let animationGroup = CAAnimationGroup()
animationGroup.duration = 2.5
animationGroup.animations = [pulseAnimation, fadeOutAnimation]
animationGroup.timingFunction = CAMediaTimingFunction(name: CAMediaTimingFunctionName.easeOut)
animationGroup.repeatCount = .greatestFiniteMagnitude
animationGroup.fillMode = CAMediaTimingFillMode.forwards
animationGroup.isRemovedOnCompletion = false
imageView.layer.add(animationGroup, forKey: "pulse")
return imageView
}()
I defined a var inside my helper class so I can retrieve the image whenever I need it. My original image, the pulse image, was 116x116, so I created the imageView as 300x300 with a contentMode = .center, so the small image was in the center and not stretched. I chose 300x300 because my animation scaled up the pulse image until a 2x its initial value (116x2), so I made room for the entire animation to be performed.
Finally I added it to as a Marker to the map:
let pulseMarker = GMSMarker(position: userLocation.coordinate)
pulseMarker.iconView = pulseImageView
pulseMarker.groundAnchor = CGPoint(x: 0.5, y: 0.5)
pulseMarker.map = googleMapView
Hope it can help.