How can I set offset of a video in Swift 4 ? Seek(to:) doesn't seem to work - swift

I am having an issue setting offset of AVQueuePlayer video. I have tried seek(to function to set offset of the video but it doesn't seem to work. The video always starts from 0 Seconds. Other requirements are playing it with control and looping back which are working fine.
I am kind a stuck at playing the video from any other point other than 0 Seconds.
func getVideoView() -> UIView
{
var videoViewContainer = UIView(frame: CGRect(x: 0, y: 0, width: 375, height: 375))
let videoUrl = URL(string: "https://myvideourl.mp4")
let item = AVPlayerItem(url: videoUrl!)
player = AVQueuePlayer()
playerLooper = AVPlayerLooper(player: player!, templateItem: item)
let time = CMTime(seconds: 17.000000, preferredTimescale: CMTimeScale(1))
player?.seek(to: time, completionHandler: { (handler) in
} )
item.forwardPlaybackEndTime = CMTimeMake(20, 1) // For playing it for 20 Seconds
let layer: AVPlayerLayer = AVPlayerLayer(player: player)
layer.frame = videoViewContainer.bounds
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoViewContainer.layer.addSublayer(layer)
player?.play()
return videoViewContainer
}

I got the answer. SeekTo doesn't seem to work when use with AVPlayerLooper.
AVPlayerLooper itself has a property time range.
Example given below
playerLooper = AVPlayerLooper(player: self.player!,
templateItem: item!,
timeRange: CMTimeRange(start: CMTime(seconds: Double(start), preferredTimescale: 1) , duration: CMTime(seconds: Double(duration), preferredTimescale: 1)))

Related

SKVideoNode seamless loop using AVPlayerLooper

Hi can anyone tell me why using AVPlayerLooper on a SKVideoNode causes a slight delay between loops whereas putting it on a AVPlayerLayer is seamless? I need to use SKVideoNode due to the placement of the layer in a SKScene, but also need it to be seamless. It works perfectly with AVPlayerLayer but this then sits in front of the SKScene.
This is seamless...
let fileUrl = Bundle.main.url(forResource: "chicken", withExtension: "mp4")!
let asset = AVAsset(url: fileUrl)
let playerItem = AVPlayerItem(asset: asset)
queuePlayer = AVQueuePlayer(playerItem: playerItem)
playerLayer = AVPlayerLayer(player: queuePlayer)
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
playerLayer.frame = CGRect(x: 0, y: 525 * hR, width: view.frame.width, height: 294 * hR)
playerLayer.videoGravity = AVLayerVideoGravity.resize
playerLayer.zPosition = -1
view.layer.addSublayer(playerLayer)
queuePlayer.play()
This is not seamless
let fileUrl = Bundle.main.url(forResource: "chicken", withExtension: "mp4")!
let asset = AVAsset(url: fileUrl)
let playerItem = AVPlayerItem(asset: asset)
queuePlayer = AVQueuePlayer(playerItem: playerItem)
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
skVideoNode = SKVideoNode(avPlayer: queuePlayer)
skVideoNode.position = CGPoint(x: frame.midX, y: view.frame.height - 672 * hR)
skVideoNode.size = CGSize(width: view.frame.width , height: 294 * hR)
addChild(skVideoNode)
skVideoNode.play()

Put buttons over PlayerLayer - Swift - Programmatically

I have a UIView which Is supposed to play a video using AVPlayerLayer and AVPlayer.
I set the player in this way:
fileprivate func setUpPlayer(){
let urlPathString = Bundle.main.path(forResource: "dance", ofType: "mp4")
if let videoURL = urlPathString{
let url = URL(fileURLWithPath: videoURL)
player = AVPlayer(url: url)
playerLayer = AVPlayerLayer(player: player)
playerLayer.videoGravity = .resizeAspectFill
self.playerView.layer.addSublayer(playerLayer)
self.playerView.layer.masksToBounds = true
}
}
override func layoutSubviews() {
self.layer.cornerRadius = 15
playerLayer.frame = self.bounds
self.setupShadow(opacity: 0.6, radius: 6, offset: CGSize.init(width: 0, height: 0), color: .darkGray)
}
The problem is that no matter how many subviews I set over the PlayerView, the video actually goes over everything hiding all the subviews I set before.
Do you know how I can put buttons or actually other UIViews over AVPlayerLayer without it actually hiding them?
Change
self.playerView.layer.addSublayer(playerLayer)
to
self.playerView.layer.insertSublayer(playerLayer, at: 0)

Can't show animated CALayer in video using AVVideoCompositionCoreAnimationTool

UPDATE 6:
I've managed to fix my issue completely but I still would like a better explanation than what I'm guessing is the reason it didn't work if I'm incorrect
I've been trying to animate a sprite sheet over a video but every time I export the video the end result is the sample video I start with.
Here's my code:
First up my custom CALayer to handle my own sprite sheets
class SpriteLayer: CALayer {
var frameIndex: Int
override init() {
// Using 0 as a default state
self.frameIndex = 0
super.init()
}
required init?(coder aDecoder: NSCoder) {
self.frameIndex = 0
super.init(coder: aDecoder)
}
override func display() {
let currentFrameIndex = self.frameIndex
if currentFrameIndex == 0 {
return
}
let frameSize = self.contentsRect.size
self.contentsRect = CGRect(x: 0, y: CGFloat(currentFrameIndex - 1) * frameSize.height, width: frameSize.width, height: frameSize.height)
}
override func action(forKey event: String) -> CAAction? {
if event == "contentsRect" {
return nil
}
return super.action(forKey: event)
}
override class func needsDisplay(forKey key: String) -> Bool {
return key == "frameIndex"
}
}
Gif is a basic class with nothing fancy and works just fine. gif.Strip is a UIImage of a vertical sprite sheet representing the gif.
Now comes the method that should export a new video (it is part of a larger class used for exporting.
func convertAndExport(to url :URL , completion: #escaping () -> Void ) {
// Get Initial info and make sure our destination is available
self.outputURL = url
let stripCgImage = self.gif.strip!.cgImage!
// This is used to time how long the export took
let start = DispatchTime.now()
do {
try FileManager.default.removeItem(at: outputURL)
} catch {
print("Remove Error: \(error.localizedDescription)")
print(error)
}
// Find and load "sample.mp4" as a AVAsset
let videoPath = Bundle.main.path(forResource: "sample", ofType: "mp4")!
let videoUrl = URL(fileURLWithPath: videoPath)
let videoAsset = AVAsset(url: videoUrl)
// Start a new mutable Composition with the same base video track
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)!
let clipVideoTrack = videoAsset.tracks(withMediaType: .video).first!
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
} catch {
print("Insert Error: \(error.localizedDescription)")
print(error)
return
}
compositionVideoTrack.preferredTransform = clipVideoTrack.preferredTransform
// Quick access to the video size
let videoSize = clipVideoTrack.naturalSize
// Setup CALayer and it's animation
let aLayer = SpriteLayer()
aLayer.contents = stripCgImage
aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
aLayer.opacity = 1.0
aLayer.masksToBounds = true
aLayer.bounds = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
aLayer.contentsRect = CGRect(x: 0, y: 0, width: 1, height: 1.0 / 3.0)
let spriteAnimation = CABasicAnimation(keyPath: "frameIndex")
spriteAnimation.fromValue = 1
spriteAnimation.toValue = 4
spriteAnimation.duration = 2.25
spriteAnimation.repeatCount = .infinity
spriteAnimation.autoreverses = false
spriteAnimation.beginTime = AVCoreAnimationBeginTimeAtZero
aLayer.add(spriteAnimation, forKey: nil)
// Setup Layers for AVVideoCompositionCoreAnimationTool
let parentLayer = CALayer()
let videoLayer = CALayer()
parentLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
videoLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
parentLayer.addSublayer(videoLayer)
parentLayer.addSublayer(aLayer)
// Create the mutable video composition
let videoComp = AVMutableVideoComposition()
videoComp.renderSize = videoSize
videoComp.frameDuration = CMTimeMake(1, 30)
videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
// Set the video composition to apply to the composition's video track
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
let videoTrack = mixComposition.tracks(withMediaType: .video).first!
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
instruction.layerInstructions = [layerInstruction]
videoComp.instructions = [instruction]
// Initialize export session
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)!
assetExport.videoComposition = videoComp
assetExport.outputFileType = AVFileType.mp4
assetExport.outputURL = self.outputURL
assetExport.shouldOptimizeForNetworkUse = true
// Export
assetExport.exportAsynchronously {
let status = assetExport.status
switch status {
case .failed:
print("Export Failed")
print("Export Error: \(assetExport.error!.localizedDescription)")
print(assetExport.error!)
case .unknown:
print("Export Unknown")
case .exporting:
print("Export Exporting")
case .waiting:
print("Export Waiting")
case .cancelled:
print("Export Cancelled")
case .completed:
let end = DispatchTime.now()
let nanoTime = end.uptimeNanoseconds - start.uptimeNanoseconds
let timeInterval = Double(nanoTime) / 1_000_000_000
// Function is now over, we can print how long it took
print("Time to generate video: \(timeInterval) seconds")
completion()
}
}
}
EDIT:
I based my code on the following links
SpriteLayer and how to use it
CABasicAnimation on a video
Using AVVideoCompositionCoreAnimationTool and AVAssetExportSession to save the new video
UPDATE 1:
I've tried removing the CABasicAnimation part of my code and played around with my CALayer but to no avail. I can't even get the image to show up.
To test things out I tried animating this sprite sheet using a CAKeyframeAnimation on contentsRect in a Xcode Playground and it worked fine so I don't think the issue is with the CABasicAnimation, and maybe not even with the CALayer itself. I could really use some help on this because I don't understand why I can't even get an image to show over my sample video on the export.
UPDATE 2:
In response to matt's comment I've tried forgetting about the sprite sheet for a bit and changed it into a CATextLayer but still not seeing anything on my video (it has dark images so white text should be perfectly visible)
let aLayer = CATextLayer()
aLayer.string = "This is a test"
aLayer.fontSize = videoSize.height / 6
aLayer.alignmentMode = kCAAlignmentCenter
aLayer.foregroundColor = UIColor.white.cgColor
aLayer.bounds = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height / 6)
UPDATE 3:
As per Matt's request I tried changing parentLayer.addSublayer(aLayer) to videoLayer.addSublayer(aLayer) but still nothing changed, but I thought as much because the documentation for the AVVideoCompositionCoreAnimationTool is as follows
convenience init(postProcessingAsVideoLayer videoLayer: CALayer,
in animationLayer: CALayer)
meaning my parentLayer is it's animationLayer and probably means any animations should be done in this layer.
UPDATE 4:
I'm starting to go crazy over here, I've given up for now the idea of showing text or an animated image I just want to affect my video in any way possible so I changed aLayer to this:
let aLayer = CALayer()
aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
aLayer.backgroundColor = UIColor.white.cgColor
Well, this does absolutely nothing, I still get my sample video at my outputUrl (I started testing this in a playground with the following code if you want to "play" along)
import PlaygroundSupport
import UIKit
import Foundation
import AVFoundation
func convertAndExport(to url :URL , completion: #escaping () -> Void ) {
let start = DispatchTime.now()
do {
try FileManager.default.removeItem(at: url)
} catch {
print("Remove Error: \(error.localizedDescription)")
print(error)
}
let videoPath = Bundle.main.path(forResource: "sample", ofType: "mp4")!
let videoUrl = URL(fileURLWithPath: videoPath)
let videoAsset = AVURLAsset(url: videoUrl)
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)!
let clipVideoTrack = videoAsset.tracks(withMediaType: .video).first!
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
} catch {
print("Insert Error: \(error.localizedDescription)")
print(error)
return
}
compositionVideoTrack.preferredTransform = clipVideoTrack.preferredTransform
let videoSize = clipVideoTrack.naturalSize
print("Video Size Detected: \(videoSize.width) x \(videoSize.height)")
let aLayer = CALayer()
aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
aLayer.backgroundColor = UIColor.white.cgColor
let parentLayer = CALayer()
let videoLayer = CALayer()
parentLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
videoLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
parentLayer.addSublayer(videoLayer)
parentLayer.addSublayer(aLayer)
aLayer.setNeedsDisplay()
let videoComp = AVMutableVideoComposition()
videoComp.renderSize = videoSize
videoComp.frameDuration = CMTimeMake(1, 30)
videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
let videoTrack = mixComposition.tracks(withMediaType: .video).first!
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
instruction.layerInstructions = [layerInstruction]
videoComp.instructions = [instruction]
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)!
assetExport.videoComposition = videoComp
assetExport.outputFileType = AVFileType.mp4
assetExport.outputURL = url
assetExport.shouldOptimizeForNetworkUse = true
assetExport.exportAsynchronously {
let status = assetExport.status
switch status {
case .failed:
print("Export Failed")
print("Export Error: \(assetExport.error!.localizedDescription)")
print(assetExport.error!)
case .unknown:
print("Export Unknown")
case .exporting:
print("Export Exporting")
case .waiting:
print("Export Waiting")
case .cancelled:
print("Export Cancelled")
case .completed:
let end = DispatchTime.now()
let nanoTime = end.uptimeNanoseconds - start.uptimeNanoseconds
let timeInterval = Double(nanoTime) / 1_000_000_000
print("Time to generate video: \(timeInterval) seconds")
completion()
}
}
}
let outputUrl = FileManager.default.temporaryDirectory.appendingPathComponent("test.mp4")
convertAndExport(to: outputUrl) {
print(outputUrl)
}
Please someone help me understand what I'm doing wrong...
UPDATE 5:
I am running everything except playground tests from an iPad Air 2 (so no simulator) because I use the camera to take pictures and then stitch them into a sprite sheet I then planned on animating on a video I would send by email. I started doing Playground testing because every test from the iPad required me to go through the whole app cycle (countdown, photos, form, email sending/receiving)
Ok, Finally got it to work as I always wanted it to.
First off even if he deleted his comments, thanks to Matt for the link to a working example that helped me piece together what was wrong with my code.
First off
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)!
I needed to use AVAssetExportPresetHighestQuality instead of AVAssetExportPresetPassthrough. My guess is that the passthrough preset means you don't do any re-encoding so setting it to highest (not medium because my exported video is of over 400x400) made it so that I could actually re-encode my video. I'm guessing this is what was stopping the exported video from containing any of the CALayer I was trying out (even covering the video in white).
Secondly (not sure if this affects really but I'll try later)
parentLayer.addSublayer(aLayer)
I replaced this with:
videoLayer.addSublayer(aLayer)
Not sure if this really mattered but my understanding was that this was actually the animation layer for AVVideoCompositionCoreAnimationTool and parentLayer was just a container not meant to contain more than this, but I'm likely wrong.
Third change I did
let spriteAnimation = CABasicAnimation(keyPath: "frameIndex")
spriteAnimation.fromValue = 1
spriteAnimation.toValue = 4
spriteAnimation.duration = 2.25
spriteAnimation.repeatCount = .infinity
spriteAnimation.autoreverses = false
spriteAnimation.beginTime = AVCoreAnimationBeginTimeAtZero
aLayer.add(spriteAnimation, forKey: nil)
I changed it to this:
let animation = CAKeyframeAnimation(keyPath: #keyPath(CALayer.contentsRect))
animation.duration = 2.25
animation.calculationMode = kCAAnimationDiscrete
animation.repeatCount = .infinity
animation.values = [
CGRect(x: 0, y: 0, width: 1, height: 1/3.0),
CGRect(x: 0, y: 1/3.0, width: 1, height: 1/3.0),
CGRect(x: 0, y: 2/3.0, width: 1, height: 1/3.0)
] as [CGRect]
animation.beginTime = AVCoreAnimationBeginTimeAtZero
animation.fillMode = kCAFillModeBackwards
animation.isRemovedOnCompletion = false
aLayer.add(animation, forKey: nil)
This change was mainly removing my custom animations for the sprite sheet (since it will always be the same I first wanted a working example then I'll generalise it and probably add it to my private UI Pod). But most importantly animation.isRemovedOnCompletion = false I noticed that removing this makes it so the animation simply does not play on the exported video. So for anyone with CABasicAnimation not animating on the video after an export, try looking if your isRemovedOnCompletion is set correctly on your animation.
I think that's pretty much all the changed I did.
Although I technically answered my question my bounty remains to understand how AVVideoCompositionCoreAnimationTool and AVAssetExport work and why I had to do the changes I did to finally get it to work if anyone is interested in explaining.
Thanks again to Matt, you helped me out by showing me how you did it.

Layout coordinates for a video-texture object in SceneKit with Swift

I'm working on creating an SCNNode instance that consists of an SCNPlane - type geometry, with a video playing on the surface of the node. This is what I have so far:
let plane = SCNPlane(width: 5, height: 5)
node = SCNNode(geometry: plane)
GlobalScene.shared.rootNode.addChildNode(node!)
let urlStr = Bundle.main.path(forResource: "test", ofType: "mov")
videoURL = NSURL(fileURLWithPath: urlStr!)
player = AVPlayer(playerItem: AVPlayerItem(url: videoURL! as URL))
let videoNode = SKVideoNode(avPlayer: player!)
player?.actionAtItemEnd = .none
let spritescene = SKScene(size: CGSize(width: 122, height: 431))
videoNode.size.width=spritescene.size.width
videoNode.size.height=spritescene.size.height
spritescene.addChild(videoNode)
plane.firstMaterial?.diffuse.contents = spritescene
The overall functionality so far works great! What I'm trying to figure out is, how to I get the node's ENTIRE surface to be made up of the video ? So far, I've only gotten it to appear in the single corner shown below:
EDIT: it looks like the issue is that I'm only setting the first material of the plane node (rather than all of them...) which is why I'm seeing the other 3 "quadrants" as blank.
EDIT2: That first conclusion may not be correct - if I set:
plane.firstMaterial?.diffuse.contents = NSColor.green
I get the following result:
So...why won't that work when applying the contents of a SpriteKit scene?
Using a SKScene and a SKVideoNode is not necessary. You can directly set the AVPlayer as the contents of a SCNMaterialProperty instance. This will allow for better performance and will avoid having to deal with scaling and positioning the SpriteKit elements.
With the help of this gist, I was able to solve my issue; which ended up involving both scaling the video node properly, as well as setting the position properly. My working code is below:
GlobalScene.shared.rootNode.addChildNode(node!)
let urlStr = Bundle.main.path(forResource: "test", ofType: "mov")
videoURL = NSURL(fileURLWithPath: urlStr!)
player = AVPlayer(playerItem: AVPlayerItem(url: videoURL! as URL))
videoSpriteKitNode = SKVideoNode(avPlayer: player!)
player?.actionAtItemEnd = .none
NotificationCenter.default.addObserver(
self,
selector: #selector(self.playerItemDidReachEnd),
name: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
object: player?.currentItem)
let spriteKitScene = SKScene(size: CGSize(width: 1276.0 / 2.0, height: 712.0 / 2.0))
videoSpriteKitNode?.size.width=spriteKitScene.size.width
videoSpriteKitNode?.size.height=spriteKitScene.size.height
node?.geometry?.firstMaterial?.diffuse.contentsTransform = SCNMatrix4Translate(SCNMatrix4MakeScale(1, -1, 1), 0, 1, 0)
videoSpriteKitNode?.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
videoSpriteKitNode?.size = spriteKitScene.size
videoSpriteKitNode?.play()
spriteKitScene.addChild(videoSpriteKitNode!)
plane?.firstMaterial?.diffuse.contents = spriteKitScene

iOS AVMutableComposition Add text overlay

can someone please advise.
I am trying to add a text overlay (title) to a video I am composing using AVFoundation. I found a few online resources (see http://stackoverflow.com/questions/21684549/add-a-text-overlay-with-avmutablevideocomposition-to-a-specific-timerange)
However all these resources are in Objective-C.
My project is in Swift and I cannot find any related resources in Swift.
I am not able to get the text to overlay properly is seems distorted as if the frame in which is gets rendered is skewed...
See picture Distorted text in AVPlayer
I have attempted to convert the Objective-C code I found to Swift but obviously I am missing something.
Below is the code I am using.
(I used some code for the player and the video file from:www.raywenderlich.com/90488/calayer-in-ios-with-swift-10-examples
func MergeUnWeldedVideoByUserPref(showInBounds: CGRect) -> (AVMutableComposition, AVMutableVideoComposition)
{
let fps: Int32 = 30
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
let mixComposition = AVMutableComposition()
// 2 - Create a video track for each of the video assests. Add your media data to the appropriate tracks
//let url = NSBundle.mainBundle().URLForResource("colorfulStreak", withExtension: "m4v")!
let url = NSBundle.mainBundle().URLForResource("colorfulStreak", withExtension: "m4v")!
let avAsset = AVAsset(URL: url)
let track = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
let segmentInMovie = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
let videoTrack = avAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
do
{
try track.insertTimeRange(segmentInMovie, ofTrack: videoTrack, atTime: kCMTimeZero)
} catch{
print("Failed to load track")
}
let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
let instruction = videoCompositionInstructionForTrack(showInBounds, track: track, asset: avAsset)
mainInstruction.layerInstructions.append(instruction)
let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(1, fps)
mainComposition.renderSize = CGSize(width: showInBounds.width, height: showInBounds.height)
let textLayer = CATextLayer()
textLayer.backgroundColor = UIColor.clearColor().CGColor
textLayer.foregroundColor = UIColor.whiteColor().CGColor
textLayer.string = "T E S T"
textLayer.font = UIFont(name: "Arial", size: 18)
textLayer.shadowOpacity = 0.5
textLayer.alignmentMode = kCAAlignmentCenter
textLayer.frame = CGRectMake(5, 5, 100, 50)
textLayer.shouldRasterize = true
textLayer.rasterizationScale = showInBounds.width / videoTrack.naturalSize.width
let parentLayer = CALayer()
let videoLayer = CALayer()
parentLayer.frame = CGRectMake(0, 0, showInBounds.width, showInBounds.height);
videoLayer.frame = CGRectMake(0, 0, showInBounds.width, showInBounds.height);
parentLayer.addSublayer(videoLayer)
parentLayer.addSublayer(textLayer)
mainComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)
return (mixComposition, mainComposition)
}
There is nothing wrong with your Swift interpretation and is rather an issue with the rendering engine of the simulator. I tried your code on the simulator and it indeed looked skewed and distorted but when compiling to the device it worked beautifully.