SKVideoNode seamless loop using AVPlayerLooper - swift

Hi can anyone tell me why using AVPlayerLooper on a SKVideoNode causes a slight delay between loops whereas putting it on a AVPlayerLayer is seamless? I need to use SKVideoNode due to the placement of the layer in a SKScene, but also need it to be seamless. It works perfectly with AVPlayerLayer but this then sits in front of the SKScene.
This is seamless...
let fileUrl = Bundle.main.url(forResource: "chicken", withExtension: "mp4")!
let asset = AVAsset(url: fileUrl)
let playerItem = AVPlayerItem(asset: asset)
queuePlayer = AVQueuePlayer(playerItem: playerItem)
playerLayer = AVPlayerLayer(player: queuePlayer)
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
playerLayer.frame = CGRect(x: 0, y: 525 * hR, width: view.frame.width, height: 294 * hR)
playerLayer.videoGravity = AVLayerVideoGravity.resize
playerLayer.zPosition = -1
view.layer.addSublayer(playerLayer)
queuePlayer.play()
This is not seamless
let fileUrl = Bundle.main.url(forResource: "chicken", withExtension: "mp4")!
let asset = AVAsset(url: fileUrl)
let playerItem = AVPlayerItem(asset: asset)
queuePlayer = AVQueuePlayer(playerItem: playerItem)
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
skVideoNode = SKVideoNode(avPlayer: queuePlayer)
skVideoNode.position = CGPoint(x: frame.midX, y: view.frame.height - 672 * hR)
skVideoNode.size = CGSize(width: view.frame.width , height: 294 * hR)
addChild(skVideoNode)
skVideoNode.play()

Related

Rotate video not woking as expect

I had function to rotate video, first time I rotate with angle = pi / 2 it working normal, but if I continue rotate it with (pi / 2), it not working, it still only rotate (pi / 2), but I expect it rotate pi. Any one can help me, thanks
let rotateTransform = CGAffineTransform(rotationAngle: CGFloat((self.rotateAngle * Double.pi)/180))
let videoAsset = AVURLAsset(url: sourceURL)
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
var clipVideoTrack = videoAsset.tracks(withMediaType: .video)
try? compositionVideoTrack?.insertTimeRange(CMTimeRange(start: .zero, duration: videoAsset.duration), of: clipVideoTrack[0], at: .zero)
compositionVideoTrack?.preferredTransform = videoAsset.preferredTransform
let videoTrack = videoAsset.tracks(withMediaType: .video)[0]
let videoSize = videoTrack.naturalSize
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = videoSize
let timeScale = CMTimeScale(videoTrack.nominalFrameRate)
videoComposition.frameDuration = CMTime(value: 1, timescale: timeScale)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(start: .zero, duration: mixComposition.duration)
let mixVideoTrack = mixComposition.tracks(withMediaType: .video)[0]
mixVideoTrack.preferredTransform = rotateTransform
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: mixVideoTrack)
layerInstruction.setTransform(mixVideoTrack.preferredTransform, at: .zero)
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
guard let exportSession = AVAssetExportSession(asset: mixComposition,
presetName: AVAssetExportPresetPassthrough) else {
return
}
exportSession.outputURL = outputURL
exportSession.outputFileType = .mov
exportSession.videoComposition = videoComposition
exportSession.exportAsynchronously { [weak self] in
self?.handleExportSession(exportSession: exportSession, sourceURL: sourceURL, outputURL: outputURL)
}
}
If you set the transform, this will always be applied to the original orientation. So when you set a rotation matrix by pi/2, the original will be rotated by pi/2. If you apply the same rotation matrix again, the original will be rotated again by pi/2. What you need is to combine the current transform with the new rotation and then set that resulting transformation.
Something like:
let currentTransform = mixVideoTrack.preferredTransform // or whereever you get it from
let newTransform = currentTransform.rotated(by: CGFloat((self.rotateAngle * Double.pi)/180))
mixVideoTrack.preferredTransform = newTransform

Put buttons over PlayerLayer - Swift - Programmatically

I have a UIView which Is supposed to play a video using AVPlayerLayer and AVPlayer.
I set the player in this way:
fileprivate func setUpPlayer(){
let urlPathString = Bundle.main.path(forResource: "dance", ofType: "mp4")
if let videoURL = urlPathString{
let url = URL(fileURLWithPath: videoURL)
player = AVPlayer(url: url)
playerLayer = AVPlayerLayer(player: player)
playerLayer.videoGravity = .resizeAspectFill
self.playerView.layer.addSublayer(playerLayer)
self.playerView.layer.masksToBounds = true
}
}
override func layoutSubviews() {
self.layer.cornerRadius = 15
playerLayer.frame = self.bounds
self.setupShadow(opacity: 0.6, radius: 6, offset: CGSize.init(width: 0, height: 0), color: .darkGray)
}
The problem is that no matter how many subviews I set over the PlayerView, the video actually goes over everything hiding all the subviews I set before.
Do you know how I can put buttons or actually other UIViews over AVPlayerLayer without it actually hiding them?
Change
self.playerView.layer.addSublayer(playerLayer)
to
self.playerView.layer.insertSublayer(playerLayer, at: 0)

How can I set offset of a video in Swift 4 ? Seek(to:) doesn't seem to work

I am having an issue setting offset of AVQueuePlayer video. I have tried seek(to function to set offset of the video but it doesn't seem to work. The video always starts from 0 Seconds. Other requirements are playing it with control and looping back which are working fine.
I am kind a stuck at playing the video from any other point other than 0 Seconds.
func getVideoView() -> UIView
{
var videoViewContainer = UIView(frame: CGRect(x: 0, y: 0, width: 375, height: 375))
let videoUrl = URL(string: "https://myvideourl.mp4")
let item = AVPlayerItem(url: videoUrl!)
player = AVQueuePlayer()
playerLooper = AVPlayerLooper(player: player!, templateItem: item)
let time = CMTime(seconds: 17.000000, preferredTimescale: CMTimeScale(1))
player?.seek(to: time, completionHandler: { (handler) in
} )
item.forwardPlaybackEndTime = CMTimeMake(20, 1) // For playing it for 20 Seconds
let layer: AVPlayerLayer = AVPlayerLayer(player: player)
layer.frame = videoViewContainer.bounds
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoViewContainer.layer.addSublayer(layer)
player?.play()
return videoViewContainer
}
I got the answer. SeekTo doesn't seem to work when use with AVPlayerLooper.
AVPlayerLooper itself has a property time range.
Example given below
playerLooper = AVPlayerLooper(player: self.player!,
templateItem: item!,
timeRange: CMTimeRange(start: CMTime(seconds: Double(start), preferredTimescale: 1) , duration: CMTime(seconds: Double(duration), preferredTimescale: 1)))

IOS How to take a custom resizable SKVideoNode to reference a custom resizable ARImageAnchor SCNPlane's position and size

Currently, when I build to my phone I see the SCNPlane pop over an image the camera is detecting. The SCNPlane does not have an issue adjusting to the size of the image it detects. What I am having trouble with is taking the SCNPlane and replacing it with a SKVideoNode and its SKScene that can also auto adjust its size to the image.
Any Ideas?
Thank You!
if let imageAnchor = anchor as? ARImageAnchor
{
//get plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.8)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
// get container
guard let container = ARSceneView.scene.rootNode.childNode(withName: "Container", recursively: false) else {return}
container.removeFromParentNode()
container.position.y = planeNode.position.y + Float(CGFloat (0.25))
container.position.z = planeNode.position.z + Float(CGFloat (0.1))
planeNode.addChildNode(container)
node.addChildNode(planeNode)
container.isHidden = false
//VIDEO SCENE
let videoURL = Bundle.main.url(forResource: "video", withExtension: "mp4")!
let videoPlayer = AVPlayer(url:videoURL)
let videoPlane = SKScene(size: CGSize(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height))
//videoPlane = SKScene(size: CGSize(width: 720.0, height: 1280.0))
let videoNode = SKVideoNode(avPlayer: videoPlayer)
videoNode.play()
videoPlane.addChild(videoNode)
guard let video = container.childNode(withName: "Video", recursively: true) else {return}
video.geometry?.firstMaterial?.diffuse.contents = videoPlane

Layout coordinates for a video-texture object in SceneKit with Swift

I'm working on creating an SCNNode instance that consists of an SCNPlane - type geometry, with a video playing on the surface of the node. This is what I have so far:
let plane = SCNPlane(width: 5, height: 5)
node = SCNNode(geometry: plane)
GlobalScene.shared.rootNode.addChildNode(node!)
let urlStr = Bundle.main.path(forResource: "test", ofType: "mov")
videoURL = NSURL(fileURLWithPath: urlStr!)
player = AVPlayer(playerItem: AVPlayerItem(url: videoURL! as URL))
let videoNode = SKVideoNode(avPlayer: player!)
player?.actionAtItemEnd = .none
let spritescene = SKScene(size: CGSize(width: 122, height: 431))
videoNode.size.width=spritescene.size.width
videoNode.size.height=spritescene.size.height
spritescene.addChild(videoNode)
plane.firstMaterial?.diffuse.contents = spritescene
The overall functionality so far works great! What I'm trying to figure out is, how to I get the node's ENTIRE surface to be made up of the video ? So far, I've only gotten it to appear in the single corner shown below:
EDIT: it looks like the issue is that I'm only setting the first material of the plane node (rather than all of them...) which is why I'm seeing the other 3 "quadrants" as blank.
EDIT2: That first conclusion may not be correct - if I set:
plane.firstMaterial?.diffuse.contents = NSColor.green
I get the following result:
So...why won't that work when applying the contents of a SpriteKit scene?
Using a SKScene and a SKVideoNode is not necessary. You can directly set the AVPlayer as the contents of a SCNMaterialProperty instance. This will allow for better performance and will avoid having to deal with scaling and positioning the SpriteKit elements.
With the help of this gist, I was able to solve my issue; which ended up involving both scaling the video node properly, as well as setting the position properly. My working code is below:
GlobalScene.shared.rootNode.addChildNode(node!)
let urlStr = Bundle.main.path(forResource: "test", ofType: "mov")
videoURL = NSURL(fileURLWithPath: urlStr!)
player = AVPlayer(playerItem: AVPlayerItem(url: videoURL! as URL))
videoSpriteKitNode = SKVideoNode(avPlayer: player!)
player?.actionAtItemEnd = .none
NotificationCenter.default.addObserver(
self,
selector: #selector(self.playerItemDidReachEnd),
name: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
object: player?.currentItem)
let spriteKitScene = SKScene(size: CGSize(width: 1276.0 / 2.0, height: 712.0 / 2.0))
videoSpriteKitNode?.size.width=spriteKitScene.size.width
videoSpriteKitNode?.size.height=spriteKitScene.size.height
node?.geometry?.firstMaterial?.diffuse.contentsTransform = SCNMatrix4Translate(SCNMatrix4MakeScale(1, -1, 1), 0, 1, 0)
videoSpriteKitNode?.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
videoSpriteKitNode?.size = spriteKitScene.size
videoSpriteKitNode?.play()
spriteKitScene.addChild(videoSpriteKitNode!)
plane?.firstMaterial?.diffuse.contents = spriteKitScene