While I am testing this app Xcode Simulator, I am unable to see video in SKVideoNode but the sound is audible.
The same issue for both .mp4 and .mov files
func showVideo()
{
let strVideoFile = "Beach.mov"
let spriteVideo : SKVideoNode = SKVideoNode(fileNamed: strVideoFile)
spriteVideo.position = CGPoint(x: 0, y: 0)
spriteVideo.setScale(0.5)
spriteVideo.zPosition = 10
spriteVideo.alpha = 1.0
self.addChild(spriteVideo)
spriteVideo.play()
print("Playing \(strVideoFile)")
}
No error messages displayed. I know that the node is added because I can see the increase in the node count.
Related
I'm getting unexpected results when exporting the contents of a SceneKit scene to a Collada (.dae) file. Here's what I have so far.
I created a simple scene with 5 spheres along the x-axis
var x:CGFloat = 0
for i in 0...4 {
let sphere = SCNNode(geometry: SCNSphere(radius: 1))
sphere.name = "sphere\(i+1)"
sphere.position = SCNVector3(x: x, y: 0, z: 0)
exportScene.rootNode.addChildNode(sphere)
x += 2
}
and exported the contents with
let url = URL(fileURLWithPath: pathName)
exportScene.write(to: url, options: nil, delegate: nil) { totalProgress, error, stop in
print("Export progress: \(totalProgress * 100.0)%")
}
When I load the .dae file into a 3D program (Cheetah 3D), I expect to have 5 identical spheres along the x-axis but instead the following appears. I had similar issues exporting to a .obj file.
The answer in the following says "Keep in mind that DAE doesn't handle all features of SceneKit, though" but the doesn't go into the limitations of the file format.
Easiest method to export a SceneKit scene as a Collada .dae file?
Q: Does anyone know how to export the contents of a SceneKit scene?
macOS app
Looks like the beginning of SceneKit's sunset.
Neither .dae nor .obj formats are properly generated in SceneKit macOS app. Moreover, an .usdz format is not exported at all.
iOS app
In iOS app, only the .usdz format is exported correctly (it kept all nodes' transforms and names of the SCN scene). This .usdz can be opened in Maya. But .dae and .obj files contain only one sphere instead of five.
If you have problems with USDZ's textures in Maya, read this post please.
Note that .usdz is not exported correctly when using for-in loop.
import SceneKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
let sceneView = self.view as! SCNView
sceneView.backgroundColor = .black
sceneView.scene = SCNScene()
let url = URL(string: "file:///Users/swift/Desktop/model.usdz")!
let sphere1 = SCNNode(geometry: SCNSphere(radius: 0.5))
sphere1.position = SCNVector3(x: -2, y: 0, z: 0)
sceneView.scene!.rootNode.addChildNode(sphere1)
// ...sphere2, sphere3, sphere4...
let sphere5 = SCNNode(geometry: SCNSphere(radius: 0.5))
sphere5.position = SCNVector3(x: 2, y: 0, z: 0)
sceneView.scene!.rootNode.addChildNode(sphere5)
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
sceneView.scene?.write(to: url, delegate: nil) { (prgs, _, _) in
print("Export progress: \(prgs * 100.0)%")
}
}
}
}
P. S.
Tested it on macOS 13.0 Ventura, Xcode 14.1, iOS 16.1 Simulator.
I am adding 60fps video files to an AVMutableComposition and I am wondering if its possible to play that composition back at half speed.
This is where its adding to the video track
do {
try videoTrack?.insertTimeRange(CMTimeRangeMake(start: start, duration: duration),
of: asset.tracks(withMediaType: AVMediaType.video)[0] ,
at: lastTime)
} catch {
print("Failed to insert video track")
}
And thats the part that plays it back in a window
let videoPlayer = AVPlayer(playerItem: playerItem)
playerLayer.player = videoPlayer
videoPlayer.play()
Any pointers to how the playback could be slowed down to 30fps?
videoPlayer.play()
videoPlayer.rate = 0.5
Seems to do the trick, but I don't know of a way to check if its playing 60fps at 30fps, or the default framerate at half speed.
I'm trying to build a custom progress bar for a video player app in tvOS, and would like to show thumbnails of the video while the user scans the video.
I'm using AVPlayer and Fairplay HLS to play remote video files. I've tried to do this using 2 methods. One with AVAssetImageGenerator's copyCGImage, and the other with AVPlayerItemVideoOutput's copyPixelBuffer method. Both return nil.
When I tried with a local video file, the first method worked.
Method 1:
let imageGenerator = AVAssetImageGenerator(asset: playerItem.asset)
let progressSeconds = playerItem.duration.seconds * Double(progress)
let time = CMTime(seconds: progressSeconds, preferredTimescale: 5)
if let imageRef = try? imageGenerator.copyCGImage(at: time, actualTime: nil) {
image = UIImage(cgImage:imageRef)
}
Method 2:
let videoThumbnailsOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])
player?.currentItem?.add(videoThumbnailsOutput)
if let pixelBuffer = videoThumbnailsOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil) {
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
Any ideas what I'm doing wrong or is there any other way?
Thanks!
This is usually done by making use of the trick play stream associated to your actual stream.
https://en.wikipedia.org/wiki/Trick_mode
You can find it declared with the key EXT-X-I-FRAME-STREAM-INF in the manifest of your HLS stream. A regex might be needed in order to parse its value.
"#EXT-X-I-FRAME-STREAM-INF[^#]*URI=[^#]*"
Once you have the URL of the trick play stream, you can use a paused instance of AVPlayer as a thumbnail. And when the user swipe left and right, you should seek the player in the thumbnail to show the right frame.
I am using the SpriteKit to play video,this is my code to create the SKVideoNode from AVPlayer:
func CreateVideoViewWithAVPlayer(player:AVPlayer)
{
video = SKVideoNode(AVPlayer: player)
let frameLayer=AVPlayerLayer(player: player)
//video?. = CGRectMake(0, 0, 100, 100)
video?.size = CGSize(width: 1024, height: 768)
println(player.currentItem)
video?.anchorPoint = CGPoint(x: 0, y: 0)
video?.position = CGPoint(x: 0, y: 0)
backgroundColor = SKColor.blackColor()
self.addChild(video!)
}
And I create a AVPlayer give the function:
let scene01 = GameScene(size: view.bounds.size)
scene01.CreateVideoViewWithAVPlayer(player!)
But When I call this "player.play()" ,I just show the first frame and not going any more. but if I use the "video" in the "CreateVideoViewWithAVPlayer" function ,It can play.
Another: if I use this code"
skView.presentScene(scene01)
//self.view.addSubview(skView)
(Comments the second) And user "player.play()" it can play (I can hear the sound).
But if I use this:
skView.presentScene(scene01)
self.view.addSubview(skView)
I also can see the first frame.and can not run anymore.
Is there something I do is wrong? Help Me! Thanks A lot.
For no reason. You just can't use AVPlayer to control SKVideoNode's play or pause. but you can control monitor its play state via AVPlayer.
To play or pause a SKVideoNode, you have to use
video.play() / video.pause() // (video you declared here is the instance of SKVideoNode)
At the moment, this is how I'm playing a video on the subview of my UIViewController:
override func viewDidAppear(animated: Bool) {
let filePath = NSBundle.mainBundle().pathForResource("musicvideo", ofType: "mp4")
self.moviePlayerController.contentURL = NSURL.fileURLWithPath(filePath)
self.moviePlayerController.play()
self.moviePlayerController.repeatMode = .One
self.moviePlayerController.view.frame = self.view.bounds
self.moviePlayerController.scalingMode = .AspectFill
self.moviePlayerController.controlStyle = .None
self.moviePlayerController.allowsAirPlay = false
self.view.addSubview(self.moviePlayerController.view)
}
I've read on ways to disable the audio by doing the following below (none of which work, at all). Keep in mind I'm trying to disable it to the point of not interrupting the current music playing via the Music app, Spotify, etc.
// Playing media items with the applicationMusicPlayer will restore the user's Music state after the application quits.
// The current volume of playing music, in the range of 0.0 to 1.0.
// This property is deprecated -- use MPVolumeView for volume control instead.
1) MPMusicPlayerController.applicationMusicPlayer().volume = 0
2) MPVolumeView doesn't even have a setting for setting the actual volume? It's a control.
3) self.moviePlayerController.useApplicationAudioSession = false
So I found this answer.
This is my Swift code that I ended up going with. I then used an AVPlayerLayer to add to the view as a sublayer, which works perfectly.
Thanks to the OP who managed to get a hold of an Apple technician and provided the original Objective-C code.
The only problems I'm facing now is that it:
1) Interrupts current music playback, whether it's from Music, Spotify, etc.
2) Video stops playing if I close the app and open it up again.
override func viewDidAppear(animated: Bool) {
let filePath = NSBundle.mainBundle().pathForResource("musicvideo", ofType: "mp4")
var asset: AVURLAsset?
asset = AVURLAsset.URLAssetWithURL(NSURL.fileURLWithPath(filePath), options: nil)
var audioTracks = NSArray()
audioTracks = asset!.tracksWithMediaType(AVMediaTypeAudio)
// Mute all the audio tracks
let allAudioParams = NSMutableArray()
for track: AnyObject in audioTracks {
// AVAssetTrack
let audioInputParams = AVMutableAudioMixInputParameters()
audioInputParams.setVolume(0.0, atTime: kCMTimeZero)
audioInputParams.trackID = track.trackID
allAudioParams.addObject(audioInputParams)
}
let audioZeroMix = AVMutableAudioMix()
audioZeroMix.inputParameters = allAudioParams
// Create a player item
let playerItem = AVPlayerItem(asset: asset)
playerItem.audioMix = audioZeroMix
// Create a new Player, and set the player to use the player item
// with the muted audio mix
let player = AVPlayer.playerWithPlayerItem(playerItem) as AVPlayer
player.play()
let layer = AVPlayerLayer(player: player)
player.actionAtItemEnd = .None
layer.frame = self.view.bounds
layer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(layer)
}