I've been trying to figure out how to get a NSWindow to perform a transition animation with Swift 3. I found a few examples in Objective-C, but I haven't been able to tease out the relevant details and translate into the target language / newer SDK and get it applied to the right object. This one is pretty flipping cool, but it's ~8yrs old: http://www.theregister.co.uk/2009/08/21/cocoa_graphics_framework/ -- I would imagine there's a better way to do the CGSCube effect now in macOS Sierra with Swift.
Here's what I have so far:
class ViewController: NSViewController {
func doAnimation() {
if let layer = view .layer {
layer.backgroundColor = CGColor.black
let rotateAnimation = CABasicAnimation(keyPath: "transform.rotation")
rotateAnimation.fromValue = 0.0
rotateAnimation.toValue = CGFloat(CGFloat.pi * 2.0)
rotateAnimation.duration = 10.0
layer.add(rotateAnimation, forKey: nil)
}
}
override func viewWillAppear() {
if let window = self.view.window {
window.styleMask = NSWindowStyleMask.borderless
window.backingType = NSBackingStoreType.buffered
window.backgroundColor = NSColor.clear
window.isOpaque = false
window.hasShadow = false
}
}
override func viewDidLoad() {
super.viewDidLoad()
doAnimation()
}
}
This doesn't really do the trick at all. I get a white background on my window instead of a transparent background, and my view rolls across the frame instead of the window frame itself being animated.
Ideally, I would like to do something more advanced like these 3d transitions -- https://cocoapods.org/pods/MKCubeController & https://www.appcoda.com/custom-view-controller-transitions-tutorial/ & https://github.com/andresbrun/ABCustomUINavigationController#cube but I'm not quite sure how to translate the examples from the iOS SDK over to the macOS SDK without UIKit. (Annoyingly, I remember pulling this off a few years back in ObjC, but the project was lost somewhere between formats / new computers.)
How can I apply a transform to the NSWindow itself while segueing between View Controllers? Any tips toward adding some 3d to this effect would be appreciated. I hoping there's maybe a CocoaPod that gets me halfway there.
Related
By setting the color of a material on the model property of a ModelEntity, I can alter the opacity/alpha of an object. But how do you animate this? My goal is to animate objects with full opacity, then have them fade to a set opacity, such as 50%.
With SCNAction.fadeOpacity on a SCNNode in SceneKit, this was particularly easy.
let fade = SCNAction.fadeOpacity(by: 0.5, duration: 0.5)
node.runAction(fade)
An Entity conforms to HasTransform, but that will only allow you to animate scale, position, and orientation. Nothing to do with animation of the material for something like fading it in or out. The effect is in RealityComposer if you create a behavior for animating hide or showing, but there doesn't seem to be something similar to HasTransform to provide functionality for animating opacity.
I've been all around the documentation looking for something, my next idea is essentially creating a custom animation to replace this behavior, but it seems like it should be available and I am just not finding it.
I tested it using different techniques and came to the sad conclusion: you can't animate a material's opacity in RealityKit framework because RealityKit materials don't support animation at runtime (for now I hope). Let's wait for RealityKit's major update.
Here's a code you can use for test
(arView.alpha property just works):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
arView.alpha = 1.0
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.arView.alpha = 0.0
})
}
}
And use this code snippet in order to make sure that animation doesn't work properly
(there's no animation process, just value assignment):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let tetheringAnchor = AnchorEntity(world: [0,0,0])
var material = SimpleMaterial()
let mesh: MeshResource = .generateSphere(radius: 0.5)
var sphereComponent: ModelComponent? = nil
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
material.metallic = .float(1.0)
material.roughness = .float(0.0)
material.baseColor = .color(.red)
sphereComponent = ModelComponent(mesh: mesh,
materials: [material])
tetheringAnchor.components.set(sphereComponent!)
arView.scene.anchors.append(tetheringAnchor)
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.material.metallic = .float(1.0)
self.material.roughness = .float(0.0)
self.material.baseColor = .color(.green)
self.sphereComponent = ModelComponent(mesh: self.mesh,
materials: [self.material])
self.tetheringAnchor.components.set(self.sphereComponent!)
self.arView.scene.anchors.append(self.tetheringAnchor)
})
}
}
As #AndyFedo says there is currently no way to animate the opacity nor alpha of an Entity.
Even changing a SimpleMaterial at run time currently results in flickering.
Having said this I was able to animate the Alpha of a SimpleMaterials Color, however based on testing it is in no way optimal or recommended for that matter.
But just in case you wanted to try to further experiment with this avenue please see an attached example which assumes that you only have a single SimpleMaterial:
class CustomBox: Entity, HasModel, HasAnchoring {
var timer: Timer?
var baseColour: UIColor!
//MARK:- Initialization
/// Initializes The Box With The Desired Colour
/// - Parameter color: UIColor
required init(color: UIColor) {
self.baseColour = color
super.init()
self.components[ModelComponent] = ModelComponent(mesh: .generateBox(size: [0.2, 0.2, 0.2]),
materials: [SimpleMaterial (color: baseColour, isMetallic: false)]
)
}
required init() { super.init() }
//MARK:- Example Fading
/// Fades The Colour Of The Entities Current Material
func fadeOut() {
var alpha: CGFloat = 1.0
timer = Timer.scheduledTimer(withTimeInterval: 0.05, repeats: true) { timer in
if alpha == 0 {
timer.invalidate()
return
}
var material = SimpleMaterial()
alpha -= 0.01
material.baseColor = MaterialColorParameter.color(self.baseColour.withAlphaComponent(alpha))
material.metallic = .float(Float(alpha))
material.roughness = .float(Float(alpha))
DispatchQueue.main.async {
self.model?.materials = [material]
}
}
}
}
As such just to test you can create and then call the function like so:
let box = CustomBox(color: .green)
box.position = [0,0,-0.5]
arView.scene.anchors.append(box)
box.fadeOut()
Also I would politely ask, that this answer not get downvoted as I am simply iterating the fact that (a) it isn't possible with any current built in methods, and (b) that it can in part be achieved albeit to a very limited extent (and thus currently; in a way which one would see fit for production).
I don't know if it suits with your use case. But you should consider video material.
As you can see in this WWDC session (2min45). An entity with complex pulsating opacity.
https://developer.apple.com/videos/play/wwdc2020/10612/
you can also create the fade in experience in Reality Composer and trigger the .rcproject file in Xcode. Have not tested other interactions with .rcproject but I know at least this can load a model to fade in into the scene.
I have a car image and I wanted just the rear wheels to spin when the image is pressed. Is it possible to select a portion of the image to animate? I was thinking of uploading 2 images, the car and wheel, since the car does not need to move yet. I was hoping there was a cleaner way to do this.
Any suggestions would be great thanks !
Add following extension : (If you don't know ->What is Extension? How to Add Extension? click question hyperlink.)
extension UIView {
func rotate360Degrees(duration: CFTimeInterval = 2) {
let rotateAnimation = CABasicAnimation(keyPath: "transform.rotation")
rotateAnimation.fromValue = 0.0
rotateAnimation.toValue = CGFloat(Double.pi * 2)
rotateAnimation.isRemovedOnCompletion = false
rotateAnimation.duration = duration
rotateAnimation.repeatCount=Float.infinity
self.layer.add(rotateAnimation, forKey: nil)
}
}
Usage :
override func viewDidLoad() {
super.viewDidLoad()
carFirstWheelImage.rotate360Degrees()
carSecondWheelImage.rotate360Degrees() // you can change the time interval to change speed
}
Suggestion:
You can also change the Background with multiple image changing. That will give your car a running animation look. That will make your story more awesome !
Trying to modifying the color of NSView with sliding animation like Google Trends
let hexColors = ["56A55B", "4F86EC", "F2BC42", "DA5040"]
#IBAction func changeColor(sender: NSButton) {
let randomIndex = Int(arc4random_uniform(UInt32(hexColors.count)))
NSAnimationContext.runAnimationGroup({_ in
//duration
NSAnimationContext.current.duration = 5.0
self.view.animator().layer?.backgroundColor = NSColor(hex: hexColors[randomIndex]).cgColor
}, completionHandler:{
print("completed")
})
}
I tried using NSAnimationContext to set duration of color change, but it does not work. However it works with the alphaValue of the view.
I'm not sure if you have gotten your answer yet. But this might be able get it to work:
let hexColors = ["56A55B", "4F86EC", "F2BC42", "DA5040"]
#IBAction func changeColor(sender: NSButton) {
let randomIndex = Int(arc4random_uniform(UInt32(hexColors.count)))
NSAnimationContext.runAnimationGroup({ context in
//duration
context.duration = 5.0
// This is the key property that needs to be set
context.allowsImplicitAnimation = true
self.view.animator().layer?.backgroundColor = NSColor(hex: hexColors[randomIndex]).cgColor
}, completionHandler:{
print("completed")
})
}
here is what the documentation says:
/* Determine if animations are enabled or not. Using the -animator proxy will automatically set allowsImplicitAnimation to YES. When YES, other properties can implicitly animate along with the initially changed property. For instance, calling [[view animator] setFrame:frame] will allow subviews to also animate their frame positions. This is only applicable when layer backed on Mac OS 10.8 and later. The default value is NO.
*/
#available(macOS 10.8, *)
open var allowsImplicitAnimation: Bool
Recently, I decided to apply my previous knowledge in C++ and Python to learning Swift. After which, I decided to see what I could do with the SceneKit framework. After hours of checking through the documentation, and consulting a tutorial, I have to wonder what's going wrong with my code:
class GameViewController: UIViewController {
var gameView:SCNView!
var gameScene:SCNScene!
var cameraNode:SCNNode!
override func viewDidLoad() {
super.viewDidLoad()
initScene()
initView()
initCamera()
}
func initView() {
//initialize the game view - this view holds everything else in the game!
gameView = self.view as! SCNView
//allow the camera to move to gestures - mainly for testing purposes
gameView.allowsCameraControl = true
//use default lighting while still practicing
gameView.autoenablesDefaultLighting = true
}
func initScene() {
//initialize the scene
gameScene = SCNScene()
//set the scen in the gameView object to the scene created by this function
gameView.scene = gameScene
}
func initCamera() {
//create a node that will become the camera
cameraNode = SCNNode()
//since a node can be any object in the scene, this needs to be set up as a camera
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3 (x:0, y:5, z:15)
}
}
After more checking through the documentation and making sure that I was now copying from the tutorial directly to get it to work, I still have no luck with this. According to a lot of the other questions I found here on StackOverflow, it looks like it has something to do with the forced unwrapping, the exclamation points, but I'm not exactly sure why that is.
I've probably been staring the answer in the face combing through this documentation, but I'm not quite seeing what the problem is.
Also, apologies if my comments are a bit long and/or distracting.
You have the following problems:
1) you should re-order the initializations in your viewDidLoad, doing so:
initView() // must be initialized before the scene
initScene() // you have been crashing here on getting `gameView.scene`, but gameView was nil
initCamera()
2) cameraNode is not attached on the rootNode, so you may add the following code at the end of initCamera:
gameScene.rootNode.addChildNode(cameraNode)
I want to make a mix of virtual reality and augmented reality.
The goal is I have a stereo camera (for each eyes).
I tried to put two ARSCNView in a viewCotnroller but it seems ARKit enable only one ARWorldTrackingSessionConfiguration at the same time. How can I do that?
I researched to copy the graphic representation of a view to past this to an other view but impossible to find. Please help me to find the solution.
I found this link, maybe can it illumine us:
ARKit with multiple users
Here's a sample of my issue:
https://www.youtube.com/watch?v=d6LOqNnYm5s
PS: before unlike my post, comment why!
The following code is basically what Hal said. I previously wrote a few lines on github that might be able to help you get started. (Simple code, no barrel distortion, no adjustment for the narrow FOV - yet).
Essentially, we connect the same scene to the second ARSCNView (so both ARSCNViews are seeing the same scene). No need to get ARWorldTrackingSessionConfiguration working with 2 ARSCNViews. Then, we offset its pointOfView so it's positioned as the 2nd eye.
https://github.com/hanleyweng/iOS-Stereoscopic-ARKit-Template
The ARSession documentation says that ARSession is a shared object.
Every AR experience built with ARKit requires a single ARSession object. If you use an
ARSCNView
or
ARSKView
object to easily build the visual part of your AR experience, the view object includes an ARSession instance. If you build your own renderer for AR content, you'll need to instantiate and maintain an ARSession object yourself.
So there's a clue in that last sentence. Instead of two ARSCNView instances, use SCNView and share the single ARSession between them.
I expect this is a common use case, so it's worth filing a Radar to request stereo support.
How to do it right now?
The (singleton) session has only one delegate. You need two different delegate instances, one for each view. You could solve that with an object that sends the delegate messages to each view; solvable but a bit of extra work.
There's also the problem of needing two slightly different camera locations, one for each eye, for stereo vision. ARKit uses one camera, placed at the iOS device's location, so you'll have to fuzz that.
Then you have to deal with the different barrel distortions for each eye.
This, for me, adds up to writing my own custom object to intercept ARKit delegate messages, convert the coordinates to what I'd see from two different cameras, and manage the two distinct SCNViews (not ARSCNViews). Or perhaps use one ARSCNView (one eye), intercept its frame updates, and pass those frames on to a SCNView (the other eye).
File the Radar, post the number, and I'll dupe it.
To accomplish this, please use the following code:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet weak var sceneView: ARSCNView!
#IBOutlet weak var sceneView2: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.showsStatistics = true
let scene = SCNScene(named: "art.scnassets/ship.scn")!
sceneView.scene = scene
sceneView.isPlaying = true
// SceneView2 Setup
sceneView2.scene = scene
sceneView2.showsStatistics = sceneView.showsStatistics
// Now sceneView2 starts receiving updates
sceneView2.isPlaying = true
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARWorldTrackingConfiguration()
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
}
And don't forget to activate .isPlaying instance properties for both ARSCNViews.
Objective-C version of Han's github code, sceneViews created programatically, with y + z positions not updated - all credit Han:
-(void)setup{
//left
leftSceneView = [ARSCNView new];
leftSceneView.frame = CGRectMake(0, 0, w, h/2);
leftSceneView.delegate = self;
leftSceneView.autoenablesDefaultLighting = true;
[self.view addSubview:leftSceneView];
//right
rightSceneView = [ARSCNView new];
rightSceneView.frame = CGRectMake(0, h/2, w, h/2);
rightSceneView.playing = true;
rightSceneView.autoenablesDefaultLighting = true;
[self.view addSubview:rightSceneView];
//scene
SCNScene * scene = [SCNScene new];
leftSceneView.scene = scene;
rightSceneView.scene = scene;
//tracking
ARWorldTrackingConfiguration * configuration = [ARWorldTrackingConfiguration new];
configuration.planeDetection = ARPlaneDetectionHorizontal;
[leftSceneView.session runWithConfiguration:configuration];
}
-(void)renderer:(id<SCNSceneRenderer>)renderer updateAtTime:(NSTimeInterval)time {
dispatch_async(dispatch_get_main_queue(), ^{
//update right eye
SCNNode * pov = self->leftSceneView.pointOfView.clone;
SCNQuaternion orientation = pov.orientation;
GLKQuaternion orientationQuaternion = GLKQuaternionMake(orientation.x, orientation.y, orientation.z, orientation.w);
GLKVector3 eyePosition = GLKVector3Make(1, 0, 0);
GLKVector3 rotatedEyePosition = GLKQuaternionRotateVector3(orientationQuaternion, eyePosition);
SCNVector3 rotatedEyePositionSCNV = SCNVector3Make(rotatedEyePosition.x, rotatedEyePosition.y, rotatedEyePosition.z);
float mag = 0.066f;
float rotatedX = pov.position.x + rotatedEyePositionSCNV.x * mag;
float rotatedY = pov.position.y;// + rotatedEyePositionSCNV.y * mag;
float rotatedZ = pov.position.z;// + rotatedEyePositionSCNV.z * mag;
[pov setPosition:SCNVector3Make(rotatedX, rotatedY, rotatedZ)];
self->rightSceneView.pointOfView = pov;
});
}