Repeating a texture over a plane in SceneKit - swift

I have a 32x32 .png image that I want to repeat over a SCNPlane. The code I've got (See below) results in the image being stretched to fit the size of the plane, rather than repeated.
CODE:
let planeGeo = SCNPlane(width: 15, height: 15)
let imageMaterial = SCNMaterial()
imageMaterial.diffuse.contents = UIImage(named: "art.scnassets/grid.png")
planeGeo.firstMaterial = imageMaterial
let plane = SCNNode(geometry: planeGeo)
plane.geometry?.firstMaterial?.diffuse.wrapS = SCNWrapMode.repeat
plane.geometry?.firstMaterial?.diffuse.wrapT = SCNWrapMode.repeat

I fixed it. It seems like the image was zoomed in. If I do imageMaterial.diffuse.contentsTransform = SCNMatrix4MakeScale(32, 32, 0), the image repeats.

I faced an identical issue when implementing plane visualisation in ARKit. I wanted to visualise the detected plane as a checkerboard pattern. I fixed it by creating a custom SCNNode called a "PlaneNode" with a correctly configured SCNMaterial. The material uses wrapS, wrapT = .repeat and calculates the scale correctly based on the size of the plane itself.
Looks like this:
Have a look at the code below, the inline comments contain the explanation.
class PlaneNode : SCNNode {
init(planeAnchor: ARPlaneAnchor) {
super.init()
// Create the 3D plane geometry with the dimensions reported
// by ARKit in the ARPlaneAnchor instance
let planeGeometry = SCNPlane(width:CGFloat(planeAnchor.extent.x), height:CGFloat(planeAnchor.extent.z))
// Instead of just visualizing the grid as a gray plane, we will render
// it in some Tron style colours.
let material = SCNMaterial()
material.diffuse.contents = PaintCode.imageOfViewARPlane
//the scale gives the number of times the image is repeated
//ARKit givest the width and height in meters, in this case we want to repeat
//the pattern each 2cm = 0.02m so we divide the width/height to find the number of patterns
//we then round this so that we always have a clean repeat and not a truncated one
let scaleX = (Float(planeGeometry.width) / 0.02).rounded()
let scaleY = (Float(planeGeometry.height) / 0.02).rounded()
//we then apply the scaling
material.diffuse.contentsTransform = SCNMatrix4MakeScale(scaleX, scaleY, 0)
//set repeat mode in both direction otherwise the patern is stretched!
material.diffuse.wrapS = .repeat
material.diffuse.wrapT = .repeat
//apply material
planeGeometry.materials = [material];
//make a node for it
self.geometry = planeGeometry
// Move the plane to the position reported by ARKit
position.x = planeAnchor.center.x
position.y = 0
position.z = planeAnchor.center.z
// Planes in SceneKit are vertical by default so we need to rotate
// 90 degrees to match planes in ARKit
transform = SCNMatrix4MakeRotation(-Float.pi / 2.0, 1.0, 0.0, 0.0);
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func update(planeAnchor: ARPlaneAnchor) {
guard let planeGeometry = geometry as? SCNPlane else {
fatalError("update(planeAnchor: ARPlaneAnchor) called on node that has no SCNPlane geometry")
}
//update the size
planeGeometry.width = CGFloat(planeAnchor.extent.x)
planeGeometry.height = CGFloat(planeAnchor.extent.z)
//and material properties
let scaleX = (Float(planeGeometry.width) / 0.02).rounded()
let scaleY = (Float(planeGeometry.height) / 0.02).rounded()
planeGeometry.firstMaterial?.diffuse.contentsTransform = SCNMatrix4MakeScale(scaleX, scaleY, 0)
// Move the plane to the position reported by ARKit
position.x = planeAnchor.center.x
position.y = 0
position.z = planeAnchor.center.z
}
}

To do this in the SceneKit editor, select your plane (add one if needed) in the scene and then select the "Material Inspector" tab on the top right. Then, under "Properties" and where it says "Diffuse", select your texture. Now, expand the diffuse section by clicking the carat to the left of "Diffuse" and go down to where it says "Scale". Here, you can increase the scaling so that the texture can look repeated rather than stretched. For this question, the OP would have to set the scaling to 32x32.

You can learn it from Scene kit viewer Suppose You have SCNplane in your scene kit
Create scene file drag a plane
Which size is 12 inches in meter it is 0.3048
and select image in diffuse
now You have image with 4 Grid as shown in image
we want each box to be show in each inches so for 12 Inches we need 12 box * 12 box as we have 12 inches box
to calculate it. First we need convert 0.3048 meter to inches
which is meters / 0.0254 answer is 12.
but we need each grid to show in each inch so we also need to divide 12 / 4 = 3
now goto show material inspector and change scale value to 3
you can see 12 boxes for 12 inch plane.
Hope it is helpful

Related

Why does my 2d image change size when displaying using ARKit?

Im trying to display this square on a horizontal plane in ARKit but the square is a different size and shape every-time it detects a horizontal plane. How could I make it so that the 2d square that Im trying to display is the same size and shape?
I tried using physicalSize object but that didnt seem to work for my problem.
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let size = planes.count
if size > 0 {
return nil
}
// creating SCNNode that we are going to return
let ARAnchorNode = SCNNode()
// converting the ARAnchor to an ARPlaneAnchor to get access to ARPlaneAnchor's extent and center values
let anchor = anchor as? ARPlaneAnchor
// creating plane geometry
planeNode.geometry = SCNPlane(width: CGFloat((anchor?.extent.x)!), height: CGFloat((anchor?.extent.z)!))
// transforming node
planeNode.position = SCNVector3((anchor?.center.x)!, 0, (anchor?.center.z)!)
planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "boxone")
planeNode.eulerAngles = SCNVector3(-Float.pi/2,0,0)
sceneView.debugOptions = []
// adding plane node as child to ARAnchorNode due to mandatory ARKit conventions
ARAnchorNode.addChildNode(planeNode)
//returning ARAnchorNode (must return a node from this function to add it to the scene)
planes.append(planeNode)
return ARAnchorNode
}
I want the image that im trying to display to be the same size and shape every time I try to display in the real world.
Added this line of code to make the image the same size everytime I display it in the ARKit environment.
//creates a box where node will appear
planeNode.geometry = SCNBox(width: 0.5, height: 0.5, length: 0, chamferRadius: 0)

How to get the SCNVector3 position of the camera in relation to it's direction ARKit Swift

I am trying to attach an object in front of the camera, but the issue is that it is always in relation to the initial camera direction. How can I adjust/get the SCNVector3 position to place the object in front, even if the direction of the camera is up or down?
This is how I do it now:
let ballShape = SCNSphere(radius: 0.03)
let ballNode = SCNNode(geometry: ballShape)
let viewPosition = sceneView.pointOfView!.position
ballNode.position = SCNVector3Make(viewPosition.x, viewPosition.y, viewPosition.z - 0.4)
sceneView.scene.rootNode.addChildNode(ballNode)
Edited to better answer the question now that it's clarified in a comment
New Answer:
You are using only the position of the camera, so if the camera is rotated, it doesn't affect the ball.
What you can do is get the transform matrix of the ball and multiply it by the transform matrix of the camera, that way the ball position will be relative to the full transformation of the camera, including rotation.
e.g.
let ballShape = SCNSphere(radius: 0.03)
let ballNode = SCNNode(geometry: ballShape)
ballNode.position = SCNVector3Make(0.0, 0.0, -0.4)
let ballMatrix = ballNode.transform
let cameraMatrix = sceneView.pointOfView!.transform
let newBallMatrix = SCNMatrix4Mult(ballMatrix, cameraMatrix)
ballNode.transform = newBallMatrix
sceneView.scene.rootNode.addChildNode(ballNode)
Or if you only want the SCNVector3 position, to answer exactly to your question (this way the ball will not rotate):
...
let newBallMatrix = SCNMatrix4Mult(ballMatrix, cameraMatrix)
let newBallPosition = SCNVector3Make(newBallMatrix.m41, newBallMatrix.m42, newBallMatrix.m43)
ballNode.position = newBallPosition
sceneView.scene.rootNode.addChildNode(ballNode)
Old Answer:
You are using only the position of the camera, so when the camera rotates, it doesn't affect the ball.
SceneKit uses a hierarchy of nodes, so when a node is "child" of another node, it follows the position, rotation and scale of its "parent". The proper way of attaching an object to another object, in this case the camera, is to make it "child" of the camera.
Then, when you set the position, rotation or any other aspect of the transform of the "child" node, you are setting it relative to its parent. So if you set the position to SCNVector3Make(0.0, 0.0, -0.4), it's translated -0.4 units in Z on top of its "parent" translation.
So to make what you want, it should be:
let ballShape = SCNSphere(radius: 0.03)
let ballNode = SCNNode(geometry: ballShape)
ballNode.position = SCNVector3Make(0.0, 0.0, -0.4)
let cameraNode = sceneView.pointOfView
cameraNode?.addChildNode(ballNode)
This way, when the camera rotates, the ball follows exactly its rotation, but separated -0.4 units from the camera.

Aligning ARFaceAnchor with SpriteKit overlay

I'm trying to calculate SpriteKit overlay content position (not just overlaying visual content) over specific geometry points ARFaceGeometry/ARFaceAnchor.
I'm using SCNSceneRenderer.projectPoint from the calculated world coordinate, but the result is y inverted and not aligned to the camera image:
let vertex4 = vector_float4(0, 0, 0, 1)
let modelMatrix = faceAnchor.transform
let world_vertex4 = simd_mul(modelMatrix, vertex4)
let pt3 = SCNVector3(x: Float(world_vertex4.x),
y: Float(world_vertex4.y),
z: Float(world_vertex4.z))
let sprite_pt = renderer.projectPoint(pt3)
// To visualize sprite_pt
let dot = SKSpriteNode(imageNamed: "dot")
dot.size = CGSize(width: 7, height: 7)
dot.position = CGPoint(x: CGFloat(sprite_pt.x),
y: CGFloat(sprite_pt.y))
overlayScene.addChild(dot)
In my experience, the screen coordinates given by ARKit's projectPoint function are directly usable when drawing to, for example, a CALayer. This means they follow iOS coordinates as described here, where the origin is in the upper left and y is inverted.
SpriteKit has its own coordinate system:
The unit coordinate system places the origin at the bottom left corner of the frame and (1,1) at the top right corner of the frame. A sprite’s anchor point defaults to (0.5,0.5), which corresponds to the center of the frame.
Finally, SKNodes are placed in an SKScene which has its origin on the bottom left. You should ensure that your SKScene is the same size as your actual view, or else the origin may not be at the bottom left of the view and thus your positioning of the node from view coordinates my be incorrect. The answer to this question may help, in particular checking the AspectFit or AspectFill of your view to ensure your scene is being scaled down.
The Scene's origin is in the bottom left and depending on your scene size and scaling it may be off screen. This is where 0,0 is. So every child you add will start there and work its way right and up based on position. A SKSpriteNode has its origin in the center.
So the two basic steps to convert from view coordinates and SpriteKit coordinates would be 1) inverting the y-axis so your origin is in the bottom left, and 2) ensuring that your SKScene frame matches your view frame.
I can test this out more fully in a bit and edit if there are any issues
Found the transformation that works using camera.projectPoint instead of the renderer.projectPoint.
To scale the points correctly on the spritekit: set scaleMode=.aspectFill
I updated https://github.com/AnsonT/ARFaceSpriteKitMapping to demo this.
guard let faceAnchor = anchor as? ARFaceAnchor,
let camera = sceneView.session.currentFrame?.camera,
let sie = overlayScene?.size
else { return }
let modelMatrix = faceAnchor.transform
let vertices = faceAnchor.geometry.vertices
for vertex in vertices {
let vertex4 = vector_float4(vertex.x, vertex.y, vertex.z, 1)
let world_vertex4 = simd_mul(modelMatrix, vertex4)
let world_vector3 = simd_float3(x: world_vertex4.x, y: world_vertex4.y, z: world_vertex4.z)
let pt = camera.projectPoint(world_vector3, orientation: .portrait, viewportSize: size)
let dot = SKSpriteNode(imageNamed: "dot")
dot.size = CGSize(width: 7, height: 7)
dot.position = CGPoint(x: CGFloat(pt.x), y: size.height - CGFloat(pt.y))
overlayScene?.addChild(dot)
}

SKCropNode Strange Behaviour

When using SKCropNode, I wanted the image I add to the cropNode to adjust each individual pixel alpha value in accordance to the corresponding mask pixel alpha value.
After a lot of research, I came to the conclusion that the image pixel alpha values were not going to adjust to the mask, however after just continuing with my project, I notice that one specific cropNode image's pixels were in fact fading to the mask pixel alpha value??? Which was great! However after reproducing this, I don't know why it is doing it?
import SpriteKit
var textureArray: [SKTexture] = []
var display: SKSpriteNode!
class GameScene: SKScene {
override func didMoveToView(view: SKView) {
anchorPoint = CGPointMake(0.5, 0.5)
backgroundColor = UIColor.greenColor()
fetchTexures()
display = SKSpriteNode()
let image = SKSpriteNode(texture: textureArray[0])
display.addChild(image)
let randomCropNode = SKCropNode()
display.addChild(randomCropNode)
let cropNode = SKCropNode()
cropNode.maskNode = display
let fill = SKSpriteNode(color: UIColor.whiteColor(), size: frame.size)
cropNode.addChild(fill)
cropNode.zPosition = 10
addChild(cropNode)
}
func fetchTexures() {
var x: Int = 0
while x < 1 {
let texture: SKTexture = SKTextureAtlas(named: "texture").textureNamed("\(x)")
textureArray.append(texture)
x += 1
}
}
}
The above code gives me my desired effect, however if you remove the below, the image pixel alpha values no longer adjust in accordance with the mask?? The below code is not actually using in my project, but it's the only way I can make the pixel alpha value's adjust.
let randomCropNode = SKCropNode()
display.addChild(randomCropNode)
Can anybody see what is causing this behaviour, or if there a better way of getting my desired effect?
Mask:
Result:
If remove:
let randomCropNode = SKCropNode()
display.addChild(randomCropNode)
Result:
Crop node will only turn on and off pixels if the alpha varies between <.5 (off) and >=.5(on)
However to apply a fade, if your alpha mask is just black(with various alpha levels) and transparent, you apply the mask as a regular texture to your crop node, and you let alpha blending take care of the fade effect.
As for your issues with the code, are you sure your crop node is cropping, and not just rendering the texture? I do not know what the texture looks like to try and reproduce this.
The node supplied to the crop node must not be a child of another
node; however, it may have children of its own.
When the crop node’s contents are rendered, the crop node first draws
its mask into a private buffer. Then, it renders its children. When
rendering its children, each pixel is verified against the
corresponding pixel in the mask. If the pixel in the mask has an alpha
value of less than 0.05, the image pixel is masked out. Any pixel not
rendered by the mask node is automatically masked out.
https://developer.apple.com/library/ios/documentation/SpriteKit/Reference/SKCropNode_Ref/#//apple_ref/occ/instp/SKCropNode/maskNode

SKShapeNode ellipseInRect, sprite does not appear in scene

I'm trying to create an ellipse in the center of my scene:
let center = (CGRectGetMidX(view.scene.frame), CGRectGetMidY(view.scene.frame))
let size = (view.scene.frame.size.width * 0.3, view.scene.frame.size.height * 0.3)
let ellipse = SKShapeNode (ellipseInRect: CGRectMake(center.0, center.1, size.0, size.1))
ellipse.strokeColor = UIColor.blackColor()
ellipse.position = CGPointMake(center)
self.addChild(ellipse)
This was added to didMoveToView, and the node count on the view shows 1, but I do not see the path. How do I add an ellipse to my scene using the SKShapeNode ellipseInRect API?
The problem lies in ellipse.position = CGPointMake(center). For some reason, this changes the position of the ellipse relative to itself rather than relative to the view - so if you did ellipse.position = CGPoint(x: 100, y: 100) then it would set the position to 100 up and 100 to the right of the ellipse itself as opposed to 100,100 on the scene. If you comment out this line, then you should be able to see you ellipse on the screen - I certainly could when it tried it. Hope that helps you position it to where you want.