How can I declare and use a *constant* array in a WGSL vertex shader? - webgpu

Background
I'm trying to render a single triangle by encoding the vertices directly in my WGSL vertex shader.
My idea was to have a global constant array, TRI_VERTICES contain the vertices of the triangle, from which I will look up the appropriate vertex coordinates using the builtin vertex_index.
let TRI_VERTICES: array<vec4<f32>, 3> = array<vec4<f32>, 3>(
vec4<f32>(0., 0., 0., 1.0),
vec4<f32>(0., 1., 0., 1.0),
vec4<f32>(1., 1., 0., 1.0),
);
#vertex
fn vs_main(
#builtin(vertex_index) in_vertex_index: u32,
) -> #builtin(position) vec4<f32> {
return TRI_VERTICES[in_vertex_index];
}
#fragment
fn fs_main(#builtin(position) in: vec4<f32>) -> #location(0) vec4<f32> {
return vec4<f32>(in.x, in.y, 0.1, 1.0);
}
I am running a draw call (in Rust and wgpu) using 3 vertices and 1 instance as follows:
render_pass.draw(0..3, 0..1);
Unfortunately, I get the following error:
Shader validation error:
┌─ Shader:13:9
│
13 │ return TRI_VERTICES[in_vertex_index];
│ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ naga::Expression [3]
Entry point vs_main at Vertex is invalid
Expression [3] is invalid
The expression [1] may only be indexed by a constant
Question
The above seems easily fixed if I just change let TRI_VERTICES to var<private> TRI_VERTICES, but I'm not sure if this is the "correct" solution. What I would like to know is:
Does using var<private> mean that the array entries are mutable within vs_main?
If so, is there a way to declare TRI_VERTICES as "constant" somehow?
What is the most appropriate way to declare TRI_VERTICES?

The following compiles correctly in Tint. This maybe a case of Naga needing to catchup to spec changes, you can file Naga issues at https://github.com/gfx-rs/naga.
const TRI_VERTICES = array(
vec4(0., 0., 0., 1.),
vec4(0., 1., 0., 1.),
vec4(1., 1., 0., 1.),
);
#vertex
fn vs_main(
#builtin(vertex_index) in_vertex_index: u32,
) -> #builtin(position) vec4<f32> {
return TRI_VERTICES[in_vertex_index];
}
#fragment
fn fs_main(#builtin(position) in: vec4<f32>) -> #location(0) vec4<f32> {
return vec4(in.x, in.y, .1, 1);
}
To answer the questions:
A var<private> is mutable in the vertex shader but only visible to the current invocation.
The new const keyword is the new constant.
The spec allows dropping most of the type annotations, so declaring as above should be sufficient.

Related

How do I use the non-convenience SCNGeometrySource initializer correctly?

being new to Swift and SceneKit (not new to programming) and trying to get my first triangle displayed (in preparation of rendering a more complex procedurally generated geometry) I just stumbled over the following issue:
When I initialise the SCNGeometrySource as shown in the code below the triangle is in the wrong position. When using the shorter, commented out version to create vertices it works as expected. My understanding from the documentation and a couple of examples I found online was that these two should be equivalent.
Which embarrassing detail am I missing? Thanks!
assert( MemoryLayout< SCNVector3 >.size == 24 )
assert( MemoryLayout< SCNVector3 >.stride == 24 )
let varray = [ SCNVector3( -3.0, -3.0, 0.0 ),
SCNVector3( 3.0, 3.0, 0.0 ),
SCNVector3( -3.0, 3.0, 0.0 ) ]
let vdata = Data( bytes: varray, count: varray.count * 24 )
print( "size: \(vdata.count), data: \(hexDump( vdata ) )" ) // size: 72, data: 00000000000008c000000000000008c0000000000000000000000000000008400000000000000840000000000000000000000000000008c000000000000008400000000000000000
let vertices = SCNGeometrySource(data: vdata,
semantic: .vertex,
vectorCount: varray.count,
usesFloatComponents: true,
componentsPerVector: 3,
bytesPerComponent: 8,
dataOffset: 0,
dataStride: 24 )
// let vertices = SCNGeometrySource(vertices: varray) // This works, the line above not
Edit: It works correctly when the input data consists of (vectors of) 32-bit floats (and all the parameters are adjusted accordingly).

Get predict in TensorFlowLite for Swift

I launch code in this instruction: https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_swift
Everything works, but I don't understand how to get the prediction values. I tried: print(outputTensor but got:
Tensor(name: "Identity", dataType: TensorFlowLite.Tensor.DataType.float32, shape: TensorFlowLite.Tensor.Shape(rank: 2, dimensions: [1, 3]), data: 12 bytes, quantizationParameters: nil)
Did you try this part from the example guide
// Copy output to `Data` to process the inference results.
let outputSize = outputTensor.shape.dimensions.reduce(1, {x, y in x * y})
let outputData =
UnsafeMutableBufferPointer<Float32>.allocate(capacity: outputSize)
outputTensor.data.copyBytes(to: outputData)
Then just print outputData buffer

How to use SceneKit vortex field to create a tornato effect

In the SceneKit WWDC 2014, they have an example of a vortex field with this effect:
The particle system looks much like a tornato, as it spins inward with a hollow center.
However, the documentation for vortex fields have no information on how to achieve this effect. Right now, I have this:
// create the particle system
let exp = SCNParticleSystem()
exp.loops = true
exp.particleMass = 5
exp.birthRate = 10000
exp.emissionDuration = 10
exp.emitterShape = SCNTorus(ringRadius: 5, pipeRadius: 1)
exp.particleLifeSpan = 15
exp.particleVelocity = 2
exp.particleColor = UIColor.white
exp.isAffectedByPhysicsFields = true
scene.addParticleSystem(exp, transform: SCNMatrix4MakeRotation(0, 0, 0, 0))
// create the field
let field = SCNPhysicsField.vortex()
field.strength = -5
field.direction = SCNVector3(x: 0, y: 1, z: 0)
let fieldNode = SCNNode()
fieldNode.physicsField = field
scene.rootNode.addChildNode(fieldNode)
This creates this effect:
Where I am looking down at the particles rotating clockwise with a really big radius outwards. It looks nothing like a tornato effect. How can I create this effect?
You say tornato, I say tornado, let’s call the whole thing off...
The SceneKit WWDC 2014 demo/slides is a sample code project, so you can see for yourself how they made any of the effects you see therein. In this case, it looks like the “vortex” demo isn’t actually using the vortexField API, but instead the custom field API that lets you supply your own math in an evaluator block. (See the link for the code in that block.)
You might be able to get similar behavior without a custom field by combining a vortex (causes rotation only) with radial gravity (attracts inward) with linear gravity (attracts downward), or some other combination (possibly something involving electric charge). But you’d probably have to experiment with tweaking the parameters quite a bit.
If anyone is still interested in this topic - here is a Swift 5 implementation of that legendary tornado effect.
Here is an example function that will create your tornado.
func addTornadoPhysicsField() {
// Tornado Particles Field Example
guard let tornadoSystem = SCNParticleSystem(named: "tornado.scnp", inDirectory: nil) else { return }
let emitterGeometry = SCNTorus(ringRadius: 1.0, pipeRadius: 0.2)
emitterGeometry.firstMaterial?.transparency = 0.0
let fieldAndParticleNode = SCNNode(geometry: emitterGeometry)
fieldAndParticleNode.position = SCNVector3(0.0, 0.0, -20.0)
tornadoSystem.emitterShape = emitterGeometry
fieldAndParticleNode.addParticleSystem(tornadoSystem)
yourScene.rootNode.addChildNode(fieldAndParticleNode)
// Tornado
let worldOrigin = SCNVector3Make(fieldAndParticleNode.worldTransform.m41,
fieldAndParticleNode.worldTransform.m42,
fieldAndParticleNode.worldTransform.m43)
let worldAxis = simd_float3(0.0, 1.0, 0.0) // i.Ex. the Y axis
// Custom Field (Tornado)
let customVortexField = SCNPhysicsField.customField(evaluationBlock: { position, velocity, mass, charge, time in
let l = simd_float3(worldOrigin.x - position.x, 1.0, worldOrigin.z - position.z)
let t = simd_cross(worldAxis, l)
let d2: Float = l.x * l.x + l.z * l.z
let vs: Float = 27 / sqrt(d2) // diameter, the bigger the value the wider it becomes (Apple Default = 20)
let fy: Float = 1.0 - Float((min(1.0, (position.y / 240.0)))) // rotations, a higher value means more turn arounds (more screwed, Apple Default = 15.0))
return SCNVector3Make(t.x * vs + l.x * 10 * fy, 0, t.z * vs + l.z * 10 * fy)
})
customVortexField.halfExtent = SCNVector3Make(100, 100, 100)
fieldAndParticleNode.physicsField = customVortexField // Attach the Field
}
Additional Configuration Options:
Finally all this can result in something like that:
Note: if you would like to move your static tornado almost like a real tornado, you will have to find a way to re-apply the physics field for each rendererd frame. If you don't, the world origin used in the evaluation block will not move and it will distort your tornado.
Note: You can also split the particle/field node into two different nodes that moves independently from each other. Constrain the field node to the position of the particle node and play around with the influence factor (still need to re-apply the field each frame)
For more information on Custom Fields check out here.

Correct way to call "realloc" in Swift with a Float array?

I'm trying figure out what size to send "realloc" when I call it through Swift. It seems that I have to add an extra byte, but I don't understand why.
typealias Floats = UnsafeMutablePointer<Float>
let FLOAT_SIZE = sizeof( Float )
func floats_realloc( floats:Floats, qty_of_floats:Int ) -> Floats {
let KLUDGE = 1 // Why?
let qty_of_bytes = ( qty_of_floats * FLOAT_SIZE ) + KLUDGE
let realloced_floats = Floats(
realloc( floats, UInt( qty_of_bytes ) )
)
return realloced_floats
}
If I set KLUDGE to 0 here, this is what happens when I attempt to make room for one new member in a three member array:
In: [0.9, 0.9, 0.9]
Out: [0.0, 0.0, 0.9, 0.0]
What I would expect is:
Out: [0.9, 0.9, 0.9, 0.0]
The arrays I'm sending it are created within Swift, using
var foo_floats = Floats.alloc(QTY_OF_FLOATS)
What's wrong with my call to realloc?
I've discussed this on Apple's Developer Forum. It turns out that using Swift's alloc allots space for a Swift array, not a C array. So if you want to use MutableUnsafePointer for a C array and "realloc" (bridged from C) you need to stick with C functions, like "malloc" (bridged from C).
By adding the following function, and using it when I initially set up my Floats array, the "realloc" bug went away:
func floats_alloc( qty_of_floats:Int ) -> Floats {
// return Floats.alloc( qty_of_floats )
let qty_of_bytes = ( qty_of_floats * FLOAT_SIZE )
let alloced_floats = Floats(
malloc( UInt( qty_of_bytes ) )
)
return alloced_floats
}
I've tested for a couple weeks now, and all is well.

Points or spheres in 3D cube with Perl

Let's say I have #points[$number][$x][$y][$z][$color] and I just for debug purposes want them visualized in 3D cube to better observe what I have. Typically I export them to *.txt and use R 3D plotting, but maybe there is easy way to do this in Perl?
It would be even better to have spheres with radius.
My answer: use OpenGL perl bindings
I haven't quite done an exact answer to your question but I'm sure you can adopt this code
I haven't done OpenGL before but it was a fun little evening project
use OpenGL qw/ :all /;
use constant ESCAPE => 27;
# Global variable for our window
my $window;
my $CubeRot = 0;
my $xCord = 1;
my $yCord = 1;
my $zCord = 0;
my $rotSpeed = 0.02 ;
($width, $height) = (1366,768);
#points = ( [ 30,40,40,[100,0,0]], #red
[ 100,100,40,[0,100,0]], #green
[ 100,10,60,[0,100,100]], #turquoise
[ 200,200,100,[0,0,100]] #blue
);
sub reshape {
glViewport(0, 0, $width, $height); # Set our viewport to the size of our window
glMatrixMode(GL_PROJECTION); # Switch to the projection matrix so that we can manipulate how our scene is viewed
glLoadIdentity(); # Reset the projection matrix to the identity matrix so that we don't get any artifacts (cleaning up)
gluPerspective(60, $width / $height, 1.0, 100.0); # Set the Field of view angle (in degrees), the aspect ratio of our window, and the new and far planes
glMatrixMode(GL_MODELVIEW); # Switch back to the model view matrix, so that we can start drawing shapes correctly
glOrtho(0, $width, 0, $height, -1, 1); # Map abstract coords directly to window coords.
glScalef(1, -1, 1); # Invert Y axis so increasing Y goes down.
glTranslatef(0, -h, 0); # Shift origin up to upper-left corner.
}
sub keyPressed {
# Shift the unsigned char key, and the x,y placement off #_, in
# that order.
my ($key, $x, $y) = #_;
# If escape is pressed, kill everything.
if ($key == ESCAPE)
{
# Shut down our window
glutDestroyWindow($window);
# Exit the program...normal termination.
exit(0);
}
}
sub InitGL {
# Shift the width and height off of #_, in that order
my ($width, $height) = #_;
# Set the background "clearing color" to black
glClearColor(0.0, 0.0, 0.0, 0.0);
# Enables clearing of the Depth buffer
glClearDepth(1.0);
glDepthFunc(GL_LESS);
# Enables depth testing with that type
glEnable(GL_DEPTH_TEST);
# Enables smooth color shading
glShadeModel(GL_SMOOTH);
# Reset the projection matrix
glMatrixMode(GL_PROJECTION);
glLoadIdentity;
# Reset the modelview matrix
glMatrixMode(GL_MODELVIEW);
}
sub display {
glClearColor(1.0,0.0,0.0,1.0);
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity;
glTranslatef(0.0, 0.0, -5.0); # Push eveything 5 units back into the scene, otherwise we won't see the primitive
#glPushMatrix();
#glRotatef($CubeRot, $xCord, $yCord, $zCord);
# this is where the drawing happens, adjust glTranslate to match your coordinates
# the centre is is 0,0,0
for my $sphere ( #points ) {
glPushMatrix();
glColor3b( #{$sphere->[3]}) ;
glRotatef($CubeRot, $xCord, $yCord, $zCord);
glTranslatef($sphere->[0]/50 -2 ,$sphere->[1]/50 -2 ,$sphere->[2]/50 -2);
glutWireSphere(1.0,24,24); # Render the primitive
glPopMatrix();
}
$CubeRot += $rotSpeed;
glFlush; # Flush the OpenGL buffers to the window
}
# Initialize GLUT state
glutInit;
# Depth buffer */
glutInitDisplayMode(GLUT_SINGLE);
# The window starts at the upper left corner of the screen
glutInitWindowPosition(0, 0);
# Open the window
$window = glutCreateWindow("Press escape to quit");
# Register the function to do all our OpenGL drawing.
glutDisplayFunc(\&display);
# Go fullscreen. This is as soon as possible.
glutFullScreen;
glutReshapeFunc(\&reshape);
# Even if there are no events, redraw our gl scene.
glutIdleFunc(\&display);
# Register the function called when the keyboard is pressed.
glutKeyboardFunc(\&keyPressed);
# Initialize our window.
InitGL($width, $height);
# Start Event Processing Engine
glutMainLoop;
return 1;