Precision in onCollisionStart in flame with flutter - flutter

I'm attempting to write a bouncing ball game using flame in flutter. To detect collisions the onCollision and onCollisionStart methods are provided. What I had hoped is that onCollisionStart would give a precise location when two objects first hit each other. However, instead it gives a list of positions indicating where the two objects overlap after the first game-tick when this happens (i.e. onCollisionStart is called at the same time as onCollision, but is not called a second time if the same two objects are still colliding on the next tick).
This is illustrated in the attached picture. The collision points are marked with red dots. If the ball were moving downwards, then the ball would have hit the top of the rectangle and so should bounce upwards. However, if the ball were moving horizontally, then its first point of contact would have been the top left corner of the box, and the ball would bounce upwards and to the left.
If I want to work out correct angle that the ball should fly off, then I would need to do some clever calculations to work out the point that the ball first started hitting the other object (those calculations would depend on the precise shape of the other object). Is there some way to work out the point at which the two objects first started colliding? Thanks

What you usually need for this is the normal of the collision, but unfortunately we don't have that for the collision detection system yet.
We do have it in the raytracing system though, so what you could do is send out a ray and see how it will bounce and then just bounce the ball in the same way.
If you don't want to use raytracing I suggest that you calculate the direction of the ball, which you might already have, but if you don't you can just store the last position and subtract it from the current position.
After that you need to find the normals of the edges where the intersection points are.
Let's say the ball direction vector is v, and the two normal vectors are n1 and n2.
Calculate the dot product (this is build in to the vector_math library) of the ball direction vector and each of the normal vectors:
dot1 = v.dot(n1)
dot2 = v.dot(n2)
Compare the results of the dot products:
If dot1 > 0, n1 is facing the ball.
If dot2 > 0, n2 is facing the ball.
After that you can use v.reflect(nx) to get the direction where your ball should be going (where nx is the normal facing the ball).
Hopefully we'll have this built-in to Flame soon!

Related

How to smoothly move a node in an ARkit Scene View based off device motion?

Swift beginner struggling with moving a scene node in ARkit in response to the device motion.
What I want to achieve is: First detect the floor plane, then place a sphere on the floor. From that point onwards depending on the movement of the device, I want to move the sphere along its x and z axis to move it around the floor of the room. (The sphere once created needs to be in the center of the device screen and locked to that view)
So far I can detect the floor and place a node no problem. I can use device motion to obtain the device attitude (pitch, roll and yaw) but how to translate these values into meaningful x, y, z positions that I can update my node with?
Are there any formulas or methods that are used to calculate such information or is this the wrong approach? I would appreciate a link to some info or an explanation of how to go about this. Also I am unsure how to ensure the node would be always at the center of the device screen.
so, as far as I understood you want to have a following workflow:
Step 1. You create a sphere on a plane (which is already done)
Step 2. Move the sphere with respect to the camera's horizontal plane (i.e. along its x and z axis to move it around the floor of the room depending on the movement of the device)
Assuming that the Step 1 is done, what you can do:
Get the position of the camera and the sphere
This should be first called within the function that is invoked after sphere creation (be it a tapGestureRecognizer(), touchesBegan(), etc.).
You can do it by calling position property of SCNNode for sphere and for camera position and/or orientation by calling sceneView.session.currentFrame's .camera.transform which contains all necessary parameters about current position of the camera
Move the sphere as camera moves
Having the sphere position on the Scene and the transformation matrix of the camera, you can find the distance relation between them. Here you can find a good explanation of how exactly you can do it
After you get those things you should implement a proper logic within renderer(_:didUpdate:for:) to obtain continuous lock of the ball with respect to the camera position
If you are interested about the math behind it, you can kick off by reading more about transformation matrices which is a big part of Image Processing and many other areas
Hope that this will help!

Unity3D angle between vectors/directions on specific axis

I have two directions and i am trying to calculate the angle on a specific axis. The object from which the directions are derived is not a static object so i'm struggling to get my head round the maths to work out what i need to do.
FYI, I have the start and end points that i have used to calculate the directions if they are needed.
Here's a diagram to show what i am looking for:
The above image is from a top-down view in unity and it shows the angle i want.
The problem can be seen from the above image, which is that the directions are not on the same height so i can't use the vector3.angle function as it won't give me the correct angle value.
In essence i want to know how much would i have to rotate the red line to the left (top view) so that it would line up with the blue (top-view).
The reason i need this is as i am trying to find a way of getting the side-to-side angles of fingers from my leap motion sensor.
This a generic version of my other question:
Leap Motion - Angle of proximal bone to metacarpal (side to side movement)
It will provide more specific information as to the problem if you need it and it has more specific screenshots.
**UPDATE:
After re-reading my question i can see it wasn't particularly clear so here i will hopefully make it clearer. I am trying to calculate the angle of a finger from the leap motion tracking data. Specifically the angle of the finger relative to the metacarpal bone (bone is back of hand). An easy way to demonstrate what i mean would be for you to move your index finger side-to-side (i.e. towards your thumb and then far away from your thumb).
I have put two diagrams below to hopefully illustrate this.
The blue line follows the metacarpal bone which your finger would line up with in a resting position. What i want to calculate is the angle between the blue and red lines (marked with a green line). I am unable to use Vector3.Angle as this value also takes into account the bending of the finger. I need someway of 'flattening' the finger direction out, thus essentially ignoring the bending and just looking at the side to side angle. The second diagram will hopefully show what i mean.
In this diagram:
The blue line represents the actual direction of the finger (taken from the proximal bone - knuckle to first joint)
The green line represents the metacarpal bone direction (the direction to compare to)
The red line represents what i would like to 'convert' the blue line to, whilst keeping it's side to side angle (as seen in the first hand diagram).
It is also worth mentioning that i can't just always look at the x and z axis as this hand will be moving at rotating.
I hope this helps clear things up and truly appreciate the help received thus far.
If I understand your problem correctly, you need to project your two vectors onto a plane. The vectors might not be in that plane currently (your "bent finger" problem) and thus you need to "project" them onto the plane (think of a tree casting a shadow onto the ground; the tree is the vector and the shadow is the projection onto the "plane" of the ground).
Luckily Unity3D provides a method for projection (though the math is not that hard). Vector3.ProjectOnPlane https://docs.unity3d.com/ScriptReference/Vector3.ProjectOnPlane.html
Vector3 a = ...;
Vector3 b = ...;
Vector3 planeNormal = ...;
Vector3 projectionA = Vector3.ProjectOnPlane(a, planeNormal);
Vector3 projectionB = Vector3.ProjectOnPlane(b, planeNormal);
float angle = Vector3.Angle(projectionA, projectionB);
What is unclear in your problem description is what plane you need to project onto? The horizontal plane? If so planeNormal is simply the vertical. But if it is in reference to some other transform, you will need to define that first.

SpriteKit move node with physics rather than updating position

My main character moves by touching and holding him and then moving your finger left or right to move the character. To do this I just simply update the node's x position (walks left and right on a flat surface) in touchesMoved() with the x position of the touch location, and apply an animation depending on the direction he's moving.
I want to kind of take this to the next level and accomplish the same effect, but using physics, so that when I'm done moving him and release my finger, he may slide a little bit in the direction he was moving given the speed I was moving him at, if that makes sense. Does anyone know how I can accomplish this effect?
Would I have to do as I'm currently doing and update the position as it moves, but also apply a force/impulse at the same time? Kind of confused on how to approach this
Moving the physics body via force, impulse, or velocity will automatically update the player position.
So you will have to play around with the correct way to accomplish your goal. But based on this, what I would suggest is replace your .position code with .physicsBody!.velocity code in your touchesMoved. Then, on your touchesEnded, you can apply a bit of an impulse to give the player that little bit of an "on ice" effect to keep them going a tad.
You will need to experiment with how much velocity you want to get the character to move at the correct speed, and then you will need to play with the impulse figures as well to get it just right.
This could get a bit tricky though, in touchesMoved... because at some point you will want to reset the velocity to 0 (so they stop moving when your finger stops).
To do that, you will need to to use the .previousLocation from your touch object, and compare the distance of X moved. if the distance X moved is >0 (or some deadzone threshold) then apply the velocity; if the deltaX is 0, then set the velocity to 0.
This may actually be more complicated than just using .position to move the character, then having the character slide a bit with physics on touchesEnded.
You will have to play with it to get it right.

Unity raycast distance incorrect for plane collisions

I've got a ray cast being created from the 'yellow X' in the pictures below. It is detecting the collision hit, however the distance is way off (~ 21 instead of 7 etc) . The green X shows where the collision is taking place and hence how long the distance should be
The Line Renderer is being drawn from start point along the forward vector for the ray hit distance, so in the second picture you can see where I have a much smaller plane with better tessellation it works fine, but with a larger ground plane in the first picture it is way off. It also works fine with other objects in the scene like small boxes.
I have tried using a box collider which still has the same problem.
Any ideas?

Getting the axis pointing up in a rotation (Unity)

I have an object in unity which has a rotation described as the following:
x, y, z, where they are both rotations ranging from 0 to 360 around their respective axis.
Now I'm trying to find out which of the vectors point up the most. Essentially I have a 6-sided dice, on which I use physics to emulate a dice-throw. I now want to find out which of the 6 faces of the die points upwards. I can imagine some rather advanced if sentences, revolving around checking the rotations individually, but I'd like to know if there is a good way to do this?
You can get the face directions with:
transform.up
-transform.up
transform.right
-transform.right
transform.forward
-transform.forward
You need to associate each direction with the appropriate face value. The side facing up will be the one with the greatest Dot Product vs Vector3.up (the world "up" direction). A dot product of 1 means a face is pointing directly up. Note that this only works because all the directions are unit vectors.
Vector3.Dot(Vector3.up, transform.up);
Given that it's only 6 (or 3 if you are clever) if statements to find the max that's probably the best way. If you are considering the general case, i.e. to support any die shape and number of faces, you could store a list of structs with a lambda expression denoting the face direction + the face value then use Linq Max().
You could, as you say, check the rotations directly manually.
Here's an alternative collider-based approach: put invisible children game objects with individual trigger colliders on each face of the die, then whenever one of them collides with the table surface, record the number of the opposite side of the face.