Detect hand swipe gesture in Unity using Kinect with OpenNI - unity3d

I have a 3D model in my Unity project and I have a JavaScript that rotates the camera based on keyboard arrow keys (left/right).
Now, I need to have a script that detects a horizontal swipe hand gesture and returns a vector that I would use to rotate the camera.
I am using the ZigFu SDK with PrimeSense OpenNI/NITE. The ZigFu SDK comes with sample scripts, one of which is SwipeDetector - I am wondering how does it work?
My setup:
I have 3 GameObjects: a 3D model, a MainCamera, and a Directional Light.
So, how do I use the SwipeDetector script in my project? The way I do it right now is 1)Create an empty game object "SwipeDetection", 2) "drag and drop" the SwipeDetector script from ZigFu. I've put in logs in the SwipeDetector script, but I don't see them.

The Zigfu bindings (I'm assuming you're using version 1.4?) dont have a SwipeDetector sample, but they do include a SwipeDetector MonoBehaviour. The SwipeDetector detects vertical and horizontal swipes, but unfortunately doesn't detect the velocity of the swipe.
You have a few options:
Use the provided Swipe Detector, and rotate the camera by a fixed amount every time you detect a horizontal swipe (SwipeDetector_Left or SwipeDetector_Right events)
Use the provided Swipe Detector, start rotating on Swipe, and stop rotating on the SwipeDetector_Release event. This would be similar to pressing on the arrow keys (assuming you have the same behaviour on keydown/keyup events)
Keep track of the hand velocity, and check its value when the swipe occurs. Use this value to rotate the camera. You can keep track of velocity by creating a new MonoBehaviour, and implementing Hand_Create, Hand_Update, and Hand_Destroy (look at any of the scripts in the HandpointControls folder). Keep a queue with the hand points from the last n frames. The delta between the newest & oldest points will be your velocity for those n frames (I recommend you start with 15 frames, or about half a second)
(This will be included in a future Zigfu release :))
Your game object setup sounds right - if you dont see any logs you may not be performing the 'focus gesture' correctly. Try waving or performing a tap towards the sensor - this should cause the Hand_Create event to be called. Once you have a valid handpoint you should get the proper events from the Swipe Detector.
Also worth mentioning your swipe detection game object should have a HandPointControl component (added implicitly with RequireComponent) and that 'ActiveOnStart' should be true.

Related

Flutter Flame game - animation speed (update duration)

for those who are familiar with using Flame in Flutter for game development, I'm wondering if you can just advise me whether I'm on the right track, or not - because I'm not sure if what I'm seeing in my testing is what I expected. I started out with Flame because I thought it seemed like a relatively simple way to make the basic game that I'm aiming to make.
I'm making a basic game where there are four boundaries defined on each edge of the screen, and a ball will bounce around the four boundaries. The boundaries are defined as widgets (because I want to control the properties of each - sometimes they'll be "electrified", meaning the ball shouldn't collide with them). And the ball is a widget as well, of course. I've got some basic code done where I can drag a line from the ball to indicate the direction that I want to start bouncing, and then the ball will bounce around the boundaries (just using basic angle of incidence = angle of reflection to determine the direction of movement).
The code to do the movement is in the "update" method of the ball widget - however, what I'm finding is that the time between updates is somewhere around 200-300 milliseconds, so if I want to show the ball moving at any kind of pace, it has to jump a good number of pixels at each "update" tick - and thus the movement looks "jerky".
Am I doing this the right way? Is there a different (better) approach that will make the movement appear smoother? Or, I'm wondering whether the duration of the "update" ticks is a result of running the code via debug in an Android emulator? (I'm using Android Studio for the emulation, and Visual Studio Code to build the project). I know I don't have actual code here in the question, because essentially I don't have an issue with my code not running - I would just like to understand if that duration of "update" ticks is "normal", and if the resulting "jerky" animation is just to be expected - or do I need to look at a different approach? Thanks in advance!
You should preferably not be using widgets for moving game parts, you should be using Flame components. So you could have for example 4 SpriteComponents as the walls and then the ball as another SpriteComponent and then you can use the collision detection system to act upon when the ball touches one of the walls.
https://docs.flame-engine.org/main/collision_detection.html

Move player in all directions with touch?, Unity

I was creating a basic scenario in Unity. Thi scene have 1 cube in the center of the room, and 1 camera(player).
I need to move the player around the cube like if was flying ( with movements at the top, bottom, left, right, inside and outside), very similar to when we move freely with the mouse on the development screen.
I need make this movement with the touch.
How can i to do?
Thanks!!
You can achieve almost all movements you want using a standard fps mobile controller: 1 joystick and a slide area for rotation. Your forward movement will be your player's forward direction(with W in unity you move always forward) and of course transform's left/right for strafe.
The tricky part is move up/down part(even in Unity editor you have to use 2 extra keys, Q&E) but you can always move up/down just looking in that direction.
if u use the unity standard asset 'cross platform input' (which is available in the standard asset pack for free,) then anything you program with a mouse event or click, will automatically call the corresponding touch event if use on a phone.

crosshair pointer enter on vuforia unity project

I have successfully enabled 3d object detection through vuforia in Unity. I have attached a crosshair (reticle) at the centre of the screen in screenspace overlay. when the user moves his phone over the 3d object which is produced upon object detection, I want a label to appear when crosshair crosses different parts of the 3d object. I tried many methods including, collision, cursor and reticle. It is not working.
Is there any easy way to implement this so that I can use event trigger pointer enter to make few things happening on the game.
I successfully solved my problem. The solution is using worldspace crosshair.
Most of the crosshairs available in the assets are cemeraspace. therefore using a worldspace corsshair solved my problem. It may be useful to someone in future.

Make an object rotate according to mouse movement. [UNITY C#]

I’m starting a learning project. The idea is that you have an archer character that is static, that has a bow attached to it that shoots arrows to targets of varying difficulty.
Turns out that right at the start I’m stuck. How do I make it so that the bow rotates when the player clicks and holds the mouse anywhere on the screen? So I click+hold and move left/right and the bow rotates left/right to aim the shot. I’d like to also eventually make it portable to phones (so you’d tap+hold etc).
Stack Overflow isnt a code writing service but i will explain what you must do:
Method 1 (Exact Aiming):
For every frame the mouse is down:
Make a Ray from a screen point... hint (use
camera.ScreenPointToRay).
Get a far point along the Ray using ray.GetPoint(distance);.
Bow.Transform.LookAt(newPoint, Vector3.Up);.
Method 2 (Continuous Movement):
Make a variable oldMousePos to store a Vector2 location.
Record your initial screen click position into that variable on a
mouse down event.
Have a function that runs once every frame the mouse stays down.
For the direction of the rotation of the bow, you can use
(newMousePos - oldMousePos).normalized;.
For the speed of rotation for your bow, you can use (newMousePos -
oldMousePos).sqrMagnitude;.

Dragging a Sprite (Cocos2D) while Chipmunk is simulating

I have a simple project built with Cocos2D and Chipmunk. So far it's just a Ball (body, shape & sprite) bouncing on the Ground (a static line segment at the bottom of the screen).
I implemented the ccTouchesBegan/Moved/Ended methods to drag the ball around.
I've tried both:
cpBodySlew(ballBody, touchPoint, 1.0/60.0f);
and
ballBody->p = cgPointMake(touchPoint.x,touchPoint.y);
and while the Ball does follow my dragging, it's still being affected by gravity and it tries to go down (which causes velocity problems and others).
Does anyone know of the preferred way to Drag an active Body while the physics simulation is going on?
Do I need somehow to stop the simulation and turn it back on afterwards?
Thanks!
Temporarily remove the body from the space.
If you want the object to have inertia when you release it, that's a different story. The cleanest way is to attach a fairly stiff spring between the ball and a temporary sensor body that moves under the control of your finger. When you let go with your finger, the ball will retain whatever kinematics it had while you were dragging it. Be sure not to remove the ball from the space in this case.
You aren't updating the velocity of the object when you aren't using cpBodySlew(). That is why it falls straight down.
A better way to do it is to use a force clamped pivot joint like the official demos do to implement mouse control. http://code.google.com/p/chipmunk-physics/source/browse/trunk/Demo/ChipmunkDemo.c#296