How to make a perfect pinch zoom (Unity 3D) - unity3d

I am searching for a solution to how to make a perfect pinch zoom in Unity by moving the camera along the forward:
Set up:
Horizontal plane centred at the origin with all Game objects.
Perspective camera with FOV 10, offset at (10,10,10) looking down at a 45 degrees angle, so that it looks at the origin (there is also a rotation of 45 degrees around the axis pointing up, to achieve this).
What I need:
When I place two fingers on the screen I am touching two GameObjects with them - so the screen coordinates under the fingers correspond to certain world coordinates. When I make a pinch movement (with moving two fingers or only one) I want the new screen coordinates to correspond to the same world coordinates that were under the fingers at the beginning of the whole interaction.
So to simplify even further - whenever I touch the screen with two fingers, I want the world coordinates corresponding to the screen coordinates under my fingers to always stay under the fingers (allowing a very small margin of error).
An example of this perfect zoom for which I am looking for you can see in the mobile game Boom Beach from Supercell.
I already tried to move the camera along its forward vector and to reposition it and I get pretty good results, but pretty much always the GameObjects underneath ‘slip’ away from under my fingers, that is at some points are no longer underneath them. It would be great if there was a mathematical solution to this, but if it’s necessary to compute the answer (through some search for example) then this is totally fine.
If the setup/scenario is not clear, I could provide some sketches to clarify it a bit more.
Hope someone can help me! :)

I would set up a system that detects when the user is zooming in and out if you are using GameObjects to pinpoint where the fingers are that is easy to do with Vector3.distance. After that, I would make a function that moves the camera closer to your desired zoom level with Vector3.MoveTowards(camera position, desired position, the speed of movement) where I would set "speed of movement" as a mathf.sqrt(vector3.distance(Camera position, Desired position));
as for the "desired position" I would set that Vector3(position) as a fraction of a line between two game objects that represent your maximum and minimum zoom level.
EDIT: with that, you should have a very nice camera system

Related

I've implemented horizontal swipe to spin a player 90 degrees, but spinning the camera breaks my swipe logic. How could I solve this problem?

I'm a beginner in Unity and I'm making a top down dungeon crawler with on rail movement, like a chess board with custom tile design. Swiping horizontally spins the player fixed 90 degrees clockwise or counterclockwise depending if the swipe was to the right or to the left, and swiping up moves foward to the tile in front of the player, the dungeon is a graph and the tiles are nodes.
The problem is, it's important that the camera spins along with the player, so I placed the camera as a child of the player object and it spinned as I predicted, but then I realized the axis for the touch input changed as if I had spinned my smart phone 90 degrees, the axis are tied to the camera.
I use Camera.main.ScreenToWorldPoint to get the first and last points of a swipe.
I am using the variation of x and y from touch and release spots relative to the screen if Input.GetTouch(0).phase is TouchPhase.Ended:
deltaX = releaseX - touchX;
deltaY = releaseY - touchY;
touchX and touchY come from touchPosition variable that holds the first touch position relative to the screen, releaseX and touchY come from the variable that holds the current touch position of GetTouch(0) since it changes ever update.
Swipe upIf deltaY is over 1, it means there was a good swipe up, but since I don't want very diagonal swipes, deltaX needs to be between -1 and 1 tolerance, if not the swipe is ignored.
Swipe to the rightIf deltaX is over 1, it means there was a good swipe to the right, since I don't want very diagonal swipes, deltaY needs to be between -1 and 1 tolerance, if not the swipe is ignored.
Swipe to the leftIf deltaX is under -1, it means there was a good swipe to the left, since I don't wan't very diagonal swipes, deltaY needs to be between -1 and 1 tolerance, if not the swipe is ignored.
The logic works until the camera is spinned, suppose I swipe to the right, since the camera spinned, its axis also spinned as if I spinned my fone with my hands, the axis is spinned counterclockwise, swiping to the right have the effect of swiping down.
I got rid of the use of ScreenToWorld and watched the console to know the values of the points, but the code still doesn't work because every the values of x and y must be the same for the same spot on the screen, the world's values will change if the camera moves, so moving the camera will mess with swipe logic because it is something that is relative to the smartphone screen, not the world.
Imagine the point p(0,0), I think I need it to always be the center of the screen because I will use it as a reference to calculate the type of a swipe and if it's valid. Using ScreenToWorld and the camera was how I learned to do this. Maybe if I have an anchor in the center of the screen I could use ScreenToWorld with it, instead of using the camera. I think it is trivial for developers to deal with touch and camera, so there must be a conventional way of doing it.
How could I solve this problem?
You want to get the position of touch relative to the camera. Use ScreenToViewportPoint instead of ScreenToWorldPoint. You'll probably need to use a smaller threshold, because it's expressed in terms of viewport coverage rather than world units.

Normalize Vector3.Distance based on rotation

I am trying to measure distance between multiple positions but I do not want the rotation to affect the distance. In concept, I want to track the starting transform and upon each update track the distance traveled without regard to the change in rotation. I am using an HTC Vive controller and people tend to rotate their hands and I want to control for this.
I've tried resetting the Eular Angles, but this doesn't seem to work.
Adding an Analogy that will certainly help.
Think of it like trying to draw and measure a line with a pencil, the position is in the eraser, and I can hold the pencil in any number of ways and in fact change the position in the middle of drawing the line, but my line will remain straight and the measurement will remain accurate.
Any help is appreciated.
I believe your problem lies around the position you are tracking. It sounds like you are tracking the transform.position of one of the child elements of the Vive controller model, leading to the situation that you're describing with the pencil eraser analogy.
Depending on where your script is attached, you could either move this to the top level element of the Vive controller, or alter your script to instead track transform.parent.position, which shouldn't be affected by the rotations of someone's hand.

How can I rotate an object based on an angle?

I am working in Unity3D and I was wondering how I would rotate a cube based on the angle between the cube and the mouse position. With that I don't mean the mouse position in world space but in pixels.
Here are some pages that'll lead you to your answer:
Input.mousePosition This also includes an example of how to turn screen coordinates into a ray in world coordinates. If you know how far away from the camera you want your point, check out ScreenToWorldPoint for a point instead of a ray.
transform.Rotate To perform a rotation.
The rest of your question is kinda vague--rotating "based on" the angle between cube and mouse position could mean a lot of things. You shouldn't need much more information than this though!

iPhone - detecting circle movements

I have an area where the user can draw using a finger. This area contains an object that I want the user to be able to rotate it clockwise or anti-clockwise. My idea is to offer the user the whole screen to control the object. I mean, if the user starts to describe clock wise finger movements in a circle pattern, the object will rotate on that direction. If the movements are anti-clockwise the object will rotate the other direction.
So, the idea is to detect if the finger is describing circle movements, clockwise or anti-clockwise and the amount of angle. This has to be real-time, I mean, as far as the user is rotating the finger object is rotating.
I have seen apps doing something like that, where the user draws a shape and boom, the app replaces the clunky shape drawn with a pretty one. In essence the app detected that a circle, a triangle, etc., was drawn and replace that gesture with a real pretty shape.
How do I do this kind of stuff? I am just interested in circle movements.
Can you guys point me the direction?
thanks.
Check out the "hough transform for circle detection". Here is a good blog post to start with:

mouse joint is not working to restrict the ball in the half part of the screen

Hi guys I Am developing the application in cocoas2d using the box 2d frame work but unfortunately
i'm not able to restrict the gray ball in the half screen area of the image shown here
i want that ball not to go opposite part of the screen
I Have Used The b2Mousejoint For to move the ball around the screen
b2PrismaticJointDef restrict on any particular axis
But
i want to restrict on the particular rect area of the screen
You could create your custom distance joint which will restrict global axes of the ball. But it will be hard if you never write your own physics engine.
There are 2 easier ways to implement what you want.
Create 4 static "border" boxes around the area where the ball must stay. Then place the ball and the boxes into one collision group.
However, the response of the "border" boxes will not be instant. Therefore, the ball at high speed will sometimes "sink" into the boxes, then be popped out.
You can correct the position and reset the speed of the ball manually in code when it crosses the bounds of the desired area. But it may lead to the unstable physics simulation.