I'm looking for a camera zooming effect like the one used in Tiny Wings, where the camera zooms out based on the characters height.
I want the character to start zooming after it reaches a set height and I want the zooming to be non-linear so that the character gradually gets closer to the camera bounds as it goes higher up the screen.
I'm currently using the following code to scale linearly
camera.scale = MIN(1, SCREEN_HEIGHT*0.7 / player_position_y);
This results in the player always being 30% away from the top of the screen. I'm trying to find an elegant solution that will result in the player going between 30% from the edge of the screen to 10% from the edge of the screen depending on how high in the game world the character goes.
Just for completion I'm posting the solution I came up with.
float scalar = 4; // Had to tweak this number to get the difference in scales to feel right
float distance = player_position_y - SCREEN_HEIGHT*0.7;
float percentage = distance/(SCREEN_HEIGHT*2 - SCREEN_HEIGHT*0.7)
percentage = 1 - (percentage/scalar);
self.scale = MIN(1, SCREEN_HEIGHT*0.70 / (player_position_y * percentage));
Basically I get the distance between where the character starts scaling and the max height the character can reach as a percentage of the max height.
I invert that number and multiply it by a scaler. I multiply this percentage value by the player height used in the scale calculation. This results in the scale calculation using a position for the character that moves lower than the character as the character gains height.
Related
Hi I have a script that adjusts the distance of an camera in unity to make sure an object is always fully in the view of the camera. I do this like so:
Vector3 characterSize = UpdateBounds(totalPoints).size;
float objectSize = Mathf.Max(Mathf.Max(characterSize.x / 2, characterSize.y / 2), characterSize.z / 2);
float cameraView = 2f * Mathf.Tan(0.5f * Mathf.Deg2Rad * Camera.main.fieldOfView);
float rigRadius = cameraPadding * objectSize / cameraView;
In this case the rigRadius is the distance from the subject to make sure the camera view contains the total object.
The problem i am having is that when the object has a big change in size over a relativly small time period. The camera movement feels jerky and not smooth at all.
So how do i adjust this code to add some sort of a smoothing value? I just can't seem to figure it out.
As far as i managed to figure out I need to smooth the rigRadius value but i dont know how :(
Have a target radius that your current radius smoothly moves towards.
For best(-ish) results, use a formula where the "speed" of the smoothing is dependent on how far the current situation is from target situation (Note: In this case, situation = radius). In other words, zoom speed depends on the current zoom state rather than a fixed starting point.
So really far (for example after a big, fast change) = really fast (the start of the smoothing is quick), but really close = really slow (so the end of the smoothing is slow).
Here's a decent tutorial on this for 2D: https://youtu.be/AvnrywsoTe0
Note how the lerp uses current camera ortho size rather than a fixed "starting" ortho size to apply the zoom, effectively making the effect I described, where zooming speed depends on current zoom and its "distance" to target zoom.
I am building an AR application. I have some points which are real worlds coordinates.
I can geolocate these points through Mapbox. My problem is that when I got far away from the points, they are looking getting smaller. I want to see them as the same size independently from the distance.
Here is an example of how to visualize the points:
So, if I near the points I see them in normal sizes. Even though I got 400 KMs away from the point, I want to see it in the same size. Is it possible?
You can try to scale the lables by some value * distance to object.
If you are standing in device and the target is in target it would be:
float experimentalScale = 0.5f
This is the amplifier of the distance. If you increase the value, the lable will get bigger by greater distance. Try out what works best for you.
float scaleFactor = Vector3.Distance(device.transform.position, target.transform.position) * experimentalScale;
target.transform.localScale(scaleFactor,scaleFactor,scaleFactor)
This only works if your Objects scale is 1. If it is something else, just multiply the scale with scaleFactor.
I have created a square that is 40x40, as shown above. I have a 4x40 strip that I'd like to use to animate (increase) the width of my square till it takes the the width of the whole screen within a second, regardless of the square's position. Quite similar to that of a progress bar loading on both sides.
UPDATE
I forgot to mention that the square is a physics body, hence the physics body must also increase as the sprite increases.
What you want to do is use SKAction.scaleXTo to achieve what you are looking for:
SKAction.scaleXTo(sceneWidth / spriteWidth, duration: 1).
Now if you want the left and right side to not scale evenly, but instead reach both edges at the same time, what you can do is change the anchor point.
The math behind this assumes that your original anchor point is (0.5,0.5)
sprite.anchorPoint = CGPointMake(sprite.position.x / scene.width,sprite.anchorPoint.y)
E.G. Scene size width is 100, sprite is at x 75
What this is basically saying is that your sprite is at some percentage of the scene, in case of the example, 75%. so by changing the anchor point to .75, what is going to happen is the left side will fill faster than the right side when you are expanding your width since the left side of the anchor point has 75% of the width, and the right side has 25% of the width .
Lets say we set the scale to 2, that means the left side of the anchor point will now be at 150%, while the right side will be at 50%.
In general, assuming the origin of your objects are in the top-left (or at least the left, since we're only changing things on one axis) if you set start_x to the original x position of your square, start_width to its width, target_x to the x position of your strip, and target_width to its width, then:
x = start_x + (target_x - start_x) * a;
and
width = start_width + (target_width - start_width) * a;
And as a goes from 0.0 to 1.0, x and width will grow to match the strip.
Hope this helps.
My Android game uses screen co-ordinates based on real space co-ordinates and my conversion goes like this: All this is pseudo code, but I've highlighted as code
Real-space=(position/480); (so for example 240/480 would be halfway across the screen)
Velocity = 1/Time; (Time = seconds)
Real-space= Real-space+ (Velocity * delta time);
Screen Coordinates = Real-space* screen width / length
Now the problem I have is that in my game, I have the need to match the screen co-ordinates of 2 sprites, so they move together (with one on top of the other, so I'm simply using something like
Sprite1_Screen_Coords = (Sprite_2_Screen_Coords-Sprite1 Height)
(All hights are scaled so are relative to the screen size the app is currently being run on)
This works OK, but I have the need to match the 'real space' co-ordinates.
Obviously I can do something like:
Sprite1-Real-Co-ordinates = Sprite2-Real-Co-ordinates
which means they would be the same but what value would I subtract from this so it 'sits' perfectly on top of the other sprite? How do I derive this missing value from/for the sprites I have? So to summerise, I need something like:
Sprite1-Real-Co-ordinates = (Sprite2-Real-Co-ordinates - something representing the sprites height)
Thank all! :-)
The answer was that I simply divided the sprite's scaled height by the current screen height and used that value - seems to work on the 3 resolutions I tested it on.
I'd like to have a manual speedometer control where a user can drag the needle to a particular speed setting, like the following:
From the angle of the above needle, how would I read back the speed that corresponds to? For example, the above would read back a speed of 100.
I've done just this calculation myself (but in reverse, given a speed, adjust the needle angle) on a similar app. Presumably your code knows the angle of the needle since you're animating it, and that you prevent the needle from going below 0 or above 160. I'll assume your needle image at rest is pointing straight up and down (at 80mph) Figure out what the angle is when the needle is pointing at 160mph, and pointing at 0mph by putting an NSLog dumping out the angle when you animate in response to touches and notice the values that make the needle just about perfectly displaying 0 and 160 (you can tweak this in code later to get it just right). Also I'm assuming that positive angles rotate clockwise and negative angles rotate counter-clockwise.
Then the speed for any arbitrary angle is:
double mphPerDegree = 160 / (angle160mph - angle0mph);
double speed = (currentAngleInDegrees * mphPerDegree) + 80.0;