Distance from light to flower box - unity3d

how I can measure distance of light to flower box? As I'm making a game where you can grow vegetables and I would like to make it so, that if lighting is more than 5 meters away, your vegetables wont grow. Also then when player picks up a light it would show distance to nearest flower box? They have their own tag of "flowerbox" and lights have their own "lights"
Thank you so much for your answers!

I could give you A solution, but it would make a lot more sense to craft a solution around your current work.
What you want to keep in mind is that if you're dealing with GameObjects in a 3D scene (something you didn't specify), you can find the distance between two objects by looking at their world positions (as opposed to their localPosition)
The distance between two GameObjects is simply:
var distance = (light.transform.position - flowerBox.transforn.position).magnitude;
That will give you a float value, which will be the distance in Unity units (defacto of 1 unit = 1 metre).
The magnitude docs are here.

Related

Explanation of how to calculate transforms in Unity

I am getting started with Unity and am just trying to get my head around the units. What are these units? It seems they are their own 'quantity' and to treat 2 units as 2 times the value of 1 unit.
Anyway - I am trying to workout how to optimally calculate transforms to objects sit exactly where I want them to.
In my scene I have a terrain and a cylinder as so:
As you can see my cylinder is floating. I want the cylinder to sit perfectly on top of the terrain.
My terrain is at the following transform: 0,0,0 and scale 0,0,0 (not sure how to tell it's dimensions yet).
My cylinder is part of a new object, as so:
My FirstPersonPlayer is at transform: 85.9,2.165,51.8 and scale 1,1,1. My Cylinder is at 'localposition' 0,0,0 and local scale 1.2,1.8,1.2
Now - the transform of FirstPersonPlayer on the y axis appears to be what I need to correct.
Currently it is set to 2.165 and is floating a bit above the terrain.
Through manually shifting it, around 1.85 looks about right - but I want to know how to calculate that, rather than doing a finger in the air 'that looks about right'.
Can anyone help me? (Before you suggest using gravity etc , I actually am, but don't want the player falling as soon as they start, however slight that may look or feel.
Many thanks,
As per #Nikola Dimitroff the answer is:
You don't have to compute anything, hold Shift + Control and drag the object. Every game engine ever made calls this "Snap to Ground"
I appreciate and agree with the other comments.

Swift - how can i correct sky map according to current time and location?

i'm new to Swift, and now i'm trying to build sky map app like the application "star chart".
i already got a sky map image from NASA and cover it on SCNsphere, also already set camera node in the center of this sphere to make it looks like 360 degrees. Furthermore, i used accelerator to check what direction the camera is looking at.
i know that the sky map like “star chart” doesn't need internet to update data. so now the biggest problem is that i don't know how to correct the position of my sky map according to people's current time and location.
Any good advice and help? Thanks in advance!!! cause i tried vary hard to find some related information but still stuck in here for three weeks.
You just need to rotate your map with time+longitude around Earth's rotation axis and with latitude around axis longitude=90 degrees while earth is placed in the center of your sphere. For stars the offset does not matter so you can ignore Sun-Earth distance and also Earth's radius as well.
The time rotation must be day+year rotations together. On top of that you have to apply precession and nutation if you want to have higher precision.
Of coarse the stars are moving too so if you need really high precision and or high time interval to cover (hundreds or thousands of years) then this approach is not good and you should use stellar catalog with the motions implemented.
For more info see related:
How to draw sky chart
Plotting a star chart efficiently
If you want to use catalog and real colors then you will also need
Star B-V color index to apparent RGB color
simplified atmospheric scattering GLSL shader
And finally here some hints for such applications:
Is it possible to make realistic n-body solar system simulation in matter of size and mass?

SpriteKit Detect Ranges

I am making small/mid RTS game, I have the following need:
I want Enemy Players to Attack allies and vice versa when they are in each attack range.
My question is: what would be a better approach finding if enemy units are in attack range of allied or vice versa of course ?
What have I tried:
For now I tried to add SKNodes with SKPhysicsBodies for each unit Node.
I can see that FPS are going down when the contact happens... I guess it wasn't the best way to know detect whether enemies are in range.
I guess my alternative is to run some Nested Loop within Update method and check if there are enemy units within the Radius.
I am not sure if it is the best approach, however with this approach I may play with some parameters and maybe optimize the routine for my own needs.
I would like to know if there is some better alternatives.
Look into GamePlayKit,
It may have some things you want.
Otherwise I would just use the Euclidiean Distance Formula https://math.stackexchange.com/questions/139600/euclidean-manhattan-distance without the square root, and use this value based on squads, not individual troops. (So if a group of 4 soldiers is attacking an enemy of 5 soldiers, only 1 distance check is done).
The reason why you do not square root, is because you should know the squared allowable distance. If an enemy 10 pixels away attracts soldiers, then use 100.
The best way to treat your soldiers like squads, add them all to an SKNode (sub class to add better functionality), then you just need to compare those squad SKNodes
If you want to reduce the number of checks you make, consider turning your play area into a grid (Like a chess board). Since you know the size of your tiles, you could easily check to see if the units are close enough to warrant a distance check. E.G. You have a unit at a1, and an enemy at i9, then you know just by tile distance that the units are too far apart to attack each other

New to unity and particles & gravity

I am fairly new to Unity3D and thought I'd give it a go to make physics models.
Here is what I would like to do:
Particles, white small ones, that "stream" outwards from a point, 360 degrees(4/3*πr3) in a shape of a sphere, a growing sphere. The paritcle system should release new bursts of "expanding spheres" quite rapidly and sometimes not as rapid. If I add another primitive to the scene, the particles should bounce in the opposite direction when they hit the surface of that primitive and when they bounce the particle "loses energy" and starts to fade away eventually.
I found the "Particle System" but it does not seem to spawn a ready flow of particles to generate a "exploding effect" that I am after. I hope someone with more experience could help me out :)
You could use the options under the particle system called bounce. and set it from 0 to 1 depending on how much you wanted it to bounce off. everything inside of the collisions tab is what you want, and adding a set amount of burst, would give the impression of a wave, bursts are under emission they need an amount and delay from start of the system.
For the sphere shape you go under the shape category and change shape from cone to sphere.
hope this helps.

iPhone iOS is it possible to create a rangefinder with 2 laser pointers and an iPhone?

I'm working on an IPhone robot that would be moving around. One of the challenges is estimating distance to objects- I don't want the robot to run into things. I saw some very expensive (~1000$) laser rangefinders, and would like to emulate one using iPhone.
I got one or two camera feeds and two laser pointers. The laser pointers are mounted about 6 inches apart, at an angle The angle of lasers in relation to the cameras is known. The Angle of cameras to each other is known.
The lasers are pointing ahead of cameras, creating 2 dots on a camera feed. Is it possible to estimate the distance to the dots by looking at the distance between the dots in a camera image?
The lasers form a trapezoid from the
/wall \
/ \
/laser mount \
As the laser mount gets closer to the wall, the points should be moving further away from each other.
Is what I'm talking about feasible? Has anyone done something like that?
Would I need one or two cameras for such calculation?
If you just don't want to run into things, rather than have an accurate idea of the distance to them, then you could go "dambusters" on it and just detect when the two points become one - this would be at a known distance from the object.
For calculation, it is probaby cheaper to have four lasers instead, in two pairs, each pair at a different angle, one pair above the other. Then a comparison between the relative differences of the dots would probably let you work out a reasonably accurate distance. Math overflow for that one, though.
In theory, yes, something like this can work. Google "light striping" or "structured light depth measurement" for some good discussions of using this sort of idea on a larger scale.
In practice, your measurements are likely to be crude. There are a number of factors to consider: the camera intrinsic parameters (focal length, etc) and extrinsic parameters will affect how the dots appear in the image frame.
With only two sample points (note that structured light methods use lines, etc), the environment will present difficulties for distance measurement. Surfaces that are directly perpendicular to the floor (and direction of travel) can be handled reasonably well. Slopes and off-angle walls may be detectable, but you will find many situations that will give ambiguous or incorrect distance measures.