Andengine how to set a circular wall? - andengine

My physics world needs a circular boundary. How can that be set? Primitive object does not include 'circle'. So can't specify the wall bound directly. Kindly guide.

There is no direct support as far as I have tried. But you can do repeated lines around the circumference of circle to generate a circular collision wall.
You can use the following link to generate points that can be used to draw lines. Formula to find points on the circumference of a circle, given the center of the circle and the radius

Related

How do I Convert an OpenLayers Polygon to a Circle?

I have a drawing feature where, as in one case, a person can draw a circle using the methodology in OL docs example. When that's saved, the server needed to be have it converted to a polygon, and I was able to do that using fromCircle.
Now, I'm needing to make the circle modifiable after it's been converted and saved. But I don't see a clear cut way to get a Circle geometry out of the Polygon tools provided in the library. There is a Polygon.circular, but that doesn't sound like what I want.
I'm guessing the only way to do this is to grab the center, and one of the segment points, and figure out the radius manually?
As long as fromCircle used sides set to a multiple of 4 and rotation zero (which are the default) center and radius can be easily obtained to convert back to a circle:
center = ol.extent.getCenter(polygon.getExtent());
radius = ol.extent.getWidth(polygon.getExtent())/2;

How Do I measure some prefabs range?

I am building firest person Firearm Simulator. When i fire bullet hall prefab showing on the target board.
this is my target
when i fire
Hole prefabs sticking on the target board like red rounds.
i need to get a range of the holes. need to get measured the holes on 4Inch, 6inch or 10-inch rounds.
First, you need to place a GameObject in the center of the target. Then you need to create a float variable for each circle that is equal to find how far away each circle is from the center. To do this, my suggestion is to copy and paste the center game object and move it to each circle on the x,y, or z axis and record how far away each circle is from the center. Once you have those numbers, you need to create an algorithm to find how far away the bullet is from the center. Finally, once you have how far away the bullet is from the center, develop a new algorithm to find between what circles the bullet is in. You will need to create an algorithm using if statements with greater than(>) and less than(<) values comparing the distance the bullet is from the center to the circles distance from the center to find what circles the bullet is between.

How to find objects with circle hole inside

Hello in the following picture i need to find objects that have circle hole inside of it.I tried to use euler number but it allowed me to find all holes not only circles.

How Does Unity Assign Pivot Point Location on Script Generated Meshes

I have tried to find any information on how the Unity assigns pivot points to object but all I keep finding is threads on how to move pivot points and that it can't be done. I am creating a 2D game with a background that is randomly created with meshes that are wrapped in empty GameObjects. These objects are organically shaped but they have a property that returns a rectangle that bounds the object so that they can be placed in a way that they are not overlapping. The trouble is that the algorithm assumes that the pivot point is going to be the center of the object. What I would like to know is how does Unity decide where the pivot point will be set to so that I can predict how much I will need to move my mesh inside the parent object so that the pivot point will be in the center of the bounding rectangle.
Possible fix:
Try create the meshes during runtime and see if it always places the pivot points at a certain corner or at least relatively speaking the same location.
If it does that you would know where the pivot point is and could take it into account in your code, if you also know the size of the mesh you spawn.
So I think most general and correct answer that I can come up with is that unity assigns the pivot point to the center of the GameObject that you apply the Mesh to. The local coordinates of the vertices of the mesh depending on how you create them mighht place your mesh so that its logical center is not the same as the that of the empty GameObject that it is attached to. What I did to fix the issue was to make a vector from local point (0,0,0) to the center of bounding rectangle and translate the vertices I use to make my mesh by that vector inverted. It wasn't perfect but by far close enough to ensure that I won't have any overlapping meshes.

Open GL - ES 2.0 : Touch detection

Hi Guys I am doing some work on iOS and the work requires use of OpenGL es. So now I have a bunch of squares, cubes and triangles on the screen. Some of these geometries might overlap. Any ideas/ approaches for touch detection?
Regards
To follow up on the answer already given, squares, cubes and triangles are convex shapes so you can perform ray-object intersection quite easily, even directly from the geometry rather than from the mathematical description of the perfect object.
You're going to need to be able to calculate the distance of a point from the plane and the intersection of a ray with the plane. As a simple test you can implement yourself very quickly, for each polygon on the convex shape work out the intersection between the ray and the plane. Then check whether that point is behind all the planes defined by polygons that share an edge with the one you just tested. If so then the hit is on the surface of the object — though you should be careful about coplanar adjoining polygons and rounding errors.
Once you've found a collision you can easily get the length of the ray to the point of collision. The object with the shortest distance is the one that's in front.
If that's fast enough then great, otherwise you'll probably want to look into partitioning the world or breaking objects down to their silhouettes. Convex objects are really simple — consider all the edges that run between one polygon and the next. If only exactly one of those polygons is front facing then the edge is part of the silhouette. All the silhouettes edges together can be projected to a convex 2d shape on the view plane. You can then test touches by performing a 2d point-in-polygon from that.
A further common alternative that eliminates most of the maths is picking. You'd render the scene to an invisible buffer with each object appearing as a solid blob in a suitably unique colour. To test for touch, you'd just do a glReadPixels and inspect the colour.
For the purposes of glu on the iPhone, you can grab SGI's implementation (as used by MESA). I've used its tessellator in a shipping, production project before.
I had that problem in the past. What I have used is an implementation of glu unproject that you can find on google (it uses the inverse of the model view projection matrix and the viewport size). This allows you to map the 2D screen coordinates to a 3D vector into the world. Then, you can use this vector to intersect with your objects and see which one intersects (or comes really close to doing so).
I do hope there are better ways of doing this, so I look forward to other answers as well!
Once you get the inverse-modelview and cast your ray (vector), you still need to know if the ray intersects your geometry. One approach would be to grab the depth (z in view coordinate system) of the object's center and extend (stretch) your vector just that far. Then see if the vector's "head" ends within the volume of your object or not (you need the objects center and e.g. Its radius, if it's a sphere)