How do I add lat/long coordinates to an Autodesk DWG? - coordinates

For our house build, I have an Autodesk .dwg file, and am trying to provide it with a coordinates (lat/long) that will then help me convert it to GeoJSON. Does anyone know how I can add coordinates to make it georeference properly?

Related

In Unity, how can I get the coordinates of the position of this object?

In Unity
When 3d objects are placed in reality using , how can I get the coordinates of the position of this object?
This is a very unclear question, be more precise.
If you want to move the object, then these videos might help: https://www.youtube.com/watch?v=tNtOcDryKv4
https://www.youtube.com/watch?v=9ZEu_I-ido4
If you want to see the coordinates for your object, then click on the "transform" tab.
If you want to set coordinates of an object in script, then look here: https://answers.unity.com/questions/935069/how-do-i-set-an-objects-coordinates-in-scripts.html

Unity Mapbox SDK - Cut off water areas from terrain mesh

I'm currently experimenting with the Mapbox Unity plugin and I need to remove the water tiles from the generated terrain to use a custom water solution.
I've read in the API documentation that there is the class "AbstractTileFactory" that allows developers to "create a custom factory to fetch raw data and process it in a custom way, like creating terrain and then cut off water areas from terrain mesh."
I don't get it how to use it and I found no usage example, could someone out me in the right direction?
I tried to expand around the "AbstractTileFactory" class but don't know exactly where to start.

How to add 3D elements into the Hololens 2 field of view

I'm trying to build a Remote Assistance solution using the Hololens 2 for university, i already set up the MRTK WebRTC example with Unity. Now i want to add the functionality of the desktop counterpart being able to add annotations in the field of view of the Hololens to support the remote guidance, but i have no idea how to achieve that. I was considering the Azure Spatial Anchors, but i haven't found a good example of adding 3D elements in the remote field of view of the Hololens from a 2D desktop environment. Also i'm not sure if the Spatial Anchors is the right framework, as they are mostly for persistent markers in the AR environment, and i'm rather looking for a temporary visual indicator.
Did anyone already work on such a solution and can give me a few frameworks/hints where to start?
To find the actual world location of a point from a 2D image, you can refer this answer: https://stackoverflow.com/a/63225342/11502506
In short, cameraToWorldMatrix and projectionMatrix transforms define for each pixel a ray in 3D space representing the path taken by the photons that produced the pixel. But anything along a certain ray will show up on the same pixel. So to find the actual world location of a point, you'll need either use Physics.Raycast method to calculate the impact point in world space where the ray hit the SpatialMapping.

Getting 3D points coordinates of a person as person is walking in Unity

For a research project, I need to find the coordinates of 3D points on the surface of a person body as the person is walking straight. I know that unity is rendering an object using a mesh based on 3D points coordinates.
I know very little about unity. I wonder if it is possible that I could use unity to create one person character and make him walk and get the 3D points of that person for each 50ms or 1sec, etc and save them to them to a file? So that I could read the points coordinates later using either C# or python and perform my simulation? How easy is that? is there any sample code or example or ready character which I could use in a relatively short time?
If there is any suggestion for any tool or software which I could achieve that would be great.
Thanks
Easiest thing to do in my opinion would be using either Kinect or photogrammetry to create your model as Point Cloud which will have vertices on the surface only. This is one of the reasons why i am suggesting Point Cloud because you do not have to find vertices of a mesh on the surface this way.
Then import it to Unity using Point Cloud Viewer.
At last in Unity you can log all the global positions of the model using transform.TransformPoint(meshVert) over time easily.

How to set dynamic hotspot for 360 image with unity 3D

I am trying to build a visitors tour with Unity 3D. I have panaromic picture of bedrooms within an hotel and I would like to add points (hot spots) to my pictures that leads to another picture.
The problem is that I want to add this point dynamically via a backend, and I can't find a way to achieve that in Unity.
I will try to answer this question.
Unity has a XYZ coordinate system that can be translated to real world. I would measure real distances to these points (from the center where you took your picture) in your location/room and send these coordinates via backend to Unity3D client.
In Unity you can create Vector3 positions or directions based on coordinates you sent before. Use these positions/directions to instantiate 'hotspots' objects prefabs in right positions and directions. It might be necessary to adjust the scale/units to get the right result.
Once you have your 'hotspot' objects in place add a script to them that will load new scene (on click) with another location/image and repeat the process.
This is a very brief suggestion on how to do it. The code would be quite simple.