I am new to AnyLogic and I've been trying out some learning resource from AnyLogic in 3 Days.
I am currently trying out Discrete Events which I am asked to use 3D models. I look at the figures in the PDF and it shows the models in the Agents tab when zooming to 500% and in the 2D simulation. In my Agent's tab zooming to 500% Display nothing, but yes you can click on the 0, 0 Coordinate and it displays a bounding box.
In the simulation Window, the simulation runs like a ghost with no models also showing up, but in the 3D model, the models show up and animate without a problem.
In the model properties, the Advance section shows in, 2D and 3D is already selected.
Can anyone tell me if there is any configuration to be done to AnyLogic?
Currently, I am using AnyLogic 8 Personal Learning Edition 8.6.0, running on Windows 7 Service Pack 1.
My notebook is running with Intel I-Core7 #2.8GHz, 16GB of RAM, and Nvidia Quadro M1000 (2GB VRAM).
Any help would be appreciated. Thank you.
Some of the Screenshots:
3d Image not showing in the editor
Orientation in Properties also shows nothing
In the simulation window, 3D object not showing in 2D, but in the 3D are showing in 3D Window
Related
I am trying to superimpose a 3D model onto a 3D physical object, and I am currently using Vuforia and the Model Target Generator. This uses computer vision to first recognize my physical model and then superimpose my 3D models.
I want to be able to superimpose the hologram even if you can't see the full phyisical model. I'm thinking I could glue some locational tracker of sorts to my physical model, and the Hololens would know the position of that at all times, even if its not visible. Additionally, my physical marker is not in a specific environment. Does anyone know a possible device that could do this?
I built an apk using the HelloAR scene (which is provided with ARcore package). The app is only detecting Horizontal surface like table and creates it's own semi-transparent plane over it. When I moved my phone around a bottle, the app again, only created a horizontal plane cutting through the bottle. I expected ARCore to create planes along the bottle as I move my phone around, like polygons in a mesh.
Another scenario is, I placed 2 books on the floor, and each of them have different thickness. But the HelloAR app creates only one semi-transparent horizontal surface over the thicker book, instead of creating two surfaces (one for each book).
What is going wrong here? How can I fix it and make the HelloAR app work more precisely? Please help.
Software: Unity v2018.2,
ARcore v1.11.0
ARCore generates an approximate point cloud using a soft movement of the device to identify the featured points, this points are detected by contrast in the different shapes, if you use your application in test mode in unity you can see how the points are placed in your empty scene.
Once the program has enough points at the "same height" (I don't know the exact precision), it generates the plane that you can see, but it won't detect planes separated by a difference of 5cm or even more distance.
If you want to know the approximate accuracy of the app, test it with unity and make a script to capture the generated points that have been used to generate the planes, then check the Y difference to see which is the tolerance distance.
Okay so Vuforia is currently one of the leading SDKs for augmented reality providing a wide area of detection options (Images, Ground, Point, 3D objects, ...)
So regarding your question about detecting a bottle I would most certainly use the 3D model detection feature. You can read the official docs here.
You need to first generate an approximate of the object in a 3d modeling software and the use their program to generate the detection model. Then you put this in Unity and setup the detection. (no coding needed)
I have some experience with this kind of detection. I used it to detect a large 2mx2m scale model of an electric vehicle. It works great, you can walk around it and it tracks it through and through. You can see a short official demo here
Hope it helped to explain this in short!
I am getting the surface shown in the picture that i am attaching below ,I want triangle with the same color and texture Could you please guide me on the same.
and one more thing ,This is because the triangles are not oriented consistently which only makes sense if the output is a true oriented surface without any artifact.if any one suggest me which visualization tool where it shows the triangles without considering orientation in WPF, i have used helix toolkit in WPF but it gives same result this is 3d model:
You can either try to fix the orientation of the triangles or define the BackMaterial of your GeometryModel3D to be the same as the Material.
I'm developing a 3D game using Unity3D 4.5.2 (free version, not Pro).
I have used a default Particle System to make a Waterfall. I have placed the Waterfall particle system in the scene in such a way that there is a 2D Sprite of a Mountain behind it.
This is to give an impression to the user that the Waterfall is 'falling' out from the Mountain.
However, after 10 seconds on simulating this waterfall particle system, the particles suddenly become transparent, enabling the user to view the Mountain behind it...have provided screenshots below:
So I would really appreciate if anyone could help me out here, as I've looked at a lot of solutions & fiddled around with all the parameters of the particle system in Inspector but to no avail...
To give you the correct answer for this question, I need see your particle system parameters. But I think something is wrong with "Max Particles" amount or "Color over lifetime". Also check "Start Lifetime", maybe it is too much.
I have models created with 3D Studio Max.
When I save as .FBX and import into Unity 3D they become grey. What's the reason of this and how can i solve it?
Also i have huge troubles with performance of model on mobile. What's the best approach to get app faster?
When I save as .FBX and import into Unity 3D they become grey. What's the reason of this and how can i solve it?
I've gone through the same issue when importing the following 3D Data Visualization model
The model was created using Cinema4D and exported as .fbx.
When i imported into Unity using drag and drop, I got the following result
As we can see from looking at the Inspector (image above) the material is grey and can't be edited, the textures were not imported. The materials of the default assets cannot directly be edited but there's a way to go around it.
I've created a new material (right click in assets > Create > Material)
Gave it a new name (green) and changed Albedo to green.
And dragged the material to the cube
The end result now looks like expected
Also i have huge troubles with performance of model on mobile. What's the best approach to get app faster?
It can be tricky to develop mobile VR apps but the game window has an info panel that displays useful live information (Use Stats button in the Game window to enable it)
There we can read how many SetPass calls, Batches and polygons are being drawn. Aim for 50 SetPass calls, that's a good number accordingly to Unity experts.
About optimizing performance, some very good developers already explain in detail how to optimize your scenes. I suggest the following links
Understanding VR Performance
Profiler window
1) all models gray- this means the texture was lost in the import. You can create materials and drag them onto the mesh. But Unity can import your max files directly, you dont have to export to .fbx. Maybe if you try that your textures and materials will be setup automatically.
2) performance on mobile- can be various causes. maybe too much detail in your models. Or more likely too many drawcalls, or maybe too heavy cpu upsage . hard to say. the unity docs have pointers on optimizing for mobile.