I am using unreal engine 4.26 for my project,
I have to get "Impact Point" when End Overlap Event is called, What can I do for this?
Both blueprint and c++ are ok :)
Related
I'm trying to create a ship simulator within Unreal Engine (4.27) I tried to use Water plugin as also its content to start from and shorten the time to get it working.
I created an empty game project loaded Water/Maps/WaterTestMap then added an istance of Water/Blueprints/BP_BuoyancyExample to simulation (above sea level) and started the simulation... it just sink like anything else. I tried to modify some Buoyancy data parameters but it just seems that no forces are applied to the body.
I tried again on 4.26.2, using water plugin contents, and it seems to work but if I try to create a floating cube BP that mimics BP_BuoyancyExample (in the same level as above) it sinks all the times… the only way to make it work is to use EditorCube as static mesh.
I cannot understand where’s the fault but it really sounds like a wonderful bug…
We're working on Hololens 2 and have created our own Button design, where we followed a MRTK Tutorial.
Now sadly we cannot execute the buttons using the Gaze-cursor in Hololens 2.
We are using our own Configurationprofile, but the same is valid when using the defaultHololens2configurationprofile.
Also there is a weird behaviour (Valid for both profiles mentioned before): When starting the app the Gaze-cursor is visible, the moment my hands are recognized the gaze-cursor disappears (all good until now), but when I move my hands behind my back the gaze-cursor doesn't appear anymore.
Does anybody have a similar problem, knows how to solve it or has observed something similar?
We are using:
Unity 2020.3.6f1
MRTK 2.7.0
All XR Packages up to date, except XR Plugin Management 4.0.1
Here some screenshots which components our buttons have attached:
Cheers and thanks for the help
The reason is that MRTK is currently designed in a way that at a distance hand rays act as the prioritized focus pointers, so the eye gaze is suppressed as a cursor input if hand rays are used.
If you want to use both eye focus and hand rays at the same time, please follow this documentation:Use hand rays and eye-gaze input together. However, in this way, voice command will be the only method to interact with the hologram which focusing on.
Besides, if you want to support a 'look and pinch' interaction, you need to disable the hand ray according to this document:How to support look + hand motions (eye gaze & hand gestures)
I filed the following GitHub issue and it's being investigated - "Select" voice command does not fire the appropriate events when using OpenXR on HoloLens 2
Making a sidescroller and is using a Pawn (Cube) with simulated physics and would like to make a magnetic grappling hook for movement.
Would you know a smart way to implement this?
many thanks!
If I understand you correctly , you want to animate the cube as its attached to magnetic grappling hook
Lucky its quite clear bec there is something called "Cable component" that do exactly what you looking for .
you just need to enable the plugin inside the engine. and follow the documentation steps Unreal docs cable component to achieve that , just scroll down to this part "Attaching Objects to the Cable ends" to give you solid start , then you can check how to use in inside blueprint to drive the game parameters you want.
Hope that works for you.
Really new to Unreal Engine. I'd like to start learning the quickest way to build a robot manipulator arm without complex inverse kinematics. Just set up the joints, arms and gripper and control them directly.
This would be nice: https://www.youtube.com/watch?v=9DqRkLQ5Sv8. This would be even nicer: https://www.youtube.com/watch?v=UWsuBdhWqL0&t=24s.
I looked into the rigging and animation toolpack, but that's just for humanoids (is it?).
I'd appreciate any pointers.
Thanks
I recommend that you look into the fabrik node or the newer 2 bone Ik node , you will be able to choose the effector bone and take its postion and move it as you like. Hope this helps .
I'd like to implement indoor navigation application using Unity3d on Project Tango.
Could anyone share me the train of thought about this?
My rough idea shows below:
Get the whole build mesh by Tango Constructor
Import into Unity3d as .obj
Bark whole mesh as Navmesh
Name and Mark all interested address or position with ADF together and save with Navmesh
Program UI to receive the start/end address and generate the navigation path dynamically.
Use AR mark and add on the navigation path floor plane.
Please correct my thoughts and share your experience, I am newbie on Unity3d/Tango.
I am doing what you are doing but without the navmesh and ADF. What you might want to consider is reducing the poly count of the 3D obj using a program like Meshlab. I dont know if they fixed that in the Mira release but previously a 'small' room would yield something like.. 1.2 million triangles which I can only assume would slow down your navmesh quite a bit.
With navmesh, generation navigation should be very easy! so I think 1, 2, 3, 5 and 6 are no problem at all.
however nr 4, I have no idea if it works. This you must explore on your own. Naming/marking an adress using ADF? You are thinking it will recognize itself in the environment and then providing the adress? How will it save it with the navmesh? I am sure you will be able to make it work.
good luck.