I have an application for Hololens 2 where i need to show other participants' hands.
I got the networking part, but i now have a set of 25 HandJoint positions per hand. I would like to visualize the hand as a mesh in the same style as the one the MRTK provides for the users own hand.
I could use any hints about where to look next.
Visualized joints of the MRTK Hand
MRTK Hand Mesh (incl. Joints)
If you are looking for the visual assets/prefabs that MRTK provides, you may find it at [Unity MixedReality Toolkits Obejct/Component] -> [Input] -> [Articulated Hand Tracking] in MRTK 2.8+.
You may also find those resources at https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Prefabs
Related
I saw the scene OutlineExamples in the MRTK examples package and recreated it in my own project.
The outlining works if I stay in unity in play mode. But if I deploy it on the HoloLens the object does not get a outline effect.
The OutlineExamples scene from the MRTKHub-project works as excpected on the HoloLens!
So I guess I missed something in my own project, but I cant find it. I compared the setup multiple times, but cant find a difference. And I also used the simplest object (the cube) from the example scene.
Setup for the cube
(the screenshot shows on the left side my project and on the right side the mrtkhub-project):
Mesh Filter (standard)
Mesh Renderer (standard)
Box Collider (standard)
MeshOutline with the Material "OutlineOrange" or "OutlineGreen" (added)
Object Manipulator (added)
Constraint Manager (added)
The only thing that I had to setup after adding the as "added" marked compenents, was the material for the MeshOutline component.
Is there something else someone has to setup to see the outline shader on the HoloLens?
My Setup:
Unity 2020.3.30
MRTK 2.7.3
Visual Studio 2019
What else did I check?
The XR Plug-in Management is set up the same way
--EDIT
I noticed something strange and I guess this will help someone who knows more about shader!
I launched my application on the HoloLens, grabbed the cube and put it in front of a window in my room. While placing the cube in front of the window, I saw the outline! But as soon as I move it outside the window area, the outline disappears! Another aspect is that I'm using the spatial mapping from MRTK. That means that the window does not get meshed, only the walls. And I guess the walls have their own shader on it, right?
So the spatial mesh shader and the outline shader "dont like each other". Is this possible?
The user derHugo gave me a hint that led to the solution! I went to the material, that I use on the cube and changed the property Render Queue Override under Advanced Options to a higher value than the material MRTK_Occlusion, which is used for the spatial mapping, has.
I'm trying to build a Remote Assistance solution using the Hololens 2 for university, i already set up the MRTK WebRTC example with Unity. Now i want to add the functionality of the desktop counterpart being able to add annotations in the field of view of the Hololens to support the remote guidance, but i have no idea how to achieve that. I was considering the Azure Spatial Anchors, but i haven't found a good example of adding 3D elements in the remote field of view of the Hololens from a 2D desktop environment. Also i'm not sure if the Spatial Anchors is the right framework, as they are mostly for persistent markers in the AR environment, and i'm rather looking for a temporary visual indicator.
Did anyone already work on such a solution and can give me a few frameworks/hints where to start?
To find the actual world location of a point from a 2D image, you can refer this answer: https://stackoverflow.com/a/63225342/11502506
In short, cameraToWorldMatrix and projectionMatrix transforms define for each pixel a ray in 3D space representing the path taken by the photons that produced the pixel. But anything along a certain ray will show up on the same pixel. So to find the actual world location of a point, you'll need either use Physics.Raycast method to calculate the impact point in world space where the ray hit the SpatialMapping.
I'm trying to record and replay hand animations on the Hololens 2. I managed to record the tracked Transforms of Joints and use the recordings to animate given hand rigs. Now I'm trying to also record the tracked hand mesh. I'm aware of OnHandMeshUpdated in the IMixedRealityHandMeshHandler interface. Also, the following post guided me in this direction (very helpful):
How to get hand mesh data from Hololens2 without turning on Hand Mesh Visualization option
My question is: Is there a simple way to simulate hand mesh data in the Unity Editor? At the moment I don't have access to my team's Hololens, so I'm trying to figure out how to develop this feature directly in Unity.
AFAIK the OnHandMeshUpdated event is only called when there is actual mesh data on the Hololens, but not in the Editor where there are only the simulated joints of the controller, but not the hand mesh.
Any suggestions are welcome!
To simulate hand mesh input, you can use the RiggedHandVisualizer to control a SkinnedMesh built with hand joints data to visualize the hands, and it can work with InputSimulation in the Unity editor. You can find an example in the RiggedHandVisualizer scene under: MRTK/Examples/Experimental/RiggedHandVisualizer/Scenes, and more detail please seeRigged Hand Visualizer [Experimental]
Ok, so I have looked around the internet but I cannot find the sprite mesh. I should be able to right click my sprite> 2D Object> SpriteMesh.
Problem is that I don't see the option "SpriteMesh" anywhere.
Here's the deal. I created a bunch of 2D pieces for a character: head, body, two arms, two legs, two hands, and two feet. I imported the sprite as a PNG file and changed SpriteMode to multiple. I used the Sprite Editor to slice the char into pieces automatically. There's also nothing inside of the sprite editor that allows me to rig bones either.
Now I need to Rig the toon with bones and skin. However, I cannot find a way to do this. Watching a few tutorials, the guy adds a SpriteMesh to each of the parts. However, when I try to do this, the option just doesn't exist. I see SpriteMask but no SpriteMesh.
I'm using Unity 2018.2.18f1.
I have zero experience in animations like this. Normally I create a player/enemy without legs/arms. So they just float and I use the animation tab to change size/shape to insinuate movement. However, I'd like to take this next step and make the game look better.
How can I rig my toon? What steps do I need to follow?
All help is appreciated!
I guess you want to use the new 2D Features from Unity, if you want to rig your 2D Character.
I'm using Unity 2018.2.18f1.
You need to use Unity 2018.3 or later to use these tools.
I suggest you to use Unity Hub to download multiples versions and Beta versions.
There is a really nice video from Brackeys about this subject also.
When you have the 2018.3 or later version installed, open your project and go to the Window/Package Manager window, you need to install these packages :
I don't think you need the 2D Pixel Perfect but it's always nice to have.
I'm new in 3D games developing so I have kinda dummy question - how to move a mesh in 3D engine (under moving I mean a walking animation). And how to dress some skin on it?
What I have:
open source 3D OpenGL engine - NinevehGL http://nineveh.gl/. It's super easy to load a mesh
to. I' pretty sure it will be awesome engine when it will be released!
a mesh model of the human.
http://www.2shared.com/file/RTBEvSbf/female.html (it's mesh of a
female that I downloaded from some open source web site..)
found a
web site from which I can download skeleton animation in
formats:
dao (COLLADA) , XML , BVH (?) - http://www.animeeple.com/details/bcd6ac4b-ebc9-465e-9233-ed0220387fb9
what I stuck on (see attached image)
So, how can I join all these things and make simple game when dressed human will walk forward and backward?
The problem is difficult to answer, because it would require knowledge of the engine's API. Also you can't just stick a skeletal animation onto some mesh. You need some connection between both, a process called rigging, in which you add "bones" (also called armatures) to the mesh. This is an artistic process, done in a 3D modeller. Then you need to implement a skeletal animation system, which is a far too complex task to answer in a single Stackoverflow answer (it involves animation curve evaluation, quaternion interpolation, skinning matrices, etc.).
You should break down your question into smaller pieces.
I'm a BETA Tester for Nineveh Engine.
Currently, the engine does not support bones/skeleton animation. This will be a part of their next release version which is upcoming in next 4-8 months.
Future (Roadmap)
Version 0.9.3 : Q4 2011 - Q1 2012
Bones, Rigging and Mesh's Animations.
Mesh Morph.
You might want to checkout http://nineveh.gl/docs/changelog/