I recently came across a video by "t3ssel8r" where he was discussing procedural animation. In the video, I saw this:
That graph in the inspector window would be very helpful for what i am doing right now. The graph changes when the variables change, and would be helpful for prototyping a weapon recoil system ive been making, where the strength of the recoil (the Y value) is proportional to the amount of rounds fired (the X value).
Unfortunately, i cannot find out how to get that to display. While i could just keep using desmos, i would prefer if i could see the graph in unity. How do i get this graph to appear?
I found this specific demo done with custom editor script in Second Order Dynamics by lomakinsam on GitHub
SecondOrderDemo.cs is the one which you shared the screenshot.
It has one script for inspector/editor script which does the graph thing SecondOrderDemoInspector.cs
Related
I am new to Unity and trying to get basic hands working in terms of being able to see the hands and having them move in accordance with my own hands (preferably using controller, which I know has limited control over what it can detect hands are doing).
I configured OVRHandPrefab as shown in this article, but I do not see the hands. I have tried using with my (physical) hands only as well, but I don't see the hands. I tried disabling hand-tracking support, but that didn't help either.
I've tried all the options in "Hand Tracking Support" in OVRCameraRig
and am using the default values for the two OVRHandPrefab objects except for changing one of them to match the right hand (since left hand seems to be default).
I also tried using the OVRCustomHandPrefab_L and ..._R, but while I do see the hands they don't animate at all in accordance with me pressing buttons or triggers. I'm not sure if these prefabs are supposed to animate out of the box though.
If anyone can suggest any troubleshooting suggestions or any steps where I can get basic animated hand models working, I'd appreciate it.
I'm using Unity 2020.3.18f1.
Use the OVRCustomHandPrefab_L and ..._R and click "automap bones" button under OVR Custom Skeleton for each one.
I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important.
We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D object to position the AR camera.
So far we've been using Vuforia. The 3D target feature didn't scan our object very well, so we're resorting to printing 2D markers and placing them on the table that the exhibit sits on. The tracking is precise enough, the downside is that the scene disappears whenever the marker is lost, e.g. when the user tries to get a closer look at something.
Now we've recently gotten our hands on a Lenovo Phab 2 pro and are trying to figure out if Tango can improve on this solution. If I understand correctly, the advantage of Tango is that we can use its internal sensors and motion tracking to estimate its trajectory, so even when the marker is lost it will continue to render the scene very accurately, and then do some drift correction once the marker is reacquired. Unfortunately, I can't find any tutorials on how to localize the marker in the first place.
Has anyone used Tango for 3D marker tracking before? I had a look at the Area Learning example included in the Unity plugin, by letting it scan our exhibit and table in a mostly featureless room. It does recognize the object in the correct orientation even when it is moved to a different location, however the scene it always off by a few centimeters, which is not precise enough for our purposes. There is also a 2D marker detection API for Tango, but it looks like it only works with QR codes or AR tags (like this one), not arbitrary images like Vuforia.
Is what we're trying to achieve possible with Tango? Thanks in advance for any suggestions.
Option A) Sticking with Vuforia.
As Hristo points out, Your marker loss problem should be fixable with Extended Tracking. This sounds definitely worth testing.
Option B) Tango
Tango doesn't natively support other markers than the ARTags and QRCodes.
It also doesn't support the Area Learnt scene moving (much). If your 3DPrinted objects stayed stationary you could scan an ADF and should have good quality tracking. If all the objects stay still you should have a little but not too much drift.
However, if you are moving those 3D Printed objects, it will definitely throw that tracking off. So moving objects shouldn't be part of the scanned scene.
You could make an ADF Scan without the 3D objects present to track the users position, and then track the 3D printed objects with ARMarkers using Tangos ARMarker detection. (unsure - is that what you tried already?) . If that approach doesn't work, I think your only Tango option is to add more features/lighting etc.. to the space to make the tracking more solid.
Overall, Natural Feature tracking by Vuforia (or Marker tracking for robustness) sounds more suited to what I think your project is doing, as users will mostly be looking at the ARTag/NFT objects. However, if it's robustness is not up to scratch, Tango could provide a similar solution.
I've read a few different posts on how to display the particle system on the canvas in Unity but I don't seem to be understanding it.
I'm trying to use the Particle Ribbon asset by Moonflower in my UI but can't get it to display in the UI. I tried adding another Canvas as suggested in other posts, with Render mode set to Screen-Space Camera but no luck.
At one point I saw the particle system but it was very, very small and wouldn't change size regardless of scaling.
you can set sortingOrder
ParticleSystemRenderer.sortingOrder / sortingLayerID, Canvas.overrideSorting / sortingOrder / sortingLayerID
canvas
particle System
I would recommend trying the UIParticleSystem script found here.
Generally speaking, this Unity UI Extension repository is full of amazing things created (and often updated) by the community : I'd advise you to bookmark it :)
Situation:
I am working on a project that allow the user to practice presentations in a VR-Room. This includes the use of Powerpoint/Keynote, which is displayed on a plane. Image display is easy possible, just as video's.
Problem:
There's the problem. Images don't contain movement, but a powerpoint/keynote file often does. Since Unity does not support the file extension of powerpoint and keynote. Exporting to HTML and programming our own phaser for the json files and apply the animations doesnt seem worth the effort.
Current situation:
At this moment we converted all sheets to textures. Not using the animations.
Request:
In the past there used to be some plugins to display HTML on a plane (flat surface). But these seem to be outdated. Is there anyone out there who has a solution for this problem?
Thanks in advance.
Although this answer doesn't address the specific request of displaying HTML on a quad (plane, whatever) in Unity, it is a solution that may be worth considering if it fits your scenario.
If the presentations are linear, why not record them as video? You can easily the play the video on a quad in Unity using RenderTexture and pause it at the right moments to wait for the user to trigger the next slide/animation, whereupon the video can be played again until the next stop point.
This will require little programming on your part, but isn't the most flexible solution as it requires a linear slideshow and for you to create pause-points in the video playback at the correct timings to match the points where the slideshow naturally awaits a mouseclick from the user.
I have an issue in Unity3d v5 where my joystick does not work as intended, When i plug in the joystick, moving it right from the center, gives me -1 and up to 1.
Keeping it completely center, gives me 1 and moving it left gives me 1 (so no change in value moving stick left.
From what ive read it has to do with using RawInput and not using DirectInput.
I have read a post where someone suggests using a registry change to force unity to use directinput. But it does nothing for me using unity3d v5.
Can anyone please help me, because i am completely stuck on this and getting this joystick to work is essential for my game :)
It's very difficult to find any information about this issue from an official Unity source, but take a look at the answer here: Joystick not working
And follow through the threads referred to there. The upshot of this all is that RawInput is all you can rely on for joystick input, on windows. The best solution to this problem I can think of is runtime calibration in your game/app. If you don't care to implement that yourself, there are 3rd party options such as this one, that integrate with Unity: cInput 2
I am running into this same problem with my Unity projects, and my plan is simply to present joystick users with the option to calibrate. You could save the calibration settings so the user doesn't have to calibrate every time the game is run. Not ideal, but again the most realistic solution to the problem I can come up with.