Unity3D vibrating and choppy scroll (with live demo + WebGL compared) - unity3d

Something must wrong here either with me or my monitor or GPU. Others also suffer from this phenomena see the links.
Now I would like to emphasize this problem is exist, being optimist I think there is some solution. Maybe it is only noticeable in special circumstances, and definitely not a performance (or GC) problem. If it would noticeable in all circumstances Unity3D would be useless...
My goal to use this wider forum to specify this problem is my hope the wider community may have wider scope to search the root of the problem. I think it is not algorithm or Time class usage related. Maybe the Time.deltaTime itself has bugs or mislead in some circumstances.
Just a feedback if anyone can reproduce the vibrating effect or not may have diagnostic value. (it seems to be oscillating in horizontal, the left and right edges are very blurry.)
Marrt created a web demo for Unity to demonstrate the effect:
http://marrt.elementfx.com/SmoothMovementTest.html
Whatever I do this demo is so vibrating + choppy. NOTE: All my self created simple scroll projects running in editor or standalone desktop are also exhibits the same, so it is not about 'how to use Time.deltaTime'
I thought the problem is in my hardware until today I saw this WebGL demo (use Chrome):
http://webglsamples.googlecode.com/hg/aquarium/aquarium.html
It is so smooth (any fish movement, also the rotating also the light fades) I mean smoooth. Why I can not produce (no one produce) this with Unity (at least in my machine)?
If anyone interested there is an exact proble description here, but no all answers solve the problem.
The problem in details:
http://answers.unity3d.com/questions/275016/updatefixedupdate-motion-stutter-not-another-novic.html
I tried all the things including the programming variations + quality setting (VSync etc)
My very simple self created projects are running 1000 - 2000 fps and still vibrating + choppy.
I spent 2 days to get the rid of this poor visual experience. Did all the programming thing what mentioned in the linked threads + tried to configure my video card (btw Radeon 6770 with 1Gigs)
Any thoughts?

Related

Unity's Particle System - shader glitch (mesh vertices reduction)

I have a problem on some Android devices with the particle system's rendering (weirdly enough the problem seems occur on devices with higher capabilities). The problem occurs when mesh based particles (Renderer/Renderer Mode/Mesh) are being rendered. It seems like the meshes that are being spewed out and slowly shrunk with time are being reduced ("reverse-tessellated") which results in a nasty visual effect. Does anyone know what might be the cause of this?
UPDATE: One thing that I've noticed is that with time - the longer the gameplay - this problem is getting worse.
UPDATE: What I've tried is to make one particle system bigger (around x5 times) in order to check if it will have any effect on it's rasterization. Normally particles are sized down from 1 to 0 based on their life-time. What I've noticed, after sizing them up, is that the problem does not occur anymore.
UPDATE: Visualisation of the problem:
Properly rendered:
Improperly rendered:
I was able to track the issue down. It turned out to be a problem within a toon shader I wrote while a go. All the things I've noticed are valid but unfortunately it took some time to put me on the right track.
Taken from Unity's documentation
The half and fixed types only become relevant when targeting mobile
GPUs, where these types primarily exist for power (and sometimes
performance) constraints. Keep in mind that you need to test your
shaders on mobile to see whether or not you are running into
precision/numerical issues.
Unfortunately for me, as it cost me quite some time, I've used half3/4 values with POSITION semantics which caused some numerical precision issues on some Android devices (in this case the particles were getting smaller and smaller - size 0 to be exact). As a general rule, from what I've read in Unity's documentation, float3/4 should always be preferred in conjunction with POSITION semantics.

ocean simulation in Blender vs. Unity3D

I want to make an ocean simulation that is physically accurate.
The height and speed of the waves should be controlled by the keyboard at runtime.
In the ocean, there needs to be a boat that either moves along a path or is controlled by the keyboard.
So far I have made this simulation in Blender:
https://youtu.be/LJ6ncxv-k7w
The problems are as follows:
1. There is no collision with the ocean
2. There are no controllers for the boat's movement
3. I am able to control the waves, but not at runtime
I thought about switching to Unity because the user interface is obviously better, as it is a game engine. I do not want to use Blender's game engine as its future is uncertain at this point.
After reviewing the various Unity water simulation plugins, I came to these conclusions:
1. the buoyancy is great in most of them, such as in Aquas and SUIMONO
2. None of them seems to offer a physically realistic collision with the boat.
3. they do offer wave height control, but not much else as far as wave properties go.
4. Some of the plugins can be combined to get closer to satisfactory results.
My question is:
Should I go with Unity completely?
It seems perfect for my user control needs, but the plugins are lacking in the collision aspect. I came across this video, but no tutorial: https://www.youtube.com/watch?v=T0D_vrYm4FQ
Even if there was one, how could I combine it with the plugins?
Is there a way to build the scene in Blender and then import it into Unity?
Would I be able to control the waves and boat after importing them?
Thank you very much for your time and knowledge.
if you really means an ocean, i suggest you to check out NVIDIA WaveWorks. it's a C library and doesn't have an officially integration with Unity3D, but since you go this far for it, i guess maybe you'll have enough courage to trying make it into a useable plugin yourself.

Unity on Xbox One - Camera/Rigidbody Visual Movement Hiccup

This is posted here since this is Xbox specific, but I am also posting this onto the Unity forums.
When testing my Unity game on Xbox One I am getting a very large amount of visual "jitter" from the ball. This is a skeeball game where you control the movement of the ball. Essentially the core of the movement is similar to the Rollerball tutorials. On PC this works fine and there are no perceptible jitters. However, on Xbox, I am seeing this a lot more. The object is travelling large distances with the camera following smoothly behind. None of the other objects or scenery are affected, I actually think the camera itself is moving perfectly. But, the ball itself seems to glitch.
Changing my camera movement to LateUpdate seemed to minimize it the most on the PC, but that doesn't make sense to me since I am still not convinced the camera is the problem.
Any help would be greatly appreciated. Perhaps a quality setting isn't placing nice with the Xbox?
Thanks!
Nick
Keep in mind the clock speed of the CPU on the Xbox is likely much slower than your PC (although there are more cores).
Unity is primarily single threaded, so that could explain the performance difference. Here are some things you can try:
* Make sure you are running the "Master" build on Xbox. The default is "Debug" which is significantly slower.
* It's possible it's something with the physics.
Once you've checked to make sure you aren't running Debug, the next step would be to use the Unity profiler to see where your frame time is being spent, then depending on what the cause is optimize that part.
Here is more information on the system resources:
https://learn.microsoft.com/en-us/windows/uwp/xbox-apps/system-resource-allocation
There is also a great post about the graphics debugger here:
https://tarhik.wordpress.com/2017/09/04/antimatter-instance-dev-log-entry-2-using-microsofts-graphic-debugger/
It looks like switching the RigidBody to use "Extrapolate" instead of "Interpolate" fixed the issue I was seeing. I am not sure if this works for every situation, but for the scale of the levels and the player physics of my game this seemed to do the trick.

How prevent large amount of transitions in unity state machine?

I have a unity state machine with four states: idle left, idle up, idle down and idle right.
To transition between these states I had to create 12 transitions. (the white arrows). This already seems unwieldy, but now I need to add 4 more states: running up, running down, running left and running right.
Does that mean I end up with 8 states and 24 transitions running between all of them? That seems very unwieldy to me. What if you need to change something later?
I know I can transition by code, but that doesn't seem to be the recommended way of working.
animaor.Play("runningright");
What would be the recommended way to work with lots of states?
As #Uri Popov said, you should consider using "Blend Trees". They are there for the same purpose. The help blend between multiple similar animations. for example, walk and run animation are similar in a way that they depend on character's movement speed.
Look at the following links to learn more about blend trees. These are only basic but will surely help you with your problem.
Unity - Manual: Blend Trees
Blend Trees - Unity Official Tutorials
When to use a blend tree vs state machine for animations (just another question on gamedev.stackexchange)

Eye-tracking for code editing

Is there a decent eye tracking package to replace the mouse for code editing?
I want to free up the mouse, but keep using my keyboard for editing code.
Having done some research on it, I concluded that proper eye tracking hardware is expensive. Using a webcam or high resolution video camera seems to be the most viable option.
Unfortunately, image-based tracking (as opposed to infra-red tracking) restricts the accuracy, and so not all features might be practical.
Desired eye-tracking IDE features:
Page scrolling
Tab selection
Setting cursor position
Selecting gaze-focused text with keyboard
A similar question recommends Opengazer for webcams, but I am particularly interested in speeding up basic text-editing. Any recommendations are appreciated, especially if you have experience with eye tracking and practical use cases.
The kind of accuracy you're looking for is pretty difficult to achieve (Since text tends to be pretty small).
IR tracking is actually pretty easy to accomplish. A few IR LEDs and an IR camera (which is really just a normal camera with different filters) and your pupil lights up (This can be done with under $100, more if you want a better camera though).
It's the head tracking that might be more of an issue.
You end up with quite a few degrees of freedoms that you need to track and your inaccuracies will just build up.
I'm pretty sure there is no out-of-the-box solution for problem, but on eyewriter.org there are really nice instructions how to build your own eye-tracker. It's accurate enought to let someone "draw" graphities using only his eyes. so it should be possible to convert the eye-movements into mouse-events.
It can be done reasonably accurately (al la this article on how people read code) but i've never seen a commercial product that does what you're asking for
Maybe take a look at Emotiv's headsets, they use thought patterns to perform tasks. They're designed for games but you can probably repurpose it for normal tasks
Re text cursor placement, Lightning (While I have not worked on this particular feature, I have previously contributed to the Text 2.0 project as a student) which is described in this paper:
Universal eye-tracking based text cursor warping
will place the text cursor at the most salient target in the neighborhood of the gaze position reported by the eye tracker.
However, you need a Tobii eye tracker that supports the TET API. You might want to contact Tobii to verify that the Tobii X2-30 eye tracker which costs < $10k is compatible.
Just use vim. Do more with the keyboard, less with the mouse.
Personally I had an issue always having to reach for a normal mouse, looked at various option eyetracking/voice/touchscreen and ended up changing the keyboard to a IBM Trackpoint end result being my hands never leave the Keyboard and my typing speed and accuracy improved due to not having to reposition my right hand.
Eye Tribe has a $99 consumer-level eye tracker that is available now.
“Using a webcam or high resolution video camera seems to be the most
viable option.”
Eye Tribe is a spinoff of Gaze Group, a research group located at the IT University of Copenaghen. The people of Gaze Group developed the open-source ITU GazeTracker software, which allows people to turn low-cost webcams into eye trackers.
http://www.gazegroup.org/downloads
Upon looking at the “downloads” section for Gaze Group, it seems that there are already some eye tracking applications to do some basic actions.
melhosseiny mentioned the Text 2.0 framework for creating eye tracking apps using HTML, CSS and JavaScript, and the Universal eye-tracking based text cursor warping feature for placing the text cursor at the most salient target.
Eye Tribe has its own SDK, but those things above could help if they work with Eye Tribe.