Use Unity3d particle system in UI - unity3d

I've read a few different posts on how to display the particle system on the canvas in Unity but I don't seem to be understanding it.
I'm trying to use the Particle Ribbon asset by Moonflower in my UI but can't get it to display in the UI. I tried adding another Canvas as suggested in other posts, with Render mode set to Screen-Space Camera but no luck.
At one point I saw the particle system but it was very, very small and wouldn't change size regardless of scaling.

you can set sortingOrder
ParticleSystemRenderer.sortingOrder / sortingLayerID, Canvas.overrideSorting / sortingOrder / sortingLayerID
canvas
particle System

I would recommend trying the UIParticleSystem script found here.
Generally speaking, this Unity UI Extension repository is full of amazing things created (and often updated) by the community : I'd advise you to bookmark it :)

Related

Show the Unity particle on the Canvas

I am using Unity version 2021.3.15f
I want to show the particles on the UI canvas.
I'd like to know two things, how to show particle on the canvas when the render mode on the canvas is screenspace-overlay and screenspace-camera.
Do I need to convert the particle's transform into a rectTransform?
Or should I use methods like Camera.ScreenToWorldPosition?
You could always move the camera into a ScreenToWorldPosition and it will work but keep in mind this is just a bandaid fix and won't be robust and maintainable. Usually anything ui related must be compatible with Unity's UI Render Pipeline.
There is this great resource for adding particle effects into UGUI from a github repository.
Use : https://github.com/mob-sakai/ParticleEffectForUGUI
take a look at the sample scenes it has everything you need.

Unity rendertexture only renders a solid color

I'm making a game in Unity 2019.1.6f1 PC, Mac, Linux standalone. I want to project the game unto the pages of a book and believe that render textures can help achieve this.
They worked reasonably well on the placeholder book model, but when I try to apply the same render texture on a different model, it only seems to project a blown-out single pixel.
Here I placed them close to eachother to show the problem:
I made screenshots of all the settings that may be related, though I did try to change all of them already. If any other information is necessary, please let me know.
The camera settings
The material settings
The mesh renderer settings
The texture settings
I tried changing the size variables to powers of 2, but that didn't work either.
I also found this post, but I don't believe this is the same problem.

How do I use different Post-Processing effects on different cameras in Unity 2017.4.2f2?

Before I explain my situation, it's important that I mention I'm using an older version of Unity, 2017.42f2 (for PSVITA support). for this reason, I'm also using the old Post-Processing Stack.
I'm making a game in which I want a lot of bloom on the UI, but not as much for the rest of the game. I used a camera to render the UI, gave it Post-Processing (Screen Space - Camera on Canvas),
and another one to render the rest of the game
Each is given a different profile to use different effects.
My expectation is that the UI renderer camera would only apply it's effects to the Canvas. Instead, it also applies them to the camera beneath it; the game renderer camera.
As you can see I used Don't Clear clear flags. I also tried Depth-only, to see if it would make a difference.
I'm lost as to what I could do. Grain and bloom get applied to everything
yet the profile for those effects is only given to the UI renderer Camera's Post Processing Behavior Script.
Does anyone have any suggestions or ideas? I'm lost.

Building custom Mixed Reality (Augmented Reality) setup in Unity

I'm tasked with developing an application, which would emulate augmented reality in a virtual reality application. We are using Google Cardboard (Google VR), and want to show the camera images (don't mind the actual camera setup, say I already have the images) to the user.
I'm wondering about the ways to implement it. Some ideas I had:
Substituting the images rendered for each eye by my custom camera images.
Here I have the following problems: I don't know how to actually replace the images that are rendered to the screen, let alone to each eye. And how to afterwards show some models overlayed on top of the image (I would assume by using the Stencil Buffer?).
Placing 2 planes in from of the camera with custom images rendered onto it
In this case, I'm not sure about the whole "convenience" of the user experience, as the planes would most likely be placed really close, so you only see one plane with one eye, and not the other. Seems like it might put some strain onto your eyes, because they would close on something that is really close to you.
Somehow I haven't found a project that would try to achieve something like that, and especially with all the Windows Mixed Reality related stuff polluting the search results.
You can use Vuforia digital eyewear, here is the documentation for it.
And a simple tutorial on YouTube.

Unity profiler Device.present

When trying to optimize my game the biggest problem seems to be the device.present. Been going through some forums and i couldnt really find any useful answers. What is usually the main problem associated with this?
There are just many things that could cause this but the main reason is because the Thread is blocked by the graphics driver in a way to let the GPU catch up.
These are the specific reasons:
1.Image Effects
Check your camera. If you have Image Effects such as Flare Layer, Anti-aliasing and others, disable them.
2.UI Effects
Check all your Images, RawImages and Texts. If you have an Outline, Shadow or Position As UV1 component attached to the Images, RawImages and Texts, this could cause the problem. Usually when you have multiple of this attached to one
Image, RawImage and Text.
3.Bad Light Settings
Select your light and make sure that the Resolution under the Shadow Type is not set to Very High Resolution.
4.In the Quality Setting change the V Sync Count to Don't Sync.
5.Check for Sprites and Images with 0 alpha then disabled them
6.From the Player Settings, disable Auto Graphics API then change the Graphics API to OpenGLES2.
7.Custom Shaders
Are you using custom shaders (non standard shaders)? Disable it temporary. An expensive or bad written shaders could cause this problem.
Those are the usual problems. It is very possible that yours is a different problem. I suggest you enable/disable items one by one and you will likely find the problem.
If not, then consider creating new project and scene. Save out your old game as prefabs or assets then import them one by one into the new project. Don't import them at the-same time because the problem might appear again. Import then test then import again and test until you find the problem. If the problem is no longer there then it's likely an Editor Settings problem.