Unity - Canvas Size changes from one computer to the other - unity3d

I am porting a video game from Xamarin to Unity.
The game uses, amonst other thing, Unity UI functionnalities (hence a canvas).
I did some work on one computer, adapting/placing the UI element I needed to the canvas, then saved and checked in my work into subversion.
I then checked-out the code from another machine and reopened the project, only to find out that the canvas size (and hence all the UI elements layout) was quite different and all over the place !
Why is that ? Did I omit to check-in some important file (for exemple metadata) into the source control ?
Thanks,
RĂ©gis

This is because the canvas height and width is dependent on the resolution of the main monitor of the machine running the game/editor.
You'll want to look into using anchors and layout components to make the canvas responsive.
Unity is a how to article on building a responsive UI: https://docs.unity3d.com/Manual/HOWTO-UIMultiResolution.html

Related

Unity rendertexture only renders a solid color

I'm making a game in Unity 2019.1.6f1 PC, Mac, Linux standalone. I want to project the game unto the pages of a book and believe that render textures can help achieve this.
They worked reasonably well on the placeholder book model, but when I try to apply the same render texture on a different model, it only seems to project a blown-out single pixel.
Here I placed them close to eachother to show the problem:
I made screenshots of all the settings that may be related, though I did try to change all of them already. If any other information is necessary, please let me know.
The camera settings
The material settings
The mesh renderer settings
The texture settings
I tried changing the size variables to powers of 2, but that didn't work either.
I also found this post, but I don't believe this is the same problem.

UE4 Navmesh isn't the same in viewport and in game

So I am making a navmesh in a scene which loads multiple levels, everything work except when the navmesh is rebuilt on startup it isn't the same as the one I see in the viewport and is way less accurate now for some reason(it is important to reload since multiple levels can be streamed but for testing purposes, I stream the same one that I see in the viewport), 2 links will be there with the viewport version and the in-game one when I simulate the game. The setting are the same in the project setting and on the actual RecastNavMesh and I can't find anything on the net refering to my problem.
[1] In viewport : https://i.stack.imgur.com/FWVRE.jpg
[2] In Game : https://i.stack.imgur.com/q0RdO.jpg

CSS3D StereoEffect creating dual non synced webpages

This project is a combination of a few things I've found, first embedding webpages in Three.js:
http://adndevblog.typepad.com/cloud_and_mobile/2015/07/embedding-webpages-in-a-3d-threejs-scene.html
and the second is the custom CSS 3D Renderer I found on stackoverflow:
Three.js StereoEffect cannot be applied to CSS3DRenderer
The effect is almost exactly what I wanted, except instead of simply re-drawing the output from one side to the other, it's loading two separate instances, which sorta breaks the point of going VR...
Any ideas? Here's the file:
https://drive.google.com/open?id=1UmXmdgyhZkbeuZlCrXFUXx-yYEKtLzFP
My goal was to render the Ace cloud editor in a VR environment using the stereo effect algorithm (which seems like it could be a fun new way to develop code if you had a wireless keyboard/trackpad with a VR headset, locking the camera in one location of course but still needing the mirrored view for the lenses)...
https://www.ebay.com/itm/Wireless-Bluetooth-Keyboard-with-Touchpad-for-iOS-Android-Smart-phone-Tablet-PC/112515393899?_trkparms=aid%3D222007%26algo%3DSIM.MBE%26ao%3D1%26asc%3D20161006002618%26meid%3D50ca4e61c27345df85a8461fb1a0e6d5%26pid%3D100694%26rk%3D7%26rkt%3D30%26sd%3D222631353709&_trksid=p2385738.c100694.m4598
https://www.walmart.com/ip/ONN-Virtual-Reality-Headset-White/187088616?wmlspartner=wlpa&selectedSellerId=0

Use Unity3d particle system in UI

I've read a few different posts on how to display the particle system on the canvas in Unity but I don't seem to be understanding it.
I'm trying to use the Particle Ribbon asset by Moonflower in my UI but can't get it to display in the UI. I tried adding another Canvas as suggested in other posts, with Render mode set to Screen-Space Camera but no luck.
At one point I saw the particle system but it was very, very small and wouldn't change size regardless of scaling.
you can set sortingOrder
ParticleSystemRenderer.sortingOrder / sortingLayerID, Canvas.overrideSorting / sortingOrder / sortingLayerID
canvas
particle System
I would recommend trying the UIParticleSystem script found here.
Generally speaking, this Unity UI Extension repository is full of amazing things created (and often updated) by the community : I'd advise you to bookmark it :)

Develop 2D game Inside Canvas Scaler

I'm new in Unity and i've realized that it's difficult do a multi resolution 2d game on unity without paid 3rd plugins available on Asset Store.
I've made some tests and i'm able to do multi resolution support in this way:
1- Put everything from UI (buttons etc) inside a Canvas object in Render Mode Screen Space - Overlay with 16:9 reference resolution and fixed width.
2- Put the rest of the game objects inside a Game Object called GameManager with the Canvas Scaler component in Render Mode Screen Space - Camera with 16:9 reference resolution, fixed width and the Main Camera attached. After that, all game objects like player, platforms etc inside GameManager need to have a RectTransform component, CanvasRenderer component and Image Component for example.
Can i continue developing the game in that way, or this is a wrong way to do the things?
Regards
Also don't forget GUI, Graphics. It's a common misconception that GUI it's depreciated and slow. No it's not. The GameObject helpers for GUI were bad and are depreciated, but the API for putting in OnGUI works great when all you need is to draw a texture or some text on a screen. They're called legacy, but there are no plans as to remove them, as the whole Unity UI is made out of it anyway.
I have made a few games just on these, using Unity as a very overengineered multiplatform API for draw quad.
There is also GL if you want something more.
Just remember - there will be no built-in physics, particle effects, path finding or anything - just a simple way to draw stuff on the screen. You will have total control over what will be drawn - and this is both a good and bad thing, depending on what you want to do.
I will not recommend you using Canvas Scaler for developing a complete game. Intended purpose of the canvas scaler was to create menus and you should use it to create menus only.
The 2D games created without the canvas scaler don't create much problems (mostly they don't cause any problems) on multiple resolutions.
So, your step 1 is correct but for step 2 you don't need to have a canvas scaler component attached.
Do remember to mark your scene as 2D (not necessary) and your camera to orthographic (necessary) while developing 2D games.