Making interactable computer screen in game Unity 3D - unity3d

I need to do a interactable computer screen in Unity 3D for my game like Streamer's Life Simulator or Internet Cafe Simulator.I used Canvas and tried TextureRenderer but didn't work as well.What systems I should work on for this?
When I tried to use canvas for this it was reversing the text (in attached photo)

If your canvas is pointing the wrong side you can rotate the canvas.
On the Rect Transform component rotate the y axis by 180 degrees.

Please use a plain and try to apply render texture to plain material. It will give the proper output.
You can use this sample code:
texture = new Texture2D(count_x, count_y, TextureFormat.RGB24, false);
Rect rectReadPicture = new Rect(0, 0, count_x, count_y);
RenderTexture.active = renderTexture;
// Read pixels
texture.ReadPixels(rectReadPicture, 0, 0);
texture.Apply();
RenderTexture.active = null;

Related

Fix the camera while walking unity 3D

Currently when I walk the camera follows the player even when he turns sideways, I wanted the X axis to be kept, so that only the camera walks, doesn't move sideways, I'm trying:
Camera.main.transform.localPosition = new Vector3(0, 8, -10);
Camera.main.transform.localRotation = Quaternion.Euler(40,0,0);
Camera.main.transform.SetParent(transform);
Are you setting the camera as child of the player GameObject? If that’s the case, you should not do this. It will always rotate the camera with the player object.

Sprite Texture of a Sprite in a Spritesheet

I got a sprite sheet with multiple sprites (32 pixels per sprite). Sprite mode is set to multiple, sprites are split properly. Sprite sheet is set to no compression and filter mode point.
I am setting the rendered sprite (for testing purposes ofc) like this:
var texture = Object.Instantiate(MySprite.texture) as Texture2D;
renderer.sprite = Sprite.Create(texture, new Rect(0.0f, 0.0f, 32, 32), new Vector2(0.5f, 0.5f), 32);
If MySprite is a sprite from the sprite sheet a the sprite renderer renders a transparent sprite.
If MySprite is a "single" mode sprite it gets rendered properly. (I cut it out of the sprite sheet).
It seems as if Sprite#texture does not contain the texture for sprites from a sprite sheet.
How do I get the texture2d object of a sprite thats part of a sprite sheet?
And how can I detect if the sprite is part of a sprite sheet, so I can select the texture according to that?
Note: Both "single" and "multiple" mode sprites have except of the mode the same settings, so it shouldn't be a settings thing.
If I loop through the pixels, I get colored pixels (with colors from the actual sheet, but with alpha = 0.

Can someone explain Unity co-ordinate system in 2D to me please?

I'm trying to make a 2D game.
I have 4 gameobjects which i want to place in each corner of the screen., i.e when I run my app on my phone it should be visible in each corner.
So what I did in the script is,
//GameObject 1 script
void Start(){
transform.position = Camera.main.ScreenToWorldPoint(new Vector2(Screen.width, Screen.height));
}
//GameObject 2 script
void Start(){
transform.position = Camera.main.ScreenToWorldPoint(new Vector2(0, 0));
}
//And same for other 2 gameobjects...
But i am not seeing any of the objects on my screen.
You're doing it in right way, but there could be some moments you should check out:
Make sure that you're using correct camera to translate coordinates from screen to world.]
Make sure that z position of your object are inside camera's clipping plane (it must be smth between camera's near clipping plane and camera's far clipping plane). You can start you game in editor and look at scene view if gameobjects are actually inside the clipping plane.
What you're doing is placing your gameobjects so that the center of each gameobject will be in the corner of the screen. And it's not like gameobject will fit the corner. If you want to place gameobject so that they will fit the corner (not their center will be placed in the corner) you should calculate the offset for each gameobject. You can user renderer.bound for this purpose.
Make sure that your camera renders the layer on which gameobjects are. You can check it in camera's culling mask.

Static 2D text over 3D scene in javafx java

My goal is to overlay 2D text over a 3d scene in javafx as seen in
Using a subscene is not a valid choice as I want the 3d model to be able to take up the entire space on the screen.
I tried adding a label to the scene and turning depth buffering off but once the model gets rotated (the actual camera changes position) the correct positioning breaks. (Used code to control the camera )
Can I somehow overlay a static 2D GUI over my 3D scene maybe by using anchor panes and having a 2D scene with transparent background?
On stack overflow I only found these questions:
Question No.1
Question No.2
which don't correspond to my exact needs.
I misunderstood the concept of subscenes as they all showed entirely separated controls. Overlaying 3D Text is possible using the following structure...
Root Container (e.g. an Anchor Pane)
2D Content (Label)
SubScene
perspective camera
root 3D
3D content
Code example:
//Add 2D content here
AnchorPane globalRoot = new AnchorPane();
globalRoot.getChildren().add(new Label("Hello World"));
Scene scene = new Scene(globalRoot, 1024, 768, true);
SubScene sub = new
SubScene(root3D,1024,768,false,SceneAntialiasing.BALANCED);
sub.setCamera(camera);
globalRoot.getChildren().add(sub);
//Add all 3D content to the root3D node
primaryStage.setScene(scene);
primaryStage.show();

Render a RenderTexture to a mesh

In Unity, I'm updating a Render Texture procedurally (via writing data to it) via a DirectX plugin. I do something like the following to initially create my RenderTexture:
RenderTexture myTexture = new RenderTexture (100, 100, 0);
myTexture.Create ();
transform.GetComponent<Renderer> ().material.mainTexture = myTexture;
transform.GetComponent<Renderer> ().enabled = true;
Then later on I modify the texture as needed. Yet this object's material (what it looks like in the real world) doesn't change. If I click on that object, then click on it's material, and click on this RenderTexture attached to it, I can see it updating, just for some reason it doesn't update on the actual mesh. Why is this? I've tried using different built-in shaders, but that hasn't seemed to help. Is there a way to write a shader to render a RenderTexture to a mesh, as one idea?
I found the best option is simply to use a RawImage instead of a Material, and apply the render texture to that RawImage's texture (not mainTexture, just texture). A material can then even be applied to that raw image if you want to use a shader.