I am learning Unity2D and trying to teach myself based on making clones of older games and my first one is Spacewar (1962 game).
The problem I cannot wrap my head around is setting up a script (I think this is the correct approach) for when my sprite goes off camera on one edge it will appear on the opposite edge, example if you are not familiar with Spacewar would be PacMan when PacMan goes off the screen and appears on the opposite side.
How should I approach this because there are other games in the list of my cloning projects that will also share this same view mechanic.
I feel like I have the logic inside my head perfectly on what needs to be done but at the same time I am so new to Unity that syntax is preventing me from moving forward.
Should this be based on :
1) Having collision on my edges and just moving it to the opposite side
or
2) Go based on camera edges.
I kinda think it might be something along the lines of #2 because what if the screen size is different on another computer.
What I was looking for was this "Mathf.Clamp(transform.position.x, 6.0, -6.0)". This solved my problem from going off the screen on one edge and reappearing in the other. This is just for the left and right edges.
if(transform.position.x < -6.0 || transform.position.x > 6.0){
var xPos : float = Mathf.Clamp(transform.position.x, 6.0, -6.0);
transform.position = Vector2(xPos, transform.position.y);
}
Related
I am currently building a simple mixed reality application in Unity where you should be able to stack different sized cubes on top of each other. As soon as I add a third cube though, the collision seems to get weird. I have already played around with the rigidbody settings but so far nothing has worked. I have also enabled adaptive force.
You need to freeze the x and z axis on the rigidbody
for those who are familiar with using Flame in Flutter for game development, I'm wondering if you can just advise me whether I'm on the right track, or not - because I'm not sure if what I'm seeing in my testing is what I expected. I started out with Flame because I thought it seemed like a relatively simple way to make the basic game that I'm aiming to make.
I'm making a basic game where there are four boundaries defined on each edge of the screen, and a ball will bounce around the four boundaries. The boundaries are defined as widgets (because I want to control the properties of each - sometimes they'll be "electrified", meaning the ball shouldn't collide with them). And the ball is a widget as well, of course. I've got some basic code done where I can drag a line from the ball to indicate the direction that I want to start bouncing, and then the ball will bounce around the boundaries (just using basic angle of incidence = angle of reflection to determine the direction of movement).
The code to do the movement is in the "update" method of the ball widget - however, what I'm finding is that the time between updates is somewhere around 200-300 milliseconds, so if I want to show the ball moving at any kind of pace, it has to jump a good number of pixels at each "update" tick - and thus the movement looks "jerky".
Am I doing this the right way? Is there a different (better) approach that will make the movement appear smoother? Or, I'm wondering whether the duration of the "update" ticks is a result of running the code via debug in an Android emulator? (I'm using Android Studio for the emulation, and Visual Studio Code to build the project). I know I don't have actual code here in the question, because essentially I don't have an issue with my code not running - I would just like to understand if that duration of "update" ticks is "normal", and if the resulting "jerky" animation is just to be expected - or do I need to look at a different approach? Thanks in advance!
You should preferably not be using widgets for moving game parts, you should be using Flame components. So you could have for example 4 SpriteComponents as the walls and then the ball as another SpriteComponent and then you can use the collision detection system to act upon when the ball touches one of the walls.
https://docs.flame-engine.org/main/collision_detection.html
Today now i'm using the keys wsad to move my ThirdPersonCharacter around.
But since i'm doing an adventure game or more a quest game i wonder if i should make somehow a point and click style ?
The game i want to do is like the old school adventure games where you click on items and select what to do with them look/take/use. And since i'm doing it in 3d i wonder how should i make the game style ?
Another thing i didn't find any tutorials how ot make the point and click and also how to make the items for example if i put a cube on the space ship how do i make that if i use will click on it it will display a small options icons like look/take/use.
Another sub question is first time the character was when i moved it to the spaceship it was walking through it so i added a Mesh Collider to the space ship. Is that right to add a Mesh Collider ? Now it seems to be working i just wonder if it's right.
A screenshots of the game scene before added the collider. And scene3 after adding the collider:
To start, this decision is left up to you. We cannot answer that for you. However, there are some pros and cons of using either.
For point and click, there is basic code for doing this but there are so many parameters to take into account. Although you may want to go from point A to B, the code must be functional where you can avoid obstacles, walk on the ground, rotate the character accordingly, respond to other stimuli in the environment, etc. You can start with a simple:
Vector3 end;
float distance, scalarLerp;
Transform player;
LayerMask layer;
RaycastHit touch;
void Update(){
if(Input.GetButtonDown(0)){
Physics.Raycast(ScreenPointToRay(Input.MousePosition), distance, out touch, layer);
start = player.position;
end = touch.point;
}
player.position = Vector3.MoveTowards(player.position, end, Time.deltaTime * scalarLerp);
}
This code is simple and you will realise the problems with point and click.
You could remain with the Third Person Controller but that is left up to you.
I am trying (for learning purposes) to make a Portal game. I have the basics working, I can place two portals, and walking within the collider of one makes me teleport to the other, however I can't seem to get the facing direction/rotation to work.
I want to face outwards from the new portal after the teleportation.
I have tried the following, with no success:
var angle = thisPortalCamera.transform.rotation.eulerAngles.y - otherPortalCamera.transform.rotation.eulerAngles.y;
playerChar.transform.Rotate(Vector3.up, angle);
My idea here was that only the y-axis rotation really matters, and I think I should rotate the player by the difference in axis between the two portals. This is probably really simple and easy, but I am pretty new to Unity. Any suggestions?
The easiest way would be to set your portal so that its forward is the orientation you want you player to have. Then you just go with:
player.transform.rotation = portal.transform.rotation;
player.transform.position = portal.transform.position;
The aim is to have the blue arrow of your portal to point in the right direction.
One of the easiest way is to use the function LookAt() and Rotate() of the transform. This function take one parameter that is the Position the object has to look.
transform.LookAt(portal.position);
tranform.Rotate(new Vector3(0,180,0);
This will make your character face the opposite way of your portal.
Hi am developing a simple Space Shooter styled 2D game and I am stuck at the point where the Object should restrict itself moving beyond the left and right edges of the screen.
I implemented #Waz solution in one of the answers in Unity Answers and it works great if the object is not a rigidbody. However if it is applied to a rigidbody, the object starts to flicker. Below is the code that I used from #Waz
float speed = 0.1f;
Vector3 viewPos = Camera.main.WorldToViewportPoint(transform.position);
viewPos.x = Mathf.Clamp01(viewPos.x);
viewPos.y = Mathf.Clamp01(viewPos.y);
transform.position = Camera.main.ViewportToWorldPoint(viewPos);
Here is the link where #Waz mentioned his piece of code:
http://answers.unity3d.com/questions/148790/detecting-the-edge-of-the-screen.html
Here is a link that says to use an alternative solution for rigidbody but this code does not work for me:
http://answers.unity3d.com/questions/62189/detect-edge-of-screen.html
I am not sure how to modify the above code so that the object I touch and move does not flicker. Any help would be great.
You are translating from arbitrary float coordinates to the range [0,1] and back again. Likely the issue you are running into is due to floating-point inaccuracies when your world-position is far away from 0.
There are multiple ways of solving this:
In your above script, only perform the transform if they are actually touching the edge of the screen.
Handle the OnBecameVisible() and OnBecameInvisible() messages. Don't let the player move off screen if it would cause them to "go invisible".
Use the IsVisibleFrom() callback from this wiki article. Some people prefer this because they claim that "OnBecameVisible()/OnBecameInvisible() are broken."
I don't know how/why they believe they're broken.
Have you tried using Screen.width and Screen.height in detecting the edge of the screen? Maybe it can help in the prevention of the flickering.