Unity 2D/3D Drag a sprite over check points in a path - unity3d

I want to implement a concept in unity application where user should drag object following checkpoints in a give path. Sprite/Object should not go far away from the path. I have implemented this concept. This is working fine when we auto move object with update function but when drags it is going any point.
1.) drag object in a sequence of checkpoints in a given path.
2.) restrict/fix sprite to give path while dragging

Just make an animation (using Unity's fantastic animation system) where the object moves from 0.0 to 1.0 along the path (or any path you want).
When the mouse is down, simply convert the x value to 0->1 and set the animation.
It's that easy in Unity.

Related

How to make Holograms move relative to user's head in Hololens 2?

I am creating a simple App on Hololens2 using Unity. I create two game objects and want them to move based on the user's head movement i.e. they do not stay still at a place in space but move relative to the user's head. However, I am not sure on how to enable this setting. Can someone please help?
The Orbital Solver provided in MRTK can implement this idea without even writing any code. It can lock the object to a specified position and offset it from the player. It is recommend to refer to the SolverExamples.unity which is located at /MRTK/Examples/Demos/Solvers/Scenes to get stated Solver components.
If I understand you correctly, you want to have a object at a specific distance from the HoloLens 2 at any given time.
So that (for example) a cube is always in the upper right corner of the users view.
If that is the case you can position the desired object as a child to the Main Camera (located under the MixedRealityPlayspace) in the hierarchy view.

HoloLens companion map

I am implementing a "companion map" for a HoloLens application using Unity and Visual Studio. My vision is for a small rectangular map to be affixed to the bottom right of the HoloLens view, and to follow the HoloLens user as they move about, much like the display of a video game.
At the moment my "map" is a .jpeg made into a material and put on an upright plane. Is there a way for me to affix the plane such that it is always in the bottom right of the user's view, as opposed to being fixed in the 3D space that the user moves through?
The Orbital Solver in MRTK can implement this idea without even writing any code. It can lock the map to a specified position and offset it from the player.
To use it what you need to do is:
Add Orbital Script Component to your companion map.
Modify the Local Offset and World Offset properties to keep the map in the bottom right of the user's view.
Modify the Orientation Type as Face Tracked Object.
Besides, the SolverExamples scene provided by the mrtkv2 SDK is an excellent outset to become familiar with Solver components

How do I force orbital movement around a point in space, while maintaining forward/backwards capability on a player?

Long-winded question out of the way, I'll provide a diagram of what I am going for:
The red square represents the character, the blue rectangle represents the camera, the green dot represents the center of the "stage", and the black circle is the stage itself.
What I desire is to essentially lock the player's movement around the "center" of the stage, so that anytime you move left or right you are more or less rotating around said center. However, I also want the player to be able to move forwards and backwards to/from the center as well. Keep in mind I want the camera to always stay directly behind the player. I have tried many different methods, and the latest is the following:
I took a default actor, attached a spring arm, attached a child actor to that (gets possessed to become the playable character), attached another spring arm, and finally the camera to that. I then added the blueprint code to the first spring arm so that it was the one being controlled by the left/right controls. However, upon hitting play, the only thing that moves is the camera, and it can only move forwards and backwards.
I'm admittedly pretty new to Unreal Blueprints, so any help would be appreciated.
Alright, I figured it out.
Here's the setup needed if anyone else wants something similar.
For the player themselves, you'll need something like this:
The important thing is to center the root mesh where you want to rotate around. The spring arm's target arm length will be affected for the player mesh movement, giving the illusion you are physically moving the character. The second spring arm isn't necessary unless you wish to have more control over the camera to player distance.
For the rotation Blueprint, you'll need this:
The target is whatever you named the root mesh. (Mine was called Center) Drag and drop it from the hierarchy.
For the forward/backward movement, you'll need this:
The target is what you named the spring arm. (I left mine as the default "SpringArm") Again, just drag and drop it from the hierarchy.
Adjustments in Project Settings:
Yes, my inputs are backwards from what you'd think. I felt it was quicker just to reverse the inputs instead of adjusting whatever was causing the movement to be backwards in the first place. (It's probably just the sphere orientation.) Also, you'll notice I have the w and s inputs set to 5 or -5 instead of 1 or -1. This is due to the fact the movement was slow otherwise. I'm sure there's a fix that doesn't involve changing the input axis scale, but honestly I won't really have a reason to alter the values at any point in my project. If it ever comes up where I do need to, I'm sure there's a bypass to change the values from within blueprints anyways.
End result:
End Result Video
If I remember correctly, child actor components are a bit different from other components in that they are transformation independent, that is they do not update their transformation when their parent component moves around.
I find it a bit strange that you would separate your player actor and the camera component. Normally, the player "pawn" contains the mesh and camera components for one player.
I would suggest you do the following:
Create a player actor (e.g. a "pawn" or "character" class)
Create the following component hierarchy:
Root Scene -> Spring arm -> Skeletal or static mesh -> spring arm -> camera
Your root scene is the green center in your drawing. You can then basically use the blueprint you already have to rotate and move your player.

Keep object in place when AR track is found

I'm pretty new with Vuforia and Unity, what I'm looking for is to have a full-screen/in place image/object (without movements) when the track is found (in keep it on screen till track is lost)
I was thinking to make the image target child of AR camera (to keeps is relative position to the camera) and put my image on top of it (a sample cube for now).
This will give me the correct positional result (image is not moving) but there are some rendering issues. (when I run it, it's blinking as I guess could be some rendering issues)
any comments and suggestions would be really helpfull
The way vuforia works is that the camera is manipulated so that it has the same relative position to the target as your device has to the physical target in the real world.* Making one the child of the other won't allow this to happen. The whole point is to replace the "marker" with the 3D object so that it looks like it actually exists in the real world, moving and keeping itself fixed in place as the user moves their device around.
If you want something to be displayed on screen relative to the screen you should have it check to see if the target is detected and enabled, and depending on that state, enable or disable itself. This object can then be a child of the camera (so it doesn't move).
*Alternatively it moves the object around relative to the camera.

Cocos2d - follow a path when dragging an image

I am trying to build a game using cocos2d where the user is able to drag an object along a path to a end goal. I have gotten the drag and drop to work but I am not sure how to implement the path so that when the user goes off the path it dies.
Any idea how to implement this so that I can have multiple levels?
From my understanding of your problem you don't want to do drag and drop at all.
Instead make the movement along the path an action and use the user's touch to set how far along that action they are. That way (like you would scrub backwards and forwards with a video) the user can scrub the object's action along its path.
I would first of all get your object moving along the path you desire using CCBezierTo or whatever. When you are happy with the results start the touch actions.
When the user touches the object you want to know how far along the path they are where 0 is the start and 1 the end. Then move the number closer to 1 the closer to the endpoint the touch moves and closer to 0 as it moves closer to the start point.
If you path is fairly straightish you can probably just do a simple calculation of where the touch is on a straight line between the start and end point. If its a complicated curve say going in circles you will have some hard work/trig to do!
Then as the touch moves you will need to update the object's position to the new location on the path. Say you have worked out the the touch is now at .75 along your path you will now need to workout the position the object should be at .75 * duration. You may need to extend or add a category to your action to allow you to set the position your given elapsed value.
Hope this steers you in a possible direction!