send touch drag to a GameObject using TouchScript on Unity - unity3d

I'm doing a Standalone Windows 7 unity3d application for use on a multitouch IR Frame.
I usually use TouchScript and no problem with this. But now I don't know how to solve a new problem.
I have a texture with a uWebKit web and I need to send to this GameObject the touch drag to move a map and make zoom.
I can send touch Down and Up but I don't know how to pass the move.
It's ok for me to use other solution than TouchScript if go well.
Thanks you.

Related

onbuttonclick - Unity with Bolt

I am using Unity with Bolt visual scripting and having some issues. I have a 2d Mobile App and basically what I want to do is when the user touches a sprite it starts spinning on every axis. Although I am able to achieve it without the user touching the sprite from the update function.
I want to achieve this through the touch in mobile. How can I do this? There is no on-button click.
Well, I solved this by making a button and changing its image from the source image to a particular shape .png and then I was able to perform on Click Function.

unity | disable eventsystem module input for one gameobject

I have a eventsystem with 2 inputmodules (gaze (for cardboard) and touch). The gazeinput is above de touchinput, so that is going to be used by unity as main inputmodule. Now i have 1 object that i want to trigger on touchinput, but that is not working because of the gazeinput. So my question is if it is possible to disable the gazeinput just for this 1 gameobject?
EDIT: the object is a menu button, located in the bottom-right corner. It moves with the camera.
thanks
use raycast. when your camera is seeing the gameobject on which you want to use the touch input. detect that the user is looking at the camera via raycast.
Attach a script (to the camera or an empty gameobject) that has the reference of both the input modules
When the player looks at the object(detected through raycast) in which you want to use the touch input. Simply disable the gazeinput.
And when the player looks away from that gameObject enable the gazeinput

EventTrigger component only works with UI system?

I'm new to Unity and using 4.6 version.
I have a prefab which is just a sprite. And I instantiate three of them.
I want to receive touch and mouse events from those instances.
So, I added EventTrigger component to them from c# script, and added ClickEvent.
And I also added IPointerClickHandler and implement of it.
But it never gets any event. What am I missing?
Is TriggerSystem only works with UI system? Because I did not add any UI Pannel or UICanvas or etc. ( instead I added an empty object and added TriggerSystem component to it )
If this is not a good way to do, please give me a direction to start.
Thanks in advance.
Below is for Mobile devices.
If you want to do in game controls without UI you can use Input
This is Unity document.
It pointed TouchPhase and all of touch phases can find here.
Unity document has good example codes i think you should check it.
For GUI system to work there should also be an EventSystem object in your scene.
You can do it the easy way too, from the editor add a collider2d to the the sprite.
Write this to an script attach it the the gameObject.
void OnMouseDown() {
// do something
}
Unity also convert mouse event to touch events on touch systems so you may not need to worry about simple touch events.

Handling GearVR Touchpad input?

When I add an OVRPlayerController into a Unity3d scene and build and run the scene for the GearVR the built-in touchpad spins the camera around the vertical axis, which is redundant with head tracking. What do I need to change so that the touchpad instead allows the camera to move forward and backward, as if walking? Is there a thorough tutorial?
The Oculus SDK 0.4.3 comes with the support for the GearVR Samsung GamePad.
All you need to do:
import the SDK.
overwrite the projectsettings folder of your project with the one that comes with the SDK.
add the overplayercontroller to your scene.
add a gameobject below the overplayercontroller e.g. plane, quad... this will act like the ground (keep the player from falling)
add a collider to the gameobject e.g. mesh collider
then once you run it you will see that you can move around using the gamepad as well as turning the came around the vertical axis...
basically use any first person shooter tutorial for Unity3D and because the Oculus SDK comes with support for the gamepad you can quickly do this...
this link might help
https://www.youtube.com/watch?v=mbm9lPB5GPw

How to create controlls script in Unity 4.6?

How can i make control script for Unity3D 4.6 ? Like rotating element with mouse and moving it with keyboard.
I imagine that you use uGUI.
In the case, you should control rotation of RectTransform.
this.gameObject.GetComponent<RectTransform>().rotation