Changing the value of a joystick's Y axis - autohotkey

I'm currently trying to change the value of my joystick's (T.16000M) Y axis using AutoHotKey.
A little background first: I'm configuring some voice commands for Star Citizen, and I'm trying to write a script that would activate decoupled mode (so that I have 6 degrees of freedom), after which I would potentially adjust my ship's pitch by a full 180 degrees along the Y axis. I've discovered that I can't exactly do this using VoiceAttack alone, because within the game, only raw mouse input is accepted, and I cannot figure out how to simulate raw mouse input with either VoiceAttack or AutoHotKey.
Therefore, my next best solution would be to change the value of my joystick's Y axis, if this is at all possible in AutoHotKey. I've done tons of googling, and I've searched the entire documentation for any clue on how to achieve my goal.
I understand that JoyX, JoyY, JoyZ, etc., can be used to map joystick input to some other sort of control, like in this script that maps joystick input to the mouse cursor.
I hope this post is appropriate for StackOverflow; I wasn't exactly sure where else to put it. Maybe superuser?
Anyway, thanks for your help.
By the way, here is the code I was planning to use for the mouse:
MouseMove, 0, 500, 10, R
MouseMove, 0, -500, 10, R
This works in Windows, but not in the game, because the game only accepts raw mouse input.

To simulate raw mouse movement with AutoHotkey, use DllCall("mouse_event"... (Google it, there are tons of examples).
To get VoiceAttack-like functionality in AHK, see HotVoice
It is absolutely impossible to alter a joystick in AHK like you can alter keyboard. You need to create an extra virtual joystick and send output to that. See CvJoyInterface.

Related

Is it possible to change the buttons in inputManager of Unity3D's gamesettings?

I am developing a 2D platform game, and is going to realize a function, let's use F to represent it, which could reverse player's control. For example, if I originally press A to make the character go left. After I use F, I press A to make the character go right.
The problem is that in my game,there will be two players using one keyboard to control 2 different characters. Both the characters use
float moveX = Input.GetAxis("HorizontalP" + player_number);
player_position = this.transform.position;
this.transform.position = player_position;
which use buttons set in inputManager.
I can't use lots of if else to get the reverse control function. So I tried to seek for some ways to change the input button mapping by script while playing the game. However, I couldn't see any such answers to handle that. Is there any solution for that?

Unreal Engine 4 - Add offset to character movement

I just started (yesterday) using unreal engine and I need to simulate a drunk character using BPs.
I'm using two camera shakes (one for standing still and one for walking) but I want to add some "displacement" on charater when he's walking.
Basically I want to define a random float to be added to X axis location in order to make character wobble smoothly.
It will be acceptable even if there's a way to make the character move along with the camera when it's shaking.
What I tried until now is using AddActorLocalOffset and a timeline to lerp between actor's location and actor's location+offset, but both are very choppy to me.
Maybe it's a noob question but as I told I'm very new to this and need it for a quick work.
Any suggestion?
Thanks
If you are targetting physically correct model, you should use AddForce (UE Docs). But this approach would require implementation of a "drunk animation" where your character will modify it's movement animation to "compensate" this force by stepping aside etc.
Another (much more simple) approach is by using AddMovementInput. This example can be seen here: UE Aswers. In this case, you are basically simulate player's input by adding small amount of side force here and there.

Mouse position not consistent with HTML canvas ondrag

I am trying to drag some shapes in HTML canvas but I am encountering a problem with respect to determining the change in mouse coordinates [dx,dy]
First of all, there is no problem in the coordinates themselves, stored in mousePos as the rollover effects work flawlessly. What I am doing is, upon first entering the shape, saving the mouse coordinates.
pos = {x : mousePos[0] , y : mousePos[1]};
Then, onMotion updates the coordinates everytime the mouse moves as well as recording the current position
dx=mousePos[0]-pos.x;
dy=mousePos[1]-pos.y;
pos = {x : mousePos[0] , y : mousePos[1]};
Then I add the dx and dy values to the shapes coordinates (lets take a simple rectangle as an example)
ctx.fillRect(0+this.dx,0+this.dy,100+this.dx,100+this.dy);
as long as the mouse doesn't move too fast, it works relatively well (not perfect though). If I move the mouse very quickly, without going out of the window, the rectangle does not catch up with the mouse. I can understand if there is a delay catching up to the mouse, but how can the delta values be off? Clearly we know where we started, and even if dozens/hundreds of pixels are skipped in the process, eventually the mouse should stop and the correct delta values should be calculated.
Any help would be greatly appreciated as I have hit a conceptual wall here.
You might try to get e.layerX-Y when the onMotion is fired to get the real position instead of the delta. This way it can't be "off".
To use this, place your shape into a div with style="padding:0px;margin=0px;" , because the position is relative to the parent block.

Setting up a power meter in cocos2d

I am a straight noob. Everyone else says it, but I'm dead serious.
My question is, what is the best way to make a power meter to move a object? Meaning, how to set it up so that the longer the player holds the more power they get. Also how, would I incorporate physics?
What I'd like to accomplish is to have a player holding onto something so that when he taps on the screen and hold he powers up, and when he lets go he throws the object a certain distance.
just checking if the there is any thouch sequence or not is rather an easy thing, you just have to overload two functions for your scene class, one to inform you whenever a touch sequence begins and one to tell you touch is ended. the source code example is describe in this link. after than i think you need a gauge to show how much power is gathered so far, the easiest way is to use a texture with full power shown in it and the set it as texture and then show it little by little as the power goes up just as the code below:
// to create the gauge with zero power
CCSprite *s=[CCSprite spriteWithTexture:[CCTextureCache addImage:#"gauge.png"] rect:CGRectMake(0,0,0,10)];
// and then whenever the power changes you call this method
[s setTextureRect:CGRectmake(0,0,power,10)]
note that in my code i am using a 100x10 texture (power is somthing between 0..100 and texture height is 10 as the last parameter in both CGRectMake functions)

Graphing x-y coordinates from a mouse with Processing

I've found this really cool site on interfacing an Arduino to an optical mouse to read out x-y readings from it. I've done it, and it's working nicely.
Then I was thinking, 'Why not plot all this to become a graph?' and I came across Processing.
I am aware that Processing has an example named 'MouseSignal'
This example is the EXACT thing that I want to write with Processing. But, the only change is that, I want to use the x-y coordinates from the mouse that is attached to the Arduino and ask Processing to generate a 'real-time' graph of the coordinate.
Thanks!
Change the spot in the code where it says:
xvals[width-1] = mouseX;
yvals[width-1] = mouseY;
Replace mouseX and mouseY with the values coming from the Arduino. You may need to scale these values to fit within the axes.