Google Glass: Not all axes seem to work using Accelerometer - accelerometer

I'm trying to make a simple glassware app that displays the accelerometer data on the screen. The code I am posting works on my Android phone, but when I use it on google glass, only the Y axis seems to work correctly (looking up and down).
When I say the Y axis, it's actually the third array entry from the SensorEvent event.values[] array (instead of the second array entry - see code).
All TextViews DO display a float value similar to how they appear on my android phone, but they just don't change by much (around .1 to .2) when moving my head.
public class SensorActivity extends Activity implements SensorEventListener {
SensorManager mSensorManager;
mAccel;
TextView tvx, tvy, tvz;
#Override
protected void onCreate(Bundle bundle) {
super.onCreate(bundle);
setContentView(R.layout.main);
tvx = (TextView) findViewById(R.id.tvx2);
tvy = (TextView) findViewById(R.id.tvy2);
tvz = (TextView) findViewById(R.id.tvz2);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mAccel = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mSensorManager.registerListener(this, mAccel, SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
protected void onPause(){
super.onPause();
mSensorManager.unregisterListener(this);
}
#Override
protected void onResume(){
super.onResume();
mSensorManager.registerListener(this, mAccel, SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
tvx.setText(Float.toString(x));
tvy.setText(Float.toString(y));
tvz.setText(Float.toString(z));
}
}
The Glass Documentation pretty much points to the Android documentation, so I am assuming they should be the same.
Any insight would be greatly appreciated.
**** NEW UPDATE ****
Now, all of a sudden. The Y-axis is reading the Y-direction (and the Z-axis is too). Main problem with that is that the Y-axis reading starts at value "9" and goes to "0" in either direction (no negative direction) so determining where the user is looking is an issue. I am aware I could call the Z-axis the Y-axis, but that would just be masking a different issue.
It turns out that the X-axis is registering head tilt and NOT head turn (which is not in accordance with the documentation). I haven't found any documentation pertaining to calibrating the accelerometer (if that's even possible).

Related

Sprite Component exits screen in flame flutter

I am completely new to flame and game development, I know with camera and viewpoint you can control the components but my component exits the screen and I don't know how to stop it from moving forward when it reaches the screen end.
Any idea how?
You don't want to control how your SpriteComponents move with the camera, the camera moves what you are looking at in the "world" and you want to move your component in the world.
If you don't move the camera then the solution can be to just look at whether the component is still within the screen by using a ScreenHitbox and the CollisionDetection system. (You could also just check the bounding box against the screen size).
Something like this, where it just moves back to the position it was in before colliding with the screen, should work:
class YourGame extends FlameGame with HasCollisionDetection {
Future<void> onLoad async {
add(ScreenHitbox());
}
...
}
class YourComponent extends SpriteComponent {
final Vector2 lastPosition = Vector2.zero();
#override
Future<void> onLoad() async {
...
// Adds a hitbox that covers the size of your component,
// so make sure that you have the size set.
add(RectangleHitbox());
}
...
#override
void update(double dt) {
lastPosition.setFrom(position);
// Update your position etc here
}
#override
void onBeginCollision(Set<Vector2> intersectionPoints, PositionComponent other) {
position = lastPosition;
}
}

How to scale something in unity without changing the aspect ratio?

I am trying to restrict a user in unity editor that if the user stretch my given prefeb the object will scale up and down according to the selected aspect ratio in the inspector. Like if the user selects 4:3 and scale that object it will change according to the aspect ratio. Kindly help me in that.
[ExecuteInEditMode]
public class AspectScale : MonoBehaviour
{
private Vector3 _baseScale;
private Vector3 _current;
private Transform _transform;
private void OnEnable ()
{
_transform = transform;
_baseScale = _transform.localScale;
_current = _baseScale;
}
private void Update ()
{
if (_transform.hasChanged)
{
if (_current.x != _transform.localScale.x)
SetScale(_transform.localScale.x/_baseScale.x);
else if (_current.y != _transform.localScale.y)
SetScale(_transform.localScale.y/_baseScale.y);
else if (_current.z != _transform.localScale.z)
SetScale(_transform.localScale.z/_baseScale.z);
}
}
private void SetScale (float value)
{
_transform.localScale = _baseScale*value;
_current = _transform.localScale;
}
}
Disable for change aspect.
From the center of the scale handle, al the scales in the 3 axis are changed at once:
So if you set a escale for example of (4,3,0) and handle the scel handle from its centre, the proportion is preserved.
You can check for example in a quad, how it changes from 4,3,0 to 8,6,0 and so on, increasing or decreasing.
If you are doing that with code, you can arrange that whenever you change the scale of an axis of interest, you calculate and set the scale of the other axis so that the relation 4:3 is maintained.

How to calculate sensitivity based on the width/height of the screen?

Here's my idea: I wanted to have a scrollview where the user could both scroll the component in it, and click on it.
After hours of testing/search, I've finally managed to make this working with the following code.
My problem is in the comparison Math.Abs(eventData.delta.x) > 1.0f. I consider that if the mouse "moves more than 1.0f" then it's dragging, else I consider this as a click.
The value 1.0f works perfectly on all the devices with a mouse = it's easy not to move, and big screen (tablets), when you click. But on smartphones, ie my Galaxy S6 or the 3 other ones I've tried, it's very sensitive and almost impossible to make a click.
How could you programmatically handle this? Is there a DPI or something to take in account and based on this, multiply my 1.0f by the resolution?
public class BoardHandler : EventTrigger
{
private static GameObject _itemDragged;
private static bool _isDragging;
private static ScrollRect _scrollRect;
public override void OnPointerClick(PointerEventData data)
{
if (_isDragging) {
return;
}
Debug.Log("Click");
}
public override void OnBeginDrag(PointerEventData eventData)
{
if (Math.Abs(eventData.delta.x) > 1.0f ||
Math.Abs(eventData.delta.y) > 1.0f) {
_scrollRect.OnBeginDrag(eventData);
_isDragging = true;
}
}
public override void OnDrag(PointerEventData eventData)
{
if (_isDragging) {
_scrollRect.OnDrag(eventData);
}
}
public override void OnEndDrag(PointerEventData eventData)
{
if (!_isDragging) {
return;
}
_scrollRect.OnEndDrag(eventData);
_isDragging = false;
}
private void Start()
{
_scrollRect = GetComponentInParent<ScrollRect>();
}
}
I've used TouchScript in the past to cover multiple devices within one project, like from 9 screens 6k screen array to tablets. They have a set of utilities to handle multiple resolution and dpi.
Check out the the UpdateResolution method in TouchManagerInstance.
hth.

How to get touched location in AndEngine?

I'm new in AndEngine and there is too many hard things for me now..
I want to know where I touched when I touch anywhere, and give actions by those locations with x and y. Can anybody help me?
Go throught the AndEngine Examples. There are two methods. Either you register touch on an Entity, as shown for example in TouchAndDrag example:
final Sprite sprite = new Sprite(centerX, centerY, this.mFaceTextureRegion, this.getVertexBufferObjectManager()) {
#Override
public boolean onAreaTouched(final TouchEvent pSceneTouchEvent, final float pTouchAreaLocalX, final float pTouchAreaLocalY) {
this.setPosition(pSceneTouchEvent.getX() - this.getWidth() / 2, pSceneTouchEvent.getY() - this.getHeight() / 2);
return true;
}
};
Or you use the touch listener on a Scene. You have to implement the IOnSceneTouchListener and then set it in your scene:
public class MyListener implements IOnSceneTouchListener {
#Override
public boolean onSceneTouchEvent(Scene pScene, final TouchEvent pSceneTouchEvent) {
if (pSceneTouchEvent.isActionDown()) {
//execute action.
}
return false;
}
}
}
...
scene.setOnSceneTouchListener(new MyListener ());
You can find most of the code needed in the examples already. See this tutorial on how to add them to Eclipse.

Touch and Drag Sprite in Andengine

I'm trying to make a sprite so when you touch it and drag your finger, the sprite follows your movement. I've tried to follow the AndEngine examples, but they are part of GLES1. I'm using GLES2.
Within my code, the onAreaTouched is being called, but it is not continuously being called to update the sprite with my finger.
Thanks in advance.
public class RectangleFactory extends Sprite {
public float randomNumber;
public RectangleFactory(float pX, float pY, ITextureRegion pTextureRegion,
VertexBufferObjectManager pVertexBufferObjectManager) {
super(pX, pY, pTextureRegion, pVertexBufferObjectManager);
// TODO Auto-generated constructor stub
Random random = new Random();
randomNumber = (float) random.nextInt(BallShakeActivity.CAMERA_WIDTH);
};
#Override
public boolean onAreaTouched(TouchEvent pSceneTouchEvent, float X, float Y)
{
Log.d("Mark", "circles are touched");
this.setPosition(pSceneTouchEvent.getX() - this.getWidth() / 2, pSceneTouchEvent.getY() - this.getHeight() / 2);
if(pSceneTouchEvent.isActionMove()){
Log.d("Mark", "finger is moving");
}
return true;
};
#Override
protected void onManagedUpdate(final float pSecondsElapsed){
if(this.mY > 0f){
}
else{
Random random = new Random();
randomNumber = (float) random.nextInt(BallShakeActivity.CAMERA_WIDTH);
this.setPosition(randomNumber, 800f);
}
super.onManagedUpdate(pSecondsElapsed);
}
}
You can find some examples, including a touch-drag example, for AndEngine GLES2 in the AndEngineExamples repository on Nicolas' Github page.
https://github.com/nicolasgramlich/AndEngineExamples
The examples seem to be mostly for the GLES2 branch but there may also be some examples for the GLES2-AnchorCenter branch as well.
Some of the code works with GLES1/master branch but you're mostly on your own for examples for that branch.