We recently launched our Android Wear app Coffee Google Play Link!
Users can launch the app by flicking their wrist. Essentially, when the watch wakes up we register with the accelerometer and start looking for a flick motion and when the watch dims we deregister.
This works on all devices except the Moto 360. When debugging we're finding that the Moto 360 accelerometer amplitude is about 10% of what we're seeing on other devices and we're only getting about 10Hz on the accelerometer.
Has anyone else seen issues like these on the Moto 360?
OK, I figured this out.
The Moto 360 accelerometer is fine. The problem is that whenever you're attached for debugging over bluetooth (which is how I captured the accelerometer data) everything slows down so bad on the Moto 360 that the accelerometer data is useless.
When using the MessageAPI to send the data and not being attached the values are fine.
Related
Hello and welcome to my first question on stackoverflow.
I have found an app in the AppStore called „heart rate free“ the app tracks the pulse rate with the camera on the phone. To measure to pulse you have to put your finger onto the camera when the flashlight is turned on. I compared the readings with my Apple Watch and it’s seems legit (I wore my watch after the measurements)
My question:
How is the app programmed to measure the pulse when the camera sees only a red point?
And is it possible to build this with a webcam and python?
screenshot of the app
Working on game where you move ship by titling screen.
Using Landscape right mode.
If device not flat the ship moves so i needed to calibrate the Y tilt when the app is run so that is neutral up/down movement.
I got it to work by reading the tilt when app was run and making that neutral.
Anyway, it works fine (took me hours to figure it out but I did it!)
It works great when I have my iPhone plugged in to MacBook and run it through Xcode.
But if I run it when the phone is not connected to Xcode - it does not work.
It acts like the calibration I set up was not even there.
I set it up to catch the Y tilt angle when you launch the app "view did load".
Works great when plugged in but not when not plugged to Xcode.
I tried force quit-relaunch, even reset phone and launch app again.
I'm pretty new to Unity and have a problem. I programmed a small 2D game in Unity that works fine, runs smoothly and doesn't lag at 60 fps. The problem occurs when I try to record a video of my game. No matter which screen recorder I tried (on Windows 10 with Xbox recorder as well as on Android Samsung Galaxy S8 and S5 with different recorders), the video doesn't run smoothly (seems stuttering). I have already tried different quality settings, different fps and switched off vSync, but nothing helped. When I set my game below 60 FPS, the game starts to stutter itself and the FPS is not kept constant. The screen resolution is 1920 x 1080.
I am now very desperate and frustrated about this problem, because I would like to make good quality videos of the game. So if anyone has any idea or suggestions how to solve this problem I would be very thankful!
try this link
it will take you VideoCapture, a free asset made for exactly your purposes. since its free you can add it right to your packages and start using it without trouble, and it elemenates the pesky problem of third party recordings.
I would like to integrate a 360 video player into an existing app for mobile.
I use the Skybox/Panoramic shader and a RenderTexture for the Video pluged in the VideoPlayer component. In 2k (2048x1024 or 2048x2048 for 3D) everything is fine but when I try to take a 4k video (4096x2048 or 4096x4096 for 3D) the app crashes instantly on an iPhone 6 Plus. I also tested it with a video clip in 3840x1920 but all I got from the debugger is:
INFO [vr/gvr/capi/src/gvr.cc:116] Initialized GVR version 1.120.0
[VRDevice] Successfully created device cardboard.
Message from debugger: Terminated due to memory issue
I thought, it has something to do with the max texture size but 4096x4096 should be OK for most devices today.
Do you guys have an idea whats going on here?
Thanks!
I am developing an app which plays interactive real time streaming video. I use FFMPEG (don't worry, I'll be releasing my source code) to decode a MPEG2/H264 RTP stream. I simply cannot get the iPhone 3G to draw a screen full of pixels faster than 5 times per second.
I've tried a OpenGL texture which was just a slow. I've also tried an array of 2D vertexes covering the entire screen and using glDrawArrays but that yielded 5 FPS as well. For now I've stuck to simply drawing a CGImage onto my view which gives me about 7-8 FPS.
From what I gathered, the private CoreSurface framework seems to be the only way. Anyone have any tips or tricks to get at least 20-30 FPS? I'd hate to restrict my app to only the 3GS and iPod touches.
Thanks,
Andrew
If you want to play H.264 and MPEG2 video on the iPhone, why are you doing the decoding yourself? What do you need that MPMoviePlayer doesn't cover?
If you are decoding in software, you're never going to get a good frame rate. Even on faster hardware (iPhone 3GS and 2nd/3rd gen iPod touches), you're going to drain the battery very quickly.