Analog clock using CALayers - iphone

I'm trying to make an analog clock for the iPhone, in which the clock hands will automatically update to the current time. I also want the clock hands to be images, unlike this: http://iphone-dev-tips.alterplay.com/2010/03/analog-clock-using-quartz-core.html. What would be the easiest way to create this using CALayers to rotate the images/hands?

There are two properties of CALayers that may be of interest to you - the anchorPoint and the transform. Set the anchor point at the origin around which you want to rotate the images, calculate the angle of rotation, make a transformation matrix from it (using CATransform3DMakeRotation around the appropriate axis) and set the transform on the layer.
It's all explained in detail here.

Related

Move 3d model forwards based on rotation

I have a 3d model in Xcode using SceneKit, that can rotate around itself, and i would like for it to move forwards based on rotation, for example if it is rotated 236 degrees in the z axis, it wouldn't go straight in x or y, but a bit of both so it would move forwards. Is it possible? Do i have to get any plugins?
No plugins required.
You can do this two main ways:
Move the object's position relative to its rotation by changing its "transform" over time.
Applying force (and/or impulses) over time (or instantly) in the direction you'd like your entity to travel.
Within these two approaches are a LOT of other considerations regarding scene size, resistance, speed, immediacy, etc.

iPhone 3D compass

I am trying to build an app for the iPhone 4 which enables the user to "point" at a hardcoded destination and a dot appears where the destination is located.
First, i use the compass to make a horizontal compass(this will cover the left/right rotation):
// Heading
nowHeading = heading.trueHeading;
// Shift image (horizontal compass)
float shift = bearing - nowHeading;
destinationImage.center = CGPointMake(shift+160, destinationImage.center.y);
I shift the dot 160 pixels because the screen is 320 pixels width. My question is now, how can I expand this code to handle up and down? Meaning that if i point the phone down in the table, the dot wont show.. I have to point (like taking a picture) at the destination in order for it to be drawn on the screen. I've already implemented the accelerator. But i don't know how to unite these components to solve my problem.
Bearing should depend on the field of vision of the camera. For iPhone 4 the horizontal angular view is 47.5 so 320 points/47.5 = xxx points per degree, use that to shift horizontally. You also have to add an adaptive filter to the accelerometers, you can get one from the AccelerometerGraph project from Apple.
You have the rotation in one axis (bearing) you should get the rotation on the other two from the accelerometers. The atan2 of two axis give you the rotation on the third. Go to UIAcceleration and imagine an axis physically piercing the device if that helps and do double xAngle = atan2(acceleration.y, acceleration.z); So once you have the rotation upside down you can repeat what you did for the horizontal with the vertical field of view, eg: 60 for the iPhone.
That is going to be one rough implementation :) but achieving smooth movement is difficult. One thing you can do is use the gyros to get a faster response and correct their signal periodically with the accelerometers. See this talk for the troubles ahead: Sensor Fusion on Android Devices. Here is a website dedicated to the Kalman Filter. If you dare with Quaternions I recommend "Visualizing Quaternions" from Andrew J. Hanson.
It sounds like you are trying to do a style of Augmented Reality. If that. Is the case there are several libraries and sample code suggested here:
Augmented Reality

How to detect a circle motion with UIGestureRecognizer

I want to be able to detect someone's finger drawing a circular motion on the screen - as if they were drawing an 'O'. Is this possible with UIGestureRecognizer?
I think the answer to this depends on your definition of circular motion and how you intend to use it. For example, do you want to know how many degrees along a circle the users finger has travelled? Or, do you only care about a circle being completed? What is the degree of accuracy you require? Do you want to allow for the motion to be interrupted or does this have to be more of a touch-down > draw-circle > touch-up (in other words, single motion)?
One approach would be to define a bunch of rectangular zones along the circumference and detect if the user is touching these in sequence. This can provide you with direction and a coarse indication of angle.
Another approach is to store the points between touch down and touch up and do some filtering and curve fitting to figure out what shape is approximated by the points. First low-pass-filter using a basic FIR filter and then look at the dx and dy from point to point. A circle (as a series of arcs) will have to fall within a certain range of slope changes from point to point, otherwise you have some other shape.
Yet another approach is to use a Neural Network to take the points and tell you what the shape looks like.
I think this may be what you need
How to detect circular gesture via Gesture Recognizer?
Instead using a gesture recognizer, this project reacts to circular motions tracking the angle of UITouch events.
My answer to my question:
I used this: http://iphonedevelopment.blogspot.com/2009/04/detecting-circle-gesture.html
.. but turned the CircleView into a custom UIGestureRecognizer. Everything lovely.
No, it doesn't recognize natively a circular motion.
You have to implement your own method to do that.
Here's how i needed to do it using the touches callbacks in my view controller but this could be made into a gesture too. Note, I was trying to detect multiple circle motions (2 or more clockwise or counterclockwise circles made during a touch event.
Store touchesMoved CGPoints in an array.
Create a min/max rect of all the points in your history array.
Divide this min/max rect into 4 smaller rects.
Assign each history point a quadrant using CGRectContainsPoint() for each of the 4 quadrants
A clockwise motion will have quadrants ascending. A counter-clockwise motion will have quadrants descending.
Check the ratio of width/height if you want to detect circles vs ovals

In my Application i want to perform Scaling and Translation together?

i want to perform Scaling and Translation of image together so how its possible?
Just make the new opposite corners of your image the two detected multitouch points. You have to keep either the rotation or the aspect ratio fixed, of course. (In theory, you could mess with the aspect ratio rather than the rotation, but you probably want the rotation to change, not the aspect ratio).
i.e. Override the touchesBegan and touchesMoved to save the initial points (in the began) and calculate rotation, translation and zoom (in the Moved), and construct a CGAffineTransform to apply to the imageView.

iphone cocoa : how to drag an image along a path

I am trying to figure out how can you drag an image while constraining its movement along a certain path.
I tried several tricks including animation along a path, but couldn't get the animation to play and pause and play backwards - so that seems out of the question.
Any ideas ? anyone ?
What you're basically trying to do is match finger movement to a 'translation' transition.
As the user touches down and starts to move their finger you want to use the current touch point value to create a translation transform which you apply to your UIImageView. Here's how you would do it:
On touch down, save the imageview's starting x,y position.
On move, calculate the delta from old point to new one. This is where you can clamp the values. So you can ignore, say, the y change and only use the x deltas. This means that the image will only move left to right. If you ignore the x and use y, then it only moves up and down.
Once you have the 'new' calculated/clamped x,y values, use it to create a new transform using CGAffineTransformMakeTranslation(x, y). Assign this transform to the UIImageView. The image moves to that place.
Once the finger lifts, figure out the delta from the original starting x,y, point and the lift-off point, then adjust the ImageView's bounds and reset the transform to CGAffineTransformIdentity. This doesn't move the object, but it sets it so subsequent accesses to the ImageView use the actual position and don't have to keep adjusting for transforms.
Moving along on a grid is easy too. Just round out the x,y values in step 2 so they're a multiple of the grid size (i.e. round out to every 10 pixel) before you pass it on to make the translation transform.
If you want to make it extra smooth, surround the code where you assign the transition with UIView animation blocks. Mess around with the easing and timing settings. The image should drag behind a bit but smoothly 'rubber-band' from one touch point to the next.
See this Sample Code : Move Me