Augmented Reality - iphone

i want to draw the pin and information of the place on the image of the camera..
Please any one help me..
i had done the coding in the app delegate
The code is :-
overlay = [[UIView alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; overlay.opaque = NO; overlay.backgroundColor=[UIColor clearColor];
[window addSubview:overlay];
#define CAMERA_TRANSFORM 1.24299
UIImagePickerController *uip;
#try {
uip = [[[UIImagePickerController alloc] init] autorelease];
uip.sourceType = UIImagePickerControllerSourceTypeCamera; uip.showsCameraControls = NO;
uip.toolbarHidden = YES;
uip.navigationBarHidden = YES;
uip.wantsFullScreenLayout = YES;
uip.cameraViewTransform = CGAffineTransformScale(uip.cameraViewTransform, CAMERA_TRANSFORM, CAMERA_TRANSFORM);
}
#catch (NSException * e)
{ [uip release];
uip = nil;
}
#finally
{ if(uip) {
[overlay addSubview:[uip view]]; [overlay release]; }
}
it shows the camera.Not i want to detect the place and put the pin on that place which shows the information of that place.

Here is a more straightforward recipe to detect the presence of a camera:
BOOL isCameraAvailable = [UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera];
99% of the job is still ahead I'm afraid. Roughly you need the following:
Get the geographical location for the user.
Get the geographical location for the point of interest (POI) you want to show. You may need to use a 3rd party library like Foursquare, Google maps, or something like that.
Calculate the distance between the user and a POI using a right triangle between both points h^2=c^2+c^2. Note that the distance in spherical geometry is calculated with the haversine formula, but the loss of precision is irrelevant for small distances if we assume cartesian coordinates, so we will just do that.
Assuming that east is 0º, get the angle from the user to the POI, which is atan dy/dx (y=latitude, x=longitude). dy is of course, the difference between the latitudes from the user and the POI.
Get the bearing from the compass and calculate the difference between the user bearing and the angle to the POI.
The position on screen of the object depends on the bearing and the device orientation. If the user is looking exactly at the POI, paint a label for the POI in the middle of the screen. If there is an offset from the exact angle, multiply offset * (width in pixels / horizontal field vision) to get the offset in pixels for the label representing the point. Do the same for the vertical offset.
If there is a rotation on the axis X (see the axis here), apply a vertical offset.
If there is a rotation on the axis Y, there will be an update in bearing from the compass.
If there is a rotation on the axis Z, if the object is near rotate the object in the opposite angle.
Scale the label according to the distance, with a minimum and a maximum.
To position the labels you may want to use a 3D engine or rotate them in a circle around your device (x=x+r*cos, y=y+r*sin) and use a billboard effect.
If that sounds like too much work, concentrate on implementing just the response to changes in bearing using offset * width in pixels / horizontal field vision. Horizontal field vision is the visible angle for the camera. It is 180º for humans, 37.5 for iPhone 3, and hmm was it 45º for iPhone 4? Width is 320, so if you are looking 10º away from your target, you have to move it horizontally 320*10/37.5 pixels away from the center.
If the readings from the compass have too much noise, add a low pass filter.

Please go through
https://github.com/zac/iphonearkit.
It's the best objectiveC code available.

Related

Rotating UIImageView Moves the Image Off Screen

I have a simple rotation gesture implemented in my code, but the problem is when I rotate the image it goes off the screen/out of the view always to the right.
The image view that is being rotated center X gets off or increases (hence it going right off the screen out of the view).
I would like it to rotate around the current center, but it's changing for some reason. Any ideas what is causing this?
Code Below:
- (void)viewDidLoad
{
[super viewDidLoad];
CALayer *l = [self.viewCase layer];
[l setMasksToBounds:YES];
[l setCornerRadius:30.0];
self.imgUserPhoto.userInteractionEnabled = YES;
[self.imgUserPhoto setClipsToBounds:NO];
UIRotationGestureRecognizer *rotationRecognizer = [[UIRotationGestureRecognizer alloc] initWithTarget:self action:#selector(rotationDetected:)];
[self.view addGestureRecognizer:rotationRecognizer];
rotationRecognizer.delegate = self;
}
- (void)rotationDetected:(UIRotationGestureRecognizer *)rotationRecognizer
{
CGFloat angle = rotationRecognizer.rotation;
self.imageView.transform = CGAffineTransformRotate(self.imageView.transform, angle);
rotationRecognizer.rotation = 0.0;
}
You want to rotate the image around it's center, but that's not what it is actually happening. Rotation transforms take place around the origin. So what you have to do is to apply a translate transform first to map the origin to the center of the image, and then apply the rotation transform, like so:
self.imageView.transform = CGAffineTransformTranslate(self.imageView.transform, self.imageView.bounds.size.width/2, self.imageView.bounds.size.height/2);
Please note that after rotating you'll probably have to undo the translate transform in order to correctly draw the image.
Hope this helps
Edit:
To quickly answer your question, what you have to do to undo the Translate Transform is to subtract the same difference you add to it in the first place, for example:
// The next line will add a translate transform
self.imageView.transform = CGAffineTransformTranslate(self.imageView.transform, 10, 10);
self.imageView.transform = CGAffineTransformRotate(self.imageView.transform, radians);
// The next line will undo the translate transform
self.imageView.transform = CGAffineTransformTranslate(self.imageView.transform, -10, -10);
However, after creating this quick project I realized that when you apply a rotation transform using UIKit (like the way you're apparently doing it) the rotation actually takes place around the center. It is only when using CoreGraphics that the rotation happens around the origin. So now I'm not sure why your image goes off the screen. Anyway, take a look at the project and see if any code there helps you.
Let me know if you have any more questions.
The 'Firefox' image is drawn using UIKit. The blue rect is drawn using CoreGraphics
You aren't rotating the image around its centre. You'll need correct this manually by translating it back to the correct position

Smooth aspect change during orientation change

When the screen of the iPhone orientation changes, I adjust my projection matrix of my 3D rendering to the new aspect value. However, doing this in either willRotateToInterfaceOrientation or didRotateFromInterfaceOrientation would cause the aspect ratio being wrong during the transition, because at the beginning of the animation, the screen size is still the same as before and is changed to the new bounds gradually while being rotated. Therefore I want the aspect value used for my 3D projection matrix to change gradually as well. To achieve this, I retrieve start time and duration for the rotation animation in willAnimateRotationToInterfaceOrientation:
- (void)willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
_aspect.isChanging = YES;
_aspect.startedChanging = [[NSDate date] retain];
_aspect.changeDuration = duration;
_aspect.oldValue = self.renderer.aspect;
_aspect.currentValue = fabsf(self.view.bounds.size.width / self.view.bounds.size.height);
}
Note that the view bound size is already set to the new value that will be valid after the animation.
Then, in update, I do the following:
- (void)update
{
if (_aspect.isChanging) {
float f = MIN(1, [[NSDate date] timeIntervalSinceDate:_aspect.startedChanging] / _aspect.changeDuration);
self.renderer.aspect = _aspect.oldValue * (1-f) + _aspect.currentValue * f;
} else {
self.renderer.aspect = _aspect.currentValue;
}
[self.renderer update];
}
This already works quite well, but the render aspect change does not match the actual aspect change, which is because I'm interpolating it linearly. Therefore I tried to match the easing of the actual aspect change by throwing math functions at the problem: The best result I could get was by adding the following line:
f = 0.5f - 0.5f*cosf(f*M_PI);
This results in almost no visible stretching of the image during the rotation, however if you look closely, it still seems to be a bit unmatched somewhere in between. I guess, the end user won't notice it, but I'm asking here if there might be a better solution, so these are my questions:
What is the actual easing function used for the change in aspect ratio during the rotation change animation?
Is there a way to get the actual width and height of the view as it is displayed during the orientation change animation? This would allow me to retrieve the in-between aspect directly.
On the first bullet point, you can use CAMediaTimingFunction (probably with kCAMediaTimingFunctionEaseInEaseOut) to get the control points for the curve that defines the transition. Then just use a cubic bezier curve formula.
On the second bullet point, you can use [[view layer] presentationLayer] to get a version of that view's CALayer with all current animations applied as per their current state. So if you check the dimensions of that you should get the then current values — I guess if you act upon a CADisplayLink callback then you'll be at most one frame behind.

Cocos2D CCNode position in absolute screen coordinates

I've been looking around for a while, and I have not been able to find an answer to this for some reason. It seems simple enough, but maybe I just can't find the right function in the library.
I have a scene with a layer that contains a bunch of CCNodes with each one CCSprite in them.
During the application, I move around the position of the main layer, so that I "pan" around a camera in a way. (i.e. I translate the entire layer so that the viewport changes).
Now I want to determine the absolute position of a CCNode in screen coordinates. The position property return the position relative to the parent node, but I really would like this transformed to its actual position on screen.
Also, as an added bonus, it would be awesome if I could express this position as coordinate system where 0,0 maps to the upper left of the screen, and 1,1 maps to the lower right of the screen. (So I stay compatible with all devices)
Edit: Note that the solution should work for any hierarchy of CCNodes preferably.
Every CCNode, and descendants thereof, has a method named convertToWorldSpace:(CGPoint)p
This returns the coordinates relative to your scene.
When you have this coordinate, flip your Y-axis, as you want 0,0 to be in the top left.
CCNode * myNode = [[CCNode alloc] init];
myNode.position = CGPointMake(150.0f, 100.0f);
CCSprite * mySprite = [[CCSprite alloc] init]; // CCSprite is a descendant of CCNode
[myNode addChild: mySprite];
mySprite.position = CGPointMake(-50.0f, 50.0f);
// we now have a node with a sprite in in. On this sprite (well, maybe
// it you should attack the node to a scene) you can call convertToWorldSpace:
CGPoint worldCoord = [mySprite convertToWorldSpace: mySprite.position];
// to convert it to Y0 = top we should take the Y of the screen, and subtract the worldCoord Y
worldCoord = CGPointMake(worldCoord.x, ((CGSize)[[CCDirector sharedDirector] displaySizeInPixels]).height - worldCoord.y);
// This is dry coded so may contain an error here or there.
// Let me know if this doesn't work.
[mySprite release];
[myNode release];

iphone image not rotating properly aacording to location upon camera view

i am developing application which point towards the particular location, i have calculate angle between current coordinate to specific coordinate and my image is rotating by that difference but image rotation is fixed means when i change my direction the image is not moved it remain fix to specific postion, i want that the image should be moved aacording to the actual position where it is located(where we want to reach) thanks for the help in advance below is my code..
-(float) angleToRadians:(float) a {
return ((a/180)*M_PI);
}
- (float) getHeadingForDirectionFromCoordinate:(CLLocationCoordinate2D)fromLoc toCoordinate:(CLLocationCoordinate2D)toLoc
{
float fLat = [self angleToRadians:fromLoc.latitude];
float fLng = [self angleToRadians:fromLoc.longitude];
float tLat = [self angleToRadians:toLoc.latitude];
float tLng = [self angleToRadians:toLoc.longitude];
float result= atan2f(sin(tLng-fLng)*cos(tLat), cos(fLat)*sin(tLat)-sin(fLat)*cos(tLat)*cos(tLng-fLng));
return RadiansToDegrees(result);
}
- (void) scanButtonTouchUpInside {
UIImage *overlayGraphic = [UIImage imageNamed:#"GFI.jpg"];
overlayGraphicView = [[UIImageView alloc] initWithImage:overlayGraphic];
overlayGraphicView.frame = CGRectMake(50, 50, 50, 80);
[self addSubview:overlayGraphicView];
float angle =[self getHeadingForDirectionFromCoordinate:currentlocation toCoordinate:pointlocation];
if(angle < 0.0)
angle += 2*M_PI;
overlayGraphicView.transform=CGAffineTransformMakeRotation(angle);
}
Judging by the name of the method that is rotating the image: scanButtonTouchUpInside, I'm guessing that it is called only when a user hits a button. If it is only called once, I would expect the image to only rotate once, then remain fixed. In order to keep the image rotating, you need to put that code somewhere that is called repeatedly. A (cheap hack) way to test this would be to add this line to the end of your - (void)scanButtonTouchUpInside method:
[self performSelector:#selector(scanButtonTouchUpInside) withObject:nil afterDelay:0.25f];
This will make the method call fire repeatedly. One thing to note though: It seems that your calculation of the angle of rotation depends on a variable called currentLocation. This solution will only work if you are updating currentLocation appropriately.
EDIT:
Just a thought, you're currently calculating the angle of rotation based on the location of the device and the location of the destination. You're not taking into account compass bearing at all. Is this intentional? With this approach, your angle will never change unless the device actually moves. Any change in orientation will be ignored.

Position on UIImageView

Me again. I have a simple question. I have an UIImageView like the one shown below.
alt text http://img683.imageshack.us/img683/1999/volumen.png
That UIimageView is supposed to be the knob to control the volume of my iphone project. My question is, how to know the positions of bar on the UIImageView when it is rotated? Because the volume needs to be 0.5 when the little bar on the cercle is vertical.
I got a piece of code which is (in the touchMoved method):
float dx = locationT.x - imgVVolume.center.x;
float dy = locationT.y - imgVVolume.center.y;
CGFloat angleDif = 0.0f;
movedRotationAngle = atan2(dy,dx);
if (beganRotationAngle == 0.0) {
beganRotationAngle = movedRotationAngle;
initialTransform = imgVVolume.transform;
}
else {
angleDif = beganRotationAngle - movedRotationAngle;
CGAffineTransform newTrans = CGAffineTransformRotate(initialTransform, -angleDif);
imgVVolume.transform = newTrans;
}
Help please.
It depends on what input mechanism you want to use to control the rotation.
If the knob is to rotate based on a single finger touch dragging from side to side then you can create a UIPanGestureRecognizer and attach it to the knob UIImageView. The translationInView: method returns a CGPoint which is the amount of X and Y movement from the touch-down point. You can feed that into a formula like the one you post to get an angle of rotation. You'll want to keep track of delta from last position and also check for stop limits (like 0..360) to prevent over-rotation.
OTOH, if you're going to use two finger rotation then you'll want to use a UIRotationGestureRecognizer and look for the rotation value. Just feed that into a CGAffineTransformRotate and set it to the UIImageView transform. That takes care of all of the above for you. Again, you'll want to check for stop limits.