MKAnnotationView image is not displayed on result snapshot image iOS 7 - iphone

I`ve created similar code as was shown on WWDC for displaying pin on snapshots, but pin image is not displayed:
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = self.mapView.region;
options.scale = 2;
options.size = self.mapView.frame.size;
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error)
{
MKAnnotationView *pin = [[MKAnnotationView alloc] initWithAnnotation:nil reuseIdentifier:#""];
UIImage *image;
UIImage *finalImage;
image = snapshot.image;
NSLog(#"%f", image.size.height);
UIImage *pinImage = pin.image;
CGPoint pinPoint = [snapshot pointForCoordinate:CLLocationCoordinate2DMake(self.longtitude, self.latitude)];
CGPoint pinCenterOffset = pin.centerOffset;
pinPoint.x -= pin.bounds.size.width / 2.0;
pinPoint.y -= pin.bounds.size.height / 2.0;
pinPoint.x += pinCenterOffset.x;
pinPoint.y += pinCenterOffset.y;
UIGraphicsBeginImageContextWithOptions(image.size, YES, image.scale);
[image drawAtPoint:CGPointMake(0, 0)];
[pinImage drawAtPoint:pinPoint];
finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImageJPEGRepresentation(finalImage, 0.95f);
NSArray *pathArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *path = [pathArray objectAtIndex:0];
NSLog(#"%#", path);
NSString *fileWithPath = [path stringByAppendingPathComponent:#"test.jpeg"];
[data writeToFile:fileWithPath atomically:YES];
}];
Only snapshot of map is displayed without pin image.

If you're expecting the default pin image to appear, you need to create an MKPinAnnotationView instead of the plain MKAnnotationView (which has no default image -- it's blank by default).
Also, please note that the latitude and longitude parameters are backwards in this line:
CGPoint pinPoint = [snapshot pointForCoordinate:CLLocationCoordinate2DMake(
self.longtitude, self.latitude)];
In CLLocationCoordinate2DMake, latitude should be the first parameter and longitude the second.

Related

Faster way to load Images in Collection View

I have a little flow problem with my UICollectionView. I want to display PDF's thumbnails just like in Apple's iBook app, when I scroll my collection view I can see it's not really smooth. Here is the way I use to load my pictures :
- (UICollectionViewCell *)collectionView:(UICollectionView *)cv cellForItemAtIndexPath: (NSIndexPath *)indexPath
{
GridCell *cell = [cv dequeueReusableCellWithReuseIdentifier:#"gridCell" forIndexPath:indexPath];
...
// Set Thumbnail
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
if ([Tools checkIfLocalFileExist:cell.pdfDoc])
{
UIImage *thumbnail = [Tools generateThumbnailForFile:((PDFDocument *)[self.pdfList objectAtIndex:indexPath.row]).title];
dispatch_async( dispatch_get_main_queue(), ^{
[cell.imageView setImage:thumbnail];
});
}
});
...
return cell;
}
Method to get thumbnail :
+ (UIImage *)generateThumbnailForFile:(NSString *) fileName
{
// ----- Check if thumbnail already exist
NSString* documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString* thumbnailAddress = [documentsPath stringByAppendingPathComponent:[[fileName stringByDeletingPathExtension] stringByAppendingString:#".png"]];
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:thumbnailAddress];
if (fileExists)
return [UIImage imageWithContentsOfFile:thumbnailAddress];
// ----- Generate Thumbnail
NSString* filePath = [documentsPath stringByAppendingPathComponent:fileName];
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:filePath];
CGPDFDocumentRef documentRef = CGPDFDocumentCreateWithURL(url);
CGPDFPageRef pageRef = CGPDFDocumentGetPage(documentRef, 1);
CGRect pageRect = CGPDFPageGetBoxRect(pageRef, kCGPDFCropBox);
UIGraphicsBeginImageContext(pageRect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context, CGRectGetMinX(pageRect),CGRectGetMaxY(pageRect));
CGContextScaleCTM(context, 1, -1);
CGContextTranslateCTM(context, -(pageRect.origin.x), -(pageRect.origin.y));
CGContextDrawPDFPage(context, pageRef);
// ----- Save Image
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(finalImage)];
[imageData writeToFile:thumbnailAddress atomically:YES];
UIGraphicsEndImageContext();
return finalImage;
}
Do you have any suggestion ?
Take a look at https://github.com/rs/SDWebImage. The library works great for async image loading particularly the method setImageWithURL:placeholderImage:
You can call that method and set a place holder with a loading image or blank png and once the image you are trying to retrieve is loaded it will fill in the place holder. This should speed up your app quite a bit.

How to retrieve images from Instagram which has special hashtag?

My client wants to share an image on Instagram. I have implemeted sharing image on instagram.But i could not share it with a special hashtag. Here is my code so far.
- (IBAction)sharePhotoOnInstagram:(id)sender {
UIImagePickerController *imgpicker=[[UIImagePickerController alloc] init];
imgpicker.delegate=self;
[self storeimage];
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL])
{
CGRect rect = CGRectMake(0 ,0 , 612, 612);
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/15717.ig"];
NSURL *igImageHookFile = [[NSURL alloc] initWithString:[[NSString alloc] initWithFormat:#"file://%#", jpgPath]];
dic.UTI = #"com.instagram.photo";
dic.delegate = self;
dic = [self setupControllerWithURL:igImageHookFile usingDelegate:self];
dic = [UIDocumentInteractionController interactionControllerWithURL:igImageHookFile];
dic.delegate = self;
[dic presentOpenInMenuFromRect: rect inView: self.view animated: YES ];
// [[UIApplication sharedApplication] openURL:instagramURL];
}
else
{
// NSLog(#"instagramImageShare");
UIAlertView *errorToShare = [[UIAlertView alloc] initWithTitle:#"Instagram unavailable " message:#"You need to install Instagram in your device in order to share this image" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
errorToShare.tag=3010;
[errorToShare show];
}
}
- (void) storeimage
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"15717.ig"];
UIImage *NewImg = [self resizedImage:picTaken :CGRectMake(0, 0, 612, 612) ];
NSData *imageData = UIImagePNGRepresentation(NewImg);
[imageData writeToFile:savedImagePath atomically:NO];
}
-(UIImage*) resizedImage:(UIImage *)inImage: (CGRect) thumbRect
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
// There's a wierdness with kCGImageAlphaNone and CGBitmapContextCreate
// see Supported Pixel Formats in the Quartz 2D Programming Guide
// Creating a Bitmap Graphics Context section
// only RGB 8 bit images with alpha of kCGImageAlphaNoneSkipFirst, kCGImageAlphaNoneSkipLast, kCGImageAlphaPremultipliedFirst,
// and kCGImageAlphaPremultipliedLast, with a few other oddball image kinds are supported
// The images on input here are likely to be png or jpeg files
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}
- (UIDocumentInteractionController *) setupControllerWithURL: (NSURL*) fileURL usingDelegate: (id <UIDocumentInteractionControllerDelegate>) interactionDelegate
{
UIDocumentInteractionController *interactionController = [UIDocumentInteractionController interactionControllerWithURL:fileURL];
interactionController.delegate = self;
return interactionController;
}
- (void)documentInteractionControllerWillPresentOpenInMenu:(UIDocumentInteractionController *)controller
{
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller canPerformAction:(SEL)action
{
// NSLog(#"5dsklfjkljas");
return YES;
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller performAction:(SEL)action
{
// NSLog(#"dsfa");
return YES;
}
- (void)documentInteractionController:(UIDocumentInteractionController *)controller didEndSendingToApplication:(NSString *)application
{
// NSLog(#"fsafasd;");
}
Note : This is working fine.
I have followed their documentation on http://instagram.com/developer/iphone-hooks/ but couldn't get better idea from it!. Now don't know what to do next step for sharing an image with hashtag and other information.
Secondly I want to retrieve all the images shared with a particular hashtag into the application.
Please guide me! Thanks in advance!
First, from iPhone Hooks, under 'Document Interaction':
To include a pre-filled caption with your photo, you can set the annotation property on the document interaction request to an NSDictionary containing an NSString under the key "InstagramCaption". Note: this feature will be available on Instagram 2.1 and later.
You'll need to add something like:
dic.annotation = [NSDictionary dictionaryWithObject:#"#yourTagHere" forKey:#"InstagramCaption"];
Second, you'll need to take a look at Tag Endpoints if you want to pull down images with a specific tag.

How to draw image and text in Image Context?

I tried to draw text on the image.When I don't apply the transformations, then the image is drawn at the bottom left corner and the image and text are fine(Fig 2),but I want the image on the top left of the view.
Below is my drawRect implementation.
How to flip the image so that text and image are aligned properly?
or
How to move the image to the top left of the view?
If I don't use the following function calls image gets created at the bottom of the view.(Fig 2)
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0, self.bounds.size.height);
CGContextScaleCTM(UIGraphicsGetCurrentContext(), 1.0, -1.0);
- (void)drawRect:(CGRect)rect
{
UIGraphicsBeginImageContext(self.bounds.size);
// Fig 2 comment these lines to have Fig 2
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0, self.bounds.size.height);
CGContextScaleCTM(UIGraphicsGetCurrentContext(), 1.0, -1.0);
// Fig 2 Applying the above transformations results in Fig 1
UIImage *natureImage = [UIImage imageNamed:#"Nature"];
CGRect natureImageRect = CGRectMake(130, 380, 50, 50);
[natureImage drawInRect:natureImageRect];
UIFont *numberFont = [UIFont systemFontOfSize:28.0];
NSFileManager *fm = [NSFileManager defaultManager];
NSString * aNumber = #"111";
[aNumber drawAtPoint:CGPointMake(100, 335) withFont:numberFont];
UIFont *textFont = [UIFont systemFontOfSize:22.0];
NSString * aText = #"Hello";
[aText drawAtPoint:CGPointMake(220, 370) withFont: textFont];
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * imageData = UIImagePNGRepresentation(self.image);
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:#"Image.png"];
NSLog(#"filePath :%#", filePath);
BOOL isFileCreated = [fm createFileAtPath:filePath contents:imageData attributes:nil];
if(isFileCreated)
{
NSLog(#"File created at Path %#",filePath);
}
}
Here is the code I used to draw the same image (except I did not use the nature image). Its not in a draw rect but depending on your end goal it might be better to do this outside of the draw rect in a custom method and set the self.image to the result of -(UIImage *)imageToDraw. It outputs this:
Here is the code:
- (UIImage *)imageToDraw
{
UIGraphicsBeginImageContextWithOptions(CGSizeMake(320, 480), NO, [UIScreen mainScreen].scale);
UIImage *natureImage = [UIImage imageNamed:#"testImage.png"];
[natureImage drawInRect:CGRectMake(130, 380, 50, 50)];
UIFont *numberFont = [UIFont systemFontOfSize:28.0];
NSString * aNumber = #"111";
[aNumber drawAtPoint:CGPointMake(100, 335) withFont:numberFont];
UIFont *textFont = [UIFont systemFontOfSize:22.0];
NSString * aText = #"Hello";
[aText drawAtPoint:CGPointMake(220, 370) withFont:textFont];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
- (NSString *)filePath
{
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
return [documentsDirectory stringByAppendingPathComponent:#"Image.png"];
}
- (void)testImageWrite
{
NSData *imageData = UIImagePNGRepresentation([self imageToDraw]);
NSError *writeError = nil;
BOOL success = [imageData writeToFile:[self filePath] options:0 error:&writeError];
if (!success || writeError != nil)
{
NSLog(#"Error Writing: %#",writeError.description);
}
}
anyway - hope this helps
Your image frame of reference is the standard iOS frame of reference: the origin is at the top left corner. Instead the text you are drawing with Core Text has the old frame of reference, with the origin at the bottom left corner. Just apply a transform on the text (not the graphics context) like so:
CGContextSetTextMatrix(context, CGAffineTransformMakeScale(1.0f, -1.0f));
then all your stuff will be layout as if the axis origin were in the top left corner.
If you have to write entire sentences with Core Text as opposed to just words take a look at this blog post
I want to say that drawRect method is for drawing in a view not to create an image. Performance issue, drawInRect is called a lot.With this intention is better viewDidLoad or some custom method.
I have done an example with your request:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGFloat margin = 10;
CGFloat y = self.view.bounds.size.height * 0.5;
CGFloat x = margin;
UIImage *natureImage = [UIImage imageNamed:#"image"];
CGRect natureImageRect = CGRectMake(x, y - 20, 40, 40);
[natureImage drawInRect:natureImageRect];
x += 40 + margin;
UIFont *textFont = [UIFont systemFontOfSize:22.0];
NSString * aText = #"Hello";
NSMutableParagraphStyle *style = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
style.alignment = NSTextAlignmentCenter;
NSDictionary *attr = #{NSParagraphStyleAttributeName: style,
NSFontAttributeName: textFont};
CGSize size = [aText sizeWithAttributes:attr];
[aText drawInRect: CGRectMake(x, y - size.height * 0.5, 100, 40)
withFont: textFont
lineBreakMode: UILineBreakModeClip
alignment: UITextAlignmentLeft];
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = self.image;
}

iPhone UIImageView delays display when setImage

I am working on displaying several Images in a scrollview loaded from a MapServer which returns an image of showing a map.
So what I did is I have created 4 UIImageViews, put them in a NSMutableDictionary.
Then when scrolling to the desirable Image it will initiate to load the data from the url asynchronous.
so first I display an UIActivityIndicatorView, then it will load the data and at the end hide the UIActivityIndicatorView and display the UIImageView.
everything is working more or less fine, apart that it takes sooo long until the image displays, allthough the image is not that big and I have a log text indicating that the end of the function arrived... this log message appears immediately but the image still does not show...
If I call the URL by the Webbrowser the image is shown immediately as well.
below you see my piece of code.
- (void) loadSRGImage:(int) page {
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:[NSString stringWithFormat:#"image_%i", page]];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:[NSString stringWithFormat:#"loading_%i", page]];
// if the image has been loaded already, do not load again
if ( currentSRGMap.image != nil ) return;
if ( page > 1 ) {
MKCoordinateSpan currentSpan;
currentSpan.latitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lat"] floatValue];
currentSpan.longitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lon"] floatValue];
region.span = currentSpan;
region.center = mapV.region.center;
[mapV setRegion:region animated:TRUE];
//[mapV regionThatFits:region];
}
srgLegende.hidden = NO;
currentSRGMap.hidden = YES;
currentLoading.hidden = NO;
[currentLoading startAnimating];
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(loadImage:)
object:[NSString stringWithFormat:#"%i", page]];
[queue addOperation:operation];
[operation release];
}
- (void) loadImage:(NSInvocationOperation *) operation {
NSString *imgStr = [#"image_" stringByAppendingString:(NSString *)operation];
NSString *loadStr = [#"loading_" stringByAppendingString:(NSString *)operation];
WGS84ToCH1903 *converter = [[WGS84ToCH1903 alloc] init];
CLLocationCoordinate2D coord1 = [mapV convertPoint:mapV.bounds.origin toCoordinateFromView:mapV];
CLLocationCoordinate2D coord2 = [mapV convertPoint:CGPointMake(mapV.bounds.size.width, mapV.bounds.size.height) toCoordinateFromView:mapV];
int x1 = [converter WGStoCHx:coord1.longitude withLat:coord1.latitude];
int y1 = [converter WGStoCHy:coord1.longitude withLat:coord1.latitude];
int x2 = [converter WGStoCHx:coord2.longitude withLat:coord2.latitude];
int y2 = [converter WGStoCHy:coord2.longitude withLat:coord2.latitude];
NSString *URL = [NSString stringWithFormat:#"http://map.ssatr.ch/mapserv?mode=map&map=import/dab/maps/dab_online.map&mapext=%i+%i+%i+%i&mapsize=320+372&layers=DAB_Radio_Top_Two", y1, x1, y2, x2];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:URL]];
UIImage* image = [[UIImage alloc] initWithData:imageData];
[imageData release];
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:imgStr];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:loadStr];
currentSRGMap.hidden = NO;
currentLoading.hidden = YES;
[currentLoading stopAnimating];
[currentSRGMap setImage:image]; //UIImageView
[image release];
NSLog(#"finished loading image: %#", URL);
}
I had a similar thing in my app, and i used the SDWebImage from https://github.com/rs/SDWebImage. This has a category written over UIImageView. You need to mention a placeholder image and a url to the UIImageView. It will display the image once it is downloaded and also maintains a cache of the downloaded images, hence avoiding multiple server calls.

iPad AQGridView not showing my UIImages not in bundle

I am using the ImageDemo template and it works great if I code in an image from the bundle but when I try and load them from directories I have created in my app I have no luck.
Code:
- (AQGridViewCell *) gridView: (AQGridView *) aGridView cellForItemAtIndex: (NSUInteger) index
{
static NSString * PlainCellIdentifier = #"PlainCellIdentifier";
AQGridViewCell * cell = nil;
ImageDemoGridViewCell * plainCell = (ImageDemoGridViewCell *)[aGridView dequeueReusableCellWithIdentifier: PlainCellIdentifier];
if ( plainCell == nil )
{
plainCell = [[[ImageDemoGridViewCell alloc] initWithFrame: CGRectMake(0.0, 0.0, 200.0, 150.0)
reuseIdentifier: PlainCellIdentifier] autorelease];
plainCell.selectionGlowColor = [UIColor lightGrayColor];
}
NSString *fileName = [NSString stringWithFormat:#"%#/%#_tn.jpg",[manufacturerID stringValue], [[[self.collectionItems objectAtIndex:index] valueForKey:#"PhotoName"] stringByReplacingOccurrencesOfString:#" " withString:#"%20"]];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0];
NSString *savePath =[documentsPath stringByAppendingPathComponent:fileName] ;
NSData *imageData = nil;
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:savePath];
if (fileExists) {
imageData = [NSData dataWithContentsOfFile:savePath];
} else {
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"NoImageAvailableThumbNail" ofType:#"png"];
imageData = [NSData dataWithContentsOfFile:filePath];
}
UIImage *myImage = [UIImage imageWithData:imageData];
NSLog(#"%#", myImage);
if (myImage==nil) {
NSLog(#"NO IMAGE");
}
plainCell.image = myImage;
cell = plainCell;
return ( cell );
}
I code I am using to pull the image data I know works because I use it all over in other places in my app and I am checking for nil when I set the property on the AQGridViewCell:
- (void) setImage: (UIImage *) anImage
{
_imageView.image = anImage;
[self setNeedsLayout];
}
I don't see the check for a nil UIImage in your example there. Are you certain it's being loaded correctly? Try creating the image on one line and setting it on another so you can see it being created in the debugger. For reference, I do something similar with cached book cover images in the Kobo app, and that works. One difference though is that I use the C methods to load a UIImage from a PNG file. Something like UIImageFromPNGData(), off the top of my head (sorry replying by iPhone right now).
The only other thing I can think to check is whether the image view is hidden for some reason. Printing it in the debugger should tell you that, along with it's size, layout, etc.