Load static google map into UIImageView? - iphone

I'm trying to show a small static google map in my app. I'm wondering if it's possible to do this using UIImageView? It doesn't seem to be working for me but maybe I'm doing something wrong.
This is what I have (I stripped the URL so that it's easier to read):
NSURL *mapurl = [[NSURL alloc] initWithString:#"http://maps.google.com/maps/api/staticmap?[...]"];
NSData *mapdata = [[NSData alloc] initWithContentsOfURL:mapurl];
UIImage *uimap = [[UIImage alloc] initWithData:mapdata];
self.map.image = uimap; // map is my UIImageView
[mapurl release];
[mapdata release];
[uimap release];
I know my UIImageView is setup correctly because if I replace the URL with one of an image then it loads fine. Also I know the google maps URL is correct because if I load it straight into a browser it shows up fine.
I'm starting to think that this might not be possible. If anyone has any ideas or alternatives please share.
Thanks.

Well I ended up finding a solution for this so I'll post it in case someone else runs into the same issue.
First I changed the way I load the image to using NSURLConnection. I don't think this particular change makes a difference but this is what allowed me to find the problem. When trying to load the URL I got a "bad url" error. So I did some searching on that and turns out there are some invalid characters in the google static map URL that must be escaped.
So basically just call this on the url and you should be good to go:
// The URL you want to load (snipped for readability)
NSString *mapurl = #"http://maps.google.com/maps/api/staticmap?[...]";
// Escape the string
mapurl = [mapurl stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
I haven't tested this solution with the code I have in my original post but I did test it with my new NSURLConnection implementation and it works perfectly. I assume it'll work fine with the code above as well since it was just a matter of escaping the string.

Related

iOS FTP Upload with SCRFTPRequest on ARC

Hy Everybody, i am using SCRFTPRequest, https://github.com/Hackmodford/SCRFTPRequest to upload a file on a FTP server, i've used it before, it worked great, but now i have a project that is with ARC, the API was recoded to ARC a couple of days ago.
I have a crash, i don't know why, it crashes without any error message after startRequest, the whole function gets executed, including startUploadRequest. I really don't know why it crashes. I've tested on iPAD 1 with 5.0 and iPad 3 with 6.1.
I've tried with other files, same problem, the file is created on the FTP server with the correct name but the size is 0 Kb.
filePath = [[NSBundle mainBundle] pathForResource:#"Spiriva Mentiones Legales" ofType:#"pdf"];
SCRFTPRequest *ftpRequest = [[SCRFTPRequest alloc] initWithURL:[NSURL URLWithString:#"ftp://ftp.belersoft.ro/"]
toUploadFile:filePath];
ftpRequest.username = #"user";
ftpRequest.password = #"pass";
ftpRequest.delegate = self;
ftpRequest.customUploadFileName = [modifiedPathWithXLS lastPathComponent];
[ftpRequest startRequest];
I have implemented all the Delegate Callbacks, only will ftpRequestWillStart gets called
In case anyone else having the 'zero-byte file upload' issue that needs help with this, try replacing
[ftpRequest startRequest];
with
[ftpRequest startAsynchronous];
And add the following delegate method to your code:
- (void)ftpRequest:(SCRFTPRequest *)request didChangeStatus:(SCRFTPRequestStatus)status
(note the difference between the method name versus the one listed in the SCRFTPRequest docs.)
I'm not familiar enough with this library yet to be comfortable changing the library itself, but adjusting the delegate method name and using asynchronous worked for me.
use [ftpResuest startAsynchronous]

Downloading image from server without file extension (NSData to UIImage)

From my server I'm pulling down a url that is supposed to simply be a profile image. The relevant code for pulling down the image from the urls is this:
NSString *urlString = [NSString stringWithFormat:#"%#%#",kBaseURL,profile_image_url];
profilePic = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:urlString]]];
My url is in the format (note no file extension on the end since its dynamically rendered)
localhost:8000/people/1/profile_image
If I load the url in my browser the image displays; however the code above for pulling down the UIImage does not work. I've verified that the code does pull an image from a random site on the interwebs.
Any thoughts on why this is happening?
Does it work if you feed NSURL
http://localhost:8000/people/1/profile_image
I imagine that the http:// is important here (so that the NSURL knows whether it's pointing at a file or remote URL).
The reason you can't display it in a UIImageView is because the UIImageView doesn't know what format it's in (jpeg,png,etc.)
A good way for future coding is always use extensions. Don't take short cuts, they will always end up worse and you'll have to fix them over and over again.
First check for the following :
Permissions: change from read only to read and write.
I have tried many methods in my project to display pic from server and the most prominent issue i have found is that by default, your localhost folders are read only and you need to change permission as well.... Initially i thought read only wudnt matter if i am just loading the picture beacuse technically i am only reading the contents , right. But it did matter.
White Spaces
One more thing you coud check for is the name of your image containing white spaces, if they are then you need to remove white spaces from your image name while making URL request.
Try adding extensions if nothing works out
If eveythings fine , try to load the image with extensions which would obviously work.
Provide us with some initial coding as well
Let us know about your code snippets and console outputs which could help the community to reach the solution of the problem in a more prominent manner.

iPhone RSS Reader, get image from rss feed

i've a problem.
I've an app that download rss feed from a website. It works all fine, but i want to set image in table view where the articles appear. How can i get image from RSS Feed? I need the code.
For title, description i've done it perfectly, but i'm not able to recover image.
P.S. : sorry for my english, i'm italian :)
Ok, so I presume you have the URL of the image. You can do this:
//Normal text setting code
//Normal text setting code
NSString *imageURLString = [rssFeed objectForKey:#img"];
NSURL *imageURL = [NSURL URLWithString:imageURLString];
NSData *imageData = [NSData dataWithContentsOfURL:imageURL];
cell.imageView.image = [UIImage imageWithData:imageData];
That is the quick and filthy method. It's running in your UI thread etc, so it's going to be a terrible user experience. You can fiddle with performance on your own terms, but I'd suggest one of the following:
Load all your data asynchronously and fill the data as it becomes available
Load all of your data in a secondary thread, but show a loading spinner
For both of the above, you would put your images into an NSArray and then call them by doing cell.imageView.image = [myImageArray objectAtIndex:indexPath.row];
The best solution however (and most complicated) though is called 'lazy loading'.
If you want to do lazy loading (the way facebook and twitter do it) you should check this out: LazyTableImages.
Happy coding, Zane
There is a plugin for it.
WP RSS Images Plugin
http://wordpress.org/extend/plugins/wp-rss-images/
After downloading it and succesfully installing it you need to parse the enclosure tag.
It will be in enclosure element with the attribute of "url". If you need I can post the code also.

Mapping data from RSS feed with no latitude/longitude using MapKit

Having follow the tutorial at http://mithin.in/2009/06/22/using-iphone-sdk-mapkit-framework-a-tutorial/ to integrate the MapKit into my application. However, I want to be able to display an annotation for a dynamic "incident" on the map. Here's where I'm having problems.
Basically my apps an RSS reader, and it downloads everything and saves each item of the feed (as in Story 1, Story 2) as currentItem. The data I want to grab and use to map the annotation can be found under currentItem.title - but I can't seem to get it to work in this tutorial's code or find the right place to put it.
The RSS feed I'm using doesn't have any latitude/longitude information either - making it even harder.
Any help would be great - I've been working on this for a couple of days now with still no luck.
UPDATE: I'm even considering launching the Google Maps application instead showing the annotation. But this in itself is raising issues. Below is the code I'm using but its throwing errors everywhere.
NSString *title = currentItem.title;
int zoom = 13;
NSString *stringURL = [NSString stringWithFormat:#"http://maps.google.com/maps?q=%##%1.6f,%1.6f&z=%d", title, zoom];
NSURL *url = [NSURL URLWithString:stringURL];
[[UIApplication sharedApplication] openURL:url];
In the end I just mapped the title using reverse lookup, and added the country afterwards so results would only be in the country where the address is. Not ideal, but it seems to work OK.

Using TwitPic API from ObjectiveC/iPhone

There's a similar question on here about this which I've read and I've tried to follow the advice given there... and I think I'm 95% complete but the remaining 5%... well you know ;-)
So I'm trying to call the twitPic API to upload an image and I've got the image contained in a UIImageView which I'm displaying on the screen (I can see it so it's definitely there). My code to form the API call looks like this:
NSURL *url = [NSURL URLWithString:#"http://twitpic.com/api/upload"];
NSString *username = #"myUsername";
NSString *password = #"myPassword";
NSData *twitpicImage = UIImagePNGRepresentation(imageView.image);
// Now, set up the post data:
ASIFormDataRequest *request = [[[ASIFormDataRequest alloc] initWithURL:url] autorelease];
[request setPostValue:twitpicImage forKey:#"media"];
[request setPostValue:username forKey:#"username"];
[request setPostValue:password forKey:#"password"];
// Initiate the WebService request
[request start];
I get an error back from it stating 'image not found'.
Is it obvious what I'm doing wrong? Any hints at all? I'm only a week deep in ObjectiveC so it's quite likely it's a real newbie error.
On the same track - it's not clear to me how I can capture a success or failure properly in code here - I'm currently dumping the 'request responseString' to an alert which isn't the best thing - how can I check the result properly?
I've also seen the use of 'NSLog' - which I suspect is a debugging/console logging tool - but I can't see the output from this anywhere in XCode - it doesn't SEEM to be shown in the debugger - any clues at all?!
Sorry if the above is really dumb - I can take a little ridicule - but I'm kind of isolated with my iPhone adventures - no one to bounce anything off etc - so I'm venting it all off here ;-)
Cheers,
Jamie.
You need to use the setData method to copy the image data into the post, like this:
[request setData:twitPicImage forKey:#"media"];
You're making a synchronous call, which is going to stall your app while you upload all that image data - you might want to switch to using an NSOperationQueue, or the ASINetworkQueue subclass that allows you to show a progress bar.
You should be able to see NSLog output in the debugger window of XCode. Make sure you've switched to this (control top left with a spray can on). You can also launch the console.